< Back to Blog

Depth Sensor Visualizer Update

By
|
March 23, 2022
Sensors

Table of Contents

When we first published the Depth Sensor Visualizer, we were encouraged at the warm reception it received in the perception community. It seems that, like us, other perception engineers were looking for more convenient ways to compare and contrast sensor systems without having to scour the internet for the right information. We continue pursuing that goal with the latest update to the Depth Sensor Visualizer.

See the updated visualizer here: https://www.tangramvision.com/resources/depth-sensor-visualizer

Code hosted here: https://gitlab.com/tangram-vision/oss/fov-visualizer

This update brings an interesting addition: omnidirectional depth! Companies like DreamVu are creating novel depth sensing technology that brings a full 360° view to your scene. We... didn’t have a way to render this, to be honest! In fact, sensors like these made us think twice about how we represented ranged cameras in general. The end result is an improved depth sensor visualization tool.

Orbbec Astra compared to a DreamVu PAL Mini

Given the popularity of the Depth Sensor Visualizer (and our own use of the modality here at Tangram Vision), we also plan to develop a LiDAR Sensor Visualizer as well. Subscribe to our newsletter to get notified when that goes live!

How Can I Add My Sensor Line?

If you represent a company that manufactures depth sensors and would like to see your 3D sensor included in the Depth Sensor Visualizer, we’ve made it easier to make that happen.

There are two ways to do this. Note that in both cases, the datasheet for your sensor must be publicly available to anyone visiting the Depth Sensor Visualizer. If your datasheet is a PDF, we are more than happy to host it in the Tangram Vision Datasheet Library.

Option 1 (Fast): Create a Merge Request in the Repository

Those who know their way around GitLab can submit a Merge Request to the fov-visualizer repository with the right data. Every sensor has an entry in sensors.js with basic information and a link to that sensor’s datasheet. This is by far the fastest way to get your sensor added to the Visualizer; we just have to review the MR and click “merge”!

Link to sensors.js: https://gitlab.com/tangram-vision/oss/fov-visualizer/-/blob/main/sensors.js

Option 2 (Slower): Send Us Your Datasheet

Send us a PDF or link to the datasheet describing your depth sensor through our contact page. We will review the datasheet, pull out the relevant information, and add it to the Depth Sensor Visualizer manually.

Accessible Perception With Tangram

We often close blog posts by mentioning that Tangram Vision is committed to creating a better foundation for perception developers. What’s often left unsaid is that we work to support perception sensor manufacturers as well. We get it: hardware is hard! Making and marketing novel hardware (like, say, a 360° depth camera) is even harder. However, these solutions could be the key to great perception, and we would be negligent in our mission if we didn’t work to support them. We will continue to improve our tools and Platform to bring the best out of every system.

Share On:

You May Also Like:

Make Your Sensors Just Work

Tangram Vision helps perception teams develop and scale autonomy faster.