Tangram Vision Platform

MetriCal v9.0.0 Multimodal Sensor Calibration Tool Release

March 20, 2024
Table of Contents

Good news, everyone! MetriCal v9.0.0 is officially out!

MetriCal is Tangram Vision’s flagship multimodal sensor calibration system, which can calibrate cameras, depth sensors, LiDARs, IMUs, and soon, radars, too. All at once, with just one data capture. Learn more here.

This release brings a slew of new features and improvements that everyone will benefit from.

💡 Need to calibrate fiducial-free? We’ve also got AutoCal.

New: IMU Intrinsics

MetriCal now calibrates the intrinsics of IMUs, along with IMU extrinsics. There are five different model combinations to choose from:

  • No intrinsics (i.e. just extrinsics, thanks)
  • Scale: scale factor for gyroscope and accelerometer
  • Scale + Shear: both scale and shear factor for gyroscope and accelerometer
  • Scale + Shear + Rotation: Scale and shear, plus the rotation between the gyroscope and accelerometer
  • Scale + Shear + Rotation + G-Sensitivity: All of the above plus G-sensitivity. G-sensitivity is a property of an oscillating gyroscope that causes a gyroscope to register a bias on its measurements when the gyroscope experiences a specific-force.

Read all about these models, and how to apply them, in the documentation.

New: Omnidirectional Camera Model

By customer request, MetriCal now offers full support for the omnidirectional camera model (keyword omni ). This model is used for catadioptric cameras, or camera systems with both lenses and mirrors. These kinds of camera systems often have fields of view nearing a complete 360 degrees, hence the name of the model.

The Omnidirectional camera model also implements the same radial and tangential distortion terms as OpenCV RadTan. However, while OpenCV RadTan uses 3 radial distortion terms, this only uses 2. The reason for this? Everyone else did it this way, so now it's just convention. Ah well.

Given the wide variety of camera models MetriCal supports, this version also introduces big improvements to calibration initialization for cameras. Cameras now have a better way of figuring out their place in the world (literally) no matter what the model, which allows for much more efficient calibrations.

Read more about the Omnidirectional model in the documentation. This model joins the seven other distortion models that are also available in MetriCal, which you’ll also find in our docs.

A Focus On Clarity

MetriCal v9.0.0 now updates its output to cover all of the supported sensor modalities, as well as clarifies its assumptions about inputs and data. In sum, all of these little changes give you a better idea of what MetriCal needs and provides.

Object Relative Extrinsics (OREs) by Default

For those not in the know, “object space” is MetriCal’s all-encompassing term for fiducials, markers, and features used in the calibration process. And, because MetriCal is based on a global bundle adjustment, that means that it doesn’t just optimize the extrinsics of your sensors; it does the same for your object space as well. As of v9.0.0, all spatial relationships between these object spaces, aka Object Relative Extrinsics (OREs), are solved for and output by default.

Just like any other data source, more object spaces mean more OREs; more OREs will add time to the optimization. It's just more to solve for! If you're not interested in surveying the extrinsics between object spaces, and are just worried about your device’s sensors’ (”component-side”) extrinsics, we recommend setting --disable-ore-inference. Note that this flag's setting shouldn't dramatically change the component-side calibration.

“Detect Interior Points”

Sometimes, the data coming out of a LiDAR can be extremely noisy, and relying on this data too much can skew or completely ruin calibration results. If this is the case, we have a remedy: The circle detector now has a mandatory detect_interior_points variable (docs):  If you’re finding that the accuracy of your results is worse than expected, setting this flag to false can greatly improve results.

X-Offset and Y-Offset

For camera-LiDAR calibrations, MetriCal uses a circular target ringed with retroreflective material. For constructing these targets, we start with a typical square Charuco board, and die-cut it into a circle. However, before v9.0.0, MetriCal just assumed that the X offset, Y offset, and radius of target boards were all the same value. Turns out, that only applied to the first board we tested; everyone else constructed their boards their own way!

The circle detector now takes an x_offset and y_offset variable to describe the center of the circle in regard to the full board frame (docs). This change should improve any camera-LiDAR extrinsics results dramatically, since the alignment between the circle center and the board center is no longer ambiguous.

Clearer Summary Statistics

MetriCal’s summary statistics provide an overview of the final results of a calibration, including information on overall accuracy and precision. However, MetriCal’s summary statistics hadn’t kept up with the rest of the program; they were still very camera-centric, a relic of times gone by. We’ve updated these tables to actually report data that you care about. Summary statistics now include an Optimization table (docs), Camera table (docs), and a LiDAR table (docs). Expect an IMU table in a future release.

Rendering Overhaul

MetriCal delivers exceptionally detailed calibration metrics. But, sometimes a picture is worth a thousand words (or, in this case, a thousand numbers?). That’s why MetriCal includes great visualizations to provide added context about your calibration results.

And in v9.0.0, those great visualizations get even better! MetriCal’s rendering code has been completely overhauled. Plus, we are now officially on Rerun 0.14 🎊! Thanks to the Rerun team for their continued support and development efforts.

This overhaul comes with its own focus on clarity:

  • LiDAR-LiDAR datasets are now rendered and registered along with camera-LiDAR
  • Object relative extrinsics are now rendered when available
  • Images now use lookup tables properly for quick correction
  • Spaces have been reorganized for clarity and ease of use

It’s now easier than ever to see what’s actually happening with your calibration data. And, we have to admit, sometimes these adjustments are just pretty to look at:

For production line calibration for sophisticated devices (like an autonomous vehicle, for instance), you’ll need a sophisticated calibration environment. Need to use 20 fiducials in your calibration environment? If you do, rest assured that MetriCal can handle it. And, for what it’s worth, MetriCal can handle 20…200…2000…etc. You get the idea. It’s powerful!

That’s all for this release! Stay tuned for even more updates, posts, and webinars on MetriCal in the coming months.

If you’re interested in testing MetriCal for your company or project, or just have a feature request, email metrical@tangramvision.com with a little information about what you’re up to.

---

Other Fun Tangram Vision Stuff

Share On: