Stereolabs ZED Stereo Camera combined with Jetson TX1 brings advanced 3D mapping to drones

A drone with a camera attached to it is nothing new. But when that camera happens to be the ZED stereo camera by Stereolabs powered by the new NVIDIA Jetson TX1 supercomputer, you suddenly have a first-of-its-kind drone that is capable of Stereo 3D Simultaneous Location and Mapping (SLAM). In the past, power and weight limitations would have made this process nearly impossible.

The small form factor of the Jetson TX1 enables San Francisco-based Stereolabs to bring advanced computer vision capabilities to smaller and smaller systems.

 

This video demo shows a drone 3D scanning a French chateau in real time.

 

Being capable of 3D SLAM means that a drone or other vehicle can easily capture a detailed understanding of the world around it, which points to new opportunities in autonomous navigation that can be made available to any developer.

“The single greatest barrier to autonomous navigation is the capability to see and understand,” says Cecile Schmollgruber, CEO of Stereolabs. “In simulations, computers have been able to navigate around obstacles for a long time. But they’re blind in the real world. With our sensor, a machine can see the world almost as well as a human can.”

Schmollgruber’s comparison with human vision is more than simply a metaphor. The ZED stereo camera is modeled after human vision. Two cameras (“eyes”) send a video feed to a GPU, where Stereolabs’ software calculates depth maps by measuring the disparity between what it sees, similar to the human visual cortex. This technique allows ZED to capture depth at resolutions up to 2.2k and frame rate up to 120 fps. Developers can capture the world in 3D and map 3D models of indoor and outdoor scenes up to 20 meters.

“In addition to depth sensing, we’re able to run tracking and mapping, in real time, aboard the drone thanks to Jetson TX1 and ZED,” says Stereolabs CTO Edwin Azzam.

This technology can also be applied to augmented reality in new and clever ways. For instance, game developers can use the ZED to create and enhance their imaginary worlds, while at the same time a real estate agent could use this technology to create immersive digital version of her listings. Modern map makers can use this 3D mapping to chart tough terrain quickly and easily.

“What we’re showing here is how we capture high quality depths map, track the camera, and fuse the 3D reconstructions together to create the 3D mesh of the environment,” says Azzam. “This resulting mesh can be imported into any 3D software, and has really simplified the process of creating and editing a 3D model.”

To make things easier for developers, the 3D SLAM feature will be added to the existing ZED SDK, giving every developer access to depth, tracking and mapping data. This makes it possible for any machine to understand its position and free space, and eventually it will be able to set its own trajectory.

“The ZED camera is very popular among our developer community and our partners,” said Deepu Talla, VP and GM, Tegra at NVIDIA. “That includes MIT, where we’ve heard great feedback from students using ZED cameras to build autonomous cars.”

For more information, check out the Stereolabs website.