Robotics

NVIDIA Researchers Release Trailblazing Deep Learning-Based Framework for Autonomous Drone Navigation

NVIDIA’s autonomous mobile robotics team today released a framework to enable developers to create autonomous drones that can navigate complex, unmapped places without GPS. All of this is done through deep learning and computer vision powered by NVIDIA Jetson TX1/TX2 embedded AI supercomputers.
The drone, nicknamed Redtail, can fly along forest trails autonomously, achieving record-breaking long-range flights of more than one kilometer (about six-tenths of a mile) in the lower forest canopy.

The Redtail drone avoids obstacles and maintains a steady position in the center of the trail.

The team has released the deep learning models and code on GitHub as an open source project, so that the robotic community can use them to build smarter mobile robots. The technology can turn any drone into one that’s autonomous, capable of navigating along roads, forest trails, tunnels, under bridges, and inside buildings by relying only on visual sensors. All that’s needed is a path the drone can recognize visually.
The framework consists of Robotic Operating System (ROS) nodes and includes a deep neural network (DNN) called TrailNet, which estimates the drone’s orientation and lateral offset with respect to the navigation path. The provided control system uses the estimated pose to fly a drone along the path.
“The use of special training techniques has allowed us to achieve smooth and stable autonomous flights without sudden movements that would make it wobble,” said NVIDIA deep learning expert Alexey Kamenev.

The Redtail drone follows a trail in the forest near the researchers’ Redmond, Wash., office. Areas in green are where the drone decided to fly and areas in red are those it rejected

In addition to the TrailNet DNN, the framework includes an object detection DNN to locate humans, vehicles, and other objects of interest. The team has also provided specifications to build a drone capable of visual navigation without GPS. All components run in real time on the NVIDIA Jetson TX1/TX2 on board the UAVs.

Example of a drone equipped with a navigation system and Jetson supercomputer

“We chose forests as a proving ground because it’s one of the most difficult places to navigate,” said Nikolai Smolyanskiy, the team’s technical lead. “We figured if we could use deep learning to navigate in that environment, we could navigate anywhere.”
Unlike a more urban setting, where there’s generally uniformity to the height of curbs, shape of mailboxes, and width of sidewalks, a forest is relatively chaotic. Trails in the woods often contain no markings. Light can be filtered through leaves and there can be bright sunlight to dark shadows. Trees can also vary in height, width, angle, and branches.
The training process uses an automatic labeling technique for dataset preparation with a special camera rig. The TrailNet DNN can be easily retrained for different environments and can be used for land-based or aerial robots. The team’s already used the framework to train a drone to follow train tracks and ported the system to a robot on wheels to traverse office hallways.
The Redtail drone team: Nikolai Smolyanskiy, Alexey Kamenev, Jeffrey Smith

The team continues to enhance its autonomous mobile robotic navigation framework and new contributions to their open source project on GitHub are welcome. For more information about the team’s work, see Toward Low-Flying Autonomous MAV Trail Navigation using Deep Neural Networks for Environmental Awareness paper or check their GTC 2017 presentation.

Discuss (0)

Tags