The NVIDIA® DriveWorks SDK is the foundation for all autonomous vehicle (AV) software development. It provides an extensive set of fundamental AV capabilities, including processing modules, tools, and frameworks that are required for advanced AV development.


With the DriveWorks SDK, developers can begin innovating their own AV solution, instead of spending time developing basic low-level functionality. DriveWorks is modular, open, and readily customizable. Developers can use a single module within their own software stack to achieve a specific function, or use multiple modules to accomplish a higher-level objective.

DriveWorks is suited for the following objectives:

  • Integrate automotive sensors within your software.
  • Accelerate image and Lidar data processing for AV algorithms.
  • Interface with a vehicle’s ECUs and receive its state.
  • Accelerate neural network inference for AV perception.
  • Capture and post-process data from multiple sensors.
  • Calibrate multiple sensors with precision.
  • Track and predict a vehicle’s pose.

Details


Sensor Abstraction Layer



The Sensor Abstraction Layer (SAL) provides: List of supported sensors

  • A unified interface to the vehicle’s sensors.
  • Synchronized timestamping of sensor data.
  • Ability to serialize the sensor data for recording.
  • Ability to replay recorded data.
  • Image signal processing acceleration via Xavier SoC ISP hardware engine.

All DriveWorks modules are compatible with the SAL, reducing the need for specialized sensor data processing. The SAL supports a diverse set of automotive sensors out of the box. Additional support can be added with the robust Sensor Plugin architecture.




These Sensor Plugins provide flexible options for interfacing new AV sensors with the SAL. Features including sensor lifecycle management, timestamp synchronization, and sensor data replay can be realized with minimal development effort

  • Custom Camera Feature: DriveWorks enables external developers to add DriveWorks support for any image sensor using the GMSL interface on DRIVE AGX Developer Kit.

  • Comprehensive Sensor Plugin Framework: This enables developers to bring new Lidar, Radar, IMU, GPS, and CAN-based sensors into the DriveWorks SAL that are not natively supported by DriveWorks.

Watch Webinar Recording

Image and Point Cloud Processing



DriveWorks provides a wide array of optimized low-level image and point cloud processing modules for incoming sensor data to use in higher level perception, mapping, and planning algorithms. Selected modules can be seamlessly run and accelerated on different DRIVE AGX hardware engines (such as PVA or GPU), giving the developer options and control over their application. Image processing capabilities include feature detection and tracking, structure from motion, and rectification. Point cloud processing capabilities include lidar packet accumulation, registration and planar segmentation (and more). New and improved modules are delivered with each release.


Watch Webinar Recording

VehicleIO


The VehicleIO module supports multiple production drive-by-wire backends to send commands to and receive status from the vehicle. In the event your drive-by-wire device is not supported out of the box, the VehicleIO Plugin framework enables easy integration with custom interfaces.


DNN Framework


The DriveWorks Deep Neural Network (DNN) Framework can be used for loading and inferring TensorRT models that have either been provided in the DRIVE Networks library or have been independently trained. The DNN Plugins module enables DNN models that are composed of layers that are not supported by TensorRT to benefit from the efficiency of TensorRT. The DNN Framework enables inference acceleration using integrated GPU (in Xavier SoC), discrete GPU (in DRIVE AGX Pegasus), or integrated DLA (Deep Learning Accelerator in Xavier SoC).


Recorder

In order to improve developer productivity, DriveWorks provides an extensive set of tools, reference applications, and documentation including:

  • Sensor Data Recording and Post-Recording Tools: A suite of tools to record, synchronize and playback the data captured from multiple sensors interfaced to the NVIDIA DRIVE™ AGX platform. Recorded data can be used as a high quality synchronized source for training and other development purposes.

Calibration

If a sensor’s position or inherent properties deviate from nominal/assumed parameters, then any downstream processing may be faulty, whether it’s for self-driving or data collection. Calibration gives you a reliable foundation for building AV solutions with the assurance that they’ll achieve high fidelity, consistent, up-to-date sensor data. Calibration supports calibration of the vehicle’s camera, LIDAR, radar, and Inertial Measurement Unit (IMU) sensors that are compatible with the DriveWorks Sensor Abstraction Layer.

  • Static Calibration Tools: Measures manufacturing variation for multiple AV sensors to a high degree of accuracy. Camera Calibration includes both extrinsic and intrinsic calibration, while the IMU Calibration Tool calibrates vehicle orientation with respect to the coordinate system.


  • Self-Calibration: Provides real-time compensation for environmental changes or mechanical stress on sensors caused by events such as changes in road gradient, tire pressure, vehicle passenger loading, and other minor changes. It corrects nominal calibration parameters (captured using the Static Calibration Tools) based on current sensor measurements in real-time, meaning that the algorithms are performant, safety-compliant, and optimized for the platform.

Egomotion

The DriveWorks Egomotion module uses a motion model to track and predict a vehicle’s pose. DriveWorks uses two types of motion models: an odometry-only model and, if an IMU is available, a model based on IMU and odometry.


Developing with DriveWorks


How to set up

You will need:

Steps:

  • Install DriveWorks using the SDK Manager.
  • Try out the DriveWorks "Hello World" getting started tutorial.
  • Refer to the DRIVE Software Documentation to experiment with DriveWorks samples. The DriveWorks Samples are demonstrations of key DriveWorks capabilities and are intended to be used as a starting point for developing and optimizing code.

How to develop

For the below development tasks please refer to the “samples” folder of DriveWorks SDK Reference guide contained in the DRIVE Software Documentation.

Development Tasks Getting Started
Integrate your sensors into the DriveWorks SAL

Watch Webinar Recording

To ensure that your sensors are fully integrated with the SAL and/or for more detail and examples of how to perform the SAL integration, please refer to the following sensor samples:

  • USB Camera Capture Sample : Displays video captured from generic USB cameras.
  • GMSL Camera Capture Sample : Displays the video from the first camera on the selected camera group.
  • RAW GMSL Camera Capture Sample : Writes to disk the RAW data from the first camera on the selected camera-group.
  • Multiple GMSL Camera Capture Sample : Displays the input from all selected cameras on the specified camera groups.
  • GMSL SIPL Camera Custom Capture sample : Displays and records a video setup using DriveWorks Custom Camera feature.
  • Video Replay Sample : Uses a hardware decoder to playback and display an H.264 video stream.
  • Camera Seek Sample : Demonstrates how to replay a video and use the seek to timestamp/event feature to seek to any point in the video.
  • Simple Sensor Recording Sample : Records data from CAN, GPS, LIDAR or RADAR sensors.
  • Sensor Enumeration Sample : Enumerates all available sensors drivers in SDK.
  • CAN Message Interpreter : Uses a DBC interpreter to decode CAN messages and displays CAN messages.
  • CAN Bus Message Logger Sample : Displays messages from the CAN bus.
  • CAN Plugin Sample : Implements a driver for a CAN-based CAN sensor.
  • GPS Location Logger Sample : Displays messages from GPS devices.
  • GPS Plugin Sample : Implements a driver for a CAN-based GPS sensor.
  • IMU Logger Sample : Displays messages from the IMU sensor.
  • Lidar Replay Sample : Displays in the 3D point cloud from the Lidar.
  • Radar Replay Sample : Connects to a Radar and displays the generated point cloud in 3D.
Record and playback using DriveWorks Tools
  • Follow the 'Recording Sensor Data Tutorial' in the 'Tutorials' section of the DriveWorks SDK Reference Guide contained in the DRIVE Software Documentation
  • For additional information, refer to "Recording Tools" under the "Tools" section of the DriveWorks SDK Reference Guide contained in the DRIVE Software Documentation
Accelerate image processing using DriveWorks Image Processing modules

For more detail and examples on how to use the DriveWorks image processing modules, please refer to the following samples:

  • Image Capture Sample : Demonstrates how to record a video from a CUDA image or a rendering video.
  • Image Streamer Cross-Process Sample : Demonstrates how to use an image streamer across multiple processes.
  • Image Streamer Multi-Thread Sample : Demonstrates how to use an image streamer in a multi-thread environment.
  • Image Streamer Simple Sample : Demonstrates how to use an image streamer in a simple environment.
  • Image Transformation Sample : Demonstrates basic image processing functionalities with DriveWorks, such as image scaling
  • Camera Color Correction Sample : Demonstrates H.264 playback with color correction. Displays video for four provided video files and corrects their color based on a selected master camera index.
  • Video Rectification Sample : Removes fisheye distortion from a video captured on a camera with a fisheye lens.
  • Stereo Disparity Sample : Demonstrates the stereo pipeline and how to produce a confidence map and final stereo output.
  • Connected Components Sample : Demonstrates connected components labeling algorithm.
  • Feature Tracker Sample : Demonstrates the feature detection and feature tracking capabilities of the Features module.
  • Template Tracker Sample : Demonstrates the template tracking capabilities of the Features module.
  • Structure from Motion Sample : Demonstrates triangulation by estimating a car pose from CAN data using the egomotion module.
Accelerate lidar processing algorithms using DriveWorks Point Cloud Processing modules

For more detail and examples on how to use the DriveWorks point cloud processing modules, please refer to the following sample:

  • Point Cloud Processing Sample : Demonstrates how to use point cloud processing APIs for primitive processing. APIs include Point Cloud Accumulation, Plane Extraction, Range Image Creation, ICP, and Stitching.
Use the DriveWorks DNN Framework to load and infer neural networks

For more detail and examples on how to use the DriveWorks DNN Framework modules, please refer to the following samples:

  • Basic DNN Plugin Sample : Performs DNN inference to classify a handwritten digit using an MNIST model with a custom layer implemented as a plugin.
  • Basic Object Detector and Tracker Sample : For each frame in a video stream, detects the object locations and tracks the objects between video frames.
Interface your software solution with the vehicle using the DriveWorks VehicleIO module

For more detail and examples on how to use the DriveWorks VehicleIO modules, please refer to the following samples:

  • VehicleIO Sample : Controls the vehicle (steering/throttle/brakes) and reads the vehicle state.
  • Dataspeed Bridge Sample : dataspeedBridge sample shows how to convert CAN signals coming to/ and from a Dataspeed drive-by-wire system to a generic NVIDIA defined CAN bus format.
Use the DriveWorks Static Calibration Tools for one-time calibration of sensors mounted on a vehicle i.e. measure and record nominal calibration parameters

Follow the “Camera Calibration Tutorial” in the “Tutorials” section of the DriveWorks SDK Reference Guide contained in the DRIVE Software Documentation.

For IMU Calibration and additional information, refer to "Calibration Tools" under the "Tools" section of the DriveWorks SDK Reference Guide contained in the DRIVE Software Documentation.

Develope dynamic sensor calibration capabilitities using the DriveWorks Self-Calibration modules

For more detail and examples on how to use the DriveWorks self-calibration modules, please refer to the following samples:

  • Camera Calibration Sample : Demonstrates the ability to estimate camera extrinsics using the DriveWorks Calibration Engine.
  • IMU Calibration Sample : Demonstrates the ability to estimate IMU extrinsics using the DriveWorks Calibration Engine.
  • LIDAR Calibration Sample : Demonstrates the ability to estimate LIDAR extrinsics using the DriveWorks CalibrationEngine.
  • Radar Calibration Sample : Demonstrates how to use the radar self-calibration module.
Use the DriveWorks Egomotion module to model and predict the vehicle’s motion

For more details and examples on how to use the DriveWorks Egomotion module, please refer to the following sample:

  • Egomotion Sample : Uses CAN measurements of steering angle and velocity to compute the position of the car in the world coordinate system.


Additional Development Resources:

Documentation