The DRIVE Core layer of the NVIDIA DriveWorks SDK is the foundation for all autonomous vehicle (AV) software development. It provides an extensive set of fundamental AV capabilities, including processing modules, tools, and frameworks that are required for the development of advanced AV functionalities.


With DRIVE Core, developers can begin innovating their own AV solution instead of spending time on developing basic low-level functionality. DRIVE Core is modular, open and readily customizable. Developers can use a single DRIVE Core module within their own software stack to achieve a specific function or use multiple modules to accomplish a higher-level objective.

DRIVE Core is suited for the following objectives:

  • Bring up automotive sensors.
  • Use tools to capture data from sensors.
  • Accelerate image and lidar data processing for AV algorithms.
  • Accelerate neural network inference for AV perception.
  • Interface with a vehicle’s ECUs.

Details


Sensor Abstraction Layer



The Sensor Abstraction Layer (SAL) provides:

  • a unified interface to the vehicle’s sensors
  • sensor lifecycle management
  • synchronized timestamping of sensor data
  • ability to serialize the sensor data for recording
  • ability to replay recorded data
  • image processing acceleration via Xavier SoC ISP hardware engine

All DriveWorks modules are compatible with the SAL, reducing the need for specialized sensor data processing. The SAL supports a diverse set of automotive sensors out of the box. Additional support can be added with the robust sensor plugin architecture.

List of supported sensors

Sensor Plugins




USING CUSTOM SENSORS WITH NVIDIA DRIVEWORKS

Watch webinar recording

DriveWorks provides flexible options for interfacing new AV sensors with the SAL so that features such as sensor lifecycle management, timestamp synchronization, and sensor data replay can be realized with minimal development effort.

  • Custom Camera Feature: DriveWorks enables external developers to add DriveWorks support for any image sensor using the GMSL interface on DRIVE AGX Developer Kit.

  • Comprehensive Sensor Plugin Framework: This enables developers to bring new sensors into the DriveWorks SAL that are not natively supported by DriveWorks. In addition to decoding plugins, the framework supports the ability to implement the transport and protocol layers necessary to communicate with the sensor. IMU and Lidar plugin support is available in DRIVE Software 9.0.

  • Lidar & Radar Decoder Plugins: Developers can add support for their own Lidar and radar sensors that are compliant with the existing DriveWorks-supported sensor interfaces. DriveWorks makes it easy by managing the sensor interfaces. Developers just need to write the decoding plugin that parses and interprets the raw sensor data into the expected SAL data structure.

Image and Point Cloud Processing



DriveWorks provides a wide array of optimized low-level image and point cloud processing modules for incoming sensor data to use in higher level perception, mapping, and planning algorithms. Selected modules can be seamlessly run and accelerated on different DRIVE AGX hardware engines (such as PVA or GPU), giving the developer options and control over their application. Image processing capabilities include feature detection and tracking, structure from motion, and rectification. Point cloud processing capabilities include lidar packet accumulation, registration and planar segmentation (and more). New and improved modules are delivered with each release.


VehicleIO module


The VehicleIO module supports multiple production drive-by-wire backends to send commands to and receive status from the vehicle. In the event your drive-by-wire device is not supported out of the box, the VehicleIO Plugin framework enables easy integration with custom interfaces.


DNN Framework


The DriveWorks Deep Neural Network (DNN) Framework can be used for loading and inferring TensorRT models that have either been provided in the DRIVE Networks library or have been independently trained. The DNN Plugins module enables DNN models that are composed of layers that are not supported by TensorRT to benefit from the efficiency of TensorRT. The DNN Framework enables inference acceleration using integrated GPU (in Xavier SoC), discrete GPU (in DRIVE AGX Pegasus), or integrated DLA (Deep Learning Accelerator in Xavier SoC).


DriveWorks Tools and Samples

In order to improve developer productivity, DriveWorks provides an extensive set of tools, reference applications, and documentation including:

  • Sensor Data Recording and Post-Recording Tools: A suite of tools to record, synchronize and playback the data captured from multiple sensors interfaced to the NVIDIA DRIVE™ AGX platform. Recorded data can be used as a high quality synchronized source for training and other development purposes.

  • DriveWorks Samples: The DriveWorks Samples are demonstrations of core DriveWorks capabilities and are intended to be used as a starting point for developing and optimizing code.

Developing with DRIVE Core


How to set up

You will need:

Steps:

  • Install DRIVE Core via the SDK Manager.
  • Try out the DriveWorks "Hello World" getting started tutorial.
  • Experiment with DriveWorks samples.
  • To cross-compile and experiment with samples on a DRIVE AGX System instead of the host PC, try the Cross compilation tutorial.

How to develop

Development Tasks Getting Started
Integrate your sensors into the DRIVE Core SAL

There are several samples included in the “samples” folder of DriveWorks SDK Reference guide contained in the DRIVE Software Documentation.

To ensure that your sensors are fully integrated with the SAL and/or for more detail and examples of how to perform the SAL integration, please refer to the following DRIVE Core sensor samples:

  • USB Camera Capture Sample : Displays video captured from generic USB cameras.
  • GMSL Camera Capture Sample : Displays the video from the first camera on the selected camera group.
  • RAW GMSL Camera Capture Sample : Writes to disk the RAW data from the first camera on the selected camera-group.
  • Multiple GMSL Camera Capture Sample : Displays the input from all selected cameras on the specified camera groups.
  • Video Replay Sample : Uses a hardware decoder to playback and display an H.264 video stream.
  • Camera Seek Sample : Demonstrates how to replay a video and use the seek to timestamp/event feature to seek to any point in the video.
  • Simple Sensor Recording Sample : Records data from CAN, GPS, LIDAR or RADAR sensors.
  • Sensor Enumeration Sample : Enumerates all available sensors drivers in SDK.
  • CAN Message Interpreter : Uses a DBC interpreter to decode CAN messages and displays CAN messages.
  • CAN Bus Message Logger Sample : Displays messages from the CAN bus.
  • GPS Location Logger Sample : Displays messages from GPS devices.
  • IMU Logger Sample : Displays messages from the IMU sensor.
  • Lidar Replay Sample : Displays in the 3D point cloud from the Lidar.
  • Radar Replay Sample : Connects to a Radar and displays the generated point cloud in 3D.
Record and playback using DRIVE Core Tools
  • DRIVE Hyperion is recommended but not required.
  • Follow the 'Recording Sensor Data Tutorial' in the 'Tutorials' section of the DriveWorks SDK Reference Guide contained in the DRIVE Software Documentation
  • For additional information, refer to "Recording Tools" under the "Tools" section of the DriveWorks SDK Reference Guide contained in the DRIVE Software Documentation
Accelerate image processing using DRIVE Core image processing modules

There are several samples included in the “samples” folder of DriveWorks SDK Reference guide contained in the DRIVE Software Documentation.

For more detail and examples on how to use the DRIVE Core image processing modules, please refer to the following samples:

  • Image capture sample : Demonstrates how to record a video from a CUDA image or a rendering video.
  • Image Streamer Cross-Process Sample : Demonstrates how to use an image streamer across multiple processes.
  • Image Streamer Multi-Thread Sample : Demonstrates how to use an image streamer in a multi-thread environment.
  • Image Streamer Simple Sample : Demonstrates how to use an image streamer in a simple environment.
  • Camera Color Correction Sample : Demonstrates H.264 playback with color correction. Displays video for four provided video files and corrects their color based on a selected master camera index.
  • Video Rectification sample : Removes fisheye distortion from a video captured on a camera with a fisheye lens.
  • Stereo Disparity Sample : Demonstrates the stereo pipeline and how to produce a confidence map and final stereo output.
  • Connected Components Sample : Demonstrates connected components labeling algorithm.
  • Feature Tracker Sample : Demonstrates the feature detection and feature tracking capabilities of the Features module.
  • Template Tracker Sample : Demonstrates the template tracking capabilities of the Features module.
  • Structure from Motion Sample : Demonstrates triangulation by estimating a car pose from CAN data using the egomotion module.
Accelerate lidar processing algorithms using DRIVE Core point cloud processing modules

There are several samples included in the “samples” folder of DriveWorks SDK Reference guide contained in the DRIVE Software Documentation.

For more detail and examples on how to use the DRIVE Core point cloud processing modules, please refer to the following sample:

  • Point Cloud Processing sample : Demonstrates how to use point cloud processing APIs for primitive processing. APIs include Point Cloud Accumulation, Plane Extraction, Range Image Creation, ICP, and Stitching.
Use the DRIVE Core DNN Framework to load and infer neural networks

There are several samples included in the “samples” folder of DriveWorks SDK Reference guide contained in the DRIVE Software Documentation.

For more detail and examples on how to use the DRIVE Core point cloud processing modules, please refer to the following samples:

  • Basic DNN Plugin Sample : Performs DNN inference to classify a handwritten digit using an MNIST model with a custom layer implemented as a plugin.
  • Basic Object Detector and Tracker Sample : For each frame in a video stream, detects the object locations and tracks the objects between video frames.
Interface your software solution with the vehicle using the DRIVE Core VehicleIO module

There are several samples included in the “samples” folder of DriveWorks SDK Reference guide contained in the DRIVE Software Documentation.

For more detail and examples on how to use the DRIVE Core point cloud processing modules, please refer to the following samples:

  • VehicleIO Sample : Controls the vehicle (steering/throttle/brakes) and reads vehicle state.
  • Dataspeed Bridge Sample : dataspeedBridge sample shows how to convert CAN signals coming to/ and from a Dataspeed drive-by-wire system to a generic NVIDIA defined CAN bus format.


Additional Development Resources:

Documentation

DRIVE Developer Kit accessories and sensors