NVIDIA DriveWorks SDK

The NVIDIA® DriveWorks SDK is the foundation for autonomous vehicle (AV) software development. It provides an automotive-grade middleware with accelerated algorithms and versatile tools.



DriveWorks at a Glance

With the DriveWorks SDK, developers can begin innovating their own AV solution, instead of spending time developing basic low-level functionality. DriveWorks is modular, open, and readily customizable. Developers can use a single module within their own software stack to achieve a specific function, or use multiple modules to accomplish a higher-level objective.

DriveWorks acceleration libraries and tools


Key Features


Compute Graph Framework and STM Scheduler

DriveWorks 5 for the DRIVE Orin™ SoC comes with new major features. With the Compute Graph Framework (CGF) and System Task Manager (STM) Scheduler, DriveWorks becomes a versatile automotive middleware.


CGF enables developers to express their application as nodes of a directed acyclic graph (DAG). Nodes can be re-used; the graph structure provides an intuitive way to organize complex applications. Compute resource consumption of nodes is defined in a fine-grained fashion:

  • Execution paths (“passes”) are defined individually with the compute engine (CPU core, GPU, DLA, etc.) and
  • Worst Case Execution Time (WCET) to enable efficient scheduling.

  • For scheduling, the graph is handed over to the STM scheduler. STM is a static, non-preemptive scheduler: STM’s compiler runs offline to derive an optimal schedule that execution will follow at runtime. It is designed to be a non-preemptive scheduler - where execution is not interrupted - enabling safety-critical applications and highly reliable deterministic behavior.

     DriveWorks comes with Compute Graph Framework and STM Scheduler

    Sensor Abstraction Layer

    DriveWorks is compatible with Sensor Abstraction Layer (SAL)

    The Sensor Abstraction Layer (SAL) provides:

  • A unified interface to the vehicle’s sensors.
  • Synchronized timestamping of sensor data.
  • Ability to serialize the sensor data for recording.
  • Ability to replay recorded data.
  • Image signal processing acceleration via Image Signal Processing (ISP) hardware engine.

  • All DriveWorks modules are compatible with the SAL, reducing the need for specialized sensor data processing. The SAL supports a diverse set of Hyperion compatible sensors out of the box. Support for additional sensors may be added using the robust sensor plugin architecture.


    Image and Point Cloud Processing

    DriveWorks provides a wide array of optimized low-level image and point cloud processing modules for incoming sensor data to use in higher level perception, mapping, and planning algorithms. Selected modules can be seamlessly run and accelerated on different DRIVE AGX hardware engines (such as Programmable Vision Accelerator or GPU), giving the developer options and control over their application. Image processing capabilities include feature detection and tracking, structure from motion, and rectification. Point cloud processing capabilities include lidar packet accumulation, registration, and planar segmentation (and more).

    DriveWorks provides image and point cloud processing modules

    VehicleIO

    VehicleIO module supports multiple production drive-by-wire backends

    The VehicleIO module supports multiple production drive-by-wire backends to send commands to and receive status from the vehicle. In the event your drive-by-wire device is not supported out-of-the-box, the VehicleIO Plugin framework enables easy integration with custom interfaces.


    DNN Framework

    The DriveWorks Deep Neural Network (DNN) Framework can be used for loading and inferring TensorRT models that have been independently trained. If DNNs are composed of layers that are not supported by TensorRT, plugins can be used to introduce custom layers and leverage the acceleration provided by the inference engine. The DNN Framework also enables inference acceleration using the DRIVE AGX’s integrated GPU.

    DNN Framework

    Recorder

    Sensor data recording and post-recording tools

    Sensor Data Recording and Post-Recording Tools are a suite of tools to record, synchronize, and playback the data captured from multiple sensors interfaced to the NVIDIA DRIVE™ AGX platform. Recorded data can be used as a high quality synchronized source for training and other development purposes.


    Calibration

    Calibration provides a reliable foundation for building AV solutions with high fidelity, consistent, up-to-date sensor data. It supports alignment of the vehicle’s camera, lidar, radar, and Inertial Measurement Unit (IMU) sensors that are compatible with the DriveWorks Sensor Abstraction Layer.

  • Static Calibration Tools: Measures manufacturing variation for multiple AV sensors to a high degree of accuracy. Camera Calibration includes both extrinsic and intrinsic calibration, while the IMU Calibration Tool calibrates vehicle orientation with respect to the coordinate system.
  • Self-Calibration: Provides real-time compensation for environmental changes or mechanical stress on sensors caused by events such as changes in road gradient, tire pressure, vehicle passenger loading, and other minor changes. It corrects nominal calibration parameters (captured using the Static Calibration Tools) based on current sensor measurements in real-time, meaning that the algorithms are performant, safety-compliant, and optimized for the platform.
  • Static calibration tools and self-calibration

    Egomotion

    DriveWorks Egomotion module uses a motion model to track and predict a vehicle’s pose

    The DriveWorks Egomotion module uses a motion model to track and predict a vehicle’s pose. DriveWorks uses two types of motion models: an odometry-only model and, if an IMU is available, a model based on IMU and odometry.


    Resources

    Peek under the hood to experience NVIDIA’s latest autonomous driving innovations via DRIVE Labs and DRIVE Dispatch.


    View DRIVE Videos