The Robot Operating System (ROS) is an extremely popular middleware used by roboticists and autonomous vehicle researchers around the globe. It offers a great deal of flexibility and a plethora of software libraries and tools to be leveraged by its users. As a result, there is a great deal of interest in understanding how ROS can be used on the NVIDIA DRIVE AGX platform.
The NVIDIA DRIVE AGX platform includes the DriveWorks SDK middleware that is the foundation for autonomous vehicle (AV) software development. It provides the necessary frameworks, tools, and processing modules that most efficiently use the NVIDIA DRIVE AGX compute engines. DriveWorks supports rapid development while complying with automotive industry standards, making it easier to transition research and development into production code.
In this post, we show you how ROS and DriveWorks can be used for building AV applications using a ROS package that we have put together. This package demonstrates how to cross compile ROS Melodic nodes with the DriveWorks SDK to run on the NVIDIA DRIVE AGX platform. The example in the package demonstrates cross-compiling a ROS node that publishes camera sensor data using DriveWorks on a ROS topic. The source code shows you how to initiate a DriveWorks SDK context, create a hardware abstraction layer (HAL) module of the SDK, and set up a ROS publisher to pass sensor data. Any subscriber on a separate ROS node that subscribes to that topic can receive the imaging data. The source is posted on the NVIDIA/dw-ros GitHub repo.
The project includes a README file that describes the cross-compilation process step-by-step. The
nv_sensor_producer executable is created as part of this process. When you are testing the cross compiled program on the NVIDIA DRIVE AGX platform with ROS,
nv_sensor_producer is launched first, followed by the
camera_start ROS service. The
nv_sensor_producer executable is responsible for the initialization of the sensor via DriveWorks APIs and for setting up the producer. When the
camera_start ROS service is up and running, it provides
nv_sensor_producer with the necessary information for sensor initialization. After initialization completes, the producer starts producing and publishing the data.
camera_start ROS service supports the following options:
- Live sensor—The arguments
camera-name=SF3324,interface=csi-a,link=0,output-format=processed” are provided to define a live sensor. The specification of
camera.gmsldefinesthe protocol that is necessary for initializing a physical camera over the Gigabit Multimedia Serial Link (GMSL) interface. This is followed by a string of parameters (key-value pairs):
camera-name=SF3324describes the live camera to be instantiated as the default 2.3MP camera module used on the NVIDIA DRIVE AGX platform.
interface=csi-a,link=0describes the physical interface that the camera is connected to.
output-format=processedspecifies the desired output format as RGBA8 format.
- Video playback—The arguments
video=/usr/local/driveworks/data/samples/recordings/highway0/video_first.h264” are example arguments for video playback. The specification of
camera.virtualdefines the protocol for sensor initialization (that this is a virtual camera) and this virtual camera produces data from the recorded video at the specified location.
The final step of the test is to verify that the published data is received and can be consumed in ROS. For this, use the simple viewer provided by ROS. To visualize the camera data being streamed from the publisher, complete the final steps provided in the README file that uses
Figure 1 shows the
image_view playback with
video_first.h264 on a host system.
This project is intended to show an example of running ROS nodes with NVIDIA DriveWorks on the NVIDIA DRIVE AGX platform. With steps similar to the ones supplied in this example, you can realize many other possibilities by cross-compiling other DriveWorks sample applications to run under the ROS environment on the NVIDIA DRIVE AGX platform.
If you have questions or suggestions, please comment below.