The best way to learn is by doing, and to help you get started, we have assembled a series of tutorials and instructional materials featuring the latest developer innovations.
The GPU Technology Conference (GTC) highlights the latest breakthroughs in autonomous vehicles, AI, HPC, accelerated data science, healthcare, graphics, and more.
You may view the extensive catalog of recorded presentations on the future of self-driving technology through NVIDIA On Demand by clicking below.
Additionally, you may view our DRIVE Developer Day sessions, which offer deep dives into safe and robust autonomous vehicle development by clicking below.
Register for upcoming automotive developer webinars to learn more about the NVIDIA DRIVE® platform.
In each hour-long session, NVIDIA experts will dive into the details of various aspects of the end-to-end AV computational pipeline and will be available for live Q&A.
Stay tuned for upcoming automotive developer webinars to learn more about the NVIDIA DRIVE platform, meanwhile check out the Past Webinars.
Catchup on the past DRIVE webinars recording at your own convenience!
|DRIVE AGX Hardware Update with NVIDIA Orin||DRIVE AGX|
Get an early look at the next generation of DRIVE AGX hardware platforms based on the upcoming Orin SoC.
|Turbocharge Autonomous Vehicle Development with DRIVE OS and DriveWorks|| DRIVE OS |
Learn how NVIDIA DRIVE OS and DriveWorks turbocharge autonomous vehicle development, delivering foundational autonomous tools and functional safety while simultaneously optimizing NVIDIA DRIVE AGX compute performance.
|DRIVE AV Perception Overview||DRIVE Perception|
The ability to interpret a scene with 360° awareness is a critical function of an autonomous vehicle. In this session, we'll highlight the NVIDIA DRIVE AV Perception software stack, including an architecture overview and our latest algorithmic results.
|Mapping and Localization with DRIVE AV||DRIVE Mapping|
The use of HD maps is a key part of ensuring a safe and comfortable journey. In this session, we'll provide an overview of NVIDIA's end-to-end solution for creating and maintaining crowdsourced HD maps, and how they're used for vehicle localization.
|A Modular Approach to AV Planning and Control||DRIVE Planning|
Planning and control executes maneuvers using input from perception, prediction, and mapping. In this session, we'll review the NVIDIA DRIVE AV modular approach to planning and control software and the variety of capabilities it enables.
|Leveraging EGX and DGX for Developing AV Platforms and Supporting Connected Services||DRIVE Infrastructure|
We'll look at how NVIDIA DGX and NVIDIA EGX are used to create the network of data centers and edge devices necessary for developing an AV platform and delivering functionality and connected services to vehicles of the future.
|Automated Testing at Scale to Enable Deployment of Autonomous Vehicles||DRIVE Infrastructure|
In this webinar, we'll discuss the use of simulation and computing infrastructure for AV development. We'll also demonstrate a scalable and automated set of solutions for end-to-end testing to enable AV deployment on the road, according to safety standards.
|NVIDIA DRIVE Sim and Omniverse||DRIVE Sim|
We'll discuss the use of simulation and computing infrastructure for AV development. We'll also demonstrate a scalable and automated set of solutions for end-to-end testing to enable AV deployment on the road, according to safety standards.
|Optimization Strategies for Deploying Self-Driving DNNs with NVIDIA TensorRT||DriveWorks|
This webinar is focused on how popular model architectures fit into the concept of solving various tasks on DRIVE AGX as well as practical considerations on balancing task accuracy with compute and memory resources, using TensorRT for deployment. The focus will lie on the most generic and typically most expensive component in DNNs for solving computer vision problems: the model backbone, for example the popular ResNet-50 architecture.
|Calibrating AV Sensors With NVIDIA DriveWorks SDK||DriveWorks|
In this webinar, we will cover the steps to perform static (offline) extrinsic and intrinsic camera calibration for AV sensors. We will also cover the DriveWorks APIs and walk through sample code that performs camera extrinsic self-calibration while the autonomous vehicle is driving.
|Real-Time Sensor Recording and Replay Tools with the NVIDIA DriveWorks SDK||DriveWorks|
Experts from NVIDIA discuss the recording pipeline and available tools in the DriveWorks SDK, enabling developers to record and replay high quality sensor data. This webinar covers how to manage the sensor configuration, both on the development bench and in the vehicle, distributed recording as well as post-recording tools and replay capabilities.
|Autonomous Driving at Scale: Architect and Deploy Object Detection Inference Pipelines||DRIVE Infrastructure|
NVIDIA and system integrator Tata Consultancy Services (TCS) company experts cover the data annotation pipeline in the autonomous vehicle training addressing the sensitivity to variations in data distribution, data type, annotation requirements, quality parameters, the volume of data, and the effectiveness of automation critical to achieving optimal annotation accuracy and speed.
|Developing Intelligent In-Cabin Experience Using DRIVE IX||DRIVE IX|
The webinar covers the architecture of the NVIDIA DRIVE IX open platform and how to build intelligent occupant-centric applications, as well as future capabilities coming down the pipeline. With the modular design of DRIVE IX, developers can use a wide array of functionalities and try out different technologies for the same feature using plugins. We walk through sample code that demonstrates how to build intelligent applications using APIs from the DRIVE IX platform.
|NVIDIA DRIVE Infrastructure – The Complete Datacenter Infrastructure to Build Autonomous Vehicles|| DRIVE Infrastructure|
This webinar introduces NVIDIA’s Infrastructure for building and maintaining autonomous vehicles. It includes techniques for managing the lifecycle of deep learning models, from definition, training and deployment to reloading and life-long learning. It also covers the powerful new NVIDIA DRIVE Constellation™ AV simulator, which is enabling the industry to safely drive billions of qualified miles in virtual reality. This two-server simulation platform makes it possible to test an autonomous vehicle in a near-infinite variety of conditions and scenarios—before it even reaches the road. This webinar introduces the large-scale deployment of this validation infrastructure and the ecosystem around it.
|Planning and Control Architecture and Implementation for Autonomous Vehicles||DRIVE Planning|
An autonomous vehicle's planning and control stack is responsible for generating a safe and comfortable plan to drive the car. In this webinar, we review the general architecture of the NVIDIA DRIVE Planning and Control stack and show examples of how the stack drives the car in different scenarios. We'll discuss an example of a module API and explore methods to test and benchmark performance of planning and control functions.
|Perception and Mapping Architecture for Autonomous Vehicles|| DRIVE Perception|
Perception, mapping and localization are critical for robust autonomous vehicle planning and control. In this webinar, we’ll walk through the NVIDIA DRIVE Perception and NVIDIA DRIVE Mapping general architectures. We also show how the stack provides a perception signal and how that perception signal is used to build a map and localize the ego-car.
|Building AI-Enabled AV Applications with NVIDIA DRIVE Software||DRIVE Software|
The open and scalable NVIDIA DRIVE Software includes DriveWorks and DRIVE AV SDKs, which provide the building blocks for developers to implement highly optimized autonomous vehicle software applications that leverage the computing power of the NVIDIA DRIVE AGX platform. In this webinar, we walk through DriveWorks and DRIVE AV design principles, the algorithms included in each release, our approach to leveraging the power of the NVIDIA Xavier SoC, as well as how to develop optimized applications.
|Introducing NVIDIA DRIVE OS, the Functional Safety Operating System for Autonomous Vehicle||DRIVE OS|
NVIDIA DRIVE OS is the foundation of the NVIDIA DRIVE Software stack. In this presentation, we introduce DRIVE OS for Safety, the first functional safety operating system designed specifically for accelerated computing and artificial intelligence. Certifying an open platform like DRIVE OS is a monumental undertaking, involving ground-breaking processes, tools, methodologies and technologies, fine-tuned to provide a rich set of features, cybersecurity and a performant real-time architecture.
In this webinar, you learn how we approached this vital task and how to implement functional safety in the autonomous vehicle development process.
|NVIDIA DRIVE AGX Solutions for Scalable Autonomous Vehicle Development||DRIVE AGX|
NVIDIA DRIVE AGX is an open, scalable architecture for autonomous driving capabilities, from NCAP through robotaxi. NVIDIA has developed a unique suite of SoC, GPU, and Smart Network computational and acceleration options for flexible autonomous vehicle development. This session provides details on our latest SoC, GPU, and Smart Network products and how they can be used in a vehicle computer architecture. We also go over DRIVE AGX Hyperion sensor solutions for both passenger cars and commercial trucks.
|Integrating DNN Inference into Autonomous Vehicle Applications with NVIDIA DriveWorks SDK||DriveWorks|
In this webinar, we’ll cover the steps to perform inference on a pretrained network with DriveWorks. We’ll first review DriveWorks basics before exploring the DriveWorks DNN APIs and tools to convert, optimize and run inference. Finally we walk through sample code that demonstrates how to integrate your DNN into your software pipeline.
|Part 3: Using CUDA Kernel Concurrency and GPU Application Profiling for Optimizing Inference on DRIVE AGX||DRIVE AGX|
Concurrent execution of multiple GPU inferencing tasks provides potential performance optimization when compared to its serialized counterpart. As a real-world use case, we implement a multi-network inference pipeline for object detection and lane segmentation. In building this application, we show how to achieve kernel concurrency using multiple CUDA Streams and CUDA Graphs. We then introduce how to use NVIDIA NSight Systems to profile the application, showing the performance gains from implementing concurrency.
|Part 2: Extending NVIDIA TensorRT with custom layers using CUDA||DRIVE AGX|
The second installment of this webinar series explains how to extend TensorRT with custom operations, running custom layers through TensorRT using the plugin interface. For the fastest implementation of custom layers, it is necessary to use the same GPU by building CUDA kernels on which the optimized engine will run. We present the CUDA kernel implementations with optimizations. We then cover TensorRT plugins and how to adapt CUDA kernel as a part of the TensorRT plugin for DNN model optimization with a sample application.
|Part 1: Optimizing DNN inference using CUDA and TensorRT on DRIVE AGX||DRIVE AGX|
In this webinar, we introduce CUDA cores, threads, blocks, gird, and stream and the TensorRT workflow. We also cover CUDA memory management and TensorRT optimization, and how you can deploy optimized deep learning networks using TensorRT samples on NVIDIA DRIVE AGX.
|Developing a Camera Pipeline Using NVIDIA DriveWorks||DriveWorks|
This webinar covers the steps to develop camera image processing software on the DriveWorks SDK. Using this platform, developers can implement a range of capabilities seamlessly and with high performance. This webinar includes DriveWorks image basics, low-level Computer Vision modules, and Feature Tracking and DNN samples.
|Integrating Custom Sensors Using NVIDIA DriveWorks||DriveWorks|
This webinar covers how to implement and use the sensor plugins for different sensor types such as radar, lidar, and camera. Such plugins will make it possible for developers to bring new sensors into the DriveWorks SAL and to implement the transport and protocol layers necessary to communicate with the sensor.
Peek under the hood of NVIDIA DRIVE Software with our latest video series.
Deep Learning Institute (DLI)
Learn how to integrate your sensor of choice for NVIDIA DRIVE.
In this course, you will be guided through a working example of a custom plugin for a sensor using the DriveWorks SDK and implement core elements of the plugin.
At the end of the course, you will have implemented a fully working plugin that you can test with pre recorded data. This implementation will cover all aspects of a sensor integration and can be used as a template for other sensor integrations in your own projects.