Cloud-Native technologies offer the flexibility and agility needed for rapid product development and continual product upgrades.

Jetson brings Cloud-Native to the edge and enables technologies like containers and container orchestration which revolutionized cloud applications.

NVIDIA JetPack includes NVIDIA Container Runtime with Docker integration, enabling GPU accelerated containerized applications on Jetson platform. Developers can package an application for Jetson with all its dependencies into a single container that is guaranteed to work in any deployment environment.

Manage the lifecycle of your containerized application on Jetson platform at scale using container orchestration technologies like Kubernetes.


Several development and deployment containers for Jetson are hosted on NVIDIA NGC that can be run on JetPack or directly on Jetson Linux using nvidia container runtime:

  • L4T-Base container image: This is the base image for all containerized applications on Jetson. The NVIDIA container runtime mounts platform specific libraries and device nodes into the l4t-base container from the underlying host.

  • JetPack container image: All JetPack components are included in this image. This image provides a containerized way of running JetPack and is useful for development and also serves as a recipe to create a custom container for Jetson

  • CUDA runtime container image: This image contains CUDA runtime components (NVIDIA container runtime will not mount them from the host.) This container image is useful to containerize CUDA applications for deployment.

  • TensorRT runtime container image: TensorRT and cuDNN runtime components are included in this image, and will not be mounted from the host. (CUDA runtime components are also included via its parent image, cuda-runtime.) This container image is useful to containerize AI applications for deployment.

  • DeepStream container images: These container images include plugins and libraries that are part of DeepStream SDK. Three different images with varying contents are available: Base, Samples and IoT.

  • TensorFlow container image: This image includes TensorFlow pre-installed in a Python environment. Developers can use this to set up a TensorFlow development environment quickly. This container can be used as a base image for containerizing TensorFlow applications.

  • PyTorch container image: Contains PyTorch and TourchVision pre-installed in a Python environment. Developers can quickly set up a PyTorch development environment and can use this container as a base image for containerizing PyTorch applications.

  • Machine Learning container image: Contains TensorFlow, PyTorch, JupyterLab, and other popular ML and data science frameworks such as scikit-learn, scipy, and Pandas pre-installed in a Python environment.

NVIDIA also hosts following containers for Jetson that can be used on a x86 host machine:

  • JetPack Cross Compilation container image: This container simplifies cross compilation and includes the needed cross compilation tools and build environment already set up within it to cross compile JetPack components on a x86 host running Linux.

  • Jetson Linux Flashing container image: Provides a lightweight environment to flash Jetson modules and developer kits without installing any prerequisites on your x86 host running Linux.

Resources