Cloud-Native on Jetson
Cloud-Native technologies offer the flexibility and agility needed for rapid product development and continual product upgrades.
Jetson brings Cloud-Native to the edge and enables technologies like containers and container orchestration which revolutionized cloud applications.
NVIDIA JetPack includes NVIDIA Container Runtime with Docker integration, enabling GPU accelerated containerized applications on Jetson platform. Developers can package an application for Jetson with all its dependencies into a single container that is guaranteed to work in any deployment environment.
Manage the lifecycle of your containerized application on Jetson platform at scale using container orchestration technologies like Kubernetes.
Several development and deployment containers for Jetson are hosted on NVIDIA NGC:
- L4T-Base container image: This is the base image for all containerized applications on Jetson. The NVIDIA container runtime mounts platform specific libraries and device nodes into the l4t-base container from the underlying host. Similarly, CUDA, TensorRT, VPI and other user level libraries in JetPack are made available from the host for use by containerized applications.
- CUDA runtime container image: This image contains CUDA runtime components (NVIDIA container runtime will not mount them from the host.) This container image is useful to containerize CUDA applications for deployment.
- TensorRT runtime container image: TensorRT and cuDNN runtime components are included in this image, and will not be mounted from the host. (CUDA runtime components are also included via it’s parent image, cuda-runtime.) This container image is useful to containerize AI applications for deployment.
- DeepStream container images: These container images include plugins and libraries that are part of DeepStream SDK. Three different images with varying contents are available: Base, Samples and IoT.
- TensorFlow container image: This image includes TensorFlow pre-installed in a Python 3.6 environment. Developers can use this to set up a TensorFlow development environment quickly. This container can be used as a base image for containerizing TensorFlow applications.
- PyTorch container image: Contains PyTorch and TourchVision pre-installed in a Python 3.6 environment. Developers can quickly set up a PyTorch development environment and can use this container as a base image for containerizing PyTorch applications.
- Machine Learning container image: Contains TensorFlow, PyTorch, JupyterLab, and other popular ML and data science frameworks such as scikit-learn, scipy, and Pandas pre-installed in a Python 3.6 environment.