JetPack 4.6.5

JetPack 4.6.5 is the latest production release, and is a minor update to JetPack 4.6.4. JetPack 4.6.5 is the same as JetPack 4.6.4 but includes Jetson Linux 32.7.5 which brings in security fixes. It also adds support for various memories - PCN 211141, 211181, 210641, 210521, and 210541.

JetPack 4.6.5 can be installed using SDK Manager.

JetPack 4.6.4

JetPack 4.6.4 is the latest production release, and is a minor update to JetPack 4.6.3. JetPack 4.6.4 is the same as JetPack 4.6.3 but includes Jetson Linux 32.7.4 which brings in security fixes. It also adds Image based OTA support for Jetson AGX Xavier 32GB with Hynix DRAM and Western Digital EMMC - PCN 208560

JetPack 4.6.4 can be installed using SDK Manager.

JetPack 4.6.2

JetPack 4.6.2 is the latest production release, and is a minor update to JetPack 4.6.1. JetPack 4.6.2 is the same as JetPack 4.6.1 but includes L4T 32.7.2 which brings in security fixes. All other features remain the same.

JetPack 4.6.2 can be installed using SDK Manager.

JetPack 4.6.1

JetPack 4.6.1 is the latest production release, and is a minor update to JetPack 4.6. It supports all Jetson modules including the new Jetson AGX Xavier 64GB and Jetson Xavier NX 16GB. JetPack 4.6.1 includes TensorRT 8.2, DLA 1.3.7, VPI 1.2 with production quality python bindings and L4T 32.7.1.

See highlights below for the full list of features added in JetPack 4.6.1

In addition to the L4T-base container, CUDA runtime and TensorRT runtime containers are now released on NGC for JetPack 4.6.1.

Installing JetPack

NVIDIA SDK Manager Method

JetPack can also be installed or upgraded using a Debian package management tool on Jetson. We also host Debian packages for JetPack components for installing on host. Refer to the JetPack documentation for instructions.

More Resources

Key Features in JetPack

OS

NVIDIA L4T provides the bootloader, Linux kernel 4.9, necessary firmwares, NVIDIA drivers, sample filesystem based on Ubuntu 18.04, and more.

JetPack 4.6.1 includes L4T 32.7.1 with these highlights:

  • Support for Jetson AGX Xavier 64GB and Jetson Xavier NX 16GB

TensorRT

TensorRT is a high performance deep learning inference runtime for image classification, segmentation, and object detection neural networks. TensorRT is built on CUDA, NVIDIA’s parallel programming model, and enables you to optimize inference for all deep learning frameworks. It includes a deep learning inference optimizer and runtime that delivers low latency and high-throughput for deep learning inference applications.

JetPack 4.6.1 includes TensorRT 8.2.1

cuDNN

CUDA Deep Neural Network library provides high-performance primitives for deep learning frameworks. It provides highly tuned implementations for standard routines such as forward and backward convolution, pooling, normalization, and activation layers.

JetPack 4.6.1 includes cuDNN 8.2.1

CUDA

CUDA Toolkit provides a comprehensive development environment for C and C++ developers building GPU-accelerated applications. The toolkit includes a compiler for NVIDIA GPUs, math libraries, and tools for debugging and optimizing the performance of your applications.

JetPack 4.6.1 includes CUDA 10.2

Multimedia API

The Jetson Multimedia API package provides low level APIs for flexible application development.

Camera application API: libargus offers a low-level frame-synchronous API for camera applications, with per frame camera parameter control, multiple (including synchronized) camera support, and EGL stream outputs. RAW output CSI cameras needing ISP can be used with either libargus or GStreamer plugin. In either case, the V4L2 media-controller sensor driver API is used.

Sensor driver API: V4L2 API enables video decode, encode, format conversion and scaling functionality. V4L2 for encode opens up many features like bit rate control, quality presets, low latency encode, temporal tradeoff, motion vector maps, and more.

JetPack 4.6.1 includes following highlights in multimedia:

  • Support for Scalable Video Coding (SVC) H.264 encoding
  • Support for YUV444 8, 10 bit encoding and decoding

Computer Vision

VPI (Vision Programing Interface) is a software library that provides Computer Vision / Image Processing algorithms implemented on PVA1 (Programmable Vision Accelerator), GPU and CPU

OpenCV is a leading open source library for computer vision, image processing and machine learning.

VisionWorks2 is a software development package for Computer Vision (CV) and image processing.

JetPack 4.6.1 includes VPI 1.2

  • Production quality support for Python bindings
  • Multi-Stream support in Python bindings to allow creation of multiple streams to parallelize operations
  • Support for calling Python scripts in a VPI Stream
  • New algorithms
    • Image Erode\Dilate algorithm on CPU and GPU backends
    • Image Min\Max location algorithm on CPU and GPU backends

JetPack 4.6.1 includes OpenCV 4.1.1

Jetpack 4.6.1 includes VisionWorks 1.6

1PVA is available only on Jetson AGX Xavier series and Jetson Xavier NX
2VisionWorks will no longer be included after JetPack 4.6.x. Developers should instead use the VPI library for computer vision and image processing. Refer to the announcement here.

Developer Tools

CUDA Toolkit provides a comprehensive development environment for C and C++ developers building high-performance GPU-accelerated applications with CUDA libraries. The toolkit includes Nsight Eclipse Edition, debugging and profiling tools including Nsight Compute, and a toolchain for cross-compiling applications.

NVIDIA Nsight Systems is a low overhead system-wide profiling tool, providing the insights developers need to analyze and optimize software performance.

NVIDIA Nsight Graphics is a standalone application for debugging and profiling graphics applications.

JetPack 4.6.1 includes NVIDIA Nsight Systems 2021.5

JetPack 4.6.1 includes NVIDIA Nsight Graphics 2021.2

Refer to release notes for more details.

Supported SDKs and Tools

NVIDIA DeepStream SDK is a complete analytics toolkit for AI-based multi-sensor processing and video and audio understanding.

DeepStream SDK 6.0 supports JetPack 4.6.1

NVIDIA Triton™ Inference Server simplifies deployment of AI models at scale. Triton Inference Server is open source and supports deployment of trained AI models from NVIDIA TensorRT, TensorFlow and ONNX Runtime on Jetson. On Jetson, Triton Inference Server is provided as a shared library for direct integration with C API.

PowerEstimator is a webapp that simplifies creation of custom power mode profiles and estimates Jetson module power consumption.

PowerEstimator v1.1 supports JetPack 4.6.

Cloud Native

Jetson brings Cloud-Native to the edge and enables technologies like containers and container orchestration. NVIDIA JetPack includes NVIDIA Container Runtime with Docker integration, enabling GPU accelerated containerized applications on Jetson platform.

NVIDIA hosts several container images for Jetson on NVIDIA NGC. Some are suitable for software development with samples and documentation and others are suitable for production software deployment, containing only runtime components. Find more information and a list of all container images at the Cloud-Native on Jetson page.

Security

NVIDIA Jetson modules include various security features including Hardware Root of Trust, Secure Boot, Hardware Cryptographic Acceleration, Trusted Execution Environment, Disk and Memory Encryption, Physical Attack Protection and more. Learn about the security features by jumping to the security section of the Jetson Linux Developer guide.

Functional Safety

NVIDIA Jetson approach to Functional Safety is to give access to the hardware error diagnostics foundation that can be used in the context of safety-related system design. Jetson Safety Extension Package (JSEP) provides error diagnostic and error reporting framework for implementing safety functions and achieving functional safety standard compliance. Learn more.


For a full list of samples and documentation, see the JetPack documentation.

For older versions of JetPack, please visit the JetPack Archive.