Boost Edge AI Performance with the New NVIDIA Jetson Orin NX 16GB

Image on the Jetson Orin

Building on the momentum from last year’s expansion of NVIDIA Jetson edge AI devices, the NVIDIA Jetson Orin NX 16 GB module is now available for purchase worldwide. 

The Jetson Orin NX 16 GB module is unmatched in performance and efficiency for small form factor, low-power robots, and autonomous machines. This makes it ideal for use in products like drones and handheld devices. The module can easily be used for advanced applications such as manufacturing, logistics, retail, agriculture, healthcare, and life sciences—all in a truly compact, power-efficient package.

It is the smallest Jetson form factor, delivering up to 100 TOPS of AI performance with power configurable between 10 W and 25 W. It gives developers 3x the performance of the NVIDIA Jetson AGX Xavier and 5x the performance of the NVIDIA Jetson Xavier NX.  

The system-on-module supports multiple AI application pipelines with NVIDIA Ampere architecture GPU, next-generation deep learning and vision accelerators, high-speed I/O, and fast memory bandwidth. You will be able to develop solutions using your largest and most complex AI models in natural language understanding, 3D perception, and multi-sensor fusion.

Showcasing the giant leap in performance, NVIDIA ran some computer vision benchmarks using the NVIDIA JetPack 5.1. Testing included some dense INT8 and FP16 pretrained models from NGC. The same models were also run for comparison on Jetson Xavier NX.

Following is the complete list of benchmarks:

Taking the geomean of these benchmarks, Jetson Orin NX shows a 2.1x performance increase compared to Jetson Xavier NX. With future software optimizations, this is expected to approach 3.1x for dense benchmarks. Other Jetson devices have increased performance by 1.5x since their first supporting software release, similar is anticipated for the Jetson Orin NX 16 GB. 

Jetson Orin NX also brings support for sparsity, which will enable even greater performance. With sparsity, you can take advantage of the fine-grained structured sparsity in deep learning networks to increase the throughput for Tensor Core operations.

Comparison of NVIDIA Orin NX to Xavier NX measured dense benchmarks show that Orin NX is 2.1X better than Xavier NX, with future software optimizations it will be 3.1X.
Figure 1: Computer vision pretrained models benchmarks 

All Jetson Orin modules run the world-standard NVIDIA AI software stack. Supported by an ecosystem of services and products, your road to market has never been faster. NVIDIA JetPack 5.1, also released today, brings support for the Orin NX 16 GB and the latest CUDA-X stack on Jetson Orin.

Additionally, the Jetson partner ecosystem supports a broad range of carrier boards and peripherals for the Jetson Orin NX 16 GB module, such as sensors, cameras, connectivity modules (5G, 4G, Wi-Fi), and more. 

Options include: 

The NVIDIA Jetson AGX Orin Developer Kit is also available now and supports software emulation for the entire family of Jetson Orin modules, including the Jetson Orin NX 16 GB module. 

Get started with the Jetson AGX Orin Developer Kit now. 

Read the NVIDIA Jetson Orin NX documentation available at the Jetson download center. For more information and support, see the NVIDIA Embedded Developer page and the Jetson forums.

The Jetson Orin NX 16 GB is available now for $599 (1KU+).

Discuss (0)