The NVIDIA Isaac™ autonomous mobile robot (AMR) platform is an end-to-end, modular autonomy stack that lets a fleet of coordinated robots function robustly and safely among humans in large, highly dynamic, unstructured environments. It gives you a powerful, proven solution for developing state-of-the-art autonomy to use with mobile robots, automated guided vehicles (AGVs), and forklifts.

Apply For Early Access

End-to-End Performance

Isaac AMR delivers robust operation of fully autonomous and collaborative robots.


Rich 3D Maps With Semantics

Create 2D and 3D occupancy maps of large spaces and add semantic labels with the integrated NVIDIA Isaac navigation stack.

Lidar Navigation Stack

Use the lidar navigation stack to reconfigure the app (low code/no code) for use in simulation or the robot. You can also use cloud/edge server-based Mission Control to optimize route plans with NVIDIA® cuOpt™.

Robot Deployment and Management at Scale

Use Mission Control to send missions to a robot fleet, use the navigation stack integrated with Isaac Sim for Software-in-Loop testing, and build with secure software updates from NGC (NVIDIA GPU Cloud).

Isaac AMR Software

Isaac AMR 2.0 features an autonomous navigation stack that includes lidar-based grid mapping, global and continuous localization, a global route planner, a mission client, a behavior planner, and wheel-IMU odometry, as well as a new path and trajectory planner and controller.

This release also includes tools for data collection and a cloud-based map creation service. You can use the on-premises data center-based mission control for optimizing route planning with the NVIDIA cuOpt engine and delivering up-to-date maps to the robots.

 Isaac AMR Cloud Services and Map as a Service (MaSS)

NVIDIA Nova Carter

Isaac AMR Reference Robot

The NVIDIA Nova Carter robot, powered by Nova Orin, is a reference robot required to experience the Isaac AMR end-to-end stack. The robot comes fully assembled and ready to use.

It includes an NVIDIA Jetson AGX Orin system-on-module with up to 275 TOPS of compute and uses lidar and a set of stereo cameras for surround perception, mapping, and navigation. The robot is calibrated and tested to work out of the box, giving developers valuable time to innovate on new features and capabilities.

Among possible use cases, Nova Carter can collect data for mapping test areas like a warehouse or a factory. The processed data can then be deployed into Carter to achieve full autonomy for a specific use case.

NVIDIA Carter - Isaac AMR Reference Robot

Nova Sensors for Full Autonomy

The NVIDIA Nova Carter robot comes with sensors to boost 3D perception, which helps it operate in very large dynamic environments. 3D understanding also ensures higher uptime and availability.



A mix of wide-angle and fisheye cameras for depth, spatial perception, visual localization, grid mapping and teleoperation


A combination of 3D lidar and 2D lidar for mapping, localization, and navigation to fit with a wide range of autonomous machines (low profile robots or forklifts)


Other Sensors

IMU and Wheel Encoder for fusion with other sensor modalities for precise odometry and localization

Sensors in Nova Carter

Stereo Camera Front Stereo GMSL2 LI Hawk 1.1 2.3MP RGGB 120
Stereo Camera Back Stereo GMSL2 LI Hawk 1.1 2.3MP RGGB 120
Stereo Camera Left Stereo GMSL2 LI Hawk 1.1 2.3MP RGGB 120
Stereo Camera Right Stereo GMSL2 LI Hawk 1.1 2.3MP RGGB 120
Camera Front Fisheye GMSL2 LI Owl 2.3MP RGGB 202
Camera Back Fisheye GMSL2 LI Owl 2.3MP RGGB 202
Camera Left Fisheye GMSL2 LI Owl 2.3MP RGGB 202
Camera Right Fisheye GMSL2 LI Owl 2.3MP RGGB 202
IMU Front Stereo IMU I2C Bosch BMI088 12.5-1600Hz -
IMU Chassis IMU I2C Bosch BMI088 12.5-1600Hz -
Mag. Chassis Magnetometer I2C Bosch BMM150 10Hz -
Bar. Chassis Barometer I2C Bosch BMP390 ≤ 200Hz -
2D Lidar Front Ethernet Slamtec RPLIDAR S2E 1600 points/scan 360
2D Lidar Back Ethernet Slamtec RPLIDAR S2E 1600 points/scan 360
3D Lidar Top Ethernet Hesai PandarXT-32 120m range
1.28M points/s
1cm accuracy
0.5cm precision

Compute for AI and Accelerated Workloads

Jetson AGX Orin

Nova Carter includes Jetson AGX Orin with high-throughput interfaces (GMSL) for high-resolution/high-speed cameras and a large number of IO ports. The sensor connectivity ensures simultaneous camera capture synchronization and global timestamping across sensors.

Jetson AGX Orin features an NVIDIA Ampere architecture GPU and Arm Cortex-A78AE CPUs, along with next-generation deep learning and vision accelerators. High-speed interfaces, faster memory bandwidth, and multimodal sensor support lets it feed multiple concurrent AI application pipelines.

Learn More

Integration With Omniverse Isaac Sim


Digital Twin

The Isaac Navigation Stack is fully integrated with NVIDIA Isaac Sim™ for Software-in-Loop development and testing. Included are simulated 3D lidar and stereo camera, which are part of the Isaac Nova sensor suite.

Learn More

Get started today with the most advanced autonomy stack for robots.

Apply For Early Access