Robotics

Advanced Sensor Physics, Customization, and Model Benchmarking Coming to NVIDIA Isaac Sim and NVIDIA Isaac Lab

SIM 5.0 lab graphic.

At COMPUTEX 2025, NVIDIA announced new updates to its robotics simulation reference application NVIDIA Isaac Sim, and robot learning framework, NVIDIA Isaac Lab, to accelerate the development of robots of all embodiments. 

Isaac Sim and Lab will be fully compatible to run on NVIDIA RTX PRO 6000 workstations and servers,  offering a single architecture for every robot development workload across training, synthetic data generation, robot learning, and simulation


In this post, we further explore what’s new in Isaac Sim and Isaac Lab, which will be available in Q2 of this year, and how these updates accelerate robotics workflows.

What’s new in Isaac Sim 5.0

The new Isaac Sim, built on NVIDIA Omniverse and OpenUSD, will be open and customizable, offers faster development with NVIDIA Launchable, and supports advanced synthetic data generation pipelines to accelerate robotics simulation. 

Open and fully customizable

Isaac Sim 5.0, available soon on GitHub, will be open, fully customizable, and extensible, enabling users to customize simulations for specific needs, such as synthetic data generation (SDG) and Software-in-the-Loop testing. 

Additional enterprise support for the underlying Omniverse Kit and commercial redistribution rights would require an Omniverse Enterprise license

Get started faster with NVIDIA Brev

Isaac Sim 5.0 will be accessible through NVIDIA Brev, which provides direct access to NVIDIA GPU instances across major cloud providers. Brev gives developers an instant way to launch Isaac Sim, eliminating infrastructure overhead and accelerating iteration cycles

Choose between a one-click launchable deployment or customize your deployment by selecting instance types and port configurations.

Screenshot of Isaac Sim NVIDIA Brev Launchable.
Figure 1. Isaac Sim Brev window

Advanced synthetic data generation pipelines

Synthetic data generated from simulation ensures accurate modeling of real-world physics, including motion data and environmental interactions. This also provides a way to capture rare scenarios that are dangerous to collect in the real world.

MobilityGen

High-quality motion data is the foundation for training smarter, more adaptable robots capable of operating safely and efficiently in real-world environments. This requires realistic motion dynamics and rich ground-truth data to develop adaptable and efficient mobility models that can generalize across robots and environments.

Video 1. MobilityGen user with Isaac Sim for synthetic data generation

MobilityGen will be available as an extension in Isaac Sim, enabling diverse physics-based data and perception model training data generation, such as occupancy maps, robot states, poses, velocities, and images.

MobilityGen, will be available as an extension in Isaac Sim, for creating diverse physics-based data as well as data for perception model training, like occupancy maps, robot states, pose, velocity, and images. It will also support various data collection methods, such as teleoperation, automated actions, and customizable path planning. The generated data can be used to train autonomous mobile robots, quadrupeds, and humanoids. The data from MobilityGen was used to train X-Mobility, an end-to-end generalizable navigation model. 

Expanding synthetic data generation to physical spaces

Video 2. Extensions in Isaac Sim to generate synthetic image and video data

Physical AI enables systems like robots, autonomous vehicles, and smart spaces to perceive and act in real-world environments such as factories, warehouses, and cities. To enable better spatial understanding of environments, developers need training data that accurately simulates what cameras in these complex 3D environments would see with multiple moving actors and objects.

The new extensions in Isaac Sim generate synthetic image and video data to train vision AI models and include:

  • Isaacsim.Replicator.Agent: Simulates human and robot characters performing actions like walking, sitting, or lifting in 3D environments.
  • Isaacsim.Replicator.Object: Generates synthetic datasets for object detection with configurable scenes and domain randomization.
  • Incident extension (coming in Isaac 5.0): Produces incident-based data such as fires, spills, or falling objects.
  • Caption extension (coming in Isaac 5.0): Creates image-caption pairs for training vision-language models on scene understanding.

Improved data format compatibility for Cosmos world foundation models 

World foundation models (WFMs) such as NVIDIA Cosmos Transfer can help augment the synthetic data generated from Isaac Sim to the required photorealism for model training. The new writer for NVIDIA Omniverse Replicator is optimized for Cosmos Transfer input, for users to easily generate and export high-quality synthetic data for ingestion. It supports standalone workflows and the Script Editor, and can be seamlessly integrated into existing Isaac Sim synthetic data generation scripts.

Enhanced sensor simulation 

Side-by-side images of depth field generated by simulated depth sensor in Isaac Sim.
Figure 2. Depth sensor simulation output

Accurate sensor simulation is essential for successful real-world deployment of robotics and computer vision applications. Isaac Sim will support a generic depth map noise model that enables developers to realistically simulate the noise characteristics of stereo cameras. This feature helps produce depth images with noise patterns similar to real sensor data, further improving the realism of synthetic outputs. 

Improved actuator models

Isaac Sim now supports a new joint friction model defined through an OpenUSD schema, including actuator and friction parameters, developed in collaboration with Hexagon Robotics and maxon. Using real-world data from manufacturer datasheets, these models ensure that the actuation of joints and motors in simulation closely mirrors actual behavior, reducing the simulation-to-real gap. With more accurate actuator modeling, reinforcement learning (RL) policies can be better trained, leading to smoother transitions from simulation to physical hardware and more reliable robot performance in the real world.

Streamline robotic workflows with standardized ROS 2 interfaces and ZMQ bridge

The new standardized ROS 2 simulation interfaces solve a persistent challenge in robotics by introducing a consistent method to control different simulators through ROS 2. Spearheaded by Robotec.ai and developed in collaboration with Gazebo, Open 3D Engine, and NVIDIA, this new standardized interface simplifies integration by providing developers with a consistent way to control their simulations from ROS 2.

ZMQ bridge, an extension added for Isaac Sim 4.5, enables fast, bidirectional communication with external applications beyond ROS. It supports software-in-the-loop testing and can extend to hardware-in-the-loop on edge devices. Use cases include streaming camera data, transferring ground truth, sending control commands, and exchanging auxiliary data.

What’s new in Isaac Lab 2.2

The latest Isaac Lab update features benchmarking scripts for GR00T N1 models, enhanced synthetic motion data generation, faster load times with Omniverse Fabric, and improved suction cup gripper modeling.

Benchmarking and evaluating NVIDIA Isaac GR00T N models

The latest version of Isaac Lab will support environment benchmarking scripts for closed-loop evaluation of the NVIDIA Isaac GR00T N family of foundation models. Developers can load these prebuilt environments and industrial tasks, such as nut pouring and pipe sorting, to run closed-loop benchmarks.

A GIF of a bi-manipulation task of pipe sorting using a fine-tuned GR00T N1 model with GR00T-mimic data in Isaac Lab. 
Figure 3. Isaac GR00T N1 deployed in Isaac Lab on the Fourier GR1 humanoid robot for bi-manipulation tasks

In addition to new environments and benchmarks, Isaac Lab will also support the LeRobot data format, enabling conversion of synthetic data generated from the Isaac GR00T blueprint for synthetic manipulation motion generation for post-training the Isaac GR00T N1 model. The trained policy can then be further validated in Isaac Sim against a multitude of “what-if” scenarios. 

Enhanced synthetic motion generation with GR00T-Mimic 

Building on the bimanual manipulation introduced in Isaac Lab 2.1, Isaac Lab 2.2 will now include pre-built environments, as demonstrated at NVIDIA GTC 2025  shown in Figure 4.

Fourier GR1 humanoid robot performing complex pick and place task using GR00T N1.
Figure 4. Fourier GR1 humanoid robot performing complex bi-manipulation task using GR00T N1

Specifically designed for data collection, these sample environments come pre-populated with the Fourier GR1 humanoid robot, making it easier to collect synthetic motion data for training robot policy models. These new training environments use visual inputs from the humanoid robot’s point of view, along with robot state information, such as joint positions for the GR00T N model. 

There will also be sample data, scripts, prompts, and model checkpoints to augment synthetic data from Cosmos-Transfer1.

Improving quality of life 

The new version of Isaac Lab will utilize NVIDIA Fabric. This Omniverse library  enables high-performance creation, modification, and access of scene data, as well as efficient communication of scene data between the CPU, GPU, and other Fabric clients across a network. 

Fabric indexes data and provides a query API, enabling systems such as physics and rendering to access only the data they need without traversing the entire scene each frame. This will result in faster load times and improved scaling, with tasks such as physics simulation and sensor data collection running in parallel with user interaction.

Tensorized suction cups

Surface grippers are a major component in robot manipulation. Physically accurate simulation and tensorized access are essential to enable suction cup-based reinforcement learning tasks and control behavior. 

Previously, the surface gripper was limited to a single joint connection point, which prevented it from picking up multiple objects. It also lacked the flexibility to adjust its physical behavior, such as using suction to pull an object—it could only grip objects statically at their current positions. Additionally, each gripper had to be individually mapped for each environment, rather than using a more streamlined, tensorized access method. 

Video 3. The new and improved surface gripper in Isaac Lab

Now, Isaac Sim and Isaac Lab will let users create and configure surface grippers that are deformable, with the ability to measure grip forces and update suction-cup parameters in a tensorized fashion.

NVIDIA Isaac Sim 5.0 and Isaac Lab 2.2 will be available this summer. 

Ecosystem adoption

Agility Robotics, Boston Dynamics, Fourier, Mentee Robotics, Neura Robotics, and XPENG Robotics are simulating and validating their humanoid robots using NVIDIA Isaac Sim and Isaac Lab. Skild AI is using the simulation frameworks to develop general robot intelligence, and General Robotics is integrating them into its robot simulation platform. 

Additionally, Taiwan’s leading electronics and robotics manufacturers, such as Adata and Advantech, Delta Electronics, Foxconn, Foxlink, Solomon, Techman, and Wistron, are using NVIDIA Isaac Sim and Lab to develop the next wave of AI-enabled robots.  

Get started developing your robotics solutions

To receive updates on the following additional resources and reference architectures to support your development goals, sign up for the NVIDIA Developer Program

  • NVIDIA Isaac Sim, built on NVIDIA Omniverse, is a reference application enabling developers to design, simulate, test, and train AI-based robots and autonomous machines in a physically based virtual environment. 
  • NVIDIA Isaac Lab, a lightweight app for robot learning.
  • NVIDIA Isaac Perceptor, a reference workflow for autonomous mobile robots (AMRs).
  • NVIDIA Isaac Manipulator offers new foundation models and a reference workflow for industrial robotic arms.
  • NVIDIA Isaac ROS, built on the open-source ROS 2 software framework, is a collection of accelerated computing packages and AI models, bringing NVIDIA-acceleration to ROS developers everywhere.
  • NVIDIA Jetson is the leading platform for autonomous machines and embedded applications.

Watch the COMPUTEX keynote from NVIDIA founder and CEO Jensen Huang, as well as NVIDIA GTC Taipei 2025 sessions.

Tune into our upcoming OpenUSD Insiders livestream, Wednesday, May 28, at 11 am PDT for a recap of the NVIDIA Isaac Sim and Isaac Lab release and other physical AI announcements from GTC Taipei at Computex.

Stay up to date by subscribing to our newsletter and following NVIDIA Robotics on LinkedIn, Instagram, X, and Facebook. Explore NVIDIA documentation and YouTube channels, and join the NVIDIA Developer Robotics forum. To start your robotics journey, enroll in our free NVIDIA Robotics Fundamentals courses today.

Get started with NVIDIA Isaac libraries and AI models for developing physical AI systems.

Discuss (0)

Tags