Robotics

Fast-Track Robot Learning in Simulation Using NVIDIA Isaac Lab

Robots need to be adaptable, readily learning new skills and adjusting to their surroundings. Yet traditional training methods can limit a robot’s ability to apply learned skills in new situations. This is often due to the gap between perception and action, as well as the challenges in transferring skills across different contexts.

NVIDIA Isaac Lab is an open-source modular framework for robot learning that addresses these limitations. The Isaac Lab modular high-fidelity simulations for diverse training environments provide physical AI capabilities and GPU-powered physics simulations. 

Isaac Lab supports both imitation learning (mimicking humans) and reinforcement learning (learning through trial and error), providing flexibility in training approaches for any robot embodiment. It offers a user-friendly environment for training scenarios that helps robot makers add or update robot skills with changing business needs. 

Many industry collaborators are using Isaac Lab to train humanoid robots efficiently. These include Fourier Intelligence, whose GR1 humanoid robot has human-like degrees of freedom, and Mentee Robotics, whose MenteeBot is built for household-to-warehouse applications.

A group of humanoid robots standing evenly spaced on a blue and white surface moving their arms up and down.
Figure 1. The Fourier Intelligence humanoid robots trained using NVIDIA Isaac Lab

ORBIT-Surgical is a simulation framework based on Isaac Lab that trains robots like the da Vinci Research Kit (dVRK) to assist surgeons and reduce their mental workload. The framework uses reinforcement learning and imitation learning, running on NVIDIA RTX GPUs, to enable robots to manipulate both rigid and soft objects. Additionally, NVIDIA Omniverse helps generate high-fidelity synthetic data that can be used to train AI models for segmenting surgical tools in real-world hospital operating room videos. 

Boston Dynamics is using Isaac Lab and NVIDIA Jetson AGX Orin to enable simulated policies to be directly deployed for inference, simplifying the deployment process. For more information, see Closing the Sim-to-Real Gap: Training Spot Quadruped Locomotion with NVIDIA Isaac Lab.

This post provides an overview of key NVIDIA Isaac Lab features and a preview of the Isaac Lab ecosystem collaborators in the NVIDIA Humanoid Robot Developer Program. It also explains how to scale robot workflows using the NVIDIA OSMO platform.

Isaac Lab features for accelerated robot learning

Key features available in Isaac Lab include reinforcement and imitation learning for seamless and effective robot policy training, fast and accurate physics simulation provided by PhysX, tiled rendering APIs for vectorized rendering, domain randomization for improving robustness and adaptability, and support for running in the cloud.

Multiple robot training techniques in one tool: Isaac Lab is a simulation application that enables robot learning through reinforcement learning and imitation learning.

Figure 2. NVIDIA Isaac Lab enables multiple embodiments with multiple trained policies in a single environment
Figure 2. NVIDIA Isaac Lab enables multiple embodiments with multiple trained policies in a single environment

Reinforcement learning (RL): Robots learn through trial and error, making them more adaptable to new situations and potentially exceeding human performance for some tasks. However, RL can be slow and requires carefully designed reward functions to guide the robot’s learning. Isaac Lab provides support for RL through wrappers to different libraries, which convert environment data into function argument and return types.

Humanoid robot walking across the floor of a warehouse.
Figure 3. MenteeBot trained using NVIDIA Isaac Lab reinforcement learning framework

Imitation learning: Robots learn by mimicking human demonstrations. This method is ideal for tasks with specific movements or behaviors, requiring less data and leveraging human expertise. Support for imitation learning comes through the learning framework Robomimic and enables saving data in HDF5. 

Flexibility with task design workflows: Build robot training environments in two ways, manager-based or direct. With the manager-based workflow, you can easily switch out different parts of the environment. To optimize performance for complex logic, the direct workflow is recommended.

Tiled rendering: Isaac Lab is currently the only industry tool offering high-fidelity rendering for robot learning, helping reduce the sim-to-real gap. Tiled rendering reduces rendering time by consolidating input from multiple cameras into a single large image. It provides a streamlined API for handling vision data, where the rendered output directly serves as observational data for simulation learning.

Multi-GPU and multi-node support: For complex reinforcement learning environments, it may be desirable to scale up training across multiple GPUs. This is possible in Isaac Lab through the use of the PyTorch distributed framework.

GPU 0, 1, and 2 power three environments with learners for the updated model.
Figure 4. Scale multiple GPUs in Isaac Lab through the PyTorch distributed framework

Vectorized APIs: Tap into enhanced View APIs for improved usability, eliminating the need to pre-initialize buffers, and caching indices for different objects in the scene, in addition to support for multiple view objects in the scene. 

Easy deployment to public clouds: Supports deployment on AWS, GCP, Azure, and Alibaba Cloud, with Docker integration for efficient RL task execution in containers, as well as seamless scaling of multi-GPU and multi-node jobs using OSMO.

Accurate physics simulation: Tap into the latest GPU accelerated PhysX version through Isaac Lab, including support for deformables, ensuring quick and accurate physics simulations augmented by domain randomizations. 

Scale robot workflows with NVIDIA OSMO

NVIDIA OSMO is a cloud-native workflow orchestration platform that helps to orchestrate, visualize, and manage a range of tasks. These include generating synthetic data, training foundation models, and implementing software-in-the-loop systems for any robot embodiment. 

With NVIDIA OSMO, enterprises can train robots efficiently without extensive in-house IT expertise. Request early access to NVIDIA OSMO.

Stack diagram of NVIDIA OSMO including compute and data backends, and functionalities for a variety of use cases.
Figure 5. Orchestrate, visualize, and manage tasks with the cloud-native NVIDIA OSMO platform

AI foundation models for humanoid robot learning

NVIDIA Project GR00T is an initiative to develop general-purpose foundation models for humanoid robots. Isaac Lab is enabling industry collaborators to perform robot learning, including 1X, The AI Institute, Boston Dynamics, ByteDance Research, Field AI, Fourier, Galbot, LimX Dynamics, Mentee, NEURA Robotics, RobotEra, and Skild AI

The complexity of modeling humanoid dynamics increases exponentially with each added degree of freedom. RL and imitation learning are the only scalable ways to develop policies for humanoids that work across a wide variety of tasks and environments. 

Video 1. Streamline robot training for physical AI using NVIDIA Isaac Sim, Isaac Lab, OSMO, and GR00T

Enterprises building robotic applications with humanoid robots are invited to apply to the NVIDIA Humanoid Robot Developer Program.

Get started 

NVIDIA Isaac Lab offers modular capabilities with customizable environments, sensors, and training scenarios, along with techniques like reinforcement learning and imitation learning that help any robot embodiment to learn from quick demonstrations. 

Ready to get started training your robot embodiment policies faster? Isaac Lab is open-sourced under the BSD-3 license and is available through isaac-sim/IsaacLab on GitHub. 

If you’re an existing NVIDIA Isaac Gym (predecessor of Isaac Lab) user, we recommend migrating to Isaac Lab to ensure you have access to the latest advancements in robot learning and a powerful development environment to accelerate your robot training efforts. 

At SIGGRAPH 2024, NVIDIA CEO Jensen Huang sat down for fireside chats with Meta founder and CEO Mark Zuckerberg and WIRED Sr. Writer Lauren Goode. Watch the fireside chats and other sessions from NVIDIA at SIGGRAPH 2024 on demand.

Stay up to date by subscribing to our robotics developer newsletter, and following NVIDIA Robotics on YouTube, Discord, and developer forums.

Discuss (0)

Tags