Physical AI-powered robots need to autonomously sense, plan, and perform complex tasks in the physical world. These include transporting and manipulating objects safely and efficiently in dynamic and unpredictable environments.
Robot simulation enables developers to train, simulate, and validate these advanced systems through virtual robot learning and testing. It all happens in physics-based digital representations of environments, such as warehouses and factories, prior to deployment.
In this post, we’ll show you how to simulate and validate your robot stack by leveraging your ROS 2 packages with NVIDIA Isaac Sim, a reference application built on the NVIDIA Omniverse platform. We’ll also discuss use cases that Isaac Sim can unlock for AI-enabled robots.
Isaac Sim is built on Universal Scene Description (OpenUSD), the foundational framework for simulations in Isaac Sim. As a developer, you can efficiently design, import, build, and share robot models and virtual training environments with ease using Isaac Sim. OpenUSD is also instrumental in streamlining the connection between a robot’s brain and its virtual world through the ROS 2 interface, full-featured Python scripting, and versatile plug-ins for importing robot and environment models.
Isaac Sim with ROS 2 workflow
The Isaac Sim to ROS 2 workflow is similar to workflows executed with other robot simulators such as Gazebo. At a high level, it starts with bringing your robot model into a prebuilt Isaac Sim environment. The next step entails adding sensors to the robot, followed by connecting up the relevant components to the ROS 2 action graph and simulating the robot by controlling it through your ROS 2 packages.
URDF: A common starting point for simulation in Isaac Sim
The ROS 2 workflow in Isaac Sim usually starts with importing the robot model through the URDF importer. URDF is a widely accepted format for working with robot models in simulation tools.
Additionally, you can also use the built-in wizard to introduce additional files, data, and environments from third-party tools and services. By answering a few simple questions, you can find the relevant steps to bring in the right asset such as robot models, tools, and sensors into your simulation scene.
Prepopulated scenes and SimReady assets
Similar to any robot simulation, you’ll need a scene in which to simulate your robot actions. Isaac Sim offers many prebuilt 3D scenes, from simple office environments to large complex environments such as a warehouse. Additionally, you can also bring more complex 3D scenes from other tools through USD Connections.
In addition to the 3D scenes, you can also leverage more than a thousand SimReady assets, which are physically accurate 3D objects that encompass accurate physical properties, behavior, and connected data streams to represent the real world in simulated digital worlds.
Adding sensors
Sensors enable robots to perceive their environment and take necessary action. Robots such as humanoids, manipulators, and AMRs have multiple on-board sensors comprising stereo cameras, 2D and 3D lidar, and radar. In addition, robots also have physical sensors such as contact and inertial sensors.
Isaac Sim includes many third-party sensors from manufacturers such as Intel, Orbbec, Stereolabs, HESAI, SICK, SLAMTEC, and more. You can also build your own custom sensor for simulation.
Leveraging NVIDIA RTX technology, you can generate photorealistic images from physically accurate simulation. These rendered images can be then used for training AI models and for software-in-loop.
Connecting to ROS 2
Isaac Sim connects to ROS 2 through the ROS 2 Bridge extension. This extension consists of various OmniGraph (OG) nodes designed for ROS developers. OG nodes are not the same as ROS nodes. OG nodes provide an encapsulated piece of functionality that can be used in a connected graph structure to perform complex tasks in simulation.
For example, to publish the time to ROS we will use two main OG nodes. The first one will read the simulation time and is the Isaac Read Simulation Time Node. The output from this node will be the input to the ROS 2 clock Publish Clockr OG node.
The ROS 2 Bridge provides access to a variety of OG nodes useful for robotics tasks. These OG nodes can be used to publish data from a simulated camera or lidar, publish the transform tree of a robot and subscribe to velocity messages. Their parameters like queue size, topic name, context and QoS can also be modified. They can be connected to build a ROS 2 Action Graph which enables complex tasks like navigation and manipulation with popular ROS 2 packages.
Enabling the ROS 2 Bridge gives access to rclpy, the ROS 2 client library for Python. This makes it possible to write your custom ROS 2 code containing nodes, services and actions which can directly access and modify data from the scene and the simulated robot when scripting in Python.
ROS 2 custom message support is enabled by sourcing your workspace prior to running Isaac Sim. Custom Python or C++ OG nodes can also be written for task specific requirements (for example, publish contact sensor state on a custom message topic).
Moving to perception AI-enabled robots
Previous sections have detailed the Isaac Sim to ROS 2 workflow that should mirror your existing workflows. In the following sections, we’ll take a look at some of the features of Isaac Sim for AI-enabled robots that will use perception and cognition models.
Scaling from simple to complex simulations
To achieve autonomy at scale, you’ll need to simulate the robots in complex and diverse environments. The OpenUSD foundation of Isaac Sim makes it highly extensible and scalable when it comes to robotics workflows. You can quickly scale your simulation from single robots in a work cell to multiple robots in a complex environment such as a factory or warehouse by modeling all the key elements of the real factory. The video below illustrates an example of the complex simulation you can run within Isaac Sim.
Synthetic data generation for model training
Training perception AI models that power these robots requires a lot of data. In most instances, real-world data is extremely difficult to obtain. In addition, the real-world data may not be diverse enough to capture the multitude of scenarios and edge cases.
To overcome this data gap, sensors in Isaac Sim can be used to generate synthetic data. One of the key features is domain randomization, where you can change many parameters in a simulation scene, including location, lighting, color, texture, background, and many more to generate a diverse set of training data. The additional benefit of using synthetic data is that you can iterate quickly to improve the model KPIs.
FoundationPose, a unified foundation model for object pose and tracking, has been trained purely on synthetic data and can be deployed without any fine-tuning. Scaling up from a robot to larger scenes, synthetic data was used to develop AI models that are being used to enhance operational efficiency in retail and warehouse environments.
Multi-agent software-in-loop testing
Facilities and warehouses usually inhabit multiple types of robots such as industrial arms, AMRs, and even humanoids that need to perform complex tasks autonomously. Isaac Sim can be used to test and validate the behavior and performance of the entire fleet of robots running their own perception stack across a multitude of scenarios that would otherwise be difficult to cover in the real world.
More workflows and use cases
Isaac Sim can be extended to additional workflows and use cases by providing you with the flexibility to create custom robotics simulators and extensions tailored to your specific needs. This versatility supports advanced robot learning techniques and scalable training, making it adaptable for various use cases.
Robot learning
Isaac Sim has been further extended to robot learning required to ensure that the robot can perform its tasks in a repeatable and efficient way. For example, Isaac Lab, a lightweight, open-source, reference framework built on Isaac Sim for photorealistic and fast simulations. Isaac Lab uses methods such as reinforcement and imitation learning, and provides developers a way to scale their robot policy training on multi-GPU and multi-node systems.
Custom simulators and extensions
Isaac Sim can also be used to build your own custom robotics simulators or extensions for your use cases. Foxglove, a member of the NVIDIA Inception Program for startups, builds visualization tools for robotics developers that allows them to gain actionable insights that promote more effective collaboration, verification, and faster iteration.
The Foxglove extension uses a WebSocket Protocol to seamlessly link any Isaac Sim project to a Foxglove visualization interface. It automatically detects all cameras, IMUs, and articulations in the simulation stage, making their data available in Foxglove, along with the complete Transform Tree. To prevent excessive data flow, sensors are only queried when toggled on by the user in Foxglove.
Get started with robotic simulation
This post has explained how to connect your existing ROS workflows to Isaac Sim for testing and validation. Furthermore, we also explored some of the features and workflows enabled by Isaac Sim for synthetic data generation and software-in-loop testing for robots powered perception AI.
To get started with NVIDIA Isaac Sim, download the standard license for free. To get started with ROS workflows on Isaac Sim, check out these resources:
- Isaac Sim ROS 2 Reference Architecture
- Introductory Tutorials on Isaac Sim and ROS/ROS 2
- Robotics Simulation Use Case
- Introduction to Robotic Simulations in Isaac Sim Self-Paced Course
Join the robotics community on the NVIDIA Developer forums, Discord server, and YouTube channels. Stay up to date on LinkedIn, Instagram, X, and Facebook.