The NVIDIA® Isaac Software Development Kit (SDK) is a developer toolbox for accelerating the development and deployment of AI-powered robots. The SDK includes the Isaac Robot Engine, packages with high-performance robotics algorithms, and hardware reference applications. It will accelerate robot development for manufacturers, researchers and startups by making it easier to add AI for perception and navigation into next-generation robots.

Isaac SDK is currently available as a developer preview to the robotics community. The Isaac SDK team would like to receive feedback and suggestions to make Isaac SDK the robotics toolkit you need. We welcome sensor or robotics platform suppliers to join the Isaac ecosystem. You can post your questions on our developer forum, which is moderated by our product team.

The current release version 2019.1 marks the beginning of a rolling release schedule for Isaac SDK. We will incrementally add new features, improvements, and are planning to release new versions of Isaac SDK about every three months. The next release - 2019.2 - is planned for June 2019.


Overview of Isaac SDK and Sim.

ROBOT ENGINE

The Isaac Robot Engine is a framework to easily build modular robotics applications. It enables high-performance image processing and deep learning for intelligent robots.

Using computational graphs and an entity component system, the Robot Engine allows developers to break down a complex robotics use case into small building blocks. Robotics applications developed on Isaac Robot Engine can seamlessly run on edge devices like the NVIDIA® Jetson AGX Xavier and NVIDIA® Jetson Nano, as well on a workstation with a discrete NVIDIA® GPU.

The Isaac Robot Engine also comes with a customizable visualization framework to create visualization for variable plots, drawings, camera image overlays, or 3D scenes. Developers can use Isaac WebSight to inspect and debug their robotics application in a web browser. Other capabilities include experimental message recording and replay of sensor data or message passing over network sockets.


Compute Graph

WebSight

GEMS

GEMS are modular capabilities for sensing, planning, or actuation that can be easily plugged into a robotics application. For example, developers can add obstacle detection, stereo depth estimation, or human speech recognition to enrich their robot use cases. Similarly, developers can use the Isaac navigation stack, a collection of modules to build a robotics navigation stack, to their robot.


GEMS currently available in Isaac SDK:

  • Perception
    • Stereo Depth Estimation
    • Stereo Visual Odometry
    • Scan-based Localization
    • Object Detection
    • AprilTag Detection
  • Sensing and Actuation
    • Segway
    • Velodyne VLP16
    • Various Camera Drivers
  • Planning
    • LQR Path Planning
  • Human-Machine Interface
    • Audio Keyword Detection
  • Sensing and Actuation
    • Segway
    • Velodyne VLP16
    • Various Camera Drivers
  • Planning
    • LQR Path Planning
  • Human-Machine Interface
    • Audio Keyword Detection

GEMS

REFERENCE APPLICATIONS

Carter the delivery robot

Carter

Carter is an Isaac SDK reference robot platform for autonomous indoor delivery and logistics based on Jetson platform.

Developers can build their own Carter robot using sample applications. NVIDIA Isaac SDK can help developers with localization, autonomous navigation, map edition, path planning and state machines.

Kaya the robot to get started with robotics

Kaya

People who would like to get started with Isaac SDK can build their own small robot platform using Kaya reference robot platform.

Kaya can be built using off-the-shelf components and 3D printed parts. The low-cost Kaya robot takes advantage of the new NVIDIA® Jetson Nano. Samples applications built for the platform include April tag detection, obstacle avoidance, remote operation, object classification based on Yolo and much more.

Isaac SIM

Isaac Sim is a virtual robotics laboratory and a high-fidelity 3D world simulator. It accelerates research, design, and development in robotics by reducing cost and risk. Developers can quickly and easily train and test their robots in detailed, highly realistic scenarios.

Developers can use virtual robots with simulated sensors (RGB, stereo, depth, segmentation, LIDAR, IMU) in Isaac Sim to test their applications in a high-fidelity simulation environment. Once tested, applications can be deployed to NVIDIA® Jetson AGX Xavier™, Jetson™TX2, or Jetson Nano™ running on physical robots.

Robots in Isaac Sim are tightly coupled to the tools and frameworks in Isaac SDK, enabling easy transfer of algorithms and data between physical and virtual robots.

Navigation with Carter

Isaac Sim includes 3 environments for testing the navigation stack of the Carter robot: Warehouse, Hopital and Office. These environments include human figures walking around and performing environment-specific actions to create critical scenarios for testing and debugging navigation.

Domain Randomization

Using synthetic data and controlling the randomization parameters in a scene, developers can create huge customized data sets. In Isaac Sim, developers can seamlessly randomize lighting, object materials, object colors, object poses and camera properties.

Robot Builder

The most common robot model is URDF. Isaac Sim has an embedded URDF loader which allows you to load the URDF model of your robot into Isaac Sim and simulate its joints and movements. The URDF models for several mobile bases and manipulators have been tested in Isaac Sim.

Kaya

Kaya is the smallest member of the Isaac family of robots. It can be controlled by joystick in Isaac Sim. There are fun experiments developers can perform in simulation with Kaya, such as pushing objects, object detection, and the Follow Me sample application.