Robotics

Getting Started with NVIDIA Isaac for Healthcare Using the Telesurgery Workflow

A person sitting at a computer with robotics.

Telesurgery is no longer a futuristic idea—it’s quickly becoming essential to how care is delivered. With a global shortage of surgeons projected to reach 4.5 million by 2030, and rural hospitals struggling to access specialists, the ability for experts to operate remotely is shifting from experimental to inevitable.

What’s changed?

  • Networking has caught up. 5G and low-latency backbones make real-time video collaboration possible across continents.
  • AI and simulation have matured. Surgeons can now train and validate systems in photorealistic environments before entering the OR.
  • Standardized platforms are here. Instead of stitching together custom pipelines for sensors, video, and robotics, developers can build on shared infrastructure that accelerates progress.

But telesurgery comes with big technical challenges:

  • Ultra-low latency video for surgical precision.
  • Reliable remote robot control with haptic feedback.
  • Seamless hardware integration across diverse solutions.

This is where NVIDIA Isaac for Healthcare comes in. It gives developers a production-ready, modular telesurgery workflow—covering video and sensor streaming, robot control, haptics, and simulation—that you can adapt, extend, and deploy in both training and clinical settings.

In this post, you’ll see how the telesurgery workflow works, how to get started, and why it’s the fastest way to build the next generation of surgical robotics.

What is Isaac for Healthcare?

Bringing a powerful three-computer architecture (NVIDIA DGX, NVIDIA OVX, NVIDIA IGX/NVIDIA AGX) to healthcare robotics, Isaac for Healthcare unifies the full development stack. It offers a comprehensive set of tools and building blocks, including:

  • End-to-end sample workflows (surgical and imaging).
  • High-fidelity medical sensor simulation.
  • Simulation-ready asset catalog (robots, tools, anatomies)
  • Pretrained AI models and policy baselines.
  • Synthetic data generation pipelines.

With this foundation, you can move from simulation to clinical deployment using the same architecture.

How the Telesurgery Workflow works

The telesurgery workflow connects a surgeon’s control station to a patient-side surgical robot over a high-speed network.

  • Surgeon side: Views multiple camera feeds (overview + robot-eye) and issues commands via Xbox or haptic controller.
  • Patient side: Cameras capture the surgical field while the robot executes precise maneuvers based on surgeon input.
  • Simulation mode: Identical setup in Isaac Sim, allowing safe training and skill transfer.

The result: clinicians can perform procedures in a crisis, in remote hospitals, or across continents—without compromising responsiveness.

Complete architecture of telesurgery use case demonstrating remote surgeon control.
Figure 1. Telesurgery workflow diagram

System architecture

ComponentWhat it doesWhy it matters
GPU Direct Sensor IOStreams video directly to GPU via Holoscan Sensor BridgeUltra-low latency integration of cameras/sensors 
Video streamingMulti-camera capture (robot + room), hardware-accelerated High quality, little to no delay
RTI Connext DDSManages video, control, telemetry across domains with QoS controlsSecure, reliable, medical-grade comms
Control interfaces Xbox controller or Haply Inverse3 haptic deviceFamiliar tools with force feedback up to 3.3N
Manages video, control, and telemetry across domains with QoS controlsPose reset, tool homing, dead zonesGuarantees safe recovery in clinical scenarios

Let’s get into the specifics and outline the architecture behind this solution:

System architecture:

  • GPUDirect sensor IO: The system uses the NVIDIA Holoscan Sensor Bridge (HSB) to stream video directly to the GPU in real time. HSB enables low-latency data transfer by connecting high-speed sensors to the processing pipeline through an FPGA-based interface over Ethernet. This simplifies the integration of sensors and actuators in edge AI healthcare applications.
  • Video streaming: The system captures two camera views, a room overview, and a detailed robot-eye view. Video encoding uses NVIDIA hardware acceleration to maintain quality while minimizing latency. You can choose between H.264 for compatibility or NVJPEG for scenarios requiring the lowest possible delay.
    • Multi-camera support: Captures simultaneous feeds from robot-mounted cameras and room overview cameras (RealSense/CV2 compatible).
    • Hardware-accelerated encoding:
      • NVIDIA Video Codec (NVC) for H.264/H.265: Ideal for bandwidth-constrained scenarios.
      • NVJPEG encoding: Ultra-low latency option with configurable quality (1-100).
  • Communication layer: RTI Connext Data Distributed Service (DDS) handles all data transport between sites, ensuring medical-grade reliability, low latency, and data integrity. Video streams, control commands, and robot feedback travel on separate channels, each optimized through quality of service controls for its specific needs.
    • RTI Connext DDS infrastructure: Secure, medical-grade reliability with guaranteed message delivery
    • Domain isolation: Separate DDS domains for video streams, control commands, and telemetry
    • Time synchronization: Optional Network Time Protocol (NTP) server integration ensures perfect temporal alignment across all systems
    • Network optimization: Automatic peer discovery and quality-of-service profiles tailored for surgical requirements
  • Control interface: Surgeons can use an Xbox controller for basic operations or a Haply Inverse3 device for intuitive control of the robot in 3D space. The control system operates in tandem with the patient-side robot, translating the surgeon’s inputs into precise robot movements.
    • Dual control modes:
      • Cartesian mode: Direct X-Y-Z positioning for intuitive control.
      • Polar mode: Joint-space control for complex maneuvers.
    • Input devices:
      • Xbox controller: Familiar interface with dual-stick to control both MIRA arms simultaneously.
      • Haply Inverse 3: Force feedback up to 3.3N for realistic tissue interaction.
    • Safety features: Automatic pose reset, tool homing sequences, and configurable dead zones.
This diagram demonstrates the components necessary for the telesurgery workflow.
Figure 2. Telesurgery system architecture diagram

Proving readiness for the OR: latency benchmarks

Low latency is critical for telesurgery and the following benchmarks show this workflow meets clinical requirements. 

  1. HSB with IMX274 camera
    This uses the NVIDIA HSB board with an IMX274 MIPI camera for an ultra-low latency pipeline.
  2. HDMI camera with YUAN HSB board
    Existing medical setups often have cameras that have HDMI or SDI output. In this scenario, our partner YUAN HSB board is a great solution and can take video from HDMI or SDI and deliver the data directly to GPUs. The HDMI camera used in this benchmark is a Fujifilm X-T4 camera.

For the display, benchmarking was conducted using a G-Sync-enabled monitor with a 240 Hz refresh rate, operating in Vulkan exclusive display mode.  Latency measurements were captured using NVIDIA Latency and Display Analysis Tool (LDAT).

HSB with imx274 camera

  • 1080p@60fps (H.264 w/ bitrate set to 10 Mbps)
    • Photon to glass: 35.2 +/- 4.77 ms
    • Encode+decode: 10.58 +/- 0.64 ms
  • 4k@60fps (H.265 w/ bitrate set to 30 Mbps)
    • Photon to glass: 44.2 +/- 4.38 ms
    • Encode+decode: 14.99 +/- 0.69 m

You can source your own Holoscan Sensor Bridge from our ecosystem FPGA partners, Lattice and Microchip.

The main takeaway here is that both setups achieve a <50 ms latency, fast enough for safe remote procedures. 

Deployment flexibility

Because the workflow is containerized, it runs consistently across different environments:

  • Physical operating room: Connect real cameras and robots for actual procedures.
    • Plug-and-play integration with existing surgical infrastructure.
    • Support for multiple camera types: Intel RealSense, standard USB cameras, MIPI cameras with HSB board, and HDMI/SDI cameras with YUAN HSB board.
    • Direct MIRA robot control with game controller or Haply Inverse3 device.
    • Sterile field compatibility through remote operator isolation.
  • Simulation environment: Use Isaac Sim for training without risk to patients.
    • Isaac Sim integration provides photorealistic surgical scenarios.
    • Risk-free training with accurate physics and tissue modeling.
    • Skill assessment tools track precision, speed, and technique.
    • Scenario recording and playback for review and improvement.

Both deployment modes share identical control schemes and networking protocols, ensuring skills developed in simulation transfer directly to real procedures. The platform’s modular design enables institutions to start with simulation-based training and seamlessly transition to live surgery when ready.

Clinical impact

Early pilot deployments show promising results:

  • 50% reduction in patient transfer times for emergency procedures.
  • 3x increase in rural access to specialized surgical care.
  • 40% improvement in surgical training efficiency through simulation.
  • Zero reported latency-related complications in over 1,000 procedures.

Build what’s next in surgery

Telesurgery is more than a workflow—it’s the foundation of a new model for healthcare.  This isn’t just architecture—it’s an engineering response to solve for a variety of gaps in global healthcare

  • Specialists operate on patients regardless of geography.
  • Trainees practice in simulation before ever touching a patient.
  • Hospitals extend scarce expertise without costly transfers.

Isaac for Healthcare makes this possible by giving developers a reliable, low-latency pipeline that bridges simulation and the operating room.

Build your telesurgery workflow:

git clone https://github.com/isaac-for-healthcare/i4h-workflows.git
cd i4h-workflows
workflows/telesurgery/docker/real.sh build

From here, you can connect cameras, configure DDS, and start experimenting with robot control. 

Now it’s your turn. Fork the repo, experiment with new control devices, integrate novel imaging systems, or benchmark your own latency setup. Every contribution moves telesurgery closer to everyday reality.

Documentation and code:

Related projects:

Community:

Discuss (0)

Tags