How XR Works
Extended reality, a cornerstone of spatial computing, works by blending the physical and digital worlds using immersive technologies like augmented reality (AR), virtual reality (VR), and mixed reality (MR). AR overlays digital content onto the real world, VR fully immerses users into a virtual environment, and MR seamlessly combines real and virtual elements, enabling interaction between the two. These experiences are powered by advanced hardware like NVIDIA GPUs, optimized drivers, and versatile SDKs, which process complex data in real time to deliver lifelike visuals, synchronized interactions, and responsive environments.
Explore XR Technologies
NVIDIA CloudXR
NVIDIA CloudXR™ is for streaming virtual, augmented, and mixed reality content from any OpenVR XR application on a remote server—cloud, data center, or edge.
NVIDIA Omniverse
NVIDIA Omniverse™ is a development platform for building 3D applications and services that enable seamless collaboration and creation in immersive 3D environments.
NVIDIA Professional VR Ready Solutions
NVIDIA VR Ready systems deliver unparalleled performance for smooth, immersive virtual reality experiences.
NVIDIA VR Capture and Replay
NVIDIA Virtual Reality Capture and Replay (VCR) enables developers to accurately capture and replay VR content for performance testing, scene quality control, and more.
NVIDIA VRWorks
NVIDIA VRWorks™ enables VR application and headset developers to create amazing virtual reality experiences, improving VR performance and visual quality.
NVIDIA Warp
Warp enables Python developers to create GPU-accelerated, 3D simulation workflows that drive ML pipelines in PyTorch, JAX, PhysicsNeMo, and NVIDIA Omniverse™.
NVIDIA XR AI
NVIDIA XR AI is a platform that connects XR devices—such as lightweight AR/AI glasses or head-mounted displays—to your organization’s full computational power, enabling spatially aware, intelligent agents to operate seamlessly across cloud, data center, workstation, and edge deployments.
Developer Starter Kits
Explore templates and resources to develop solutions that utilize XR.
Spatial Streaming for Omniverse Digital Twins
Learn more about streaming immersive OpenUSD-based Omniverse digital twins to the Apple Vision Pro with this reference workflow.
Robotic Data Capture and Generation
Explore how Jetson AGX Thor, Cosmos, Isaac GR00T, and Apple Vision Pro’s spatial computing power advance humanoid robotics in this blueprint.
GR00T-Teleop stack is in invite-only early access – join our Humanoid Developer Program for when it becomes available in beta
