AR / VR

Build 3D Virtual Worlds at Human Scale with the NVIDIA Omniverse XR App

Users can now produce 3D virtual worlds at human scale with the new Omniverse XR App available in beta from the NVIDIA Omniverse launcher.

At Computex this week, NVIDIA announced new releases and expansions to the collaborative graphic platform’s growing ecosystem, including the beta release of Omniverse XR. With this Omniverse app, creators and developers can drop into 3D scenes in VR and AR with full RTX ray tracing. Users can also review, manipulate, and annotate high-poly production assets without preprocessing.

Effortless, ray-traced VR

With the Omniverse XR App users can load Universal Scene Description (USD) production assets directly into the editor. You can jump directly into VR or AR and view or interact with them in ray-traced VR without preprocessing.

Unlike today’s interactive VR engines, Omniverse XR loads USD scenes directly for editing. This removes the need for pregenerated levels of detail since ray tracing is less sensitive to geometry complexity. 

The scene below was modeled with 70 million polygons; with instancing, the total scene contains 18 billion polygons.

Figure 1. Production-quality USD-based assets load directly into a scene in Omniverse XR.

Realism out of the box

Fully raytraced VR accounts for a sizable contribution to realism. The human eye notices when contact shadows don’t reflect a scene’s geometry or when reflections don’t move with body motion. 

With Omniverse XR, engineers, designers, and researchers are able to see soft shadows, translucency, and real reflections out of the box, making the experience feel truly immersive and realistic. 

Figure 2. View a complete 3D scene in VR with the Omniverse XR App.

Navigate and manipulate in VR

Omniverse XR users can teleport and manipulate scene objects in VR. Currently, Oculus and HTC Vive controllers are supported. Since the Omniverse Kit uses real-time ray tracing, ray casting can be done with ease, leading to higher precision in close-range interactions.

Figure 3. Achieve precise object manipulation in Omniverse XR.

Continuous foveated rendering

Omniverse XR uses foveated rendering. This technique samples a region of an HMD screen at a higher shading rate and shrinks the number of pixels rendered to 30% of the image plane. These are pixels that a user won’t see in full resolution. 

The foveation technique that comes with Omniverse XR is built specifically for real-time ray tracing. This removes the boundaries between resolution regions, avoiding the need to reshape an image plane when the fovea is off center.

Interior kitchen scene with sunlight and shadow.
Figure 4. Example of foveated rendering off (left) and on (right).

One-step AR streaming 

With Omniverse XR, you can also experience tablet AR streaming, with the NVIDIA CloudXR streaming platform. This mode opens a virtual camera on the tablet, linked to the Omniverse XR session. 

To use Tablet AR mode, content creators and developers can download the Omniverse Streaming Client for I/Os. It is available from the Apple Store for iPad I/Os 14.5 and later, or with an APK and code examples available for Android Tablets.

Figure 5. Experience tablet AR streaming.

Explore Omniverse XR

The Omniverse XR App is now available in beta from the Omniverse Launcher. To learn more, check out the tutorial video or attend the Q&A livestream on May 25 at 11 a.m. PDT / 8 p.m. CET.

Additional resources 

Learn more by diving into the Omniverse Resource Center, which details how developers can build custom applications and extensions for the platform. 

For additional support, explore the Omniverse forums and Medium channel, tutorials on Twitter and YouTube, and join our Discord server to chat with the community.

Discuss (4)

Tags