AR / VR

Building XR Applications on PC for High-Quality Graphics

Image of a large hall with tables and lamps.

When it comes to creating immersive, virtual environments, users want the experience to look as realistic and lifelike as possible. And while AIO headsets provide mobility and freedom for VR users, the headsets don’t always have enough power to render photorealistic scenes with accurate physics and lighting.

Using the cloud and professional GPUs, you can generate realistic graphics for immersive environments. NVIDIA CloudXR delivers the best of both worlds, including enhanced mobility and the power of NVIDIA RTX GPUs:

  • Enhance the VR experience on headsets by attaching to NVIDIA GPUs
  • Bring AI, photorealistic rendering, and accurate physics to VR environments with NVIDIA RTX technology
  • Achieve high-quality displays on devices with modest compute to create greater immersion

Learn more about the three major benefits that you can experience when building XR apps on PCs with NVIDIA RTX GPUs.

Use higher-fidelity models, geometry, and materials

Most developers create VR environments that can run seamlessly on any platform. But there are many instances where a higher level of graphical fidelity is needed.

The graphic quality, or the complexity, of the VR scenes has a huge impact on the effectiveness of the experience. For example, doing virtual training for an operating room scenario or running a digital twin are highly complex. The massive number of objects in these scenes requires the computer power that only PCs can deliver.

By creating those complex environments with a GPU, you can take the graphics experience to the next level. Developing applications on PC enables higher fidelity and depth of information, as well as larger, fully dynamic environments. NVIDIA RTX provides the combination of speed, performance, and massive memory that enables VR applications to easily handle large, complex models in photorealistic environments.

For example, the developers at Brightline have created a multi-user, dynamic training system with a scenario builder. Due to the fully dynamic and instructor-driven nature of the content, as well as the realism of the environment, the teams at Brightline were able to use NVIDIA RTX and spend more time focusing on functionality and accuracy, and less time on optimization and graphical shortcuts to hit VR frame rates on the modest compute available on mobile devices.

Brightline used NVIDIA CloudXR over a wireless connection, allowing for full-scale rendering with accurate and dynamic lighting, coupled with the benefits of wireless freedom and inside-out tracking offered by AIO devices

“With NVIDIA CloudXR, we’re able to take advantage of NVIDIA RTX GPUs at the edge, so a user with an AIO headset or other resource-constrained device can experience an accurately lit dynamic world,” said Jason Powers, Chief Creative Technologist at Brightline.

Create graphics with higher-quality lighting, from reflections to refractions

You can develop applications to be of the highest visual fidelity by taking full advantage of what the latest GPUs, SDKs, and APIs have to offer. You can use advanced technologies like NVIDIA DLSS and NVIDIA RTXGI to create XR environments with realistic lighting, incredible reflections, and accurate shadows through real-time ray tracing.

Teams can also use physically based rendering (PBR) shaders and materials, using techniques such as bump mapping, ambient occlusion, and roughness mapping to bring finer details to life. And with NVIDIA RTX and NVIDIA CloudXR, these XR experiences can be streamed from the cloud, so you don’t have to worry about users running the experience from an older workstation.

Many developers are already using these technologies to bring high-quality graphics to life. For example, the team at Agile Lens relies on powerful engines such as Unreal Engine, which provides advanced capabilities for PC VR experiences. For a project at ViveCon, Agile Lens built a MetaHuman demo for the Vive Focus 3 AIO device. The project included an enormous amount of work to develop a method for the character and theater environment to look and run well. But when they used NVIDIA CloudXR, the team could easily go back to the highest-quality versions of the digital human and environment. They were able to show two experiences: one running locally tethered to a desktop computer on-site and one streaming wirelessly from the cloud. No one could tell the two apart.

“From the audience’s side, experiencing live mo-capped MetaHumans in high-fidelity environments using RTXGI and DLSS in Unreal Engine 5 was a breathtaking experience,” said Alex Coulombe, CEO of Agile Lens: Immersive Design. “And on the developer side, there was freedom in knowing everyone who accessed the show would be streaming their experience from a perfect copy of a cloud virtual machine with an NVIDIA RTX A6000, which means that powerful config is the only one that must be tested—no one will be limited to running the experience from their 10-year old laptop.”

Video 1: Unreal Engine 5.1 same Nanite map running on Quest and Desktop VR (Source: Agile Lens)

Develop environments with enhanced AI-based interactions

Combining AI and XR enables users to interact in immersive environments just as they would in the real world. Developers today are trying to represent immersive environments accurately and make interactions as natural as possible to improve users’ comfort levels and productivity.

Project Aurora is an example of a purpose-built NVIDIA platform where AI-driven virtual assistance will be integrated in XR environments. Users can also interact with realistic AI avatars for a much more engaging, interactive experience. With AI technologies, 3D characters can see, hear, understand, and communicate with people. And with solutions like NVIDIA Omniverse Avatar Cloud Engine (ACE), you can easily customize and deploy interactive avatars.

Omniverse ACE is a suite of cloud-native AI microservices that make it easier to build and deploy intelligent virtual assistants and digital humans at scale. These interactive avatars can be designed for organizations across industries such as gaming, entertainment, transportation, and hospitality, enabling teams to enhance existing workflows and unlock new business opportunities.

In the future, AI-based content will become more prominent in XR experiences. From synthetic data generation to image recognition, AI will help you build immersive content filled with physically accurate details and photorealistic graphics. And new technologies will enable AI to better understand real-world content, so it can create synthetic versions for virtual worlds.

Learn more at GTC, taking place from March 20-23. Registration is free—join us and see how NVIDIA CloudXR, NVIDIA RTX, and other technologies are being used to develop XR applications.

Do you have Early Access feedback on NVIDIA CloudXR? Please share with our engineers in the CloudXR forum.

Featured image: Brockman Hall for Opera, courtesy of Agile Lens and Fisher Dachs Associates

Discuss (0)

Tags