Content Creation / Rendering

New Ray-Tracing, AI, Cloud, and Virtual World Tools Simplify Game Development at GDC 2022

This week at GDC, NVIDIA announced a number of new tools for game developers to help save time, more easily integrate NVIDIA RTX, and streamline the creation of virtual worlds. Watch this overview of three exciting new tools now available. 

Figure 1. NVIDIA announces a number of new tools for game developers to help save time, more easily integrate NVIDIA RTX, and streamline the creation of virtual worlds.

Simplify integration with the new NVIDIA Streamline

Since NVIDIA Deep Learning Super Sampling (DLSS) launched in 2019, a variety of super-resolution technologies have shipped from both hardware vendors and engine providers. To support these various technologies, game developers have had to integrate multiple SDKs, often with varying integration points and compatibility.

Today NVIDIA is releasing Streamline, an open-source cross-IHV framework that aims to simplify integration of multiple super-resolution technologies and other graphics effects in games and applications.

Streamline offers a single integration with a plug-and-play framework. It sits between the game and render API, and abstracts the SDK-specific API calls into an easy-to-use Streamline framework. Instead of manually integrating each SDK, developers simply identify which resources (motion vectors, depth, etc) are required for the target super-resolution plug-ins and then set where they want the plug-ins to run in their graphics pipeline. The framework is also extensible beyond super-resolution SDKs, with developers able to add NVIDIA Real-time Denoisers (NRD) to their games via Streamline. Making multiple technologies easier for developers to integrate, Streamline benefits gamers with more technologies in more games.

A graphic showing Streamline’s plug-and-play framework.
Figure 2. Streamline’s plug-and-play framework sits between the game and render API.

The Streamline SDK is available today on Github with support for NVIDIA DLSS and Deep Learning Anti-Aliasing. NVIDIA Image Scaling support is also coming soon. Streamline is open source and hardware vendors can create their own specific plug-ins. For instance, Intel is working on Streamline support for XeSS.

Easier ray-tracing integration for games with Kickstart RT 

In 2018, NVIDIA Turing architecture changed the game with real-time ray tracing. Now, over 150 top games and applications use RTX to deliver realistic graphics with incredibly fast performance, including Cyberpunk 2077, Far Cry 6, and Marvel’s Guardians of the Galaxy

Available now, Kickstart RT makes it easier than ever to add real-time ray tracing to game engines, producing realistic reflections, shadows, ambient occlusion, and global illumination.

Image show the Kickstart SDK running on Marbles scene
Figure 3. This scene highlights ray tracing, global illumination, ambient occlusion, and ray traced shadows, enabled through KickStart SDK.

Traditionally, game engines must bind all active materials in a scene. Kickstart RT delivers beautiful ray-traced effects, while foregoing legacy requirements and invasive changes to existing material systems.

Kickstart RT provides a convenient starting point for developers to quickly and easily include realistic dynamic lighting of complex scenes in their game engines in a much shorter timespan than traditional methods. It’s also helpful for those who may find upgrading their engine to the DirectX12 API difficult. 

Learn more about Kickstart RT.

Make game testing faster and easier with GeForce NOW

Graphic of GeForce NOW SDK and Playtest.
Figure 4. The NVIDIA GeForce NOW platform.

With a network of more than 30 data centers and 80 countries, GeForce NOW (GFN) uses the power of the cloud to take the GeForce PC gaming experience to any device. This extended infrastructure provides a developer platform for studios and publishers to perform their game development virtually, starting with playtesting. 

With GeForce NOW Cloud Playtest, players and observers can be anywhere in the world. Game developers can upload a build to the cloud, and schedule a playtest for players and observers to manage on their calendar. 

During a playtest session, a player uses the GFN app to play the targeted game, while streaming their camera and microphone. Observers watch the gameplay and webcam feeds from the cloud. Developers need innovative ways to perform this vital part of game development safely and securely. All this is possible with Cloud Playtest on the GeForce NOW Developer Platform.

Learn more about GeForce Now Cloud Playtest.

Creating virtual worlds with Omniverse

Virtual world simulation technology is opening new portals for game developers. The NVIDIA Omniverse platform is bringing a new era of opportunities to game developers. 

Users can plug into any layer of the modular Omniverse stack to build advanced tools that simplify workflows, integrate advanced AI and simulation technologies, or help connect complex production pipelines. 

Minecraft scene highlighting ray traced shadows.
Figure 5. Game developers have been building virtual worlds for more than a decade (Minecraft pictured), courtesy of The Vokselians.

This GDC, see how Omniverse is supercharging game development pipelines with over 20 years of NVIDIA rendering, simulation, and AI technology.

Lastly, NVIDIA announced the #MadeInMachinima contest—where participants can remix iconic game characters into a cinematic short using the Omniverse Machinima app for a chance to win RTX-accelerated NVIDIA Studio laptops. Users can experiment with AI-enabled tools and create intuitively in real-time using assets from Squad, Mount & Blade II: Bannerlord, and Mechwarrior 5. The contest is free to enter.

Learn more about Omniverse for developers.

Enhanced RTX SDKs for open-world games

It has never been easier to integrate real-time ray tracing in your games with Kickstart RT. However, for those looking for specific solutions, we’ve also updated our individual ray tracing SDKs.

The RTX Global Illumination plug-in is now available for Unreal Engine 5 (UE5). Developers can get a headstart in UE5 with dynamic lighting in their open worlds. Unreal Engine 4.27 has also been updated with performance and quality improvements while the plug-in for the NVIDIA branch of UE4 has received ray traced reflections and translucency support alongside skylight enhancements. 

RTX shown in a forest scene in the title Icarus.
Figure 6. RTX highlighted in Icarus.

RTX Direct Illumination has received image quality improvements for glossy surfaces. NVIDIA Real-Time Denoisers introduces NVIDIA Image Scaling and a path-tracing mode within the sample application. It also has a new performance mode that is optimized for lower spec systems.

We’ve recently announced new latency measurement enhancements for NVIDIA Reflex that can be seen in Valorant, Overwatch, and Fortnite. These new features include per-frame PC Latency measurement and automatic configuration for Reflex Analyzer. Today, Reflex 1.6 SDK is available to all developers—making latency monitoring as easy as measuring FPS.

We’ve also updated our NVIDIA RTX Technology Showcase, an interactive demo, with Deep Learning Super Sampling (DLSS), Deep Learning Anti-Aliasing, and NVIDIA Image Scaling. Toggle different technologies to see the benefits AI can bring to your projects. For developers who have benefited from DLSS, we’ve updated the SDK to include a new and simpler way to integrate the latest NVIDIA technologies into your pipeline.  

Latest tools and SDKs for XR development

Truly immersive XR games require advanced graphics, best-in-class lighting, and performance optimizations to run smoothly on all-in-one VR headsets. Cutting-edge XR solutions like NVIDIA CloudXR streaming and NVIDIA VRWorks are making it easier and faster for game developers to create amazing AR and VR experiences.

Developers and early access users can accurately capture and replay VR sessions for performance testing, scene troubleshooting, and more with NVIDIA Virtual Reality Capture and Replay (VCR).

Man wearing an HTC VIVE VR headset and using controllers.
Figure 7. NVIDIA CloudXR streaming on an HTC VIVE

Attempting to repeat a user’s VR experience exactly is time consuming, and nearly impossible. VCR makes replaying VR sessions both accurate and painless. The tool records time-stamped HMD and controller inputs during an immersive VR session. Users can then replay the recording, without an HMD attached, to reproduce the session. It’s also possible to filter the recorded session through an optional processing step, cleaning up the data and removing excessive camera motion.

Learn more about NVIDIA XR developer solutions.

Boost performance with NVIDIA Nsight tools 

The latest Nsight Graphics 2022.2 features the brand new Ray Tracing Acceleration Structure Instance Heatmap and Shader Timing Heatmap. The GPU Trace Profiler has been improved to help developers see when GPU activity belonging to other processes may be interfering with profiling data. This ensures that the information that developers get for profiling is accurate and reliable

Figure 8. The latest version of Nsight Graphics is now available and includes numerous bug fixes, improvements and useful new features.

NVIDIA Nsight Systems is a triage and performance analysis tool designed to track GPU workloads to their CPU origins within a system-wide view. The new Nsight Systems 2022.2 release now includes support for Vulkan graphics pipeline library and Vulkan memory operations and warnings. Enhancements to multi report view further improve the ability to compare and debug issues.  

Figure 9. Building the next great game with NVIDIA Nsight tools.

Game developers can find additional free resources to recreate fully ray-traced and AI-driven virtual worlds on the NVIDIA Game Development resources page. Check out NVIDIA GTC 2022 game development sessions from this past week.

Discuss (0)

Tags

Leave a Reply