Generative AI

NVIDIA RTX Advances with Neural Rendering and Digital Human Technologies at GDC 2025

A Zorah sample still showing a partially shaded courtyard with ornate pillars and plants growing everywhere.

AI is transforming how we experience our favorite games. It is unlocking new levels of visuals, performance, and gameplay possibilities with neural rendering and generative AI-powered characters. With game development becoming more complex, AI is also playing a role in helping artists and engineers realize their creative visions.

At GDC 2025, NVIDIA is building upon NVIDIA RTX Kit, our suite of neural rendering technologies, expanding NVIDIA ACE, which is a toolkit for AI-powered game characters, and also showcasing how generative AI amplifies the capabilities of game developers, helping you create bigger, more amazing worlds for gamers to enjoy.

Neural shading support coming to DirectX in April 

At CES, NVIDIA introduced RTX Neural Shaders, bringing small neural networks into programmable shaders to improve image quality, boost performance, and reduce system resources. The applications are broad—from textures and materials to lighting.

Today, NVIDIA and Microsoft announced that neural shading support will be coming to DirectX 12 through the Agility SDK Preview in April 2025. The DirectX update will enable you to access RTX Tensor Cores from within shaders to achieve incredible image quality and performance gains. It will help enable AI workloads by optimizing matrix-vector operations, which are crucial for AI training, fine-tuning, and inference. 

“Microsoft is adding Cooperative Vectors support to DirectX and HLSL, starting with a preview this April. This will advance the future of graphics programming by enabling neural rendering across the gaming industry,” said Shawn Hargreaves, Direct3D developer manager at Microsoft. “Unlocking Tensor Cores on NVIDIA RTX will enable developers to fully leverage RTX Neural Shading for richer, more immersive experiences on Windows.” 

NVIDIA RTX technologies made more accessible for game developers

An image shows a courtyard with plants and intricate lacework gates and walls. The sun casts long shadows of plants and walls alike.
Figure 1. A still of the Zorah demo from NVIDIA RTX branch of Unreal Engine

NVIDIA RTX Kit, a suite of neural rendering technologies, has added a new RTX Texture Streaming SDK and improvements to RTX Texture Filtering. 

RTX Texture Streaming divides textures into smaller tiles and efficiently manages and loads them based on need. It minimizes memory overhead while ensuring that high-resolution textures are delivered exactly when and where they’re needed, optimizing both performance and visual quality. 

This new SDK makes it possible for RTX Neural Texture Compression to only decompress and cache the portions of textures actually accessed by the GPU to reduce unnecessary memory usage.

RTX Texture Filtering has expanded its DLSS support and added a new rasterization mode with deferred lighting.

Lastly, the RTX Path Tracing code sample has been updated with full DLSS 4 integration and transformer model for dramatic improvements to image quality in real-time, path-traced sample scenes. 

For Unreal Engine 5 developers, RTX Mega Geometry is now available in the NVIDIA RTX Experimental Branch of Unreal Engine 5 (NvRTX). RTX Mega Geometry is the NVIDIA technology for path-tracing massive worlds of dense geometry.

“Unreal Engine empowers developers to create the most detailed, immersive worlds imaginable,” said Brian Karis, graphics engineering fellow at Epic Games. “RTX Mega Geometry enables ray tracing with extreme detail and geometric complexity in a way that wasn’t possible before.”

Many of these technologies can be seen in the NVIDIA neural rendering technology demo, Zorah, built using the NVIDIA RTX Branch of Unreal Engine.

Video 1. Zorah | Neural Rendering, Powered by GeForce RTX 50 Series and AI

Zorah has been updated with new scenes to demonstrate the latest real-time path tracing capabilities. The Zorah sample features the following technologies:

  • ReSTIR PT
  • ReSTIR DI
  • RTX Mega Geometry

This sample demonstrates how you can harness these technologies to render incredibly detailed scenes with millions of triangles and cinematic lighting in real time. 

Get started with the latest NVIDIA RTX Kit updates now.

DLSS 4 now available in over 100 games and applications

Video 2. Lost Soul Aside | Launching with DLSS 4 and Ray Tracing

NVIDIA DLSS 4 has become the gold standard for AI super resolution and frame generation, boosting frame rates by up to 8x over native rendering. DLSS 4 Multi Frame Generation is now available in over 100 titles, including Marvel Rivals, Indiana Jones and the Great Circle, and FragPunk. This makes DLSS the most rapidly adopted NVIDIA game technology of all time. For more information about DLSS 4 from our Advanced Deep Learning and Research team, see DLSS 4: Transforming Real-Time Graphics with AI

To experience the full suite of DLSS 4 technologies, see the NVIDIA Amusement Park Demo. It is an interactive sample built off of NvRTX 5.4. NVIDIA provides both the source project, which can be opened against the latest NvRTX 5.4 built engine code, or a prebuilt binary that doesn’t require any compiling. 

For Unreal Engine 5 developers, access DLSS 4 plugins today. 

NVIDIA RTX Remix officially released

Video 3. NVIDIA RTX Remix | Remaster the Classics with DLSS 4 and Neural Rendering

RTX Remix is an open-sourced platform that enables modders to create stunning NVIDIA RTX remasters of classic games. Since the beta in January 2024, over 30K modders have used Remix across hundreds of games with over a million gamers playing RTX Remix mods.

At GDC 2025, RTX Remix is now officially released with a host of state-of-the-art graphical technologies and an upgraded neural renderer. RTX Remix is an easy way for game developers and hobbyists to explore and preview some of our latest technologies. 

With its full release, RTX Remix features DLSS 4. In addition, every RTX Remix mod will also now include the world’s first neural shader, RTX Neural Radiance Cache (NRC), which uses neural networks to estimate indirect light more accurately and performantly. NRC trains as you play, learning during your specific game session to bring you the most accurate indirect lighting profile for your specific content.

RTX Remix also brings RTX Skin, one of the first implementations of sub-surface scattering in ray-traced gaming. Light can now transmit and propagate through materials and characters, grounding them in a realism previously unattainable. 

You can also punctuate the look of your mods with the first availability of RTX Volumetrics, the new NVIDIA technique for rendering highly defined volumetrics. RTX Volumetrics uses a volume-based Reservoir Spatio-Temporal Importance Resampling (ReSTIR) algorithm to create high-contrast volumetric light and shadows, rendering crystal clear beams of light that look great and can help create epic moments during gameplay.

All the new graphical and neural rendering technology can be experienced in the public demo of Half-Life 2 RTX, releasing March 18 by Orbifold Studios, a community team of over 100 mod artists and developers. Half-Life 2 RTX is fully path-traced with nearly every asset and particle remade in high quality and with PBR materials that interact realistically with the lighting.

Video 4. Half-Life 2 RTX | Demo with Full Ray Tracing and DLSS 4 Announce

New NVIDIA ACE models drive more human-like autonomous game characters

The future of games will be living worlds powered by AI. From non-playable characters (NPCs) that you can converse using natural language to AI assistants that can analyze your gameplay in real-time to help you improve, the number of gameplay-changing experiences is countless.

Making these living worlds a reality requires a massive stack of AI models that enable game characters to simulate human decision-making. 

NVIDIA ACE is a suite of AI-powered, digital human technologies including language, speech, animation, and vision models. The suite continues to grow, adding new language and perception AI models, upgrading facial animation with a new diffusion architecture, and updating ACE plugins for Maya and Unreal Engine (coming soon). Now it’s even easier to enhance games using your favorite digital content creation tools and game engines. For more information about building with ACE, see NVIDIA ACE for Games

Today, NVIDIA made available a new NVIDIA ACE-fused vision language model (VLM), Nemovision-4B Instruct, that can describe on-screen information while taking human input to give more realistic and accurate responses. Deployment of Nemovision-4B Instruct is simplified for cloud or on-device through the NVIDIA In-Game Inferencing SDK (NVIGI)

There are many benefits to fused VLMs within a gaming pipeline:

  • It can capture visual effects not hard-coded into the game engine, especially in multiplayer experiences that have more dynamic elements. 
  • It can pull real-time, visual data compared to pulling directly from game state can provide a more believable human-like experience instead of one with artificial or superhuman response times.
  • It can help you prototype new concepts effectively to quickly assess their value before full production integration. 

InZOI shipping first NVIDIA ACE autonomous game characters

Video 5. NVIDIA ACE | inZOI – Create Simulated Cities with Co-Playable Characters

KRAFTON’s inZOI is a life simulation game that enables players to customize their family, home, work, and game world to create unique stories. It is one of the top five most wishlisted games on Steam. 

Players can transform the city’s NPCs (Zois) into AI-powered autonomous game characters by activating a Smart Zoi setting.

You can give Smart Zois life goals through a text prompt that generates corresponding actions generated by a language model. For instance, a Smart Zoi can be prompted to do acts of kindness and the AI model generates much more dynamic actions, such as having them offer food to a hungry stranger. At the end of each day, the language model helps adjust the Zois schedule of activities based on your customization and their day’s experiences.

An ACE half-billion parameter small language model (SLM), Mistral-Nemo-Minitron Instruct, powers Smart Zois. NVIDIA’s unique distillation methods provide high-quality outputs at low latency in a compact footprint. Smart Zoi is GPU-accelerated exclusively on NVIDIA RTX GPUs and the ACE technology will arrive with the release of Inzoi on March 28.

AI amplifies game developers

While neural rendering and digital human technologies are unlocking new experiences for gamers, these technologies are also enabling you to build games faster and focus on what matters most, creating fun and memorable experiences.

At GDC 2025, NVIDIA has two days of sessions that detail how AI can amplify game development abilities, giving you tools to offload common tasks and ultimately focus on the most important aspects of the game. NVIDIA experts preview the latest AI technology innovations:

  • AI Body Motion: Takes text input and pose targets and produces high-quality 3D skeletal animations in seconds through a motion-generation technology in research.
  • Audio2Face-3D for authoring: Enables you to feed sound inputs in multiple languages to get highly accurate lip sync and scale localization efforts to more territories.
  • AI QA agents: Detects and summarizes vast amounts of data and logging common bugs so that teams can prioritize key issues.

Win a GeForce RTX GPU signed by Jensen Huang

Join us at GDC 2025 as we continue to push the boundaries of what’s possible in real-time graphics and AI-powered digital humans. Be sure to attend our opening session, Advances in RTX, presented by John Spitzer, VP of Developer and Performance Technology, on Wednesday March 19 at 9am. Attendees will have the chance to win a GeForce RTX GPU signed by NVIDIA CEO Jensen Huang. 

For more information, see the Game Development resource hub and explore an extensive library of tools, demos, and samples.

Discuss (4)

Tags