GDC19 show guide from NVIDIA
Ray Tracing - Guidance
NVIDIA's RTX GPUs now make ray tracing in video games a reality - something difficult to imagine just a year ago. This talk discusses the evolution of RTX technology, showcasing its power in recent and upcoming games, and presents NVIDIA's vision for the future of ray tracing and computer graphics.
Speakers: Martin Stich
This session will have two parts; first the session will dive into what it takes to integrate ray tracing into an existing engine/pipeline from an engineering perspective, covering common pitfalls and not so obvious gotchas - an essential listen for anyone considering the addition of a ray tracing back-end. The second part of this session will cover a summary of everything you ever wanted to know about ray tracing performance but were too afraid to ask - the culmination of the last 1+ years of working on ray tracing games/demos.
Speakers: Alex Dunn, Principal Devtech Engineer Pawel Kozlowski, Devtech Engineer
The session provides an introduction to Microsoft’s DirectX Raytracing, including an overview of its new shaders, additions to the host API, and real world integration tips based on adding ray tracing into Remedy’s Control. The first half focuses on ray tracing basics, an overview of the new shaders, and walking through a series of open-source shader tutorials. The second half highlights important changes to the DirectX API for developers integrating ray tracing into existing raster applications. We end by discussing important low-level details to help you achieve optimal ray tracing performance, including insights for optimal acceleration structures, instancing, and reduced shader permutations.
Speakers: Chris Wyman, Adam Marrs, Juha Sjoholm
Ray Tracing - Examples
This session will cover how Nixxes and NVIDIA approached ray traced shadows for directional, spot, point and area light sources in Shadow of the Tomb Raider. We’ll cover what went well and what didn’t in all aspects of the work, from BVH construction to implications on content, and finally the tracing and denoising of the results.
Speakers: Michiel Roza, Jon Story, Holger Gruen
In this session we are going to discuss ray tracing in Metro Exodus in general and in detail. The following topics will be covered:
- Stochastic effects are your friends
- Non-RT rendering pipeline in Metro. How to improve it with raytracing?
- DXR integration without doing massive changes in the engine
- Deferred lighting for hit positions
- Global Illumination: the Holy Grail of computer graphics in less than 1 ray per pixel? Is it possible?
- More rays versus more denoising. What is better?
- Denoising - how to denoise at full resolution and full performance? Voodoo magic, tips & tricks
- Other ray tracing effects
Speakers: Ben Archard, Dmitry Zhdan, Oles Shyshkovtsov,Sergei Karmalsky
This session covers how NetEase and NVIDIA approached ray traced reflections, ray traced shadows, ray traced caustics, and Deep Learning Super Sampling (DLSS) in the Justice Technology demo. We will give a general introduction to the RTX technology features, the production pipeline, learnings from DX upgrade, and performance optimizations. We will also cover what went well and didn’t in all aspects of work.
Speakers: Haiyong Qian, Xueqing Yang
We’ll talk about Q2VKPT, the Vulkan-based renderer for Quake 2 that uses hardware accelerated path tracing and advanced spatiotemporal denoising. We’ll cover some of the implementation details, including things like importance sampling, lighting, materials, and the denoising filters. In addition, we’ll discuss the challenges of using a physically based renderer with the assets from a game released over two decades ago.
Speakers: Christoph Schied, Alexey Panteleev
NVIDIA’s RTX leverages 10+ years of research into accelerated ray tracing on GPUs. In this talk, we’ll explore our API for exposing RTX through Vulkan. We will discuss how ray tracing fits in with a low-level rasterization API and cover some details on our Vulkan ray tracing extension.
Ray Tracing - Tech
The new book "Ray Tracing Gems" (http://raytracinggems.com, free electronically) is a collection of 32 articles by experts in the field. Chapters range from covering basic algorithms and techniques to descriptions of full-blown commercial global illumination systems. To give you a preview and let you decide what you want to read, the editors briefly describe each and what it offers, at an average of 90 seconds per chapter.
Speakers: Eric Haines, Distinguised Engineer
We discuss a pragmatic approach to real-time supersampling that extends common temporal antialiasing techniques with adaptive ray tracing. We have integrated our solution into Unreal Engine 4, and demonstrate how it removes the blurring and ghosting artifacts associated with standard temporal antialiasing, achieves quality approaching 16x supersampling, and operates within a 16ms frame budget.
Speakers: Adam Marrs, Senior Graphics Engineer
This talk shows how DXR raytracing helps to overcome the quality and performance constraints that are inherent to caustics rendering methdos used by most game engines. Starting from a fully dynamic and simulated water surface, the presentation covers two methods to render surface caustics, how to create volumetric caustics including volumetric shadows and also discusses multiple bounces of light. The techniques presented here are not limited to rendering water caustics but can be carried over to other transparent interfaces.
Speakers: Holger Gruen
We show how to combine glossy reflection ray tracing with diffuse ray-traced irradiance field probes. This creates a complete dynamic global illumination solution that scales across all DXR GPUs and minimizes light leaking without requiring manual per-probe or lightmap work by artists. It fits into modern game engines by directly replacing screen-space ray tracing and baked irradiance probes. The key breakthroughs are a fast way of ray tracing and updating irradiance instead of denoising individual samples or convolving radiance probes, and a new moment-based depth scheme for preventing light and shadow from leaking through walls.
Speakers: Morgan McGuire, Distinguished Research Scientist
Tools & Debugging
Are you struggling to understand why your graphics application isn't performing the way you think it should? Have you added new features only to see new rendering bugs and your frame rate fall? We'll discuss how NVIDIA Nsight Graphics provides tools to solve some of the most puzzling graphics rendering and performance problems. We'll walk you through real-world examples that show how to use NVIDIA's developer tools to solve these issues and more. Learn how to use the Frame Debugger to find those pesky rendering anomalies, and how to dig into the GPU performance metrics provided by the Range Profiler to decode your performance issues. We'll also cover how to harness GPU Trace to better understand how your application is utilizing the vast parallel computing resources of the shader units. Come ready to unleash your inner GPU Ninja!
Speakers: Jeff Kiel
The NVIDIA tools team shows how developers have identified and fixed DXR performance pitfalls using their latest tools. They will cover Nsight profilers and visualizers, GPUTrace occupancy visualizer and Aftermath crash debugging. Topics will cover RTAS structural pitfalls, RTAS building strategies, and how to get the most out of Hit shaders.
Speakers: Russ Kerschner, Jeff Kiel
This talk shows how Nsight GPU Trace can be used to determine the performance limiters of any DX12 workload on NVIDIA Turing GPUs, and improve performance by applying architecture-aware optimizations. Because the tool captures all of its metrics in a single pass (no frame replay), it can be used on DX12 frames that use asynchronous compute or copy queues. After recaping what the Peak-Performance-Percentage Method is, the talk shows how it can be applied to unlock perf speedups on various workloads, including: compute shaders with large thread-group sizes, pixel shaders with out-of-order completion, ray-tracing BVH updates and ray-tracing denoisers.
Speakers:Louis Bavoil, Principal Developer Technology Engineer
Deep Learning for Games
True next-generation games and graphics that integrate ray tracing and deep learning are upon us. In this session, we look at how deep learning is changing game production, game operations and now, even how games are rendered. Learn how to leverage the latest developments in DirectML and CUDA for your titles. NVIDIA also details its new technology Deep Learning Super Sampling (DLSS). DLSS is made possible by NVIDIA's TensorCores found in the Turing architecture and is the first use of real-time deep learning in a rendering pipeline.
NVIDIA is working with partners to integrate DLSS into a wide range of games. Hear first-hand from programmers about a recent game integration covering in detail: how the integration was undertaken in the engine; debugging issues with a neural network; and considerations for different GPUs and resolutions.
Speakers: Andrew Edelsten; Anjul Patney; Paula Jukarainen, Developer Technology Engineer
Latest Rendering and Simulation Technology
This talk discusses variable the rate shading feature in Turing that allows applications to vary the number of pixel shader invocations to dynamically adapt the shading rate per screen space tile. For coarse shading this has been implemented to improve performance on forward renderered lighting passes. We will give an overview of different algorithms that have been adapted to use this feature, including NVIDIA adaptive shading, and discuss API support. We will use the recent integration into "Wolfenstein II: The New Colossus" as a specific example. The following topics will be covered:
- Introduction to variable shading rate
- API support
- Color and motion adaptation methods
- Implementation with minimal overhead
- Stabilizing shading rates
- Integration details - tips & tricks
- Trade-off between performance and quality
Speakers: Lei Yang, Dmitry Zhdan, Matthew Johnson
In this talk, we will discuss recent progress on deep learning-based image-to-image translation research, which concerns designing deep neural networks for mapping an image in one domain to a corresponding image in a different domain. We will cover the fundamentals and discuss several representive research works. We will showcase several usage examples including rendering a 3D environment without using textures and converting hand drawn images to photorealistic images. We will share our prediction on how this technology would advance and its impact to the content creation industry.
Speakers: Ming-Yu Liu Ting-Chun Wang
In this talk, we will discuss the latest features of PhysX 4, NVIDIA’s latest open-sourced PhysX version. This talk will focus on which new techniques are available, how to use them and provide details about the performance and accuracy trade-offs of these techniques and applicability to game developers. We will provide examples of how these techniques can be used to improve not only simulation quality, but also performance in a wide range of gaming applications. We'll also discuss the feature set and simulation integrity of the PhysX Vehicles SDK.
Speakers: Kier Storey, Michelle Lu
We discuss the practical implementation of "in-pipe" GPU culling and level of detail alogithms with Turing's new mesh shading technology. We will use the context of the DX12 Asteroids demo to demonstrate how programmable shading for geometry topology will revolutionize the rasterization pipeline.
Speakers: Manuel Kraemer