The recent webinar shares how to use Unity’s High Definition Render Pipeline (HDRP) wizard to enable ray-tracing in your Unity project with just a few clicks. At the end of the webinar, we hosted a Q&A session with our guest speakers from Unity Pierre Yves Donzallaz and Anis Benyoub and below are the top 11 questions asked.
Q: What are the specifications of the computer used for the webinar?
Pierre: The computer hosts an NVIDIA RTX 2070 Super, with a 16-thread CPU and 32 GB of system memory. It is a typical midrange configuration for a gaming machine nowadays. Performance is great when using ray traced shadows and ambient occlusion (60+ fps), or only ray-traced reflections in the Performance mode. Nevertheless, the frame rate may dip under 30 fps at 1080p when combining high quality ray traced global illumination, reflections, ambient occlusion, and shadows. This is naturally expected, especially when compared to other games using many ray tracing effects at once.
Q: Can ray tracing be used for very large open architectural scenes, where rays need to travel hundreds of meters?
Anis: Yes, Unity’s HDRP does not limit the length of the rays in its latest version. Therefore, you can tune the ray length of the effects you want to use, as well as increase the range of the Light Cluster, which is a structure listing the local lights and reflection probes active in each of its cells.
Q: What are the minimum requirements to achieve good performance with ray tracing?
Pierre: It entirely depends on the number of ray tracing effects you want to enable and their quality (e.g. number of bounces and samples). For instance, for game scenarios with ray traced shadows and ray traced ambient occlusion only, an NVIDIA RTX 2060 Super could be an acceptable solution. However, it greatly depends on the optimization of the project itself. For high-end gaming, with all effects enabled, an RTX 2080 or better is highly recommended. Finally, for visualization and “interactive frame rate” scenarios, an RTX 3000 series or better will certainly offer the most comfortable working experience.
Q: Does the scene have to be pre-baked in any way to use ray tracing and/or path tracing?
Pierre: The path tracer does not require any pre-baking, thanks to its more brute force (and slower) approach to lighting. In some cases, ray tracing can take advantage of a better lighting baseline—one example is in the form of local reflection probes, notably in darker interiors where a fallback to the skybox’s reflection probe is unwanted. For a bright outdoor scene with a lot of sun and sky contribution, no baking at all is required to take advantage of the ray tracing, as the fallback to the ambient probe and reflection probe derived automatically from the sky will offer a good starting point.
Q: Are single-sided objects problematic with ray tracing and/or path tracing, for example, a floor or a ceiling plane?
Anis: Ray tracing and path tracing in Unity have the same visibility requirements as the rasterized techniques: proper shadow casting is expected, either in the form of a watertight mesh, a shadow proxy like in our new HDRP Scene Template, or in the form of double-sided materials or two-sided shadows.
Q: Is it possible to tune the ray tracing settings on a per scene or per location basis?
Pierre: Certainly. Unity’s HDRP uses a Volume system to control precisely settings applied spatially, depending on the camera position. The shape of a volume can be a box, a sphere, a custom mesh (concave), or it can be infinitely large (global). Multiple volumes can be nested, thanks to the priority system, and smooth transitions can be achieved easily by tweaking the blend distances or the influence of each volume. Of course with Unity, you can easily adjust the volume settings programmatically if needed.
Q: Do I have to use Temporal Anti Aliasing (TAA) to use ray tracing?
Pierre: TAA can help dramatically, so it is highly recommended. However, it isn’t mandatory. The temporal accumulation helps reduce high-frequency noise whenever the denoiser is unable to provide good results due to a lack of history, typically with moving objects, such as trees moving in the wind.
Q: Does the image converge faster on more powerful hardware?
Anis: To obtain an image with a minimal amount of noise, the image requires a few frames to stabilize. Therefore, with a more powerful GPU and the resulting higher frame rate (uncapped), your frame time will be lowered and you will get a stable image more rapidly.
Q: During the development of this demo, what was the most exciting feature you worked on or experienced?
Pierre: Personally, playing with mirrors and increasing the number of bounces for the ray-traced reflections beyond a handful was particularly impressive. Of course, the use cases for multi-bounce reflections are fairly limited in common projects, and two or three bounces will suffice in most real life-like cases. Nonetheless, it is always fun to watch recursive reflections!
Q: How much further can this technology go? It looks almost as close to real life as possible based on the examples shown in the webinar.
Anis: Real-time path tracing is certainly the goal in the distant future. However, it will require further research on real-time path tracing denoisers. In the meantime, a hybrid pipeline like the one offered by Unity’s HDRP, with a mix of rasterized and ray traced effects can already offer a great compromise on current hardware.
Q: Can we achieve “physically accurate” results in terms of lumen levels? For technical lighting analysis.
Pierre: Unity’s HDRP provides physical light units (lux, lumen, candela, EV, and nits) and physically based light attenuation. Therefore, results can be somewhat close to reality when using the GPU Lightmapper, the path tracer, or the ray traced global illumination with a sufficient number of samples. However, keep in mind that all these features are mere approximations of real-life physics and should be taken with a grain of salt. For technical lighting analysis, you will also certainly be interested in our improved exposure debug views, which offer an EV100 visualization of the scene for example.
To learn more, visit the developer ray tracing resources page where you can find videos, blogs, webinars, and more to help you get started.
And don’t miss out on the latest ray tracing news – GTC starts April 12, 2021, and registration is free. Be the first to know when registration is open.