We’ve already started with teaser posts about HDR on this blog both here and here. Now that we’ve gotten through the madness of launching a new product, it is time to go through a more detailed set of information on the how, why, and what of HDR. This post kicks off a whole series of blog posts on HDR and color evolutions for real-time graphics. Additionally, we’ve got some new sample code and a white paper covering the topics in more depth. You can find all of this information on our HDR Development page.
Why you care
From my perspective, HDR display technology really is the biggest leap forward in the quality of pixels in over 20 years. We’re finally progressing from the point where the screen is able to use a wider portion of human perception. Colors can both be richer and brighter. It really can be a whole new experience. I’ve personally encountered scenes in games that give me chills when I see them in HDR for the first time. Since your game likely already renders the data for it, failing to allow all of it to reach the display seems like a waste.
Since you’re sold on how great displaying your content in HDR is, here is the quick list of what you need to get things working:
- NVIDIA GPU from the Maxwell or Pascal families (GTX 960, GTX 980, GTX 1070, GTX 1080, etc)
- HDR display supporting SMPTE 2084 often labeled as HDR10 (most likely a new UHD TV, but you can expect future support from monitors as well)
- Windows 7 or later
If you already have all these, you can grab our SDK app here and start looking at HDR images from real game engine content right now. There are a couple things you may want to double-check though. First, I recommend using a High Speed HDMI Cable for hooking up the display. The higher bandwidth enables driving the full 4k without compromises. Second, please consult the manual for which HDMI ports support HDMI 2.0 and/or HDR support.
For those that are disappointed that they don’t have access to an HDR display, there is an alternative that can allow you to start experimenting. Many high quality displays sold today offer what we’ve been calling “Enhanced Dynamic Range”. These are displays that can give you a limited HDR-like effect. The best results are for a 10 bit display with 300-350 nits of brightness. Turning the brightness (and possibly contrast) to the maximum level on the display makes it compatible with the “EDR” setting in the SDK application.
Testing & Evaluating HDR
I have a few tips for experimenting with HDR. First, I recommend doing evaluations in a dim or dark environment. Human vision adapts to the surroundings, and if the room is bright enough, it can wash out any HDR display. It should still look better, but you’ll get a better experience when things are dimmer. Second, be mindful of options like vivid mode and ‘enhancements’ that a display might offer. Most HDR output paths are doing a pretty good job of faithfully reproducing the content today, while the default standard dynamic range (SDR) mode on that same device might be boosting contrast, crushing blacks, and stretching colors to make it stand out on a showroom floor. It is a good idea to try to calibrate the device to something reasonably neutral so that you are doing apples to apples comparisons. Cinematic mode often seems pretty true. I highly recommend adding side-by-side modes to your application, as I have in the SDK sample, while they aren’t perfect, they do give you a quick idea of what you’ve really been missing.
I highly recommend our white paper here. It covers a lot of what game developers should know about color with this big transition happening. Additionally, we’ll keep the blog posts rolling with directed bits of information.