High-Dynamic Range (HDR) refers to the range of luminance in an image. While definitions vary a bit, it typically means a range in luminance of 5 or more orders of magnitude. While HDR rendering has been around for over a decade, displays capable of directly reproducing HDR are just now becoming commonly available. Normal displays are considered Standard Dynamic Range and handle roughly 2-3 orders of magnitude difference in luminance.
Industry groups such as the Society of Motion Picture and Television Engineers (SMTPE) and the HDMI Forum have created standards for the next generation of displays. In particular, they go far beyond the dynamic range and color gamut described by sRGB and Rec. 709 which has driven the industry for the past two decades. These standards enable the encoding of brighter and richer colors than displays have traditionally been able to display. Additionally, an industry group called the UHD Alliance (UHDA) has set forward a certification program of what capabilities are required of a display to produce a high-quality experience today.
All NVIDIA GPUs from the 900 and 1000 series support HDR display output. The presence of HDMI 2.0 provides the bandwidth necessary for the higher quality signal desirable for HDR. The 1000 series adds support for HDR over future Display Port revisions.
Many of the latest premium TVs from LG, Samsung, Sony, and others support the necessary capabilities to display high-quality HDR images. Traditional desktop monitors can be expected at a later date.
At a high level, utilizing HDR display only involves these steps.
Additionally, NVIDIA has a white paper explaining the more complete details as well as an SDK sample with all the code needed to get started. Finally, this page links to some additional blog posts with yet more supplementary information to provide guidance on upgrading an application to support HDR displays.