Simulation / Modeling / Design

Hacking NVIDIA Ansel to Slash VR Rendering Times

Warrior9 VR team members started working on The PhoenIX—a sci-fi animated series in virtual reality (VR)—two years ago. The tools and technology used to create CG imagery have rapidly evolved during those two years, and our production process needed to evolve with it. VR rendering techniques can often be complex and adversely affect performance. We discovered a novel use for the NVIDIA Ansel in-game photography engine to substantially speed up VR rendering.

The beginning

We decided to create a 3-minute teaser for The PhoenIX in VR at the tail-end of 2015. Initially pitched as a traditional 2D animation, we started to research the best way to approach the project, bearing in mind that we had a limited budget and a small team.

We found that real-time game engines and the GPUs that powered them were hitting their stride in terms of advanced rendering techniques and image quality. At the same time, VR began making a big splash.

Video 1. The PhoenIX teaser

Why choose VR?

We chose to make The PhoenIX into a VR series because of the young medium’s potential. VR looked like—and remains—an exciting field to work in. We saw the opportunity to be on the ground floor of a potentially transformative new medium where the rules had yet to be written and everyone was essentially on a level playing field.

What VR is and isn’t

The discussion around what constitutes “real” VR has often been heated. We prefer the term immersive content, which encompasses an entire spectrum of media.

Our aim in creating The PhoenIX was to avoid limiting our potential viewer base. At that time, high-end interactive VR content still required an expensive computer with an expensive headset. The barrier for entry for VR users was much higher than it is now.

On the flip side, video streaming sites such as YouTube had begun to offer 3D stereoscopic playback, which meant that anyone with a decent Internet connection and smartphone could potentially view our content.

We also made a conscious decision to make The PhoenIX a passive narrative experience. This meant the viewer would be an observer of the story rather than affecting its outcome. These factors led us to ultimately choose stereoscopic panoramic video as the format for The PhoenIX.

Choosing Unreal Engine

Render times proved to be the initial issue that pushed us towards using a game engine to produce The PhoenIX.  We had to quickly review our shots and make changes on the fly. The what-you-see-is-what-you-get (WYSIWYG) nature of game engines made using a real-time game engine a natural choice for us.

I had begun experimenting with Unreal Engine in my spare time before joining Warrior9 VR and had been learning it with the help of Epic’s robust YouTube library of free video tutorials, which fast-tracked the learning experience.

Though I grew up as a gamer, I can honestly say that I’m more interested in the technology and the creative process behind the games than in actually playing them. I studied Visual Effects in school because I wanted to be able to bring the worlds in my imagination to life. Unreal Engine 4 is the ultimate sandbox that allowed me to do that.

The teaser

An experimental mindset pervaded our thinking when we began working on the teaser for The PhoenIX. Everyone involved at Warrior9 VR took on new roles and learned new skills. The first step was seeing if it was even possible to create stereoscopic content using Unreal Engine.

When we started production, we were using UE 4.9, the version before Epic implemented the Sequencer tool. Beyond being limited to rudimentary control over our scenes and shots, there was no built-in way of exporting spherical panoramas, so we had to get creative and use the tools that were available inside of Unreal Engine.

With the help of some members of the UE development forum, we began to build a camera rig inside of Unreal that was essentially a virtual GoPro 360 rig (Figure 1).  We created a blueprint actor that contained six cameras pointing in the main directions. As we animated the rig through the scene, the camera views were streamed into a render target that handled the image blending and gave us one full, roughly stitched spherical panoramic image.

Unreal Engine virtual 360 camera rig
Figure 1. Our virtual 360 camera rig

We had the render target for the virtual surround camera function as a live texture for a plane hidden in the level, with an orthographic camera placed to match the aspect ratio perfectly. We then rendered out our image sequence from that camera. It certainly was a hack workaround, but it worked well enough to convince us that we should keep going.

You can check out the final version of the teaser on The PhoenIX website.

The first episode

Epic released some major UE4 updates not long after finishing our teaser. The addition of the Sequencer tool in UE 4.11 gave us the ability to control our cameras and scenes with much greater finesse. Then came the addition of the experimental spherical panorama export tool courtesy of Kite and Lightning (K+L). While this meant we no longer had to rely on our cobbled-together virtual GoPro rig, it presented new issues.

The plugin assembles the imagery slice-by-slice. When properly capturing stereoscopic images, only the middle slice of a render represents accurate stereoscopy. To assemble a complete stereoscopic panorama, the game engine actually renders a full image of your scene many times, then discards all but a strip in the middle before stitching them together. The width of the strip depends on what type of render quality you seek. At higher quality settings, this can mean the computer actually renders thousands of images in order to generate a single frame. Here’s the process.

The total number of renders to generate a single, 360-degree stereoscopic image is as follows:

(360 / HorizontalAngularIncrement) * (180 / VerticalAngularIncrement) * 2

Bear in mind that you must render twice, as VR requires slightly different images per eye.

These settings mean you’re going to be doing 2520 frames of rendering to generate just one frame of stereoscopic-360 capture! To put some perspective on that, if you rendered 2520 frames of your 60hz game, it would normally generate 42 seconds of output, but instead you’re getting 1/60th of a second of output if capturing for a 60fps movie.

Figure 2 shows the impact of what’s actually rendered versus what’s discarded.

The Phoenix VR Stereoscopic Rendering Unreal Engine Ansel
Figure 2. Lots of rendering effort gets discarded to create an 360-degree frame. (Source: Unreal Engine blog)

The plugin design also meant that many camera-specific and post-processing effects simply do not get picked up in the final frames. In addition, not everything obeys the game’s internal clock. For instance, GPU particle effects often had an adverse impact on the output, creating frustrating artifacts in the render. The final exported frames would be dark and the high-end effects that we depended on to give The PhoenIX its signature look did not carry through.

On top of all of these issues, the K+L plugin also did not take the rotational data from the in-game camera. So turning the view actually had to be done in a program like Adobe After Effects in post-production. While the K+L plugin was a good primer on how rendering stereoscopic panoramas worked, it wasn’t the appropriate solution for our rendering needs.

On to Episode 2

We expanded our team in 2017 and started work on Episode 2. This proved to be an important time in our development. Our teaser only ran for just 3.5 minutes long, whereas the first and second episodes ended up at almost 10 minutes. In addition, the second episode became infinitely more complex, featuring multiple characters and more detailed locations.

Render tests using the built-in K+L panoramic plugin took up to 7 minutes per frame in certain spots, effectively negating the benefit of using a real-time engine. We tried different things like spreading the render across multiple machines but in the end that only created more issues.

Enter Ansel

We have been strong proponents of NVIDIA technology and were aware of the capabilities of the NVIDIA Ansel plugin. Ansel takes impressive in-game screenshots in a variety of formats and is been making its way into more applications. Unreal Engine includes support for Ansel as a plugin, provided that you have the correct GPU, and lets you take up to 8k resolution stereoscopic panoramic images.

We were initially drawn to Ansel as a potential solution because of its ability to render frames with all of the effects and lighting as they appeared in the engine. In addition, Ansel also combined both eyes’ views into one image.  It seemed to offer a solution to exactly what we needed to do. The only problem is that Ansel only renders a single frame at a time when you use the in-game GUI.

We solved this problem by using an old-school workaround to automate navigating the Ansel GUI: the macro.

“A macro in computer science is a rule or pattern that specifies how a certain input sequence should be mapped to a replacement output sequence” (Source: Wikipedia)

Using Ansel in your Unreal Project requires you to run the project in its own process. You also need to set up a level blueprint for your scene which pauses the game on every frame for capturing. We created a simple macro script that activated the Ansel plugin, set the image output parameters to our requirements, then progressed the project playback by one frame before cycling through again. We had some great assistance from incredibly helpful members in the Unreal Engine forums in figuring this out.

Figure 3 shows the level blueprint to control the frame-by-frame progression in Unreal.

Ansel The PhoenixVR GPU Unreal Engine
Figure 3. Frame progression Unreal engine blueprint

This simple blueprint script works as follows:

  • The game starts and the level sequence is loaded and then immediately paused.
  • The blueprint looks at the status of the playback. When it unpauses, the sequence advances by .03 seconds (since we are rendering out at 30fps) before being paused again.
  • The frame number updates and then displays on-screen to allow you to follow the progress. This sequence repeats for as many frames as necessary.

After experimenting with this, we obtained a solid rendering pipeline working. Our initial results looked promising, taking our render time per frame down to ~45 seconds, compared to the ~120+ seconds per frame when using the K+L stereo pano plugin.

Further testing revealed that the resolution of the game process had a major impact on the time needed to render out our final images. Ansel needed to acquire many more tiles to create the full panorama when running the application in a smaller window. We took our render times from ~45 seconds per frame to just six seconds by increasing the resolution of our game process to 4k.

To recap, the render process for our teaser and our first episode took several weeks to complete. Our first attempts with Ansel cut that down to several days. With our most recent tweaks, we can render an entire sequence in a fraction of that time. Running the game at a higher resolution takes more video RAM, but people who are doing a lot of rendering should have the hardware to support it.

The following video shows a bit of the final result.

Video 2. The PhoenIX: Spaceport Prime

Final thoughts

The NVIDIA screenshot tool, Ansel, provides powerful capabilities for capturing impressive in-game screenshots. With a bit of creative thinking, we learned to use it in a new way, boosting our output exponentially. This in turn enabled us to solve many other problems intrinsic to the intensive process of rendering spherical panoramas.

To learn more about Ansel, see the Coffee Break series of video tutorials or the NVIDIA Ansel page.

The experience of collaborating with other artists, experimenting with new techniques, and employing computer technology on the cutting edge has been one of the most rewarding experiences that I have ever had. You can follow the progress of The PhoenIX by signing up for Warrior9 VR’s newsletter on the website and following its page on Facebook.

Discuss (1)

Tags