Simulation / Modeling / Design

Interview with the Team Behind Metro Exodus

We recently caught up with Oles Shyshkovtsov, Ben Archard, and Sergei Karmalsy from 4Athe team behind Metro Exodusabout the unique features of the PC version of the game.

Metro Exodus finds beauty in the darkest possible places. As players, we see fetid sewers, terrifying cannibal camps, and ruined compounds. Yet even as we feel revulsion, we’re impressed by the artistry on display. What challenges came with the juxtaposition of making horrific beasts and environments look beautiful?

Single frame from Metro Exodus for performance event example
Figure 1. Performance event frame example from Metro Exodus

Decay of the world, pleasant imagery of thriving nature, nostalgic references, architecture and environment of unique style, people turned monsters and those who remain humans – these are separate pillars of our setting. They can all look decent on their own if done properly in style, that’s just a question of technically correct work.

But it’s the switch from one area to another, the actual meeting point of extremes is what brings emotions. The challenge was to deliver these switches and meeting points in a controlled pace, while having the freedom of open areas.

Ray tracing is a standout feature in the PC version of Metro Exodus. Global Illumination adds a layer of depth that makes digital sets feel almost like they are real environments, captured on 35mm film. How does ray tracing give you different artistic choices? How do you most effectively take advantage of these new paint brushes?

We always used the possibilities of the PC as a tool for graphical exploration, establishing our new goals, and finally as our team motivation. Integrating RTX matched these practices very well.

Metro’s frame rate performance while running ray tracing on “high” is very impressive. How did you optimize for this result? How did DLSS help you reach your performance goals?

DLSS and RTX are very much separate technologies. Any game is with or without RTX could potentially support DLSS and see potential frame rate gains by running at a lower resolution then upscaling and having the neural network attempt to fill in the information skipped during render.

RTX on the High setting is running in checkerboard mode, so is effectively half the resolution of the Ultra setting. We made great many optimizations behind the scenes that actually go into running RTX at all, but that is the main difference between the render modes we offer in game.

Aside from that, optimizations mainly come from targeting the requirements of the feature itself and from being able to remove some elements of the pipeline being replaced such as GI.

Can you share what you learned about the denoising process throughout production? Any tips for getting good performance out of shadows, reflections, and AO denoisers?

GI (global illuminatio) denoising is the toughest possible problem out of all cases you mention because we frequently have only one bright pixel out of thousands of black around. So, not even comparable. 😊

As for GI:

  • Go multipass, that’s the way to low sample count in spatial filtering.
  • Adapt to variance, hit distances, lighting, basically everything you can.
  • Temporally accumulate as much as you can.
  • Do temporal accumulation before denoising to blur out reprojection artifacts.
  • Convert your signal to spherical harmonics and denoise it. It allows you to reconstruct normal map details.
  • Don’t trash caches.
  • Optimize your math (yes, denoising can be ALU bound on mighty Turing).

What are the unique challenges a team may face that chooses to add ray tracing to a custom engine?

First, I want to demystify RTX a bit and say that the class of challenges that others will face are challenges that they have faced before, lots of times. If I went back in time a bit to ask what challenges a team would have implementing image-based lighting, or shadow maps, or tessellation, then the answers would be pretty similar: There will be a chunk of time implementing and optimizing the feature, a transition period where artists get their hands on it and figure out how to work with it, some time getting to know the limits of the new feature, some balancing, and some tweaking.

I am hard pressed to call much of the experience of transitioning “unique” because all of it is reminiscent of when engines moved over to physically based lighting. The unique experience of the implementation will be more along the lines of engine-specific concerns:

  • Where to put it in the pipeline?
  • How do we set it up to take advantage of existing assets?
  • Which RTX features best suit our project?

Implementing RTX is generally not something to be regarded as a huge, scary, monolithic shift in the way their engine works. The API mirrors the way data is currently set up in a game engine and the actual process of integrating it is no worse than any GI implementation and less painful than some.

But if I had to pick something that would be a standout for RTX is that it will take their filtering and denoising systems to a whole new level. That is the part that will make or break the feature, so be prepared to put a lot of work into it

Did you ever have the temptation to place environmental lights in different spots in a room – and emit light at different intensities – for the ray-traced version of the game?

Our traditional render with RTX Off of course includes a vast amount of environmental lights to simulate the illusion of sun bouncing around a room. With RTX on, we only need the sun as a single point of light and the technology takes care of the rest.

However, when releasing Metro Exodus certain very rare, specific, and localized situations existed where we needed to add something to temporarily fix a bug, or edge cases that were not supported yet. We aim to resolve as many of these as possible as we continue to support the game with patches.

How would you encourage a fellow developer to use HairWorks? How has its integration in Metro Exodus made a difference?

If your game needs dynamic hair—go for HairWorks. The time you spend integrating and creating assets is greatly less than the time you’ll spend developing your own solution.

What does Advanced PhysX add to the game’s environmental interaction?

We kept this tech integrated from our previous titles and rely on it primarily for the massive amounts of dynamic particles working within the physical world.

Some of the most striking advances from prior Metro games are the facial animations in Exodus. Anna in particular is so expressive. How has your approach to facial animation evolved over time? Is there anything special that you need to consider when lighting human skin, hair, and facial features, to keep from evoking the “uncanny valley” effect… particularly in the ray tracing era?

New face/head tech meant a full rebuild of many technologies and processes from the ground up. Part of this focus was also making sure that our characters supported physically based systems as with the rest of the environmental assets. Doing this set us up perfectly for RTX to “just work” with all our assets.

What segments in Metro Exodus would you recommend people experiment with turning RTX “on” and “off”, to see the difference?

Wake up, go outside, and then go back inside again. The illumination with RTX on comes from the sunlight and from that light bouncing off of nearby objects. Although the improvements to ambient occlusion can be seen deep inside the tunnel and interior sections with careful scrutiny, the most striking changes are clearly visible in the natural lighting and soft, complex shadows that you get at the transition between inside and out.

My favorite area types are overcast outdoors, structures with no roofs and indoors with big openings for sun rays and skylight.

In applying Highlights, how did you choose which events should trigger an automated recording? How has Highlights helped your community share their personal stories?

We decided to split those events on a few basic categories: one where the player involves in combat, the second to relate it to exploring the game world, and a third one for important story moments. At the same time, we tried to follow the NVIDIA guidelines to fit them into internal ID categories and it came out naturally.

As for sharing personal stories, the best part is that capturing all epic and funny situations throughout the game is fully automated and players won’t miss any of those precious moments.

Next steps

For more information, see the following resources:

About the contributors

Oles Shyshkovstov

Oles is CTO of 4A-Games with an emphasis on R&D and graphics. He’s worked on AAA titles for almost 20 years, on several generations of different consoles and PCs. He’s mostly known for the S.T.A.L.K.E.R and Metro series of games. He is passionate about doing great-looking things at a great framerate.

Ben Archard

For the past 12 years, Ben has worked for numerous companies in both games and visual effects industries. He has worked with a wide range of platforms including PC; multiple generations of consoles; Oculus, Vive, and mobile VR platforms; and Magic Leap and HoloLens AR platforms. Currently, he works at 4A-Games as a rendering programmer focused on both rendering systems and game visual effects. He is driven by the challenges of an industry in which there is always something new to discover, tinker with, master, and use to make something awesome.

Sergei Karmalsy

Breaking apart and altering DOS games as a kid in the 90s, Sergei turned technical artist with a passion for both next-gen visuals and grand environments in the 00s. Within 4A Games, Sergei became an art director because he deeply respects technology and the never-ending R&D process. Sergei has worked on Metro Exodus, Metro Last Light, Crimecraft, You Are Empty, and Stalker.

Discuss (0)