Content Creation / Rendering

How Painkiller RTX Uses Generative AI to Modernize Game Assets at Scale

A behind-the-scenes look at rebuilding a classic game with RTX Remix and AI-assisted workflows

Painkiller RTX sets a new standard for how small teams can balance massive visual ambition with limited resources by integrating generative AI. By upscaling thousands of legacy textures into high-quality Physically Based Rendering (PBR) materials—a process that would have traditionally taken years—the team dramatically reduced the burden of repetitive work.

This approach was especially impactful for contributors without traditional modding backgrounds, freeing them to focus on creative decisions: refining materials and ensuring the game’s iconic atmosphere responds correctly to ray-traced lighting. Learn how the team architected a production pipeline that blends automation with artistic judgment across 35 unique levels.

To explore the motivations, solutions, and lessons behind these technical challenges, we spoke with McGillacutty (environment reconstruction and material lead), Quinn Baddams (team lead and founder of Merry Pencil Studios), and NightRaven (creator of PBRFusion).

What’s your professional background and current role?

McGillacutty: My background spans architectural design, technical art, and game analysis, with a focus on real-time environments. I currently work independently, combining teaching and technical client work with development on RTX Remix projects like Painkiller RTX. My role centers on environment reconstruction, material authoring, and building AI-assisted asset pipelines.

Quinn Baddams: My career has focused on building and optimizing complex systems—first in business strategy and digital infrastructure, and more recently in computer graphics. I’m currently studying computer science with a focus on AI and machine learning, which directly informs my work as team lead on Painkiller RTX and founder of Merry Pencil Studios. I apply systems thinking to architect our production pipeline and integrate generative AI as a practical solution to problems of scale.

NightRaven: I am currently a system engineer handling everything from full-stack automation to administrating VMware and cloud environments.

What made you want to become an RTX Remix modder, and what brought you to Painkiller?

McGillacutty: I came to RTX Remix from a visual and architectural perspective, without any modding background a year ago. When Quinn showed me Painkiller’s towering gothic interiors, I immediately saw how well they would lend themselves to ray-traced lighting—stained glass, stone, metal, and deep interior spaces. RTX Remix offered a way to renovate those environments by rebuilding the materials so the lighting could finally behave realistically, which pulled me straight into the project.

Quinn Baddams: I’ve been interested in computer graphics and technical art since the early days of 3D accelerator cards like Voodoo and TNT. At the time, real-time ray tracing felt like something we might see far in the future, but advances in denoising and technologies like NVIDIA DLSS made it viable much sooner than expected.

RTX Remix naturally pulled me in. I’ve always found physically based rendering principles satisfying, and path tracing fits that mindset well. After experimenting with several games with varying levels of compatibility, Painkiller stood out. It has solid mod support, an active community, and it was also one of my favorite games back in the GeForce 2 GTS era.

You’re among the early adopters to use generative AI to rebuild textures and materials at scale. How did you use models like PBRFusion to convert low-resolution assets into high-quality PBR materials?

McGillacutty: With minimal texture reuse across 35 levels, manually rebuilding thousands of materials simply wasn’t feasible for a small team. PBRFusion became the backbone of our pipeline, allowing us to batch-convert large sets of legacy textures into a usable PBR baseline at unprecedented scale.

The model automatically generated base color, normal, roughness, and height maps, which let us bring entire levels into a physically based context in a fraction of the time. Coming into modding without a traditional background, this AI-driven approach was critical—it removed the friction of repetitive work and let me focus on creative decisions, like refining materials, preserving the game’s iconic atmosphere, and ensuring everything responded correctly to ray-traced lighting.

Quinn Baddams: PBRFusion makes it possible to batch-upscale an entire project’s textures and quickly generate normal, roughness, and height maps, which is an excellent starting point. That said, convincing results still require material-by-material judgment.

Many surfaces don’t benefit from height maps at all, while others, especially metals, require much more careful treatment. Most metallic materials in Painkiller RTX were hand-crafted. Glass, transparent surfaces, and skin also needed custom values and maps, particularly for subsurface scattering.

Hero materials received additional attention using a mix of techniques, including blending CC0 PBR materials, AI-assisted generation, and procedural workflows in tools like InstaMAT Studio. AI provided the baseline, but traditional material authoring was essential for achieving quality and control.

What got you interested in generative AI, and what motivated you to fine-tune a model for RTX Remix?

McGillacutty: Scale was the primary driver. With thousands of textures spread across 35 levels, rebuilding materials by hand would have been impractical for a small team. I was already using generative AI for rapid iteration and visual exploration in other design contexts, so adapting it for RTX Remix felt like a natural extension.

Fine-tuning a model gave us a way to process large volumes of stylized legacy textures efficiently while maintaining cohesion across levels. Instead of treating each asset as an isolated problem, AI helped establish a consistent material baseline that we could then refine artistically.

Quinn Baddams: My interest came from a practical, production-focused curiosity. While experimenting with asset pipelines, I noticed a clear technical gap: there were no generative AI models tailored to the specific challenges of game development, particularly removing baked lighting and shadows from legacy textures, which is a major obstacle when converting assets to PBR.

That problem overlapped directly with my academic focus on AI and machine learning. RTX Remix provided a real-world production environment where I could bridge that gap by fine-tuning models to solve an actual pipeline bottleneck, turning research into something that directly addressed Painkiller’s scale.

NightRaven: RTX Remix was my entry point into generative AI. It was exciting to see older games brought back to life with modern rendering, and while learning how to mod with Remix, it quickly became clear that high-quality PBR materials are one of the biggest factors in making path tracing work.

I started using the available PBR generation tools, but I wasn’t satisfied with the results. Despite having no formal background in AI, I decided to build my own solution, which became PBRFusion. It went through three major iterations and more than a thousand hours of work to reach version 3—the version used in Painkiller RTX. One of my goals was also to lower the barrier to entry for RTX Remix, making it easier for more creators to experiment and contribute.

Why was it important for your texture pipeline to blend AI-generated outputs with traditional hand-crafted work, rather than relying on a single approach?

McGillacutty: It comes down to scalable quality. AI-generated outputs were essential for handling the sheer volume of assets and establishing a consistent visual baseline across the project, but they’re not a substitute for artistic judgment. The manual refinement phase is where we pushed quality further and preserved Painkiller’s distinct character.

That’s where we reinterpreted ambiguous source textures, corrected materials that broke under physically accurate lighting, and made intentional creative decisions. This hybrid approach allowed us to automate roughly 80% of the repetitive work, so we could focus human effort on the 20% that ultimately defines the project’s quality and vision.

Quinn Baddams: AI-generated roughness, normal, and height maps provide a strong starting point, but they often require adjustment to achieve physically accurate results. Correct values can be very specific, and many materials need manual tweaks or custom painting informed by real-world PBR references.

Painkiller also relies heavily on texture atlases, which can confuse AI models when a single texture contains multiple unrelated surfaces. Blending AI automation with hand-crafted work let us remove most of the repetitive busywork while maintaining precise control over both artistic intent and physical accuracy.

NightRaven: PBRFusion was always intended to be a tool, not a drop-in replacement for material creation. I’m glad the Painkiller team approached it that way—using the tool to accelerate their workflow rather than treating it as a crutch.

Because the model isn’t perfectly accurate, especially for roughness generation, it will get things wrong. Human verification and adjustment are essential to ensure materials behave correctly under physically based and path-traced lighting.

How did you maintain a consistent style and quality bar across more than 35 levels while integrating AI-generated content?

McGillacutty: Consistency at that scale required defining constraints early and treating AI output as a baseline system rather than as individual, isolated assets. PBRFusion’s content-consistent super-resolution produced cohesive results across large material sets, which helped establish a shared visual language for the project.

We regularly evaluated materials in context using in-engine captures, then iterated so that both AI-generated materials and hand-crafted hero assets reinforced the same style and quality bar.

Quinn Baddams: We set a small number of core guidelines early on. Small or distant textures weren’t upscaled unnecessarily, height maps were limited to large, flat surfaces, and roughness maps were treated as a primary driver of perceived material quality.

We referenced real-world PBR materials to validate roughness values and paid close attention to how albedo maps behave in a physically based workflow. In practice, consistency was achieved largely by reviewing and adjusting roughness maps to ensure materials behaved as intended under lighting.

The materials and textures now react much more realistically to light. How did you rethink your material, texture, and lighting workflows to achieve that result across so many environments?

McGillacutty: Introducing physically based lighting into Painkiller meant rethinking the entire relationship between materials and light. The game’s environments are abstract and otherworldly, designed around dramatic contrast rather than realism, so simply adding realistic lighting wasn’t enough.

We started by stripping baked lighting information from the original textures, then rebuilt contrast intentionally through material definition—using grime, surface variation, and physically meaningful roughness values. That way, the drama came from correct interactions with light rather than painted-in shadows.

All lighting in Painkiller RTX was hand-tuned at the scene level, which allowed us to carefully shape mood and composition across each environment while still preserving the game’s signature atmosphere.

Quinn Baddams: We took an iterative approach and learned early on that incorrect lighting responses couldn’t be fixed by simply adjusting texture brightness. The original game relied heavily on baked shadows, which added contrast that no longer made sense in a PBR workflow.

After removing that baked lighting, we reintroduced contrast through roughness variation, stronger normal maps, and controlled self-shadowing. Standardizing physically plausible light values across scenes was also critical to achieving consistent, believable results.

Full-scene path tracing, volumetric lighting, and advanced techniques all work together in Painkiller RTX. How did you combine these systems to shape the game, and what did each contribute that you couldn’t get from more traditional rendering?

McGillacutty: Full-scene path tracing and volumetric lighting fundamentally changed how materials behaved, which meant material work had to be developed in close alignment with lighting. While lighting and volumetrics were handled by the team lead, my role was to ensure materials responded correctly once those systems were in place.

Path tracing exposed properties like roughness, reflectivity, and wetness far more clearly than traditional rendering ever could. In areas with rain or fog, I adjusted materials to include puddles and surface ripples so they would interact believably with volumetrics and moving light.

A great example of this is RTX Skin, particularly on characters and semi-translucent surfaces like marble. For assets such as the nun or the lab fish, RTX Skin allows light to genuinely scatter through the surface. You can see it in haggard skin or gelatinous flesh, this subsurface scattering creates a sense of depth that simple surface highlights can’t achieve.

RTX Skin has been an extremely helpful tool. It’s allowed me to make these characters feel like tangible, physical parts of the ray-traced world we’re building. It’s especially rewarding to see a game from 2004 transformed to such an extent.

Quinn Baddams: Full-scene path tracing fundamentally changed how lighting and materials interacted, exposing inaccuracies that would have been hidden in traditional rendering. Volumetric lighting added depth and atmosphere, particularly in large interior spaces. While traditional techniques can approximate these effects, path tracing and volumetrics allow light to behave consistently across the entire scene.

RTX Skin was a major part of making all of this work together. For a project rebuilding a classic game, it solved two important problems. First, it allowed us to get far more out of our low-detail character models. The mesh geometry is exactly the same, but RTX Skin makes it appear significantly more detailed. A lot of that comes from the normal maps generated through PBRFusion, while RTX Skin itself helps smooth sharp edges, making low-poly geometry appear denser and less jagged.

Second, and more importantly, it gave us true artistic control over subsurface scattering for the first time in a real-time pipeline. You can define exactly how much light scatters through a surface and how its color changes as it does. We used this on the wings of the demon Alastor, where the internal veins are only visible because of RTX Skin—an effect we didn’t consider possible before.

To my knowledge, this level of ray-traced subsurface scattering hasn’t been available to game developers in a practical, real-time way. It was previously limited to offline rendering. Having it available through RTX Skin is fantastic—not just as a technical leap, but because it’s genuinely enjoyable to work with. We’re only scratching the surface of what’s possible.

For developers inspired by Painkiller RTX who want to take a first step toward similar visuals, which features or workflows would you recommend experimenting with first?

McGillacutty: My advice is to start with a simple, focused artistic goal. Don’t try to rebuild an entire level. Instead, capture a single, iconic scene and concentrate on the relationship between a few key materials and the lighting.

Use the RTX Remix Toolkit to replace the original textures with basic PBR materials, then iterate using path tracing and lighting tools. Once you understand that core dialogue between materials and light, you can introduce AI tools like PBRFusion. Used this way, AI becomes a rapid iteration engine—letting you test different visual hypotheses within the same scene.

Quinn Baddams: Start with the RTX Remix Toolkit itself. Capture a scene, apply basic materials, and begin experimenting with lighting and path tracing to understand how they interact.

The RTX Remix community is also an important resource, with shared tools, scripts, and active support. Most importantly, experiment freely—hands-on iteration is the fastest way to build intuition for these workflows.

How do you think generative AI has changed modding and game development, and what tools are you looking forward to next?

McGillacutty: As someone relatively new to modding, the biggest change I’ve seen is accessibility. Generative AI dramatically reduces the time and technical overhead required to experiment, iterate, and ship meaningful work. This opens development to creators from a wider range of backgrounds. 

For my next project, I’m looking forward to more advanced material and geometry tools and AI-assisted workflow scripting.

Quinn Baddams: Generative AI represents a paradigm shift—away from memorizing systems toward creating with a support layer that understands them. AI acts as both a tutor and a problem-solving partner. 

I’m particularly interested in further advances in AI-assisted asset cleanup and using retrieval-augmented generation to work with undocumented legacy codebases.

NightRaven: I am already nearly finished with the next version of PBRFusion, hopefully providing great benefit to this modding community.

Join us at GDC

Join us at GDC to explore how NVIDIA RTX neural rendering and AI are shaping the next era of gaming. Get a glimpse into the future of game development with John Spitzer, Vice President of Developer and Performance Technology at NVIDIA, as he unveils the latest innovations in path tracing and generative AI workflows.

Then, join Bryan Catanzaro, Vice President of Applied Deep Learning Research at NVIDIA, for an interactive “Ask Me Anything” session covering the latest trends in AI. Along with two full days of additional sessions, these events offer a front-row seat to the technologies enabling new kinds of player experiences.

Resources for game developers

See our full list of game developer resources here and follow us to stay up to date with the latest NVIDIA Game development news:

Discuss (0)

Tags