NVIDIA DLSS 5 is the biggest graphics leap since ray tracing

NVIDIA DLSS 5 is the biggest graphics leap since ray tracing

5 views
2 mins read

Jensen Huang called DLSS 5 “the GPT moment for graphics.” That’s a massive claim. But when you look at what this technology actually does, the comparison starts to make sense.

NVIDIA announced DLSS 5 on March 16, 2026, and it’s genuinely different from anything we’ve seen in real-time games before. Previous versions of DLSS were about performance: generating synthetic pixels to boost frame rates. DLSS 5 goes somewhere else entirely. It uses a neural rendering model, running in real time, to add photoreal lighting and materials to game scenes that were never rendered that way to begin with.

A side-by-side game screenshot comparison showing DLSS OFF vs DLSS 5 ON enhanced visuals, with dramatically improved subsurface skin scattering and realistic lighting on a character's face
A side-by-side game screenshot comparison showing DLSS OFF vs DLSS 5 ON enhanced visuals, with dramatically improved subsurface skin scattering and realistic lighting on a character’s face

How the AI model actually works

DLSS 5 takes two things from every game frame: color data and motion vectors. From those two inputs, the AI model reads the entire scene. It determines whether a character is front-lit or backlit, how skin should scatter light from within, how hair catches a glint, how fabric catches a sheen. Then it generates new pixels, grounded in the game’s original 3D world, consistent from frame to frame.

That last part matters more than people realize. Video AI models can generate photoreal images, but they’re unpredictable. You can’t ship a game where the lighting looks different every time you look at a wall. DLSS 5 is deterministic. Its outputs are anchored to the game’s source content and stay stable across frames. It runs at up to 4K resolution, fast enough for interactive gameplay.

For context: DLSS 4.5, released at CES earlier this year, already used AI to draw 23 out of every 24 pixels you see on screen. That was about quantity. DLSS 5 is about quality, specifically closing the gap between a 16-millisecond game frame and a Hollywood VFX frame that can take hours to render on a render farm.

See Also
Anthropic vs. Pentagon: Why Dario Amodei Refused to Back Down

Which games are getting it

DLSS 5 arrives Fall 2026. The confirmed lineup includes Assassin’s Creed Shadows, Hogwarts Legacy, Starfield, Resident Evil Requiem, The Elder Scrolls IV: Oblivion Remastered, Phantom Blade Zero, Delta Force, and more. Publisher support is wide: Bethesda, CAPCOM, Ubisoft, Warner Bros. Games, Tencent, and NetEase are all confirmed partners.

Todd Howard said getting DLSS 5 running in Starfield was “amazing how it brought it to life.” Ubisoft’s Charlie Guillemot said it changed “what we can promise to players” for Assassin’s Creed Shadows. Studios don’t reach for quotes like that for features they’re lukewarm about. Something real is happening here.

Assassin's Creed Shadows DLSS 5 comparison showing OFF vs ON, with improved reflections, micro-shadowing, light scattering, and ambient occlusion
Assassin’s Creed Shadows with DLSS 5 OFF (left) vs ON (right). The difference in micro-shadowing, light scattering, and ambient occlusion is visible even at a glance

The catch, and why it still matters

DLSS 5 is RTX hardware only. If you’re on an older card, none of this applies to you right now. The first public preview is happening at GTC this week, with the full release still months away.

That said, NVIDIA gave developers real control over the output: intensity sliders, color grading tools, and masking options so artists can decide exactly where the AI applies its effects. That’s the right design. It’s why major studios are integrating it rather than keeping their distance from a black box they can’t steer.

DLSS has shipped in over 750 games since 2018. If DLSS 5 sees even a fraction of that adoption, the average PC game in 2027 will look noticeably different from what we have today. Not because GPUs got faster, but because AI learned to render what hardware alone never could.

Leave a Reply

Your email address will not be published.

Previous Story

Anthropic vs. Pentagon: Why Dario Amodei Refused to Back Down

Latest from Blog