The RTX 5090 is the best graphics card I've ever owned — but its big new feature disappoints
Nvidia’s AI-powered frame generation tech brings shame and pain

Just as Terminator 2: Judgment Day predicted back in ye olden days of 1991, the future belongs to AI. That could be a problem for Nvidia RTX 50 GPUs, even when it comes to the best consumer graphics card money can buy.
I was ‘fortunate’ enough to pick up an Nvidia GeForce RTX 5090 a couple of months ago. I use those semi-joking apostrophes because I merely had to pay $650 over MSRP for the new overlord of GPUs. Lucky me.
Before you factor in the 5090’s frame-generating AI voodoo (which I’ll get to), it’s important to give credit to Team Green for assembling an utter beastly piece of silicon. Around 30% more powerful than the RTX 4090 — the previous graphics card champ — there’s no denying it’s an astonishing piece of kit.
Whether you’re gaming on one of the best TVs at 120 FPS or one of the best gaming monitors at 240 fps and above, the RTX 5090 has been designed for the most ludicrously committed hardcore gamers. And wouldn't you know it? I just happen to fall into this aforementioned, horribly clichéd category.
Frame game
The main selling point of Nvidia’s latest flagship product is DLSS 4’s Multi Frame Generation tech. Taking advantage of sophisticated AI features, Nvidia’s RTX 50 cards are capable of serving up blistering frame rates that simply can’t be achieved through brute force hardware horsepower.
Multi Frame Generation — and I promise that’s the last time I capitalize Team Green’s latest buzz phrase — feels like the biggest (and most contentious) development to hit the PC gaming scene in ages. The tech has only been out for a few months and there are already over 100 titles that support Nvidia’s ambitious AI wizardry.
How does it work? Depending on the setting you choose, an additional 1-3 AI-driven frames of gameplay will be rendered for every native frame your GPU draws. This can lead to colossal onscreen FPS counts, even in the most demanding games.
Doom: The Dark Ages, Hogwarts Legacy, Microsoft Flight Simulator 2024, Cyberpunk 2077 — some of the most graphically intense titles around can now be played at incredibly high frame rates with full ray tracing engaged. That’s mainly thanks to multi frame generation.
Just how high are we talking? On my RTX 5090, I can comfortably hit a locked 120 FPS at 4K with max settings, providing Nvidia DLSS is enabled. That figure is limited by my LG G3 OLED’s max 120Hz refresh rate. When I hook my rig up to my 240Hz Samsung Odyssey G9 OLED super ultrawide monitor, some of the games above can be played at over 200 FPS.
There is a catch, though. And said stumbling block is as sizable as a certain silver screen ape that clambered to the top of the Empire State Building. That ended well, right?
Yes, the scarcely believable frame rates my third-party RTX 5090 is able to achieve are a lot cheerier than the finale of King Kong. Yet that doesn’t mean the best graphics card in the world doesn’t have to face its own version of pesky biplanes.
A mixed lag
Despite my frame rate counter showing seriously impressive numbers, the in-game experiences often don’t feel as smooth as you’d expect. It’s not unfair to expect 120 FPS gameplay to be super slick, and when all your frames are being rendered natively by your GPU, it normally does. Sadly, that’s not quite the case with multi frame generation.
As much as I’ve tried to resist, I’ve become increasingly obsessed with the excellent Nvidia app (and more specifically) its statistics overlay while messing around with multi frame gen of late. These stats let you monitor FPS, GPU and CPU usage, and most crucially for me, latency.
Also known as input lag, latency measures the time it takes a game to register the press of a button on one of the best PC game controllers or the click of a key/mouse in milliseconds. If your latency is high, movement is going to feel sluggish, regardless of how lofty your frame rate is. Generally speaking, I find input lag of 70 ms and above pretty hard to stomach.
I’ve mostly been playing around with Team Green’s multi frame gen features in Doom: The Dark Ages, Indiana Jones and the Great Circle, Cyberpunk 2077: The Phantom Liberty and the recent, extra demanding Half-Life 2 RTX demo. To say the results have been mixed would be akin to describing Godzilla as “above average height”.
Cyberpunk 2077 actually fairs pretty well when it comes to balancing input lag and big frame rate numbers. At the maximum x4 multi frame gen setting, I generally float around 64-78 ms of latency in 4K (3840 x 2160) at 120 FPS with all settings maxed out and full path tracing enabled — more on that shortly.
For a game that hardly requires lightning reactions, those latency measurements feel just about acceptable to me. Knock multi frame generation down to x3 and input lag drops to around 55-65 ms cruising around Night City while still hitting a locked 120 FPS, which feels reasonably responsive. At x2 frame gen, latency of around 50 ms feels even better, albeit with the big caveat that I drop down to 90 FPS. And with frame generation turned off completely? You’re looking at 40 ms of lag with a nosedive to 50 FPS.
In the case of Cyberpunk, I’d say x3 frame gen hits the sweet spot between responsiveness and in-game smoothness. It’s not a fast-paced shooter, so a little added latency is worth sacrificing for a locked 4K/120 FPS experience.
Speaking of games that do require more nimble reactions, Doom: The Dark Ages can produce multi frame generation results that feel downright awful. Despite being well optimized overall and even with Nvidia Reflex low latency mode turned on, controlling the Doom Slayer during his medieval murder quest can feel like wading through a sea of space soup.
At x4 and x3 multi frame gen settings, the action is outright ghastly. With Nvidia’s AI tech maxed out, latency never once measures in below an unplayable 110 ms on my rig. Turn frame gen off though, and a card like the 5090 can still hand in 4K/120 FPS but with latency dropping to a slick and responsive 20 fps. The higher frame generation presets may look smooth in motion, yet they feel massively heavy with a controller in your hands.
Next up is Indy’s latest adventure. The Great Circle might be a breezy, enjoyable action-adventure, but it’s definitely not the best poster boy for multi frame generation. At the amusingly stupid ‘Very Ultra’ settings in 4K with all settings maxed out and path tracing cranked up, latency lands on a super sluggish 100 ms and above with x4 frame gen enabled.
If you own one of the best gaming PCs and want to enjoy a rich ray traced experience with acceptable input lag at responsive frame rates, I suggest going for the x2 frame gen setting. At this level, I find latency hovers between mid 30s to low 40 ms gameplay, which feels as snappy as one of the explorer’s legendary whip lashes.
Even though it's over 20 years old, it's Gordon Freeman’s path traced Half-Life 2 RTX demo that produces the worst results on my gaming PC. Movement feels utterly shocking with multi frame gen set to either the x4 or x3 setting. I’m talking ‘150 ms of latency’ levels of shocking. Even cutting through Headcrabs in the shooter’s legendary Ravenholm level at 120 FPS using one of the best gaming mice is horribly sluggish.
It’s only by turning Nvidia’s latest tech off entirely that torpedoing zombies with buzzsaws fired from Gordon’s gravity gun feels playable again. With frame gen disabled, my 5090-powered PC was able to achieve just 30 ms of latency consistently as frame rates fluctuated between 60-75 FPS. And if all of the inconsistent frame rates above are making you queasy, I can assure you they never bothered me thanks to the combination of my display’s FPS-smoothing G-Sync and VRR (Variable Refresh Rate) features.
Path life
You’d probably think the big takeaway from my multi frame generation experiments would be ‘disable multi frame gen’ at this point, am I right? In the here and now, most definitely. Yet in the future, the most graphically exciting development in PC gaming for years will most likely demand you use DLSS 4’s x4 or x3 AI frame-generating settings to maintain high frame rates.
That feature is the aforementioned path tracing. Essentially the ‘pro level’ form of ray tracing, this lighting algorithm can produce in-game scenes that look staggeringly authentic. The two best current examples of the technology being deployed to eye-arousing effect I’ve come across are Cyberpunk and Doctor Jones’ enjoyable romp.
I wasn’t surprised that path tracing floored me in CD Projekt Red’s seedy yet sensational open-world — I was messing with its path traced photo mode long before DLSS 4 arrived. The quality of the effect cranked to the max in The Great Circle knocked my socks off, though.
That stunning screenshot a few paragraphs above is from the game’s second level, set in Indy’s Marshall College. During a segment where Jones and his vexed buddy Marcus search for clues following a robbery, path tracing gets to really flex its muscles in a sun-dappled room full of antiquities. Dropping down to the highest form of more traditional ray tracing, I was genuinely shocked at just how much more convincing the path traced equivalent looked.
It kinda pains me to think I’m probably going to have to lean on multi frame generation going forward if I’m to maintain 4K, high frame rates experiences in games that support path tracing. As the technology matures, I really hope Nvidia finds ways to reduce latency without massively compromising on the speedy FPS performance its latest AI experiment targets.
Seeing as the launch of the RTX 50 range has gone as smoothly as a dinner party for chickens organised by The Fantastic Mr Fox, I have no problem stating that if you own a 40 series GPU (especially an RTX 4080 or 4090), you should stick with your current card. Even if you’ve been hyped for multi frame generation, know that it’s nowhere near effective enough at the moment to be worth upgrading your GPU for.
The most damning aspect of DLSS 4’s multi frame gen performance is that it’s actually producing worse in-game experiences than you get with DLSS 3’s x2 frame gen settings. Based on my time with the titles I’ve mentioned, the lowest level of this frame-boosting tech hits the best balance between reasonable latency and stutter-free gameplay. Considering Nvidia first launched the DLSS 3 version back in October 2022, and you can enjoy it on last-gen GPUs it’s not a great advert for DLSS 4 and its latest AI ace in the hole.
The iconic computing company’s new artificial intelligence model might be 40% faster than the previous iteration, but that doesn’t mean multi frame generation feels satisfying in motion in its current state.
I don’t want to end on a total downer though, so I’ll give DLSS 4 credit where it’s due. Multi frame gen undeniably reeks of the Emperor’s New Clothes at present and that’s disappointing. However, Nvidia’s latest form of supersampling and its new Transformer model deliver considerably better anti-aliasing while being less power-hungry than the existing Legacy edition.
My fondness for the RTX 5090 is only matched by Hannibal Lecter’s delight in chowing down on human livers. Probably. If you’re on the fence about the latest wave of Nvidia GPUs though, don’t let multi frame generation sway a potential purchasing decision.
More from Tom's Guide
Get instant access to breaking news, the hottest reviews, great deals and helpful tips.

Dave is a computing editor at Tom’s Guide and covers everything from cutting edge laptops to ultrawide monitors. When he’s not worrying about dead pixels, Dave enjoys regularly rebuilding his PC for absolutely no reason at all. In a previous life, he worked as a video game journalist for 15 years, with bylines across GamesRadar+, PC Gamer and TechRadar. Despite owning a graphics card that costs roughly the same as your average used car, he still enjoys gaming on the go and is regularly glued to his Switch. Away from tech, most of Dave’s time is taken up by walking his husky, buying new TVs at an embarrassing rate and obsessing over his beloved Arsenal.
You must confirm your public display name before commenting
Please logout and then login again, you will then be prompted to enter your display name.