Friday, January 17th 2025
NVIDIA GeForce RTX 5090 Performance in Cyberpunk 2077 With and Without DLSS 4 Detailed
It is no secret that NVIDIA's RTX 50-series launch was welcomed with a mixed reception. On one hand, DLSS 4 with Multi-Frame Generation has allowed for obscene jumps in performance, much to the dismay of purists who would rather do away with AI-powered wizardry. A recent YouTube video has detailed what the RTX 5090 is capable of in Cyberpunk 2077 with Path Tracing at 4K, both with and without the controversial AI features. With DLSS set to performance mode and 4x frame generation (three generated frames), the RTX 5090 managed around 280 FPS. Pretty good, especially when considering the perfectly acceptable latency of around 52 ms, albeit with occasional spikes.
Turning DLSS to quality, the frame rate drops to around 230 FPS, with latency continuing to hover around 50 ms. Interestingly, with frame generation set to 3x or even 2x, the difference in latency was borderline negligible between the two, right around 44 ms or so. However, the FPS takes a massive nosedive when frame generation is turned off entirely. With DLSS set to quality mode and FG turned off, the RTX 5090 barely managed around 70 FPS in the game. Taking things a step further, the presenter turned off DLSS as well, resulting in the RTX 5090 struggling to hit 30 FPS in the game, with latency spiking to 70 ms. Clearly, DLSS 4 and MFG allows for an incredible uplift in performance with minimal artefacting, at least in Cyberpunk 2077 unless one really looks for it.
Source:
PC Centric
Turning DLSS to quality, the frame rate drops to around 230 FPS, with latency continuing to hover around 50 ms. Interestingly, with frame generation set to 3x or even 2x, the difference in latency was borderline negligible between the two, right around 44 ms or so. However, the FPS takes a massive nosedive when frame generation is turned off entirely. With DLSS set to quality mode and FG turned off, the RTX 5090 barely managed around 70 FPS in the game. Taking things a step further, the presenter turned off DLSS as well, resulting in the RTX 5090 struggling to hit 30 FPS in the game, with latency spiking to 70 ms. Clearly, DLSS 4 and MFG allows for an incredible uplift in performance with minimal artefacting, at least in Cyberpunk 2077 unless one really looks for it.
25 Comments on NVIDIA GeForce RTX 5090 Performance in Cyberpunk 2077 With and Without DLSS 4 Detailed
Would be interesting to see how frame gen, when in a scenario where the graphics load is so high the real FPS would be say around 10fps, would handle cut-scene changes...
The more I hear about all this frame generation AI bullshit, the less I like it. The 9070XT is sounding better and better, especially if the price is right.
(I've also thought DLSS on any game made it look a horrible fuzzy blurry washed out mess)
It's actually more annoying that the resolution display scaling on the GPU isn't as clever / good as FSR in that respect.
Anyway... let's see what they bring
Uhm at 4K DLSS Quality, you're basically getting native picture quality that is running full path tracing at 70fps, without the added latency from frame gen. That's rather incredible. FSR is trash, to this day. In fact, on my handheld, if I have any other options like TAA or XeSS, I'll choose those. It's incredible how bad FSR is.
I'm more interested in a DLSS4 P VS DLSS3 Q comparison, both sans FG, in terms of image quality and stability. At a pedestrian resolution of 1440p preferrably. If the former isn't significantly worse than the latter we have really good news for everyone.
5070 against the 4070 Super
(My expectation is the 5070 will be approx 5% slower than the 4070 Super)
5070 Ti against the 4070 Ti Super
(My expectation is the 5070 Ti will be approx 6% quicker than the 4070 Ti Super)
5080 against the 4080 Super
(My expectation is the 5080 will be approx 10% quicker than the 4080 Super)
5090 against the 4090
(My expectation is the 5090 will be approx 27% quicker than the 4090)
I just hate that claim/sentence, its not performance, its doubling frames from an actual rendered frame, its frame interpolation, its adobe flash tweening, its nothing new or special and its not....what I would call "performance"
Now I am not hating on it, im all for providing the user with a better experience and if a frame in between makes it feel smoother, go for it.
but stop hyping it up like that and stop pretending that frame generated fps is the same as "real" fps because it isnt, again, we could push 20 frames in there if we wanted, "wow look at that, 2000 fps, crazy"...yeah.....great.
I find upscaling to be fine in quality mode for 1440/2160p final resolution.
Even FSR looks nice when game developers take some time to implement it right.
For example FSR3 sucks big time on CP2077 and FSR2.1 looks much better with minimal shimmering. Someone got lazy there and it’s not AMD.
Anyway…
RT is here to stay, no doubt.
We (I) can live with “simple” RT with or without upscaling. DLSS upscaler is fine already outside of 1080p native/final resolution and FSR is catching up. Even FSR3.1 quality mode looks great and very close to DLSS3.5 when comparing them in 4K as long as it’s done right. Those estimates could be right but only on raster native. Entering RT and upscaling it’s a bit better for the latest. Leaving out the FG/MFG which is not quite there yet.
RayTracing outside of PathTracing can become the new standard really quickly with the latest hardware from all GPU manufacturers.
Shows again still that professional workloads is driving the gaming market. There is no need for raw performance in professional/prosumer space, but for AI/ML.
They just worked around it and found a way to “satisfy” gamers too with frame generation, or so they’re thinking.
Introducing PathTracing is pushing/forcing this notion further and human psychology is susceptible to the latest and greatest must have(s). They even starting to call path tracing as “full raytracing” so the “simplest” form of it feel like incomplete or broken.
Fascinating how some can manipulate the masses.
They could have made 5080 with 20 or 24GB for users like you even if we accept that games don’t need that frame buffer. IMO should have been at 20GB.
nVidia: Here we have hardware for everyone on various price points.
But no… you want this? Get your hand elbow deep in your pocket.
I mean if someone can’t accept that there are doing this on purpose then green is all you see.
I'm actually looking forward to DLSS comparisons between similar 40 and 50 series cards.