• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Cyberpunk 2077: Phantom Liberty Benchmark Performance

If anyone's interested in DLSS 3.5 results, here are some benchmarks:
purepc.pl
It appears that 1440p (and above) with PT on is still possible. Gains reach over 300% (~98 fps on 4080).
Yep it is.

Nice results.

Can't wait to play this game on QD-OLED with RT :D
 
Be careful with QD-OLED, the panels are much more susceptible to burn-in than WOLED. I would wait until it matures or proper heatsinks are included.
 
PT Overdive is a 4090-exclusive feature.
I disagree. Im pretty confident a 4080 at 1440p DLSS Q + FG will hit 130+ fps with PT. So if you have a 1440p monitor, the game is perfectly fine on a 4080, a 4070ti and probably even a 4070.
 
I disagree. Im pretty confident a 4080 at 1440p DLSS Q + FG will hit 130+ fps with PT. So if you have a 1440p monitor, the game is perfectly fine on a 4080, a 4070ti and probably even a 4070.
Your confidence is misplaced, and there's no need to guess:


80-105fps, and the reflections look like pretty terrible because they're being rendered at 960p. Yes, you can play - but with FG we're looking at 40-50fps "feel" because of latency, and with scummy reflections. What's the point, buying a 4080, a high-refresh 1440p monitor, and then having the fancy reflections look like crap and having it feel like a 40fps experience? Where do you draw the line, exactly? An RX 6600 will run Overdrive ray tracing in CP2077 at some terrible framerate. Turning on eye-candy is only worth it if the image quality and visual experience goes up.
 
An RX 6600 will run Overdrive ray tracing in CP2077
Wrong. It will crawl. And eventually crash. I already got crashes on RX 6700 XT, which has 1.5x more VRAM. 6600 is just a full-on "I will use ray tracing in a different life" device.
 
Your confidence is misplaced, and there's no need to guess:


80-105fps, and the reflections look like pretty terrible because they're being rendered at 960p. Yes, you can play - but with FG we're looking at 40-50fps "feel" because of latency, and with scummy reflections. What's the point, buying a 4080, a high-refresh 1440p monitor, and then having the fancy reflections look like crap and having it feel like a 40fps experience? Where do you draw the line, exactly? An RX 6600 will run Overdrive ray tracing in CP2077 at some terrible framerate. Turning on eye-candy is only worth it if the image quality and visual experience goes up.
100 fps seems very playable to me. DLSS Q looks great, better than native actually, so im not sure what your issue is.
 
Your confidence is misplaced, and there's no need to guess:


80-105fps, and the reflections look like pretty terrible because they're being rendered at 960p. Yes, you can play - but with FG we're looking at 40-50fps "feel" because of latency, and with scummy reflections. What's the point, buying a 4080, a high-refresh 1440p monitor, and then having the fancy reflections look like crap and having it feel like a 40fps experience? Where do you draw the line, exactly? An RX 6600 will run Overdrive ray tracing in CP2077 at some terrible framerate. Turning on eye-candy is only worth it if the image quality and visual experience goes up.
ray reconstruction is supposed to fix that low res reflections due to upscaling right?
 
It's great to see that even in "Low" settings, nowadays games are looking good enough ...
More like companies don't even bother to make lower quality assets that perform better and just disable effects, lower texture quality...

That's also why many game have unnecessarily high requirements and there's not much performance gain by lowering the settings on many instances

StarCraft 2 might have the best low quality mode I've seen, the game replaces 3D assets with 2D ones. Performs great, looks great
 
Last edited:
performance-pt-1920-1080.png



Cat's out of the bag about Cyberpunk's path tracing mode (and honestly about CDPR's role in nvidia's marketing machine). It has little to do with raytracing performance. The Arc A770 has "raw" raytracing performance equivalent to the RTX3060 Ti, or the RTX3070 in some cases. Here it shows less than a third of the performance, being significantly below the RTX3050.

Cyberpunk's raytracing modes have basically become an ad for an IHV, and it's a shame that CDPR let themselves become this.



Where are all the articles asking CDPR and Nvidia if they're intentionally blocking performance on competing GPUs on path tracing? Where's all the commotion over suspicions of malpractice?
 
performance-pt-1920-1080.png



Cat's out of the bag about Cyberpunk's path tracing mode (and honestly about CDPR's role in nvidia's marketing machine). It has little to do with raytracing performance. The Arc A770 has "raw" raytracing performance equivalent to the RTX3060 Ti, or the RTX3070 in some cases. Here it shows less than a third of the performance, being significantly below the RTX3050.

Cyberpunk's raytracing modes have basically become an ad for an IHV, and it's a shame that CDPR let themselves become this.



Where are all the articles asking CDPR and Nvidia if they're intentionally blocking performance on competing GPUs on path tracing? Where's all the commotion over suspicions of malpractice?
CP both in raster and RT performs similarly to every other non nvidia sponsored game. Don't believe me? Go disable every RT effect except reflections and test. AMD cards will perform very similarly to their nvidia counterparts, exactly like it happens in amd sponsored RT games. The issue is, CP doesn't have just 1 RT effect like the amd sponsored games (because they are trying to not expose their incompetency in doing RT) but lot's of them. When you enable all of them, it's a massacre, just like any other game.

Take RESI village as an example. It only has RT shadows, and to add insult to injury, they are being rendered at 1/4th the resolution that makes them look like crap. Of course it's an amd sponsored and they added RT into the game just to pretend their cards are close to nvidia in RT performance, which is not true.
 
CP both in raster and RT performs similarly to every other non nvidia sponsored game. Don't believe me? Go disable every RT effect except reflections and test. AMD cards will perform very similarly to their nvidia counterparts, exactly like it happens in amd sponsored RT games. The issue is, CP doesn't have just 1 RT effect like the amd sponsored games (because they are trying to not expose their incompetency in doing RT) but lot's of them. When you enable all of them, it's a massacre, just like any other game.

Take RESI village as an example. It only has RT shadows, and to add insult to injury, they are being rendered at 1/4th the resolution that makes them look like crap. Of course it's an amd sponsored and they added RT into the game just to pretend their cards are close to nvidia in RT performance, which is not true.

Nice try evading the Arc 770 reference, though.
 
Nice try evading the Arc 770 reference, though.
I have no idea about arcs performance in rt so...? What do you expect me to say about it?

But nice try evading a whole paragraph... Just face it already, amd is a couple of generations behind in rt. Pretending it isn't will not make it not true,so let's stop it?

Btw, in raster, the 7800xt is 25% faster in cyberpunk than the 4070. Because unlike amd sponsored games nvidia doesn't gimp performance on other cards.
 
I have no idea about arcs performance in rt so...? What do you expect me to say about it?

But nice try evading a whole paragraph... Just face it already, amd is a couple of generations behind in rt. Pretending it isn't will not make it not true,so let's stop it?

Btw, in raster, the 7800xt is 25% faster in cyberpunk than the 4070. Because unlike amd sponsored games nvidia doesn't gimp performance on other cards.
Hah. I remember Crysis 2 tesselation using the technology in places where it didn't give any visual upgrades but it sure tanked the performance. At this point Nvidia dominated in tesselation performance and obviously they worked together with Crytek to boost their GPU sales. It was clear that Nvidia was the card to buy if you wanted best Crysis 2 visuals..
AMD worked with Crytek in Far Cry 1 where you supposedly needed a 64bit CPU for the extra visuals, but someone made it work on 32bit hardware where it worked just fine

These companies are not your friends
 
Hah. I remember Crysis 2 tesselation using the technology in places where it didn't give any visual upgrades but it sure tanked the performance. At this point Nvidia dominated in tesselation performance and obviously they worked together with Crytek to boost their GPU sales. It was clear that Nvidia was the card to buy if you wanted best Crysis 2 visuals..
AMD worked with Crytek in Far Cry 1 where you supposedly needed a 64bit CPU for the extra visuals, but someone made it work on 32bit hardware where it worked just fine

These companies are not your friends
Ι know they are not our friends, but the last few years nvidia has been way more consumer friendly than amd. Amd is blocking technologies altogether and gimps performance in competitors gpus etc.
 
It's just a new sigma male student who ended up being bullied by a smarter alpha and a beefier beta kid. Intel's drivers are in desperate need of maturing and Intel will be the number three for the foreseeable future. A half dozen years at least.

Explanation:
• nVidia have already launched their Game Ready drivers for Phantom Liberty. This is why their GPUs, especially the latest ones, are performing very well in this game.
• AMD GPUs are not ideal, yet they handle these RT effects relatively fine (considering their overall poor RT performance) thanks to overall driver capacity. There is not so much AMD can improve in their drivers besides stability in certain areas at certain graphics settings.
• Intel Arc GPUs are relatively new and thus Intel don't have much experience in AAA-oriented gaming GPU development. This is the reason why their drivers don't have enough "what to do if they throw this at you" instructions for their GPUs. Whilst the tech itself is physically at least on par with AMD competition and most probably can compete with nVidia Ampere ones or even outperform them regarding RT, these GPUs are utterly handicapped by the driver part of this equasion.

This doesn't mean anything else. Intel ain't bad, they just need to grow their production up. nVidia ain't bad, they just use what they can because no one can stop them. AMD? They are just AMD.
 
Well, both RT modes need some refinement...
 
100 fps seems very playable to me. DLSS Q looks great, better than native actually, so im not sure what your issue is.
My issue?

You disagree with my comment, and your reason for that is your claim that the framerate is 130+ FPS
I link you evidence to show the framerate is nowhere near 130+ FPS, but much closer to 85fps, invalidating your argument because you are out out by more than 50%.
Then you ask what my issue is?

My issue is clearly your reading comprehension and inability to apply basic reasoning to a discussion.

ray reconstruction is supposed to fix that low res reflections due to upscaling right?
I hope so, but whilst W1zzard's article this evening paints DLSS3.5 in pretty good light, it's far from perfect and other articles which are more video-heavy clearly point out the other, newer problems with it. We seem to be trading one problem for another to fix low res reflections with ray reconstruction...
 
Last edited:
Ι know they are not our friends, but the last few years nvidia has been way more consumer friendly than amd. Amd is blocking technologies altogether and gimps performance in competitors gpus etc.
Have you seen what Nvidia has done to GPU pricing? There is no way that is Good. Please provide proof of your allegation and if the Game has a console release add the gravity of Sony and Xbox. Part of the narrative is that Sony give better PC features support.....After the Game has made enough sales on the PS5 maybe and what you are saying is so contrary to what AMD's actions have been in the space but I guess the TV and monitor makers are happier to pay for Gsync modules than use Freesync.
 
Wrong. It will crawl. And eventually crash. I already got crashes on RX 6700 XT, which has 1.5x more VRAM. 6600 is just a full-on "I will use ray tracing in a different life" device.
I said "terrible fps" by which I mean 3fps, ie "crawl" as you put it. I don't think I was hinting at playable framerates in the slightest.
 
If you're referring to Starfield, that's been debunked. Nvidia cards perform normally, AMD is just better than usual

A 6950xt matching a 4080. Yeah okay, sure

Have you seen what Nvidia has done to GPU pricing?
What? Their prices are similar to amd's while they offer much better RT performance and software support. If anything I'd say amd gpus are overpriced.

Take for example the 7900xt vs the 4070ti. The latter not only has much better RT performance, DLSS , FG , much lower power draw, but for god's sake it also had better raster performance per dollar, at least with EU pricing. But yeah, nvidia and their prices, lol.

But anyhow, prices are irrelevant, never mentioning them. I said they are more consumer friendly. That means, they don't block YOU out of features on nvidia sponsored games just because you happen to buy an amd card.
 
Back
Top