• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Cyberpunk 2077: Phantom Liberty Benchmark Performance

Looking forward to playing this, there's reports of solid gains for system specs like mine compared to the pre 2.0 PT performance. This game, although barely appealing to me in a gameplay sense, is a technical showcase powerhouse, and I'm very impressed by what my hardware has been able to crunch through so far.
 
Different resolution
It's not the resolution, it's the setting, but still, SO? I said the 6950xt matches the 4080, you said it doesn't, I gave you a graph, and you said it's beacuse of the resolution. How does the resolution turn the 6950xt into a 4080?
 
Looking forward to playing this, there's reports of solid gains for system specs like mine compared to the pre 2.0 PT performance. This game, although barely appealing to me in a gameplay sense, is a technical showcase powerhouse, and I'm very impressed by what my hardware has been able to crunch through so far.
gained around 5-7fps with same exact settings (Path tracing) on my 1.63 to 2.0 Only addition was ray reconstruction.
After a few runs though, it crashed and couldn't start the game again. My old json settings and cache were the culprit and had to set them all again manually
 
Hah. I remember Crysis 2 tesselation using the technology in places where it didn't give any visual upgrades but it sure tanked the performance. At this point Nvidia dominated in tesselation performance and obviously they worked together with Crytek to boost their GPU sales. It was clear that Nvidia was the card to buy if you wanted best Crysis 2 visuals..
AMD worked with Crytek in Far Cry 1 where you supposedly needed a 64bit CPU for the extra visuals, but someone made it work on 32bit hardware where it worked just fine

These companies are not your friends
I remember AMD sponsoring Shadow of Mordor and got devs to release a uncompressed texture pack that delivered no benefits at all (looked identical) but many Nvidia cards (and even some AMD cards) ran out of VRAM. Simply google comparisons. They only released it to gimp cards. Not even with enough VRAM it was pretty much pointless. More like waste of storage.

Both AMD and Nvidia does this :D

However AMD did it ALOT recently and strangely enough many AMD sponsored games don't have DLSS meanwhile most Nvidia sponsored games have FSR and even XeSS. Nvidia don't care about comparisons but AMD don't really like it because FSR is so bad in comparison.

FSR2 looks like garbage in Starfield compared to modded DLSS, yet I am using DLAA to get peak visuals instead of more performance because game is unoptimized crap really.

DLSS Quality mode beat FSR2 with ease tho, going DLAA makes me loose 5-10% fps only (compared to FSR2 and DLSS Quality). Running LOW and ULTRA almost delivers the same performance. Very weird game... CPU is not really punished and almost idles in most locations.
 
It's not the resolution, it's the setting, but still, SO? I said the 6950xt matches the 4080, you said it doesn't, I gave you a graph, and you said it's beacuse of the resolution. How does the resolution turn the 6950xt into a 4080?
No, it literally is the resolution. You posted 1440p Ultra, but the timestamp I referenced in the video was looking at 4k Ultra.

This is the 4k result from the very same place you got your 1440p result from
 
And what's the difference? In your link the difference between the 2 is negligible, that's what, 10%? Come on man...

I think we are talking about GPU power not CPU, 4k is the way to create GPU bottleneck, so to test GPU raw power.
 
I think we are talking about GPU power not CPU, 4k is the way to create GPU bottleneck, so to test GPU raw power.
And still the 6950XT is very close to the 4080. How is that normal?
And no starfield is gpu bound even at 1080p...
 
And still the 6950XT is very close to the 4080. How is that normal?
And no starfield is gpu bound even at 1080p...

If starfield is all GPU bound at 1080p, 1440p and 4k
Then the GPU ranking should remain the same at these resolutions
However it isn't.
From hardware unboxed tests we can clearly see Nvidia cards climbing the charts along with increasing resolution.
This is classic CPU bottleneck behaviour since Nvidia cards had higher CPU overhead.
With increasing resolution more and more CPU resources is 'released' to handle the GPU driver load so 'releases' more Nvidia GPU potential.
 
Last edited:
Hah. I remember Crysis 2 tesselation using the technology in places where it didn't give any visual upgrades but it sure tanked the performance. At this point Nvidia dominated in tesselation performance and obviously they worked together with Crytek to boost their GPU sales. It was clear that Nvidia was the card to buy if you wanted best Crysis 2 visuals..
AMD worked with Crytek in Far Cry 1 where you supposedly needed a 64bit CPU for the extra visuals, but someone made it work on 32bit hardware where it worked just fine

These companies are not your friends
IIRC AMD added a limiter in their driver to stop subdividing polygons more than 64 times when tessellating, just because Nvidia TWIMTBP titles would tesselate hundreds of times with ridiculous diminishing returns just to make it run worse on the AMD hardware. It hurt performance for everyone, but it hurt AMD cards more.

That limiter is still in the AMD driver to this day, and it's a sad legacy of Nvidia foul play going back more than decade ago.
 
If starfield is all GPU bound at 1080p, 1440p and 4k
Then the GPU ranking should remain the same at these resolutions
However it isn't.
From hardware unboxed tests we can clearly see Nvidia cards climbing the charts along with increasing resolution.
This is classic CPU bottleneck behaviour since Nvidia cards had higher CPU overhead.
With increasing resolution more and more CPU resources is 'released' to handle the GPU driver load so 'releases' more Nvidia GPU potential.
I have the game and an nvidia card. Don't you think I know more about it than you do? My 4090 is bottlenecking even at 1080p with 50% res scale.
 
What? Their prices are similar to amd's while they offer much better RT performance and software support. If anything I'd say amd gpus are overpriced.

Take for example the 7900xt vs the 4070ti. The latter not only has much better RT performance, DLSS , FG , much lower power draw, but for god's sake it also had better raster performance per dollar, at least with EU pricing. But yeah, nvidia and their prices, lol.

But anyhow, prices are irrelevant, never mentioning them. I said they are more consumer friendly. That means, they don't block YOU out of features on nvidia sponsored games just because you happen to buy an amd card.
Maybe where you live. I would not buy a card with 12GB of VRAM and a fast chip for accoutrements vs a card with 20 GB of VRAM and a fast chip. How the hell can the 4070TI have more raster per dollar than a card that competes with the 4080?



As far as I am concerned you would be fool to spend the same amount to not just get less VRAM but 8 GB and there is no way pricing is similar. This is the cheapest 4090


At the end of the day $1000 is not the value of DLSS, FG or anything else vs raw performance. Just like how there are not 10 pages flaming on Nvidia for the latest shenanigans from CP2077 vs the S%$%storm that no DLSS (At launch) created for Starfield. As far as I am concerned a $1000 GPU getting 6 FPS in any Game while cards that are half that in price get better frames smells very foul to me regardless of how you may think Nvidia has become some archetype for Corporate governance especially now.

BTW the 7800XT is 1/3 the price of the 4090 and has better power to performance numbers as well so. I play CP2077 and enjoy it but I don't need these "features" as Ray Tracing was created the first time we saw 3D Images on the screen and I don't mean FMV either. If you are going to bring upscaling and tell me that it is better than native, I would tell you if you know the difference, play a Game at 4K for 20 Minutes and then switch to 1440P and see how quick you will notice the difference.

With modern monitors you also need to understand the picture the monitor produces is the most important feature with Freesync (High refresh) next. Once those are covered you are good. I can't explain to you what a difference that makes. It's like getting proper speakers. I don't mean Logitech anything but at the base going to Creative.
 
If you are going to bring upscaling and tell me that it is better than native
It is, if configured properly, and especially if the native TAA implementation sucks balls.

I spent a week configuring FSR (affecting only Cyberpunk 2077 experience) and I'm now currently completely satisfied with the image quality at 1440p @ Performance. Previously, even Quality was too much soap for me.

If a gamedev also allows for native TAA replacement without any upscaling both FSR and DLSS are marvellous. I see out-of-the-box implementation of FSR TAA the most appropriate future of AAA-gaming.

the 7800XT is 1/3 the price of the 4090 and has better power to performance numbers as well so
It doesn't. RDNA3 is inferior to Ada in terms of bang per watt. And always will be, completely regardless of tiering and cherrypicking. Yet I can't disagree on RDNA3 being a better bang per buck.
 
Maybe where you live.
It was the case throughout EU, not just where I live

How the hell can the 4070TI have more raster per dollar than a card that competes with the 4080?
The 7900xt does not compete with the 4080, lol.
As far as I am concerned you would be fool to spend the same amount to not just get less VRAM but 8 GB and there is no way pricing is similar.
And that's just...your opinion. I could argue youd have to be a fool to buy a card with the RT performance that's a couple of generations behind.

BTW the 7800XT is 1/3 the price of the 4090 and has better power to performance numbers as well so.
You owe me a coffee, I spilled mine reading this. LOL man...


But go ahead, we can test this.

At the end of the day $1000 is not the value of DLSS, FG or anything else vs raw performance
But the 7900xt doesn't have any raw performance. It has very similar raster performance with the 4070ti, a card that was actually cheaper. So what raw performance are you talking about man, lol :banghead:

Even at 4k that the 7900xt is at it's best, it's closer to the 6950xt than it is to the 4080, according to the techspot review. Saying it's a 4080 competitor is just completely delusional.

4K.png
 
Last edited:
It was the case throughout EU, not just where I live


The 7900xt does not compete with the 4080, lol.

And that's just...your opinion. I could argue youd have to be a fool to buy a card with the RT performance that's a couple of generations behind.


You owe me a coffee, I spilled mine reading this. LOL man...


But go ahead, we can test this.


But the 7900xt doesn't have any raw performance. It has very similar raster performance with the 4070ti, a card that was actually cheaper. So what raw performance are you talking about man, lol :banghead:

Even at 4k that the 7900xt is at it's best, it's closer to the 6950xt than it is to the 4080, according to the techspot review. Saying it's a 4080 competitor is just completely delusional.

4K.png
Does this review use the same RAM, CPU and Storage as my PC?
 
@W1zzard, sorry for the ping mate, but I have a somewhat relevant question, do you plan on hosting the DLSS Ray Reconstruction DLLs as well?

The Super Sampling (nvngx_dlss.dll) and Frame Generation (nvngx_dlssg.dll) were unfortunately not updated from 3.1.13 in Cyberpunk's latest patch, but it includes the first public release of the Ray Reconstruction DLL (nvngx_dlssd.dll version 3.5.0.0), I noticed that it was still missing from the TPU download section.
 
Does this review use the same RAM, CPU and Storage as my PC?
Yes

What kind of silly question is that? Does that review use the same ram cpu and storage as the guy with the 4070ti's PC?

Are you suggesting that ram cpu and storage will make a difference at 4k? I mean considering a few days ago you were trying to convince me you are getting 130 fps with a 7900xt in cyberpunk 4k with PT, god knows...
 
you are getting 130 fps with a 7900xt in cyberpunk 4k with PT
Sounds doable.
PT: On.
FSR: Ultra Performance.
Every setting: lowest possible.
RX 7900 XT: extremely overclocked.
LSD: taken.
 
Sounds doable.
PT: On.
FSR: Ultra Performance.
Every setting: lowest possible.
RX 7900 XT: extremely overclocked.
LSD: taken.
You got me, I was skipping the last part so couldn't figure out how to hit those numbers. Now it makes sense!
 
Yes

What kind of silly question is that? Does that review use the same ram cpu and storage as the guy with the 4070ti's PC?

Are you suggesting that ram cpu and storage will make a difference at 4k? I mean considering a few days ago you were trying to convince me you are getting 130 fps with a 7900xt in cyberpunk 4k with PT, god knows...
Yes they do. Especially at 4K. You might think it is 2017 but it is 2023 and when have an AMD CPU paired with a 7900XT it doesn't matter is the narrative but I can tell you with confidence that the 7900X3D is the fastest CPU I have ever owned. Before you get in a fit understand that I have used every Generation of Ryzen including the 5900x and 5950x. Do yourself and read some of my posts to understand that I am not a noob. I thought the Sapphire Vapor 7950X was going to my favourite GPU of all time and even had the 7900XTX to compare and what is my verdict on what I am going to spend my money on. I will tell you

Asus X670E Gaming
7900X3D 12/24 CPU
Gskill 5200 Mt/s DDR5 30-30-etc.
Sappjire Pulse 7900XT in a Alphacool Waterblock
22.5 TB of NAND storage including 1 RAID 0 4TB drive and 4 TB RAID 0 NVME array
Sofware is strictly Windows 11 and Gaming platforms
AMD software handles everything including System Specs, OC, UV and the accoutremets the AMD provides.

When you understand that this is the fastest PC I have ever owned with high FPS in everything in my 800+ Game library and yes every Game is at 4K with high FPS so no matter how you may feel it is a great PC to me. I could care less about the 4070Ti, 7800XT or anything. I have my killer PC like what used to be in EGM and other Magazines but I am happy that my Games look better than Magazine images (That was my Qnix 1440P 10 years ago).
 
It's in threads like this that I wonder how much money actually goes into astroturfing, and how much of it is really just ignorance and mindless bias towards trillion dollar corporations.
 
Yes they do. Especially at 4K. You might think it is 2017 but it is 2023 and when have an AMD CPU paired with a 7900XT it doesn't matter is the narrative but I can tell you with confidence that the 7900X3D is the fastest CPU I have ever owned. Before you get in a fit understand that I have used every Generation of Ryzen including the 5900x and 5950x. Do yourself and read some of my posts to understand that I am not a noob. I thought the Sapphire Vapor 7950X was going to my favourite GPU of all time and even had the 7900XTX to compare and what is my verdict on what I am going to spend my money on. I will tell you

Asus X670E Gaming
7900X3D 12/24 CPU
Gskill 5200 Mt/s DDR5 30-30-etc.
Sappjire Pulse 7900XT in a Alphacool Waterblock
22.5 TB of NAND storage including 1 RAID 0 4TB drive and 4 TB RAID 0 NVME array
Sofware is strictly Windows 11 and Gaming platforms
AMD software handles everything including System Specs, OC, UV and the accoutremets the AMD provides.

When you understand that this is the fastest PC I have ever owned with high FPS in everything in my 800+ Game library and yes every Game is at 4K with high FPS so no matter how you may feel it is a great PC to me. I could care less about the 4070Ti, 7800XT or anything. I have my killer PC like what used to be in EGM and other Magazines but I am happy that my Games look better than Magazine images (That was my Qnix 1440P 10 years ago).
And how is any of that relevant to the 7900xt not being a 4080 competitor? LOL

And im pretty sure the review numbers are better than what you are getting with your 5200mhz ram so...
 
I have the game and an nvidia card. Don't you think I know more about it than you do?
I guess you didn't pay attention at all.

So I DO have the game and I DO have the card.
Does it automatically invalidates everything you said now ?
If it doesn't,
What makes you think that 'Having the game and the card' is a valid arguement in the first place ?
 
And how is any of that relevant to the 7900xt not being a 4080 competitor? LOL

And im pretty sure the review numbers are better than what you are getting with your 5200mhz ram so...
Wtf cares it's an Nvidia sponsored title, in Starfield the reverse is true, that mean a lot?!, all those cards you deride, debate mean little to 99.9% of people, and the performance you deride is fine for others, are you really that into Your Pc, go play Cp2077, I do have to wonder if you bothered, or have time.
I'd argue Horizon does real cyberpunk better.
And GtaV with mods looks as real all in. :P
 
It is inadvisable to judge these videogame imaging technologies without watching actual gameplay. Still images aren't the actual experience and many of these graphical tricks work better or worse in motion.

One consistently repeated comment is that Nvidia DLSS 2 is superior to TAA at the native resolution. Sometimes there are weird shimmering/banding issues that don't appear in still screenshots because it's an effect that is caused by motion.

Remember that when playing a videogame you aren't staring at some wire in the background. Your primary focus will be on the character(s) and action in front of you.

As game developers get more proficient at using these technologies (which are temporally oriented), players will see greater benefit during actual gameplay, not still images.

All graphics cards can generate fine images at native resolutions using conventional rasterization techniques. The point is to use other technology to speed up rendering so smooth framerates can be enjoyed during actual gameplay.

Well i been noticing more jaggies since the update, were before i would see them very rearly but now it's much more common, like is AA has been turned down.

Noticing way more bugs too now too.
 
Back
Top