• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Cyberpunk 2077: Phantom Liberty Benchmark Performance

No. Of course not. It would be unreasonable to do so. You are basically asking me if a stranger on the internet (you) is as reliable as my senses are. Nope. You simply might be, for example, lying.
This statment completely invalidates your own self.
Since you are, a random stranger on the internet to all the rest of us.
If you see unreasonable to respect the opinion of a random stranger,
Then it is exactly how the others should treat you.


Afterall you don't even honor your own statment, when it turns itself against your own:
I have the game and an nvidia card. Don't you think I know more about it than you do?
 
This statment completely invalidates your own self.
Since you are, a random stranger on the internet to all the rest of us.
If you see unreasonable to respect the opinion of a random stranger,
Then it is exactly how the others should treat you.


Afterall you don't even honor your own statment, when it turns itself against your own:
Of course, Im not in disagreement, lol. If your personal experience doesn't align with what im telling you, then obviously youll value your personal experience more than my opinion. But obviously that works for me too, I experience GPU bottleneck even at 1080p, so you telling me it doesn't happen doesn't really matter to me.
 
It's not just cp that amd cards fall flat on their face.

With ridiculous performance deltas it's basically just CP2077 (a 3 year-old game with a new DLC using a dead end engine), Control (an even older game from 2019) a couple tech demos and a handful other games.


What actually matters is how the graphics cards compare in games that were the most sold and played this year. That means Baldur's Gate 3, Starfield, Jedi Survivor, Call of Duty MW2, Resident Evil 4, etc. Feel free to check benchmarks on those.


Cyberpunk's path tracing mode makes for nice videos made by Digital Foundry (who don't have to pay for their graphics cards) that no one will remember 2 months from now, but its performance isn't really that relevant for the PC GPU market.
99% of the people can't even really run it and most of the 1% that can aren't playing Cyberpunk.
 
With ridiculous performance deltas it's basically just CP2077 (a 3 year-old game with a new DLC using a dead end engine), Control (an even older game from 2019) a couple tech demos and a handful other games.


What actually matters is how the graphics cards compare in games that were the most sold and played this year. That means Baldur's Gate 3, Starfield, Jedi Survivor, Call of Duty MW2, Resident Evil 4, etc. Feel free to check benchmarks on those.


Cyberpunk's path tracing mode makes for nice videos made by Digital Foundry (who don't have to pay for their graphics cards) that no one will remember 2 months from now, but its performance isn't really that relevant for the PC GPU market.
99% of the people can't even really run it and most of the 1% that can aren't playing Cyberpunk.
Dying light, Ratchet and clank, Cp 2077, Chernobylite, Hogwarts, and so on.

If you are telling me these games aren't successful, you are crazy.

If RT isn't relevant to the PC GPU market then nvidia wouldn't have like 80% market share man. You are the minority here
 
That's because AMD has 3% market share. If they made these features exclusive, nobody would use them, lol.
Why do you make up such stupid numbers that are so trivial to disprove? I'm genuinely curious how an individual can be so inaccurate and have so little grounding in reality.

AMD has 17.5% of the market right now, down from 20% this time last year. They have never had less than 10% of the market, and the last time things were that bad was 2002.
source

That also ignores the fact that game developers focus largely on console releases first, where AMD has 100% of the market, because Nvidia didn't want to play in a market they can't control or manipulate.
 
Why do you make up such stupid numbers that are so trivial to disprove? I'm genuinely curious how an individual can be so inaccurate and have so little grounding in reality.

AMD has 17.5% of the market right now, down from 20% this time last year. They have never had less than 10% of the market, and the last time things were that bad was 2002.
source

That also ignores the fact that game developers focus largely on console releases first, where AMD has 100% of the market, because Nvidia didn't want to play in a market they can't control or manipulate.
Funny. Very funny. You claim I make up stupid numbers that are trivial to disprove, and then claim that amd has 100% of the console market. Have you heard of nintendo switch? You know it sold more than both ps5 and xbox combined, right? But who cares about any of that? What is your point?

Aren't you people bored already sh**ting on nvidia and intel every time you have (or not even that) the chance and then pat each other on the back for a job well done? Jesus christ man, they are multi billion faceless corporations, they really don't care about you. Stop being so passionate.
 
Funny. Very funny. You claim I make up stupid numbers that are trivial to disprove, and then claim that amd has 100% of the console market. Have you heard of nintendo switch? You know it sold more than both ps5 and xbox combined, right? But who cares about any of that? What is your point?

Aren't you people bored already sh**ting on nvidia and intel every time you have (or not even that) the chance and then pat each other on the back for a job well done? Jesus christ man, they are multi billion faceless corporations, they really don't care about you. Stop being so passionate.
How old is the Switch? I could use the Gameboy or PS2 but Games we play like CP2077 would not work on the Switch. Can you play any Nintendo exclusive on PC? Is the Switch even comparable to the PS5 but wait there are handhelds that have been sold for the last year that blow the Switch away and do you know what chips they use? You seem to think that people are ragging on Nvidia or Intel but the truth is we are just responding to your narrartive. If only because it is flawed. It almost feels like you are stuck in 2016 and have not seen what Lisa Su has done. That is not me though as the stock price of AMD means that they must be doing something right with a jump from $3.50 to today's price of $96.20.

Passionate? Are you not the one putting out lies like AMD has 3% of the GPU market on a thread that has nothing to do with it? Are you not the one telling me how my PC performs without context? Are you not the one refuting what people say about the truth with your narrative? That sounds pretty passionate to me.

AMD is a better Company than Nvidia to me for one reason. I buy my card and know that the narrative will give me features that Nvidia makes you buy a new card for free. There is also the fact that AMD has been Champion of Open source since the 90s.
 
If you have the game then you know that even at 1080p you run into a GPU bottleneck.

EG1. You must be from a different universe. In this universe, the majority of consoles sold have an nvidia chip.
No console with an Nvidia chip can even play this game natively, that's why in this conversation the switch is beyond consideration.

So is the switch good at FG dlss RT etc.

The rest of us have to read it, I will move on , you won't, Nvidia v AMD 4eva from you wtaf you have a 4090, And , well done.

Funny. Very funny. You claim I make up stupid numbers that are trivial to disprove, and then claim that amd has 100% of the console market. Have you heard of nintendo switch? You know it sold more than both ps5 and xbox combined, right? But who cares about any of that? What is your point?

Aren't you people bored already sh**ting on nvidia and intel every time you have (or not even that) the chance and then pat each other on the back for a job well done? Jesus christ man, they are multi billion faceless corporations, they really don't care about you. Stop being so passionate.
Reeeeeallly, You, went there.
 
How old is the Switch?
That matters how?
Can you play any Nintendo exclusive on PC?
Yes, all of them
You seem to think that people are ragging on Nvidia or Intel but the truth is we are just responding to your narrartive.
No you are not responding to any narrative, you are creating a narrative where your 7900xt is a 4080 competitor and gets 130 fps in cyberpunk with PT. In other words, you are making stuff up.

AMD is a better Company than Nvidia to me for one reason. I buy my card and know that the narrative will give me features that Nvidia makes you buy a new card for free.
Absolutely. Greedy nvidia only gave FG to the latest gen. AMD on the other hand....oh wait, AMD gave it to 0 cards as of right now. Nvidia is a better company cause it allows me to play games TODAY that AMD will manage to 3 to 5 years from today. They are really, really, really far behind. When AMD makes better products, I buy them (that's how I ended up with 4 full amd laptops). You just buy them no matter what, even when they are like 5 years behind in GPUs.

No point trying to reason with you. You are completely gone my man.
 
Aren't you people bored already sh**ting on nvidia and intel every time you have (or not even that) the chance and then pat each other on the back for a job well done?
Nobody here is "sh**ting" on any corporations except you, seemingly, with some indefensible anti-AMD rhetoric that has brought up multiple different comments from multiple different people about your bias, possible astroturfing and other generally disagreeable things when trying to have an open and balanced discussion.

The person I'm personally "sh**ting" on is you for spouting easily disprovable nonsense. You just make stuff up that's clearly FUD, we prove you wrong with links to articles proving you wrong by wide margins, and then you ignore the accusation and pick some new nonsense to get wrong in your seemingly obsessive one-man crusade against AMD.

We get it, the 4090 is better than the 7900XTX and the DLSS is better than FSR, but that's why at every price point AMD has cards in, it undercuts Nvidia on pricing. If you have between $200 and $1000 to spend, you are going to get better gaming for your money on AMD because they simply offer more performance per dollar, and not everyone cares about or needs the Nvidia exclusive features that command such a premium.
 
Nobody here is "sh**ting" on any corporations except you, seemingly, with some indefensible anti-AMD rhetoric that has brought up multiple different comments from multiple different people about your bias, possible astroturfing and other generally disagreeable things when trying to have an open and balanced discussion.

The person I'm personally "sh**ting" on is you for spouting easily disprovable nonsense. You just make stuff up that's clearly FUD, we prove you wrong with links to articles proving you wrong by wide margins, and then you ignore the accusation and pick some new nonsense to get wrong in your seemingly obsessive one-man crusade against AMD.

We get it, the 4090 is better than the 7900XTX and the DLSS is better than FSR, but that's why at every price point AMD has cards in, it undercuts Nvidia on pricing. If you have between $200 and $1000 to spend, you are going to get better gaming for your money on AMD because they simply offer more performance per dollar, and not everyone cares about or needs the Nvidia exclusive features that command such a premium.
Claiming you are disproving me doesn't mean you are. I can claim I'm disproving you.

Through the last pages I've learned that the 7900xt is a 4080 competitor and it plays cyberpunk with PT 4k at 130 fps. Facts!

AMD doesn't undercut anything. It has the rt performance of Turing while it's charging ada prices. Apparently everyone cars about those exclusive features that's why nvidia is outselling amd 5 to 1.
 
That matters how?

Yes, all of them

No you are not responding to any narrative, you are creating a narrative where your 7900xt is a 4080 competitor and gets 130 fps in cyberpunk with PT. In other words, you are making stuff up.


Absolutely. Greedy nvidia only gave FG to the latest gen. AMD on the other hand....oh wait, AMD gave it to 0 cards as of right now. Nvidia is a better company cause it allows me to play games TODAY that AMD will manage to 3 to 5 years from today. They are really, really, really far behind. When AMD makes better products, I buy them (that's how I ended up with 4 full amd laptops). You just buy them no matter what, even when they are like 5 years behind in GPUs.

No point trying to reason with you. You are completely gone my man.
You are very misguided on the reality of what a PC means. Why do you care so much?
 
Of course, Im not in disagreement, lol. If your personal experience doesn't align with what im telling you, then obviously youll value your personal experience more than my opinion. But obviously that works for me too, I experience GPU bottleneck even at 1080p, so you telling me it doesn't happen doesn't really matter to me.

You are still trying to evade my question.
My point is simple.

You don't have to agree with me.
You don't even agree with your own words:
I have the game and an nvidia card. Don't you think I know more about it than you do?

There is no point wasting time keep talking to someone who don't honor their own statements.
Farewell.
 
Game is so popular they even made a whole 1hr 15min documentary for German TV. :wtf: Which like never ever happens for any game.

Inside the Game - Cyberpunk 2077: Phantom Liberty (English Version)
https://www.ardmediathek.de/video/i...jMjk1MDktMmZjOC00NWNhLTg2NWUtMmY3NDNmNTJhN2Vj
The documentary is fully packed with dev interviews and behind the scenes. Very insightful & personal. If it's blocked in your country just use a VPN.
 
You are still trying to evade my question.
My point is simple.

You don't have to agree with me.
You don't even agree with your own words:


There is no point wasting time keep talking to someone who don't honor their own statements.
Farewell.
Im not evading at all. Im saying, yes, if you have the game and the card, for you what im saying is irrelevant. If your experience is not the same with mine then for all you know im lying to you. What more of an answer do you require, lol

You are very misguided on the reality of what a PC means. Why do you care so much?
Care about what? You 3 are spam quoting me like crazy.
 
With ridiculous performance deltas it's basically just CP2077 (a 3 year-old game with a new DLC using a dead end engine), Control (an even older game from 2019) a couple tech demos and a handful other games.


What actually matters is how the graphics cards compare in games that were the most sold and played this year. That means Baldur's Gate 3, Starfield, Jedi Survivor, Call of Duty MW2, Resident Evil 4, etc. Feel free to check benchmarks on those.


Cyberpunk's path tracing mode makes for nice videos made by Digital Foundry (who don't have to pay for their graphics cards) that no one will remember 2 months from now, but its performance isn't really that relevant for the PC GPU market.
99% of the people can't even really run it and most of the 1% that can aren't playing Cyberpunk.
Funny you mention all AMD sponsored titles and games that generally runs well on AMD, while hating Cyberpunk because it runs and looks best on Nvidia, like usual :laugh:

Sadly for you, most games plays best on Nvidia hardware because most developers are optimizing for Nvidia (80% of PC users own Nvidia) and 9 out of 10 new games have RTX features.

But yeah, Starfield plays great on AMD GPUs with no sun present. I play Starfield with DLAA and a Sun :D Overrated game tho.

Also, AMD won't even release a RDNA4 high-end GPU. Meanwhile RTX 5090 hits 2025. AMD is done for in the high end market really.
 
I Think that the developers are getting lazy on texture compression the industry basically uses BC7 in general, i that eats memory in a gluttony way (in a open world full of details for instance), the industry today lacks innovation; In the 90s when the limit was the hardware brilliant technologies pop from january to january, now i think that the industry is lacking hungry of motivation. They are unconcerned about it because they (obviously) have the best machines, but I think they think everyone else has it too!
And the use of interpolation in the way that is being done is piteous to me, i have seen multiple games simply don't prune the non visible textures, and as far i know the culling from the GPU don't erase the underground or the empty rooms in many games
 
Pathtracing actually does make a noticeable difference on the slide bar when comparing it to ultra with RT on or off. However, ultra doesn't seem to look any different with RT on or off.
Is there something wrong with ampere drivers ? the performance hit vs 1.x is pretty big, when barely anything changed for RDNA 2
View attachment 314439View attachment 314440
You're right, but it's not just the Ampere cards doing worse than RDNA2 because both Ampere and Turing cards suffer a much larger percentage drop than either RDNA1 or RDNA2. It was a bit challenging because only 4 cards from each side appear on both charts, but I did a little math with them. One of the results is mind-boggling:

GeForce Cards:

RTX 2080 Ti 11GB: 55.9->38.8 (31%) - Turing
RTX 3060 12GB: 35.4->23.9 (-48%) - Ampere
RTX 3070 8GB: 57.0->38.4 (-33%) - Ampere
RTX 3080 10GB: 68.5->49.2 (-28%) - Ampere

Radeon Cards:

RX 5700 XT 8GB: 36.0->29.5 (-18%) - RDNA1
RX 6600 XT 8GB: 34.0-36.3 (+8%) - RDNA2 o_O:twitch::eek:
RX 6700 XT 12GB: 49.9->45.4 (-9%) - RDNA2
RX 6800 XT 16GB: 72.8->66.3 (-9%) - RDNA2

Initially, I compared the RTX 3080 with the RX 6800 XT because these two cards have been natural performance rivals within low single-digits of each other percentage-wise. Seeing the 28% drop in frames for the RTX 3080 but only 9% drop in frames for the RX 6800 XT was something that I didn't think should be possible. My first thought was that maybe it's because of the difference in VRAM so I checked the RTX 3060 vs. the RX 6700 XT since they both have 12GB. Clearly, I was wrong because the 6700 XT lost the same 9% as the 6800 XT but the 3060 lost a whopping 48% despite having more VRAM than the 3080. Then, just to blow my mind (I had to check that I did the numbers in the correct order), the RX 6600 XT actually gains frames while every other card loses frames.

I find CP2077 to be a pretty ironic game with respect to GPU performance because it's nVidia's biggest RT poster-child and crushes Radeons with its RT more than any other game out there. It's always the nvidia-favouring outlier when RT is turned on (especially when maxxed out). What's ironic is that when RT isn't enabled, the game actually prefers Radeon cards. IIRC, Control was similar in this fashion although it no longer crushes Radeons like it once did (probably a new Radeon driver improved performance dramatically).

That's both insane and funny at the same time. :kookoo: :laugh:
 
Last edited:
Pathtracing actually does make a noticeable difference on the slide bar when comparing it to ultra with RT on or off. However, ultra doesn't seem to look any different with RT on or off.

You're right, but it's not just the Ampere cards doing worse than RDNA2 because both Ampere and Turing cards suffer a much larger percentage drop than either RDNA1 or RDNA2. It was a bit challenging because only 4 cards from each side appear on both charts, but I did a little math with them. One of the results is mind-boggling:

GeForce Cards:

RTX 2080 Ti 11GB: 55.9->38.8 (31%) - Turing
RTX 3060 12GB: 35.4->23.9 (-48%) - Ampere
RTX 3070 8GB: 57.0->38.4 (-33%) - Ampere
RTX 3080 10GB: 68.5->49.2 (-28%) - Ampere

Radeon Cards:

RX 5700 XT 8GB: 36.0->29.5 (-18%) - RDNA1
RX 6600 XT 8GB: 34.0-36.3 (+8%) - RDNA2 o_O:twitch::eek:
RX 6700 XT 12GB: 49.9->45.4 (-9%) - RDNA2
RX 6800 XT 16GB: 72.8->66.3 (-9%) - RDNA2

Initially, I compared the RTX 3080 with the RX 6800 XT because these two cards have been natural performance rivals within low single-digits of each other percentage-wise. Seeing the 28% drop in frames for the RTX 3080 but only 9% drop in frames for the RX 6800 XT was something that I didn't think should be possible. My first thought was that maybe it's because of the difference in VRAM so I checked the RTX 3060 vs. the RX 6700 XT since they both have 12GB. Clearly, I was wrong because the 6700 XT lost the same 9% as the 6800 XT but the 3060 lost a whopping 48% despite having more VRAM than the 3080. Then, just to blow my mind (I had to check that I did the numbers in the correct order), the RX 6600 XT actually gains frames while every other card loses frames.

I find CP2077 to be a pretty ironic game with respect to GPU performance because it's nVidia's biggest RT poster-child and crushes Radeons with its RT more than any other game out there. It's always the nvidia-favouring outlier when RT is turned on (especially when maxxed out). What's ironic is that when RT isn't enabled, the game actually prefers Radeon cards. IIRC, Control was similar in this fashion although it no longer crushes Radeons like it once did (probably a new Radeon driver improved performance dramatically).

That's both insane and funny at the same time. :kookoo: :laugh:
Nvidia sponsored games are neutral in terms of performance. AMD is just not good on RT and it plummets because the game uses a lot of effects.
 
Uhh, path tracing looks really bad. Funny that in theory its a more full version of a ray tracing but it looks even worse, unnatural, even basic light effects are failing... :D
There are definitely serious problems with the implementation right now, it should look better than raytracing or at least equal.

Btw as i see, AMD gpu-s are like dead weight in complex ray tracing...another reason to avoid them, i hope Intel's next generation will be even better at raytracing than the current one, so it will be some competition against Nvidia. By that time, i hope intel really up their driver game too...
 
Nvidia sponsored games are neutral in terms of performance. AMD is just not good on RT and it plummets because the game uses a lot of effects.
Since when?
 
Can you name any recent nvidia sponsored games that run like crap on AMD? Cause I can name a few amd sponsored ones that run like crap on nvidia.
Control, Withcher 3, CP2077, Batman Artkham, Tomb Raider: All when you turn on Nvidia features.
 
Control, Withcher 3, CP2077, Batman Artkham, Tomb Raider: All when you turn on Nvidia features.
Like fish in a barrel this is. I truly applaud your diligence for all of us.
 
Also, AMD won't even release a RDNA4 high-end GPU. Meanwhile RTX 5090 hits 2025. AMD is done for in the high end market really.
You know this because you obviously work for AMD.
 
Back
Top