• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Cyberpunk 2077: Phantom Liberty Benchmark Performance

Wtf cares it's an Nvidia sponsored title, in Starfield the reverse is true, that mean a lot?!, all those cards you deride, debate mean little to 99.9% of people, and the performance you deride is fine for others, are you really that into Your Pc, go play Cp2077, I do have to wonder if you bothered, or have time.
I'd argue Horizon does real cyberpunk better.
And GtaV with mods looks as real all in. :p
The person im talking to obviously carse, else he wouldn't be quoting me. If you don't care then move on
 
The person im talking to obviously carse, else he wouldn't be quoting me. If you don't care then move on
The rest of us have to read it, I will move on , you won't, Nvidia v AMD 4eva from you wtaf you have a 4090, And , well done.
 
Ι know they are not our friends, but the last few years nvidia has been way more consumer friendly than amd. Amd is blocking technologies altogether and gimps performance in competitors gpus etc.
This is nonsense. 3070 had little VRAM, 4070 is the same thing, just mere 12GB. Not our friends.
FSR works with everything, DLSS does not. When Nvidia bought PhysX this was not available for everyone, they certainly want you to buy Nvidia cards to just have their technology. I still can't enable some Arkham Knight Nvidia only stuff on my current AMD GPU. Not possible.

Adaptive sync was for everyone else too, Nvidia's G-sync was not.
 
This is nonsense. 3070 had little VRAM, 4070 is the same thing, just mere 12GB. Not our friends.
FSR works with everything, DLSS does not. When Nvidia bought PhysX this was not available for everyone, they certainly want you to buy Nvidia cards to just have their technology. I still can't enable some Arkham Knight Nvidia only stuff on my current AMD GPU. Not possible.

Adaptive sync was for everyone else too, Nvidia's G-sync was not.
That's because AMD has 3% market share. If they made these features exclusive, nobody would use them, lol.
 
That's because AMD has 3% market share. If they made these features exclusive, nobody would use them, lol.
In which parallel world does AMD have 3% of the GPU market? Looking at real-world data, I see about 20% in the worst-case scenario. Speaking to large stores, the data is confirmed.

AMD has tended towards open technologies, even when its market share was larger. They created HBM and decided to leave it free of patents, Nvidia itself is using something that AMD created. They also control the hardware of the consoles but they did not include any ultra-proprietary technology that could only run on their hardware, making the market difficult for Nvidia. Note that games are primarily created with consoles in mind, not the other way around.
 
Have to point out that

sadly will be the only one, unless an actual miracle happens.

Yeah seems like if there was more CP2077 it be on the Unreal engine sadly.
 
In which parallel world does AMD have 3% of the GPU market? Looking at real-world data, I see about 20% in the worst-case scenario. Speaking to large stores, the data is confirmed.

AMD has tended towards open technologies, even when its market share was larger. They created HBM and decided to leave it free of patents, Nvidia itself is using something that AMD created. They also control the hardware of the consoles but they did not include any ultra-proprietary technology that could only run on their hardware, making the market difficult for Nvidia. Note that games are primarily created with consoles in mind, not the other way around.
You realize that not even AMD themselves are using it, right?

The majority of consoles are running nvidia chips btw, not amd.
 
You realize that not even AMD themselves are using it, right?

The majority of consoles are running nvidia chips btw, not amd.
AMD, Nvidia, Intel and the entire AI industry are developing around HBM, a much larger market than gaming.

The consoles that dictate the gaming market on PC are PS5 and Xbox, Nintendo is irrelevant in this discussion, even if their games were ported to PC they would run on any recent iGPU.
 
Last edited:
AMD, Nvidia, Intel and the entire AI industry are developing around HBM, a much larger market than gaming.

The consoles that dictate the gaming market on the PC are the PS5 and Xbox, Nintendo is irrelevant in this discussion, even if their games were ported to the PC they would run on any recent iGPU.
Nintendo is irrelevant, but the AI industry is relevant to gaming. Okay
 
It's in threads like this that I wonder how much money actually goes into astroturfing, and how much of it is really just ignorance and mindless bias towards trillion dollar corporations.
I don't think it's astroturfing, if you're talking about who I think you're talking about. Astroturfers never raise their heads that far over the parapets, because their work is only valid to their employer if it's subtle and not openly biased so clearly that it's obvious to everyone.
 
The majority of consoles are running nvidia chips btw, not amd.
Man you must be from an alternate universe or something.
In this universe, for current gen, only Nintendo Switch is running Nvidia hardware, everyone else runs AMD.

And it will be great if you can have a look and answer my humble question in #148.
 
Man you must be from an alternate universe or something.
In this universe, for current gen, only Nintendo Switch is running Nvidia hardware, everyone else runs AMD.

And it will be great if you can have a look and answer my humble question in #148.
If you have the game then you know that even at 1080p you run into a GPU bottleneck.

EG1. You must be from a different universe. In this universe, the majority of consoles sold have an nvidia chip.
 
That's because AMD has 3% market share. If they made these features exclusive, nobody would use them, lol.
Well, even with the marketshare physX for instance is dead. Perhaps it would make more sense to make sure everyone can actually use it.
Creative did the same mistake with EAX, its own implementation was superb, really enhanced games like Halflife 1, System Shock 2 and Thief 1+2. But Creative merely shared EAX technology in a subpar way to other manufacturers who could only emulate it and did it poorly. So the technology died too.
 
If you have the game then you know that even at 1080p you run into a GPU bottleneck.

EG1. You must be from a different universe. In this universe, the majority of consoles sold have an nvidia chip.
I don'tr know where you get your information from but you need to come out of your shell and read more objective information. Your blind dedication to Nvidia speaks volumes about your ability to not discern propaganda. Your examples of 3% market share for AMD and Modern Consoles having Nvidia speaks volumes to it.
 
Your blind dedication to Nvidia speaks volumes about your ability to not discern propaganda. Your examples of 3% market share for AMD and Modern Consoles having Nvidia speaks volumes to it.
Dude you think your card is a 4080 competitor and im the blindly dedicated. Come on, you can't even make this up :banghead: :banghead:
 

So if I check the original TPU review, will be same performance?

Path tracing seems at this point just a technology demonstration, only hitting 60fps on a 4090 at 1080p.

RT for mainstream hardware too big a hit to use.

Ironically looking at the TPU screenshots, I think the RT/PT off pics look the nicest to me.

Well I checked, the original was faster and used less VRAM, but not by more than a small amount, am surprised. :)

 
Dude you think your card is a 4080 competitor and im the blindly dedicated. Come on, you can't even make this up :banghead: :banghead:
Why don't you look at the relative performance charts from TPU and yes it says 87% but those are from Day 1 reviews and if you don't think AMD cards get better with age, it would prove that you have never owned one. I keep trelling you that I am not a noob and have been PC Gaming since I was 9 years old. As of right now these are the most powerful GPUs we have ever seen from all 3 vendors. I know you have a 4090 and am proud of it but why do you try to deny me that for my experience with my card? It is probably because you are a victim of propaganda but don't feel bad it is rife in today's society. For some context do yourself a favour and take a look at the 7000 Owners club as we are all pretty much happy with our cards and with the content available Gaming is once again a joy to behold and experience.

I seriously do worry about people like you as you seem to be open to regurgitate what the narrative without seeing that it is just that. In reality no one can tell the difference between 120 and 140 FPS and as I have said before Freesync is the reason and objectively more important for PC Gaming than any Nvidia accoutrements as most of them are a detriment to performance. Just look at this Path tracing and now that AI is the focus how long do you think it will be really a focus while they make hand over fist in the commercial sector?

As an example I look at how only now does Nvidia support 2000 series cards with DLSS 3.5 while AMD has been open source since the start of FSR. I hope you are ready when the next iteration of FSR as AMD has been great at dispelling the narrative.
 
Denial

Why don't you look at the relative performance charts from TPU and yes it says 87% but those are from Day 1 reviews and if you don't think AMD cards get better with age, it would prove that you have never owned one. I keep trelling you that I am not a noob and have been PC Gaming since I was 9 years old. As of right now these are the most powerful GPUs we have ever seen from all 3 vendors. I know you have a 4090 and am proud of it but why do you try to deny me that for my experience with my card? It is probably because you are a victim of propaganda but don't feel bad it is rife in today's society. For some context do yourself a favour and take a look at the 7000 Owners club as we are all pretty much happy with our cards and with the content available Gaming is once again a joy to behold and experience.

I seriously do worry about people like you as you seem to be open to regurgitate what the narrative without seeing that it is just that. In reality no one can tell the difference between 120 and 140 FPS and as I have said before Freesync is the reason and objectively more important for PC Gaming than any Nvidia accoutrements as most of them are a detriment to performance. Just look at this Path tracing and now that AI is the focus how long do you think it will be really a focus while they make hand over fist in the commercial sector?

As an example I look at how only now does Nvidia support 2000 series cards with DLSS 3.5 while AMD has been open source since the start of FSR. I hope you are ready when the next iteration of FSR as AMD has been great at dispelling the narrative.
Why are you writing walls of texts. I'm not trying to deny you anything, I'm just stating the facts, the 7900xt is a 4070ti competitor in raster and a 3080 competitor in rt.
 
If you have the game then you know that even at 1080p you run into a GPU bottleneck.

EG1. You must be from a different universe. In this universe, the majority of consoles sold have an nvidia chip.
Playing blind on me ah?

This is what you said at #136
I have the game and an nvidia card. Don't you think I know more about it than you do?

At #148 I asked you one thing.
So I DO have the game and I DO have the card.
Does it automatically invalidates everything you said now ?
If it doesn't,
What makes you think that 'Having the game and the card' is a valid arguement in the first place ?

Please, answer my question.
I DO have the game and the card. So I DO know at least the SAME AS YOU DO.

Do you agree?

(If you agreed, then this invalidates everything you've said before, because my words hold the same 'weights' as you do.)
(If you disagreed, then this invalidates everything you've said before, because 'having the game and the card' doesn't grand me any privileges, you are the same.)

Come on, answer me.

Tough Choice Choose GIF by Next Level Chef
 
Am I reading right though? AMD GPUs now have lesser performance in raster than Nvidia with this 2.0? How can it be?
 
I don't think it's astroturfing, if you're talking about who I think you're talking about. Astroturfers never raise their heads that far over the parapets, because their work is only valid to their employer if it's subtle and not openly biased so clearly that it's obvious to everyone.
I don't know, all the talking points are there. The "AMD's incompetence in RT" message gets repeated ad nauseum and CP2077's path tracing performance is mentioned as if it was the single most important metric in the near future.

CP2077's path tracing performance isn't the future. It's not the future even for CDPR. This DLC was RED Engine's last grasp, as CDPR changes things to Unreal Engine 5.


1695518595080.png
 
I don't know, all the talking points are there. The "AMD's incompetence in RT" message gets repeated ad nauseum and CP2077's path tracing performance is mentioned as if it was the single most important metric in the near future.

CP2077's path tracing performance isn't the future. It's not the future even for CDPR. This DLC was RED Engine's last grasp, as CDPR changes things to Unreal Engine 5.


View attachment 314959
Well you know, since 2077 became THE tech demo now it is natural for biased personnal to grab on the single obvious advantage they found on their desired product and, pretending the rest 99% of the game doesn't matter.
 
Playing blind on me ah?

This is what you said at #136


At #148 I asked you one thing.


Please, answer my question.
I DO have the game and the card. So I DO know at least the SAME AS YOU DO.

Do you agree?
No. Of course not. It would be unreasonable to do so. You are basically asking me if a stranger on the internet (you) is as reliable as my senses are. Nope. You simply might be, for example, lying.

I don't know, all the talking points are there. The "AMD's incompetence in RT" message gets repeated ad nauseum and CP2077's path tracing performance is mentioned as if it was the single most important metric in the near future.

CP2077's path tracing performance isn't the future. It's not the future even for CDPR. This DLC was RED Engine's last grasp, as CDPR changes things to Unreal Engine 5.


View attachment 314959
It's not just cp that amd cards fall flat on their face. Anything that actually has RT and not just the 1 rt effect running at 1/8th the render solution that they do on AMD sponsored games, amd competes with last gen, if even that.

And that's fine, if you don't care about RT, good for you. But people trying to convince everyone else that amd and nvidia is close in RT is just insanity.
 
Back
Top