• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 7900 XTX

Is there any logical reason to measure performance per watt based on CP2077 and not the overall average?
Power is measured on a different machine than the normal performance. Testing all games would take forever, more than perf/watt is worth. I picked CP because it's highly GPU bound and everybody definitely has their drivers well-optimized for that title.
 
I guess those rumors about AMD Hardware bugs were right, in the AMD slides, and AMD themselves i swear it said this card would push 3GHZ. If it actually could have made it that high on the frequency i think it could have been a competitor to the 4090. Right now the card is kinda all over the place in FPS, frame times and power consumption. The card was clearly not ready but they pushed it out anyways. Let's see what they can do to salvage this in the next couple months with drivers.
I dunno, the overclocking results don't seem to scale so well with frequency.
 
It did, but there's no games using SER at default yet...

Sad to say but rdna2 was more competitive with ampere than rdna3 is to ada(probably just because of TSMC).
Nvidia's cards use the 4nm process, while AMD is using the 5nm and 6nm process, but Nvidia is paying more and thus we see $1600 4090, $1200 4080, while AMD are able to offer slightly cheaper cards.

I don't think Nvidia can reduce the price of either of the 4000 GPU's by much as they would likely be facing selling at cost if they lowered the RTX 4080 to say $900, anything less than that and I think they are likely to be selling at a loss.

I think the RTX 4090 can be sold for quite a bit less before essentially selling it at cost, but Nvidia are not going to do that! I believe they can lower the RTX 4090 to $1100 and sell it at cost.

Again, people also need to consider R&D, Marketing, testing, driver and features development, etc... when accounting for the cost of these GPU's. If its purely the cost of producing these, plus packaging, plus shipping then you can easily go down to something like $400 for the RTX 4090, but you got to understand that they need to have a leeway for AIB partners to make money and you have to account for the cost in developing these cards in the first place!

I think AMD is in a slightly better position to enter into a price war with Nvidia, but they are also limited in how "cheap" they can go.

To me the 7900XTX is extremely competitive with the RTX 4080, on average about 5% faster and $200 cheaper! We also see that there are several games in which the 7900 series clearly have some performance issues and AMD is going to have to address these in upcoming driver releases. I'm certain these cards will gain at least 5% more performance in the next few months as AMD iron out issues and optimize some of the games through drivers. We clearly see it in every past generation that their GPU's do gain significant performance over time as AMD optimize performance in future driver releases!

Nvidia on the other hand pretty much never gain performance in subsequent driver releases and in some cases actually lose performance as Nvidia fix stability issues in various games!
 
Probably if you normalise for the shaders quantity change and frequency.
Look at how close the 7900 XT is to 6950 XT :twitch: 126 vs 113 is literally the same performance.

View attachment 274162
AMD Radeon RX 7900 XTX review - The Witcher III Wild Hunt (guru3d.com)
index.php

To be honest this is where I figured the cards would rank on the average metrics give or take a bit. 7900XTX comfortably sitting between the 4080 and 4090 with the 7900XT close to the 4080 on average.

What it shows to me is that AMD are not extracting all the ILP available at the moment so some games see crap gains and others are pretty damn good. I am sure if the clocks were as high as AMD wanted to begin with even the likes of Witcher 3 where the ILP is pretty low and not making good use of the dual issue shaders the raster performance would be above that of the 4080 and then in cases where AMD have extracted ILP for their shader design you get situations where the XTX matches or even beats the 4090.
 
Where is the performance increase? There is none, it's in the margin of error.
I feel like my error is considerably lower than that. I think the improvements are in those 2-3%
 
Buldozer was in some cases slower than the previous gen Phenoms. That was the majority of the disappointment behind Buldozer. Is RDNA3 slower in some cases than RDNA2?
Dual ALUs in Bulldozer, dual ALUs in RDNA3 (again if I am describing it wrong, please feel free to correct me).

That's why I was thinking Bulldozer.
As about performance, it's not much faster. It's not as fast as AMD was promising in it's slides. In fact if we use the 12288 number for the SPs, performance is nowhere near where we should be expecting it. So calling RDNA3 faster, technically is correct, but not in a meaningful way.
 
I'm surprised at just how big the differences are with the different CPU at 1080p compared to at 4k. Makes me wonder how low my scores would be with this GPU, as I have a Ryzen 5 2600x

You should be able to extrapolate from this data and this review. But these cards are simply not made for 1080p, they are super-CPU bound at that res. You can see that even with the 13900K, when results bunch up to an invisible wall (like Borderlands 3 1080p)
 
Well, it's... okay, I guess? However with these numbers Huang can either simply continue the giga-milking mode or deliver a fatality just by dropping 4080 to a $1000. Unfortunate for everyone on the consumer side. GG to AMD for enabling better margins for themselves if they're really saving by going with chiplets.
from my experience, people don't really care about the details. the only thing that seems to move product is brand perception. for example, i was at a microcenter about 3 weeks ago, and saw two different individuals walk out with $700 RTX 3070ti cards (1 asus strix and 1 msi suprim) when they could have got a 6900XT at the same store for only $650. the sales people tried to steer them in the right direction but they were not having it. same trend everywhere. newegg sold out of $800, RTX 3080 long before selling out their $700-$750 RTX 6900XTs.

perception trumps all else and in light of that i don't see huang taking his foot off the "moar money" gas pedal. as long as they have the halo product (rtx 4090) people will just mindlessly flock to them. and i'm not an nvidia hater by any stretch of imagination. i bought a 3090 on release day myself.
 
To be honest this is where I figured the cards would rank on the average metrics give or take a bit.
I think Guru3D is testing the in-game benchmark, not the actual game. Maybe AMD optimized their shader compiler for dual issue only for the integrated benchmarks?
 
Well I have to say I am somewhat happy with its performance as it is better in the way most people are going to use it. But I am disappointed that its not more above the 4080 but had a feeling based on the way it was being advertised from AMD. Memory overclocking was interesting as it seems regular overclocking barely helps if at all, I wonder if they are being held back by the fact they are relying on the Infinity cache and not using GDDR6X memory. I get they are doing it because of the power consumption of said memory, but I wonder if its being held back by that fact because the performance jump in overclocking the memory was pretty significant.

As for everyone complaining about Ray Tracing performance. In all seriousness I want to know if people are seriously using the tech in the small list of games it supports. I get that its something to compare and talk about when comparing all the cards, but lets be frank here that the performance drop even with DLSS enabled is still huge to this day. The only card that delivers reasonable performance with it all enabled is the 4090 and it struggles as well in many of the games.
 
CPU bottlenecks very hard in some games so the ~45% uplift vs 6900XT @4K is pessimistic. For any game utilising the GPU properly it get closer to 60% and in some games even at 80% over the 6900XT (on other online reviews). So the up to 54% higher efficiency is true.


well, they stepped-up this time to Raptor Lake i9 (so the 4k should be pretty GPU-limited)

maybe swap 1080 for 5k, just for top-end cards; also, another new AMD-smack-down of RT is missing from this test : Portal RTXa!
 
Last edited:
I dunno, the overclocking results don't seem to scale so well with frequency.
That's why i think there is something wrong with the Hardware, the slides don't add up. Where's the power per watt performance and the Raster performance, just seems like something went wrong and AMD did not hit the marks they were trying to achieve in the slides. Or the drivers and AMD marketing is really bad this time around.
 
from my experience, people don't really care about the details. the only thing that seems to move product is brand perception. for example, i was at a microcenter about 3 weeks ago, and saw two different individuals walk out with $700 RTX 3070ti cards (1 asus strix and 1 msi suprim) when they could have got a 6900XT at the same store for only $650. the sales people tried to steer them in the right direction but they were not having it. same trend everywhere. newegg sold out of $800, RTX 3080 long before selling out their $700-$750 RTX 6900XTs.

perception trumps all else and in light of that i don't see huang taking his foot off the "moar money" gas pedal. as long as they have the halo product (rtx 4090) people will just mindlessly flock to them. and i'm not an nvidia hater by any stretch of imagination. i bought a 3090 on release day myself.

People and reviewers keep throwing that at buyers. How many people was AMD burned in the last couple of years? It's not exactly like "perception" came out of think air.
 
What it shows to me is that AMD are not extracting all the ILP available at the moment so some games see crap gains and others are pretty damn good. I am sure if the clocks were as high as AMD wanted to begin with even the likes of Witcher 3 where the ILP is pretty low and not making good use of the dual issue shaders the raster performance would be above that of the 4080 and then in cases where AMD have extracted ILP for their shader design you get situations where the XTX matches or even beats the 4090.
vega 64 all over again. not even joking. I'm sick of AMD repeating the same mistakes over and over again. have they not learned?
 
By the way, regarding RT performance, checking the RDNA3's architecture slides — there's only stuff about effiency and algorithm optimizations there for RT. So I suppose AMD's RT cores are still not doing BVH traversals and final ray shading, offloading them to standard cores, just like RDNA2 did — hence the performance drop in the games is more or less the same. Maybe once game devs wrap their heads around architectural improvements they could squeeze a bit more, but if the cores are still not as "complete" as Nvidia's and Intel's there are no surprises here.

from my experience, people don't really care about the details. the only thing that seems to move product is brand perception. for example, i was at a microcenter about 3 weeks ago, and saw two different individuals walk out with $700 RTX 3070ti cards (1 asus strix and 1 msi suprim) when they could have got a 6900XT at the same store for only $650. the sales people tried to steer them in the right direction but they were not having it. same trend everywhere. newegg sold out of $800, RTX 3080 long before selling out their $700-$750 RTX 6900XTs.

perception trumps all else and in light of that i don't see huang taking his foot off the "moar money" gas pedal. as long as they have the halo product (rtx 4090) people will just mindlessly flock to them. and i'm not an nvidia hater by any stretch of imagination. i bought a 3090 on release day myself.

Lamentable! Also I can't see AMD ever beating Nvidia's halo product, I don't think they have the r&d budget. Plus 4090 isn't even a full die.
 
well, they stepped-up this time to Raptor Lake i9 (so the 4k should be pretty GPU-limited)
Just to clarify, I've retesting every single card on the new rig, the 6900 XT numbers were tested last week on the 22.11.2 drivers
 
That can be fixed easily, stop giving nvidia money, regardless of what they have to offer. This is the sh!t, people want AMD to produce a 4090 killer, but offer it at US$500, just so Nvidia is "forced" to cut prices so the rabid followers would still give nvidia their money.
I'm gonna leave this here
Top Nvidia shareholders

Vanguard Group Inc. representing 7.7% of total shares outstanding

BlackRock Inc. representing 7.2% of total shares outstanding

Top AMD shareholders

Vanguard Group Inc. representing 8.28% of total shares outstanding

BlackRock Inc. representing 7.21% of total shares outstanding

 
Some people here sound like AMD has personally insulted them by releasing this abysmal disgrace of a product. If you don't like it, don't by it. It's that simple.

Here in Germany the cheapest RTX 4080 is 1350€, if AMD actually manages to have XTX cards in stock tomorrow for 1150€ that would be 200€ or ~15% cheaper for the same raster performance.

I think I'm finally going to replace my Vega 64, just need a waterblock to go along with the card. Does anybody know when alphacool releases theirs?
 
I'm gonna leave this here
Top Nvidia shareholders

Vanguard Group Inc. representing 7.7% of total shares outstanding

BlackRock Inc. representing 7.2% of total shares outstanding

Top AMD shareholders

Vanguard Group Inc. representing 8.28% of total shares outstanding

BlackRock Inc. representing 7.21% of total shares outstanding

hahaha, OK, the prices now make total sense
 
Nvidia on the other hand pretty much never gain performance in subsequent driver releases and in some cases actually lose performance as Nvidia fix stability issues in various games!
categorically false
 
Maybe I'm weird but I just don't understand how any of these $900-1200 cards make any sense to purchase. If I'm going to be spending four figures (or just shy of it) on a graphics card, I want the best. And none of these are the best - the 4090 trounces them.

I either want the best, or I want some value, and these cards exist in a no-man's land for me. Where are the great new GPUs between $500-700? Nowhere in sight. Both AMD and Nvidia refusing to move the needle on performance in what was historically the high end pricing tier.
 
Some people here sound like AMD has personally insulted them by releasing this abysmal disgrace of a product. If you don't like it, don't by it. It's that simple.

Here in Germany the cheapest RTX 4080 is 1350€, if AMD actually manages to have XTX cards in stock tomorrow for 1150€ that would be 200€ or ~15% cheaper for the same raster performance.

I think I'm finally going to replace my Vega 64, just need a waterblock to go along with the card. Does anybody know when alphacool releases theirs?
For me the card would be fine if it was consistent, but it is not and its all over the place. Some people have been burned where AMD never fixes the issues. in the 6k series AMD fixed the issues but why cant they get it right, every year AMD has issues at 1k we should at least get a consistent card.
 
I'm gonna leave this here
Top Nvidia shareholders

Vanguard Group Inc. representing 7.7% of total shares outstanding

BlackRock Inc. representing 7.2% of total shares outstanding

Top AMD shareholders

Vanguard Group Inc. representing 8.28% of total shares outstanding

BlackRock Inc. representing 7.21% of total shares outstanding

Interesting, but they only count for less than 20% of shareholders, so who has the other 80% and are they also in cahoots?
 
People and reviewers keep throwing that at buyers. How many people was AMD burned in the last couple of years? It's not exactly like "perception" came out of think air.
Every time someone points that out you start hearing the cope brigade "oh well nvidia has the same problems (lolno)" "oh nvidia is evil" "gamers just dont care about AMD" ece.

Few are willing to admit that AMD dug that hole for themselves, starting in 2006 when they overpaid by over $2 billion for ATi, and the subsequent decade of crap drivers and poorly supported releases gained AMD the reputation of the budget brand. EVen recently we had the rDNA downlcokcing bug and the first gen APUs relying on OEMs for drivers, both of which required pressure from tech media to fix.

Also on a side note: did you know AMD still sells bulldozer chips for chromebooks? They suck, and they have that nice red AMD sticker to draw your attention to who made this slow garbage. That's helping the public image. Like you said, perception is everything, and AMD's optics have not been all that great historically speaking
Maybe I'm weird but I just don't understand how any of these $900-1200 cards make any sense to purchase. If I'm going to be spending four figures (or just shy of it) on a graphics card, I want the best. And none of these are the best - the 4090 trounces them.

I either want the best, or I want some value, and these cards exist in a no-man's land for me. Where are the great new GPUs between $500-700? Nowhere in sight. Both AMD and Nvidia refusing to move the needle on performance in what was historically the high end pricing tier.
What you consider the best or having some value is subjective. For those who chase high frame rates but dont use RT there are multiple high end GPUs to choose from, those who want 4k are going to need more power then 1440p, ece. There's also the angle of these high end GPUs lasting a long time, whereas mid range GPUs will need replaced more often to handle newer games at settings higher then poverty tier. Those 480s and 1060s are not holding out as well as the 1080ti in modern games, for example.
Interesting, but they only count for less than 20% of shareholders, so who has the other 80% and are they also in cahoots?
didnt you know, without black rock or Vanguard AMD would be selling the 7900xtx for $250 to compete with the $300 4090 and all CPUs would have a billion cores!
 
£630 all UK taxes included for a MSI 6900 XT Trio X seems like a great deal now. Spends most of its time at 2500MHz, and I'm playing CP2077 Ultra with Ultra ray tracing at 40-100 fps (1080p). Wish I'd bought 2 as the wife "needs" something similar for the Witcher 3 remaster. I'd love a 4090 if someone gave it to me because prices are stupid, and it now looks like the stupid is set to continue a bit longer.
 
Back
Top