Tuesday, November 15th 2022
AMD Confirms Radeon RX 7900 Series Clocks, Direct Competition with RTX 4080
AMD in its technical presentation confirmed the reference clock speeds of the Radeon RX 7900 XTX and RX 7900 XT RDNA3 graphics cards. The company also made its first reference to a GeForce RTX 40-series "Ada" product, the RTX 4080 (16 GB), which is going to launch later today. The RX 7900 XTX maxes out the "Navi 31" silicon, featuring all 96 RDNA3 compute units or 6,144 stream processors; while the RX 7900 XT is configured with 84 compute units, or 5,376 stream processors. The two cards also differ with memory configuration. While the RX 7900 XTX gets 24 GB of 20 Gbps GDDR6 across a 384-bit memory interface (960 GB/s); the RX 7900 XT gets 20 GB of 20 Gbps GDDR6 across 320-bit (800 GB/s).
The RX 7900 XTX comes with a Game Clocks frequency of 2300 MHz, and 2500 MHz boost clocks, whereas the RX 7900 XT comes with 2000 MHz Game Clocks, and 2400 MHz boost clocks. The Game Clocks frequency is more relevant between the two. AMD achieves 20 GB memory on the RX 7900 XT by using ten 16 Gbit GDDR6 memory chips across a 320-bit wide memory bus created by disabling one of the six 64-bit MCDs, which also subtracts 16 MB from the GPU's 96 MB Infinity Cache memory, leaving the RX 7900 XT with 80 MB of it. The slide describing the specs of the two cards compares them to the GeForce RTX 4080, which is what the two could compete more against, especially given their pricing. The RX 7900 XTX is 16% cheaper than the RTX 4080, and the RX 7900 XT is 25% cheaper.
The RX 7900 XTX comes with a Game Clocks frequency of 2300 MHz, and 2500 MHz boost clocks, whereas the RX 7900 XT comes with 2000 MHz Game Clocks, and 2400 MHz boost clocks. The Game Clocks frequency is more relevant between the two. AMD achieves 20 GB memory on the RX 7900 XT by using ten 16 Gbit GDDR6 memory chips across a 320-bit wide memory bus created by disabling one of the six 64-bit MCDs, which also subtracts 16 MB from the GPU's 96 MB Infinity Cache memory, leaving the RX 7900 XT with 80 MB of it. The slide describing the specs of the two cards compares them to the GeForce RTX 4080, which is what the two could compete more against, especially given their pricing. The RX 7900 XTX is 16% cheaper than the RTX 4080, and the RX 7900 XT is 25% cheaper.
166 Comments on AMD Confirms Radeon RX 7900 Series Clocks, Direct Competition with RTX 4080
I am still waiting, for many years, for realistic audio in games. I'm probably the only one, as industry interest is next to zero.
Meanwhile locking voltage/freq is a dick move, especially on top-of-the-line GPU where enthusiasts (who are more likely to buy these products) like to overclock
Enthusiasts should realise that overclocking is a thing of the past - we live in an age when GPUs and CPUs come delivering their fullest right out of the box. If you want some personalisation, you should underclock/undervolt more than anything, imo.
Also some entries on this chart is pure comedy - Intel Arc cards especially
XTX is just a better chip.
Maybe they use a pretty low target for the bin on XT so that they can keep the price relatively low for both.
Its also a new type of product wrt the chips they use. I think they're conservative to keep yields up, so again, binning.
If it's all about size, then why do they do the same with their lower-end chips, like the GA104?
If the RT performance gap is not wider than previous gen, I think this card will be a good choice vs. the 4080.
I'm also curious to see what FSR 3.0 will bring, and I am thankful for the power requirements of this 7900 series. Having said that, it's generally very upsetting to see both normalizing $1k~$2k for high end gaming GPUs. This used to be a whole system budget not too long ago.
The reasoning: if the chip is already cut down, its already not 'perfect', so it makes sense the rest of that chip is also less likely to be optimal. This doesn't translate to practice for most, but the emotion is what it is.
Another emotional aspect: 'I'm paying this much, it better be 'the whole thing'. They have volume on each SKU; they move enough units to do the process of gradual improvement on each one. And it moves both ways, remember the re-purposed 104s on lower end products.
I remember when BF Bad Company 2 came out and how there was a focus on the audio and everyone talked about it.
I thought it was a turning point but....alas.....
And yet we all know how important audio is to the experience, yet the budget and attention it gets is zero.
Hell I remember AMD also back in the day, I think it was related to what eventually became Vulkan, that they also had something that was meant to focus and increase the quality of audio in games.
That said, being a fan of Digital Foundry's work, I appreciate RT and what it can do/does
We get it, lots of you don't care about RT, you've only been shouting that from the rooftops since Turings debut, believe me, we hear you loud and clear.
Would ya'll please be mindful there is a subset of enthusiasts who do want good RT performance in future purchases, for it not to be an afterthought, a checkbox to tick.
If it's not for you, awesome, but I tire of hearing nObOdY cArEs ABouT rAy TrACinG when clearly people do, so please, stop (silly to even ask I know, this will probably make the vocal among you double down and write me an essay on why RT is a gimmick or that 'most' people don't care, good on you!).
Personally I'd love to be able to very strongly consider AMD GPU's, but a prerequisite of that is for them to take RT more seriously, and lessen the hit, and they can certainly swing even more buyers there way if they deliver on that, so I eagerly wait to see if the top product has made significant strides.
In those price category, absolute pref is the bar- not price\performance or even power.
Their might be a small group of users who will push for the 1000$ range at a stretch but no more but most can just droop extra 700-1000$ without real problem- gaming is their hobby and thus it is a legit cost.
To be clear- I`m not one of those, quite the opposite (see my current GPU), but it`s the realty today. Many are willing to pay whatever extra for the ultimate quality/ performance.
The cheapest 4090 in Germany is 2300+. That is 1860+ if you strip VAT.
Also, cough:
www.tomshardware.com/news/igors-lab-evga-decision-leaving-gpus-is-its-fault
This would mean in stock configurations the 7800XT would be better value than the 7900XT and the 7900XTX would also be better value than the 7900XT, however, depending on what is failing to cause parts to be in the 7900XT bin it is possible that overclocking is rather strong on the card so even though stock performance is not that great from a value perspective the overclocked performance could be enough that some enthusiast buyers who like to tinker could see value in the 7900XT even at $900 ensuring there is a market for it, albeit a small low volume market that allows AMD to focus more on the XTX sku.
Then there is the cut N32 die. Will AMD bother with a vanilla 7800 or would they just cut N32 to about 5k shaders, pair it with 3MCDs and call it a 7700XT? I personally think the later and with a 200mm^2 die you are looking at stupid numbers per wafer so supply of 7700XT and 7800XT should be far far better than supply of 6800XT and 6700XT was even if the 7700XT has to use perfectly good dies AMD will have calculated for that.
If you look closer at reviews, 4090 is basically double the performance of a 3080 10GB. 3080 MSRP was set to $700 at launch which we all know was so damn high at that time. Not to mention the enormous street prices. You may argue 4090 is faster than 3090 and 3090 Ti and it is cheaper or has a better performance per $ ratio the problem is those 3090 and 3090Ti had a crazy pricing anyway. Also 4090 is the fastest thus the stupid pricing and obviously you will have to pay premium for it. There is one problem though. It is not the fastest because it is not fully unlocked. You buy crippled GPU for astronomical price which will be replaced by a GPU with a higher price than ridiculous but fully unlocked.
4090 so far, despite its performance, has disappointed in all other fronts. 4080 16GB? Same thing considering its price. That is basically for every GPU NV is planning to release so far with the current information we have. Hopefully something will change but I doubt it.
Affordability is kinda relative, 1600usd is kinda pocket change for some people :)
RT performance is absolutely an important aspect of the value of Nvidia's GPUs - the question is how important. For me, it's ... idk, maybe 5:1 raster-v-RT? Rasterization is a lot more important overall, and for the foreseeable lifetime of this product and its contemporaries, I don't see the delta between them as that meaningful long term. When my 6900 XT performs between a 3070 and 3080 in RT depending on the title, and the 3090 Ti is maybe 20% faster than those on average, that means they'll all go obsolete for RT at roughly the same time. There are absolutely differences, but I don't see them as big enough to dismiss AMD outright.
These all above can be considered a handicap which in my book is silly to even talk about it. Your argument belongs in the same category.