Monday, December 2nd 2024

AMD Radeon RX 8800 XT RDNA 4 Enters Mass-production This Month: Rumor

Apparently, AMD's next-generation gaming graphics card is closer to launch than anyone in the media expected, with mass-production of the so-called Radeon RX 8800 XT poised to begin later this month, if sources on ChipHell are to be believed. The RX 8800 XT will be the fastest product from AMD's next-generation, and will be part of the performance segment, succeeding the current RX 7800 XT. There will not be an enthusiast-segment product in this generation, as AMD looks to consolidate in key market segments with the most sales. The RX 8800 XT will be powered by AMD's next-generation RDNA 4 graphics architecture.

There are some spicy claims related to the RX 8800 XT being made. Apparently, the card will rival the current GeForce RTX 4080 or RTX 4080 SUPER in ray tracing performance, which would mean a massive 45% increase in RT performance over even the current flagship RX 7900 XTX. Meanwhile, the power and thermal footprint of the GPU is expected to reduce with the switch to a newer foundry process, with the RX 8800 XT expected to have 25% lower board power than the RX 7900 XTX. Unlike the "Navi 31" and "Navi 32" powering the RX 7900 series and RX 7800 XT, respectively, the "Navi 48" driving the RX 8800 XT is expected to be a monolithic chip built entirely on a new process node. If we were to guess, this could very well be TSMC N4P, a node AMD is using for everything from its "Zen 5" chiplets to its "Strix Point" mobile processors.
Sources: ChipHell, Wccftech, VideoCardz
Add your own comment

182 Comments on AMD Radeon RX 8800 XT RDNA 4 Enters Mass-production This Month: Rumor

#51
Craptacular
john_They had probably many bugs with RDNA3 and not with just hi end models. The high power consumption in video playback is one example.
I am expecting them to remain out of the high end market as long as they see that consumers are unwilling to pay for their cards.
Seeing as the lower end models were monolithic dies I'm willing to bet those bugs, such as a high-power consumption in video playback, would be more software than hardware, and seeing as higher power consumption in video playback for a desktop gpu isn't really much of a concern was probably a lower priority to fix. The high-end was chiplet and underperformed expectations even at AMD. Chiplets for a GPU are relatively new and thus there is going to be a larger chance of a severe hardware design flaw being uncovered during release.
_roman_The 7800XT was not an upgrade. I have a 7800XT and I check the performance charts quite often for my card. The 6800XT was very often better as the 7800XT. Personally I see the 7800XT 5-10% behind the 6800XT. That's why some called the card the renamed 7700XT.

Some other aspects were more important for myself as only the performance difference for the 7800XT vs the 6800XT / 6950XT.
If you look at the MSRP, and adjust it for inflation the 7800 XT was an upgrade for the 6700 XT, in which it was around 45% faster.

7900 xtx = 6900 xt
7900 xt = 6800 xt
7900 gre = 6800
7800 xt = 6700 xt
7700 xt = 6700
7600 xt = 6650 xt
7600 = 6600 xt
Posted on Reply
#52
LabRat 891
I will be pleasantly surprised if AMD manages to even match the 7900XTX in raster.
Hoping for another 2900->3870 deal, but I'm doubtful
Posted on Reply
#53
kanecvr
john_You don't need to convince me on this. RTX 3050 sells 5-10 times better than RX 6600.
But at least AMD's cards will look more competitive that will force tech press to be less of promoters of Nvidia hardware, which could be the first step for a mentality change in the market.
I noticed this and it baffles me. The 3050 is so much slower then the 6600, but it still outsells it. I believe OEMs are mostly to blame for this, since as a consumer, it's not hard to google 6600 vs 3050.
Posted on Reply
#54
john_
kanecvrI noticed this and it baffles me. The 3050 is so much slower then the 6600, but it still outsells it. I believe OEMs are mostly to blame for this, since as a consumer, it's not hard to google 6600 vs 3050.
The consumer will google that and end up with many posts saying "AMD drivers are trash, FSR is trash, only buy Nvidia cards" and will go and buy the Nvidia card.
Posted on Reply
#55
3valatzy
CraptacularIf you look at the MSRP, and adjust it for inflation the 7800 XT was an upgrade for the 6700 XT, in which it was around 45% faster.

7900 xtx = 6900 xt
7900 xt = 6800 xt
7900 gre = 6800
7800 xt = 6700 xt
7700 xt = 6700
7600 xt = 6650 xt
7600 = 6600 xt
Performance upgrade:

7900 xtx = 6950 xt 36%
7900 xtx = 6900 xt 47%
7900 xt = 6800 xt 36%
7900 gre = 6800 31%
7800 xt = 6750 xt 40%
7800 xt = 6700 xt 48%
7700 xt = 6700 42%

These four are rebrands of one and the same thing:
7600 xt = 6650 xt 4%
7600 = 6600 xt 10%
Posted on Reply
#56
lilhasselhoffer
So...lots of Nvidia and AMD hate in this thread. I support neither company, I buy a GPU that I can support.

That said, if the 8800 is equivalent to the 7900 it seems like the normal incremental improvement. +1 generation ~= +1 performance segment. That's usually coupled with more power draw...so putting out a lower power draw (and presumably cooler) version of the card would be great. The RT performance doesn't exactly concern me. As far as I'm concerned it's another TressFX. You remember that, don't you? The newest thing that was going to make hair rendering super realistic. The thing that it seem like nobody actually remembers about the Tomb Raider game...

Seriously though, it's stupid to support a brand. I bought a 5070, a 3080, and haven't been able to justify any new GPU purchase since the prices went from high to utterly silly. 3060s selling in 2024 for $300 is utterly silly, and as long as the market tolerates that Nvidia will continue to sell goods at eye watering prices. I don't support AMD with bad products...and I didn't support Intel with their current crop of driver crippled GPUs...but telling Nvidia that they can spoon feed you slop and charge for filet mignon is frustrating. I hope the 8000 series helps to rectify that...but I see too many people who sprung for the 4060 to believe that we'll see the blatant price gouging stop any time soon.
Posted on Reply
#58
_roman_
3valatzy7800 xt = 6750 xt 40%
May I ask for the line for 7800XT = 6800XT ?
Posted on Reply
#60
Random_User
TomorrowLow compared to what? 3090 Ti RT perf was not good enough improvement from AMD?
I'll also remind you that Nvidia themselves did not massively increase RT performance from 30 series.


Why would a 256bit 16GB G6 card with 7900XT raster and 4080 RT perf be priced above 750?
It would make zero sense. I can already buy 4070 Ti Super for less that equals 7900XT raster and beats it in RT.

For 8800XT to have a chance it must not be more expensive than 7800XT is new. Meaning around 500. 550 at most.
The less it costs the better deal it will be. 450 would be good. 400 would be amazing. I doubt it will be less than 400.
I know, that I would be beaten for what I'll write here. But still...

It's logically correct. But truth is, AMD is not interested in such generoucity of"giving away" VGAs, for any less than nVidia. No matter how much "slower" RDNA4 is going to be, compared to nVidia GPUs, AMD for last five years (or even more), have shown by all their actions, that they will price thieir (top) card, similary to nVidia (top) "counterparts". Even if there's an entire performance gulf between them. This is sad, but it feels like AMD is going to extract every last penny, much like nVidia.

This happened during, 5700XT, when AMD tried to mark up their raw and unfinished architecture, and simply were forced to bring down the prices, when the public outrage exploded. They did that with X570/B550 MBs. They did that with Zen3, and with RDNA3 as well. Like they've priced RX7700XT (which is RX7600XT in reality), for whoping $449. The 192 bit card, for almost half grand. The hubris and arrogance has no limits.
Vayra863090ti RT was not enough, no.

4090 RT isn't clearly enough either.

This whole thing is a farce.
Exactly! This was an nVidia game, only to inflate the price of graphic cards.

I still think, that unless GPU vendors will start make the RTRT HW as separate AICs, that will scale akin GPUs, there's no way GPUs would be able to push the RT to any reasonable levels. GPUs simply have no room for RT to scale. This is jack of all trades, master of none. SInce both RT and raster parts share the silicon space, and thermal envelope, and none can "breathe".
If GPUs would be raster only, they would have much less footprint, either by size, and by power. And everyone, who wants to tinker/enjoy mazoshistic pleasure of limited RT capabilities would be able to add the RTRT cards.
Posted on Reply
#61
mrnagant
25% board power of the 7900XTX is 266W. Puts it inline with the 7800XT/7900GRE.
Posted on Reply
#62
Tomorrow
Macro DeviceThe whole RDNA2 line-up failed to outperform their MSRP-sakes from NVIDIA. By a significant (15+ %) margin at least. Also no DLSS, CUDA, non-existent RT performance on top of that + insane power spikes.
Also, 6500 XT. Not as bad as GT 1630 but still ridiculous.
"At least"?. TPU shows 6900XT losing only 10% to 3090 while costing 500 less. That's an even better deal than 7900XTX is today compared to 4090.
6950XT was arguably an even better deal by increasing performance over 6900XT by further 7% while costing 100 more.
3090 Ti increased by 8% over 3090 but added another 500 and thus extending the price cap to 900.

We're also talking about 2020/2021 here. DLSS had just gotten to the point it was actually worth using but availability was still very limited and thus the fact that AMD did not have an answer at the time did not matter much. As for CUDA - well if Nvidia made it open source then AMD cards could run it no problem. You're also calling out non-existent RT perf. The same non-existent perf that applied to 3090 Ti...
Macro DeviceRDNA3 is trickier:
7900 XTX looked somewhat attractive in comparison with 4080, however at this price point, a gamer expects more than just raw raster performance. They want to enable everything. You can't do that on 7900 XTX. Thus, it had to be launched more significantly below 1200. $850 tops.
7900 GRE is just an abomination and a half. At 600 dollars, it's just a meager 10 to 20 % boost over 4070 at the cost of being worse in power draw and scenarios that aren't gaming pure raster titles.
7800 XT is the same story as 7900 XTX. NOT CHEAP ENOUGH TO CONVINCE. 4070 is more feature rich and performance difference is only visible with FPS graphs enabled. $100 premium is low cost enough.
7700 XT is also an abomination.
7600... Don't even get me started, it's awful.
And 4080 users can enable everything and still enjoy high refreshrate gaming at 1200 or would a sane person look at 4080 price and conclude that if they're already ready to spend 1200 then why not jump to 4090?
AMD, unlike Nvidia did not increase their top cards price. 7900XTX launched at the same MSRP as 6900XT.
7900 GRE was and is an odd release. Probably for dumping defective N31 dies.
7800 XT and 7700 XT were the most popular RDNA3 cards i believe.
7600 may have been awful 8GB card but at least unlike Nvidia it was not priced at 400 and then charged another 100 for clamshell 16GB version.
Not to mention Nvidia not even releasing 3050 (a truly awful card that does not even have 8GB) successor.
Macro DeviceThat's why I'm not delusional. I'm just strongly pessimistic because AMD seem to live in the fairy tale where nothing NVIDIA better than 2080 Ti exists.
Strongly pessimistic person expects 550 or 600 most. Not over 750. You realize that if 8800XT really ended up costing 750+ then AMD would not be able to sell any because 7900XT and 7900XTX would be so much better deals?
DavenThe Radeon 5000 and Geforce GTX 1000 series were priced just fine. The Geforce RTX 2000 series introduced us to ray tracing where pricing started to get out of hand. Pricing went insane with the Geforce RTX3000, Geforce RTX4000, Radeon 6000 and Radeon 7000 series.
You're confusing something here. Yes 20 series was massive price hike for very little substance, but 30 series was very well priced thanks to cheaper node. 6000 and 7000 series had roughly the same prices with some outliers. 40 series was again a price hike.
3valatzyGTX 480 was 250W, and it was not called "efficient". It was a disaster.
You did not find anything older than a 15 year old card?
Nvidia also had 250W 780, 780Ti, 980Ti and 1080Ti. 980Ti was praised for it's power efficiency and 1080Ti is legendary.

Also you do not account for the fact that 480 was a single fan blower card and it's performance was underwhelming.
Cooling 270W today is a far cry from cooling 250W fifteen years ago. The coolers are much bigger and can easily handle it.
Not to mention tolerable noise levels now vs then.
john_Forget what is playable. This is marketing. Someone pays $2000 for an RTX 4090, someone pays $1000 for the RX 7900XTX and one gets 60fps and the other one 15fps(I don't exactly remember the framerates, but I think PathTracing in those cards are like that). You know what you have? Not a playable game, but the first "proof" for the buyer of the RTX 4090 that their money where spend well. It's marketing and Nvidia is selling cards because of RT and DLSS.
Playable framerate is not marketing. It is essential. A person buying 7900XTX is not buying it for 60fps tech demo.
Playing one tech demo at barely playable framerate (these days i expect high refreshrate experience at 90+) is not what i call a "money spent well".
john_They might, then what is AMD going to do? Lower the price to $500? Then to $450 and then to $400? Then in their financial results the gaming department will be more in red than Intel's. From a gamer/consumer perspective we all love low prices. But with Nvidia having all the support of the world, with countless out there been educated to love Nvidia products and hate AMD products, with countless out there willing to spend more money to get a worst Nvidia product than a better AMD product, aggressive pricing could end up a financial disaster for AMD. So they need to be careful. Now, if RDNA4 is a marvel architecture that they know that Nvidia can't counter and if we assume that they have secured enough wafers to cover the high demand that we could expect from a positive reaction from consumers, then and only then AMD will price their products aggressively. Putting an MSRP of $400 and failing to cover demand or scalpers driving the price to $600 will do no good to AMD, only bad.
Nvidia lowering prices while manufacturing costs go up and new G7 being also more expensive? Never gonna happen. The best we can expect is the same price and that's assuming they're feeling generous and cut into their margins.
AMD wont start a price war with Nvidia because they dont have the money coffers and capacity.
Nvidia wont start a price was with AMD because they want to increase their money coffers.
OnasiTime — yeah, potentially RT can be faster since you don’t have to manually set up lighting.
That's patently false. It's actually double work for devs now since they still have to do manual lights and RT on top of that.
Only games that fully rely on RT where it cant be disabled can claim workload reduction.
OnasiAbsolutely. I find it amusing how people now look at cards that are nearly double the TDP and it’s apparently fine, no problem there.
But it *IS* fine because we have much better coolers that dont sound like fighter jets on an afterburner.
Vya DomusEven if this thing will indeed have 45% better RT performance or whatever it wont make a difference to the market share situation.
And people like john_ will still complain that AMD "only" manages 4080S RT performance. Nothing new here.
Conveniently ignoring the fact that Nvidia themselves do not give 4090 RT performance for 1/4th the price.

Nvidia does not really care about RT availability or market penetration. They only care how much more they can charge for this on their top cards.
If they truly cared (like they claim) they would do everything in their power to produce cheap mainstream cards with good RT perf.

Ironically it's AMD who has managed to bring RT to masses even on consoles. TBH i did not think consoles would get RT so soon and at this level of performance.
TumbleGeorge8800 XT, 220 watt card?
Nasty! /s.
Random_UserThis happened during, 5700XT, when AMD tried to mark up their raw and unfinished architecture, and simply were forced to bring down the prices, when the public outrage exploded. They did that with X570/B550 MBs. They did that with Zen3, and with RDNA3 as well. Like they've priced RX7700XT (which is RX7600XT in reality), for whoping $449. The 192 bit card, for almost half grand. The hubris and arrogance has no limits.
X570 was justified because it had Gen4 capable chipset in 2019. Something Intel introduced a whole two years later (with fewer lanes).
Today's AM5 prices regardless of the chipset are way more arrogant.

Zen 3 also had massive performance increase. RDNA3 had some bad examples but the top card did not increase in price.
7700XT may have been that but at least it was 12GB. Meanwhile Nvidia asked 400 for a 8GB card and whopping 500 for 16GB despite AMD proving with 7600XT that going from 8GB to 16GB does not add 100 the the price. Not to mention that i remember 7700XT being out of stock because people bought it up compared to 7800XT.
Random_UserI still think, that unless GPU vendors will start make the RTRT HW as separate AICs, that will scale akin GPUs, there's no way GPUs would be able to push the RT to any reasonable levels.
I agree but practically i dont see this happening. The overhead of moving data over PCIe is so large that for real-time rendering this would introduce a whole host of problems that were prevalent in the SLI/CF days including the dreaded micro-stutter. Maybe future Gen6 or similar speeds can mitigate this issue somewhat but that still leaves the extra slot problem where most motherboards do not have and extra x16 electrical (not just physical) slot to plug in that RT card.
Posted on Reply
#63
Vya Domus
Another thing to consider is that it seems UE5 is becoming an industry standard for better or worse (mostly worse) and most games are going to feature software RT in the form of Lumen with hardware RT being either optional or not really bringing much to the table, this will further muddy the waters.

There is a serious non zero chance that dedicated RT hardware will fade away in favor of more general purpose performance, with computing this happens very often historically.
Posted on Reply
#64
kapone32
DavenThis post eerily describes me as well right down to the 7900 XT, my current GPU.
I also have a 7900Xt and feel the exact same way.
Vya DomusAnother thing to consider is that it seems UE5 is becoming an industry standard for better or worse (mostly worse) and most games are going to feature software RT in the form of Lumen with hardware RT being either optional or not really bringing much to the table, this will further muddy the waters.

There is a serious non zero chance that dedicated RT hardware will fade away in favor of more general purpose performance, with computing this happens very often historically.
Check Freesync vs Gsync.
Posted on Reply
#65
Onasi
TomorrowThat's patently false. It's actually double work for devs now since they still have to do manual lights and RT on top of that.
Only games that fully rely on RT where it cant be disabled can claim workload reduction.
…do you struggle with the word “potentially”? Nothing I said is false, “patently” or not.
And no, even with the current limited RT usage it is not even close to being double work, especially seeing the industry converging onto UE5 where RT implementation is baked into the pipeline.
TomorrowBut it *IS* fine because we have much better coolers that dont sound like fighter jets on an afterburner.
A better cooler doesn’t change the fact that more power equals more heat dumped into your case, out of it and into ones room. I already said that I don’t care one way or the other, my personal preferences are just that. If people are willing to accept 500 watt GPUs - more power to them.

oh hey that’s actually a good joke
Posted on Reply
#66
Vya Domus
Epic made a really bizarre choice where instead of hardware Lumen simply being a faster version of software Lumen, it also features more complex ray tracing and runs worse than the software version. This will lead to the hilarious situation where consumers will have to face the choice between ray tracing and slower ray tracing, at a glance they'll look similar but they'll be bewildered by the fact that it runs worse.
Posted on Reply
#67
Tomorrow
Onasi…do you struggle with the word “potentially”? Nothing I said is false, “patently” or not.
And no, even with the current limited RT usage it is not even close to being double work, especially seeing the industry converging onto UE5 where RT implementation is baked into the pipeline.
So 1,5 times the work and it's somehow better?
OnasiA better cooler doesn’t change the fact that more power equals more heat dumped into your case, out of it and into ones room. I already said that I don’t care one way or the other, my personal preferences are just that. If people are willing to accept 500 watt GPUs - more power to them.
270W of heat is not uncomfortable or unmanageable. Even during summer heatwaves. I have 375W and now this as approaching uncomfortable levels in summer.
Posted on Reply
#68
3valatzy
TumbleGeorge8800 XT, 220 watt card?
With those specs, maybe +5-10% over the aging RX 6800 XT (year 2020). The only thing that could save AMD's face is the price - if it is $399, it will sell, if it is $499-599, it won't.
Posted on Reply
#69
EatingDirt
TomorrowLow compared to what? 3090 Ti RT perf was not good enough improvement from AMD?
I'll also remind you that Nvidia themselves did not massively increase RT performance from 30 series.
The problem with AMD's 7000 series RT capability is the performance in Pure-Raster vs RT compared to the 6000 series was something around 2-4% better in games with more than superficial RT(see Control). They said that RDNA3 was better at raytracing, but that 'better' was basically a meaningless improvement. The only real difference was the pure rasterization capabilities, which allowed the cards better performance overall.

Would be nice to see the 8000 series get a more significant boost to Raytracing than the 7000 series. RDNA4 really needs to see nothing more than a around a -48% regression(which would still put it behind Nvidia) in Raytracing vs Pure Raster to make it worth considering playing even older title with raytracing at anything more than 1080p.
Posted on Reply
#70
Vya Domus
EatingDirtThe problem with AMD's 7000 series RT capability is the performance in Pure-Raster vs RT compared to the 6000 series was something around 2-4% better in games with more than superficial RT(see Control). They said that RDNA3 was better at raytracing, but that 'better' was basically a meaningless improvement.
You need to brush up on your math, that's not how those percentages work, those are performance regressions for each card with RT on, you can't just subtract percentages like that. If you do the math correctly RDNA3 is about 8% better and that's about the same Nvidia achieved this generation as well.

Suppose 100 is baseline, -61% is 39, - 58% is 42, 42/39 ~= 8%.
Posted on Reply
#71
Tomorrow
3valatzyWith those specs, maybe +5-10% over the aging RX 6800 XT (year 2020). The only thing that could save AMD's face is the price - if it is $399, it will sell, if it is $499-599, it won't.
Faster than that. Reported to be 7900 XT (raster) performance. This is 36% faster than 6800 XT. Way faster in RT compared to 6800 XT (4080S perf).
It will sell fine even at 499 unless 7900 XT drops to 499 in clearance sales (which it wont).
599 is a much tougher sell. Personally i dont think it will be either 399 or 599. Both are unrealistic. Im betting on 499.
Posted on Reply
#72
EatingDirt
Vya DomusYou need to brush up on your math, that's not how those percentages work, those are performance regressions for each card with RT on, you can't just subtract percentages like that. If you do the math correctly RDNA3 is about 8% better and that's about the same Nvidia achieved this generation as well.

Suppose 100 is baseline, -61% is 39, - 58% is 42, 42/39 ~= 8%.
Math wrong, noted.

That being said, a 8% improvement is not enough when you're already ~33% behind.
Posted on Reply
#73
Onasi
TomorrowSo 1,5 times the work and it's somehow better?
Are you arguing with ghosts now? You do realize that I said in this very thread this:
OnasiReal-time RT as it is in GAMES today is little more than a gimmick.
I am not saying that RT based engines aren’t the way forward - they inevitably are as essentially THE holy grail of real-time rendering. But the push started way, waaaaay too early.
Do I need to further elaborate on my position or nah?
Tomorrow270W of heat is not uncomfortable or unmanageable. Even during summer heatwaves. I have 375W and now this as approaching uncomfortable levels in summer.
Okay? Your point being? Are you trying to explain why me preferring lower TDP cards as a personal choice is wrong somehow or…?
Posted on Reply
#74
Dr. Dro
john_When RX 7000 came out I was screaming about the low RT performance. I was called an Nvidia fanboy back then.
A few years latter and probably with SONY pushing AMD in that direction, the rumors talk about a new RX 8000 series that mostly increases performance in Raytracing.
Better late than never....
Don't worry, it's never been about RT itself but more about "Radeons are bad at X and Y so X and Y do not matter, because it exposes a deficiency in my favorite brand"

Better late than never, sadly RTX 50 series will walk all over it
kanecvrI noticed this and it baffles me. The 3050 is so much slower then the 6600, but it still outsells it. I believe OEMs are mostly to blame for this, since as a consumer, it's not hard to google 6600 vs 3050.
Raw performance is half the story. By giving even the lowly RTX 3050 full access to the entire RTX Studio suite and heavily investing into day-one game ready drivers, providing years of updates etc. NV captures this value-sensitive market, besides, a 3050 will run eSports and most F2P phenomenon games just fine at very high settings and good frame rates
Posted on Reply
#75
Vya Domus
Better late than never, sadly RTX 50 series will walk all over it
@john_

See, this is what I meant when I said it doesn't matter. That's how a consumer looks at it, it wouldn't matter if AMD offered 4080 RT performance for say 500$ the consumer only knows that "Nvidia walks all over it" and that's the end of the story.
Posted on Reply
Add your own comment
Dec 4th, 2024 04:16 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts