Monday, December 2nd 2024
AMD Radeon RX 8800 XT RDNA 4 Enters Mass-production This Month: Rumor
Apparently, AMD's next-generation gaming graphics card is closer to launch than anyone in the media expected, with mass-production of the so-called Radeon RX 8800 XT poised to begin later this month, if sources on ChipHell are to be believed. The RX 8800 XT will be the fastest product from AMD's next-generation, and will be part of the performance segment, succeeding the current RX 7800 XT. There will not be an enthusiast-segment product in this generation, as AMD looks to consolidate in key market segments with the most sales. The RX 8800 XT will be powered by AMD's next-generation RDNA 4 graphics architecture.
There are some spicy claims related to the RX 8800 XT being made. Apparently, the card will rival the current GeForce RTX 4080 or RTX 4080 SUPER in ray tracing performance, which would mean a massive 45% increase in RT performance over even the current flagship RX 7900 XTX. Meanwhile, the power and thermal footprint of the GPU is expected to reduce with the switch to a newer foundry process, with the RX 8800 XT expected to have 25% lower board power than the RX 7900 XTX. Unlike the "Navi 31" and "Navi 32" powering the RX 7900 series and RX 7800 XT, respectively, the "Navi 48" driving the RX 8800 XT is expected to be a monolithic chip built entirely on a new process node. If we were to guess, this could very well be TSMC N4P, a node AMD is using for everything from its "Zen 5" chiplets to its "Strix Point" mobile processors.
Sources:
ChipHell, Wccftech, VideoCardz
There are some spicy claims related to the RX 8800 XT being made. Apparently, the card will rival the current GeForce RTX 4080 or RTX 4080 SUPER in ray tracing performance, which would mean a massive 45% increase in RT performance over even the current flagship RX 7900 XTX. Meanwhile, the power and thermal footprint of the GPU is expected to reduce with the switch to a newer foundry process, with the RX 8800 XT expected to have 25% lower board power than the RX 7900 XTX. Unlike the "Navi 31" and "Navi 32" powering the RX 7900 series and RX 7800 XT, respectively, the "Navi 48" driving the RX 8800 XT is expected to be a monolithic chip built entirely on a new process node. If we were to guess, this could very well be TSMC N4P, a node AMD is using for everything from its "Zen 5" chiplets to its "Strix Point" mobile processors.
182 Comments on AMD Radeon RX 8800 XT RDNA 4 Enters Mass-production This Month: Rumor
7900 xtx = 6900 xt
7900 xt = 6800 xt
7900 gre = 6800
7800 xt = 6700 xt
7700 xt = 6700
7600 xt = 6650 xt
7600 = 6600 xt
Hoping for another 2900->3870 deal, but I'm doubtful
7900 xtx = 6950 xt 36%
7900 xtx = 6900 xt 47%
7900 xt = 6800 xt 36%
7900 gre = 6800 31%
7800 xt = 6750 xt 40%
7800 xt = 6700 xt 48%
7700 xt = 6700 42%
These four are rebrands of one and the same thing:
7600 xt = 6650 xt 4%
7600 = 6600 xt 10%
That said, if the 8800 is equivalent to the 7900 it seems like the normal incremental improvement. +1 generation ~= +1 performance segment. That's usually coupled with more power draw...so putting out a lower power draw (and presumably cooler) version of the card would be great. The RT performance doesn't exactly concern me. As far as I'm concerned it's another TressFX. You remember that, don't you? The newest thing that was going to make hair rendering super realistic. The thing that it seem like nobody actually remembers about the Tomb Raider game...
Seriously though, it's stupid to support a brand. I bought a 5070, a 3080, and haven't been able to justify any new GPU purchase since the prices went from high to utterly silly. 3060s selling in 2024 for $300 is utterly silly, and as long as the market tolerates that Nvidia will continue to sell goods at eye watering prices. I don't support AMD with bad products...and I didn't support Intel with their current crop of driver crippled GPUs...but telling Nvidia that they can spoon feed you slop and charge for filet mignon is frustrating. I hope the 8000 series helps to rectify that...but I see too many people who sprung for the 4060 to believe that we'll see the blatant price gouging stop any time soon.
www.guru3d.com/review/radeon-rx-7800-xt-reference-review/page-29/#performance
It's logically correct. But truth is, AMD is not interested in such generoucity of"giving away" VGAs, for any less than nVidia. No matter how much "slower" RDNA4 is going to be, compared to nVidia GPUs, AMD for last five years (or even more), have shown by all their actions, that they will price thieir (top) card, similary to nVidia (top) "counterparts". Even if there's an entire performance gulf between them. This is sad, but it feels like AMD is going to extract every last penny, much like nVidia.
This happened during, 5700XT, when AMD tried to mark up their raw and unfinished architecture, and simply were forced to bring down the prices, when the public outrage exploded. They did that with X570/B550 MBs. They did that with Zen3, and with RDNA3 as well. Like they've priced RX7700XT (which is RX7600XT in reality), for whoping $449. The 192 bit card, for almost half grand. The hubris and arrogance has no limits. Exactly! This was an nVidia game, only to inflate the price of graphic cards.
I still think, that unless GPU vendors will start make the RTRT HW as separate AICs, that will scale akin GPUs, there's no way GPUs would be able to push the RT to any reasonable levels. GPUs simply have no room for RT to scale. This is jack of all trades, master of none. SInce both RT and raster parts share the silicon space, and thermal envelope, and none can "breathe".
If GPUs would be raster only, they would have much less footprint, either by size, and by power. And everyone, who wants to tinker/enjoy mazoshistic pleasure of limited RT capabilities would be able to add the RTRT cards.
6950XT was arguably an even better deal by increasing performance over 6900XT by further 7% while costing 100 more.
3090 Ti increased by 8% over 3090 but added another 500 and thus extending the price cap to 900.
We're also talking about 2020/2021 here. DLSS had just gotten to the point it was actually worth using but availability was still very limited and thus the fact that AMD did not have an answer at the time did not matter much. As for CUDA - well if Nvidia made it open source then AMD cards could run it no problem. You're also calling out non-existent RT perf. The same non-existent perf that applied to 3090 Ti... And 4080 users can enable everything and still enjoy high refreshrate gaming at 1200 or would a sane person look at 4080 price and conclude that if they're already ready to spend 1200 then why not jump to 4090?
AMD, unlike Nvidia did not increase their top cards price. 7900XTX launched at the same MSRP as 6900XT.
7900 GRE was and is an odd release. Probably for dumping defective N31 dies.
7800 XT and 7700 XT were the most popular RDNA3 cards i believe.
7600 may have been awful 8GB card but at least unlike Nvidia it was not priced at 400 and then charged another 100 for clamshell 16GB version.
Not to mention Nvidia not even releasing 3050 (a truly awful card that does not even have 8GB) successor. Strongly pessimistic person expects 550 or 600 most. Not over 750. You realize that if 8800XT really ended up costing 750+ then AMD would not be able to sell any because 7900XT and 7900XTX would be so much better deals? You're confusing something here. Yes 20 series was massive price hike for very little substance, but 30 series was very well priced thanks to cheaper node. 6000 and 7000 series had roughly the same prices with some outliers. 40 series was again a price hike. You did not find anything older than a 15 year old card?
Nvidia also had 250W 780, 780Ti, 980Ti and 1080Ti. 980Ti was praised for it's power efficiency and 1080Ti is legendary.
Also you do not account for the fact that 480 was a single fan blower card and it's performance was underwhelming.
Cooling 270W today is a far cry from cooling 250W fifteen years ago. The coolers are much bigger and can easily handle it.
Not to mention tolerable noise levels now vs then. Playable framerate is not marketing. It is essential. A person buying 7900XTX is not buying it for 60fps tech demo.
Playing one tech demo at barely playable framerate (these days i expect high refreshrate experience at 90+) is not what i call a "money spent well". Nvidia lowering prices while manufacturing costs go up and new G7 being also more expensive? Never gonna happen. The best we can expect is the same price and that's assuming they're feeling generous and cut into their margins.
AMD wont start a price war with Nvidia because they dont have the money coffers and capacity.
Nvidia wont start a price was with AMD because they want to increase their money coffers. That's patently false. It's actually double work for devs now since they still have to do manual lights and RT on top of that.
Only games that fully rely on RT where it cant be disabled can claim workload reduction. But it *IS* fine because we have much better coolers that dont sound like fighter jets on an afterburner. And people like john_ will still complain that AMD "only" manages 4080S RT performance. Nothing new here.
Conveniently ignoring the fact that Nvidia themselves do not give 4090 RT performance for 1/4th the price.
Nvidia does not really care about RT availability or market penetration. They only care how much more they can charge for this on their top cards.
If they truly cared (like they claim) they would do everything in their power to produce cheap mainstream cards with good RT perf.
Ironically it's AMD who has managed to bring RT to masses even on consoles. TBH i did not think consoles would get RT so soon and at this level of performance. Nasty! /s. X570 was justified because it had Gen4 capable chipset in 2019. Something Intel introduced a whole two years later (with fewer lanes).
Today's AM5 prices regardless of the chipset are way more arrogant.
Zen 3 also had massive performance increase. RDNA3 had some bad examples but the top card did not increase in price.
7700XT may have been that but at least it was 12GB. Meanwhile Nvidia asked 400 for a 8GB card and whopping 500 for 16GB despite AMD proving with 7600XT that going from 8GB to 16GB does not add 100 the the price. Not to mention that i remember 7700XT being out of stock because people bought it up compared to 7800XT. I agree but practically i dont see this happening. The overhead of moving data over PCIe is so large that for real-time rendering this would introduce a whole host of problems that were prevalent in the SLI/CF days including the dreaded micro-stutter. Maybe future Gen6 or similar speeds can mitigate this issue somewhat but that still leaves the extra slot problem where most motherboards do not have and extra x16 electrical (not just physical) slot to plug in that RT card.
There is a serious non zero chance that dedicated RT hardware will fade away in favor of more general purpose performance, with computing this happens very often historically.
And no, even with the current limited RT usage it is not even close to being double work, especially seeing the industry converging onto UE5 where RT implementation is baked into the pipeline. A better cooler doesn’t change the fact that more power equals more heat dumped into your case, out of it and into ones room. I already said that I don’t care one way or the other, my personal preferences are just that. If people are willing to accept 500 watt GPUs - more power to them.
oh hey that’s actually a good joke
Would be nice to see the 8000 series get a more significant boost to Raytracing than the 7000 series. RDNA4 really needs to see nothing more than a around a -48% regression(which would still put it behind Nvidia) in Raytracing vs Pure Raster to make it worth considering playing even older title with raytracing at anything more than 1080p.
Suppose 100 is baseline, -61% is 39, - 58% is 42, 42/39 ~= 8%.
It will sell fine even at 499 unless 7900 XT drops to 499 in clearance sales (which it wont).
599 is a much tougher sell. Personally i dont think it will be either 399 or 599. Both are unrealistic. Im betting on 499.
That being said, a 8% improvement is not enough when you're already ~33% behind.
Better late than never, sadly RTX 50 series will walk all over it Raw performance is half the story. By giving even the lowly RTX 3050 full access to the entire RTX Studio suite and heavily investing into day-one game ready drivers, providing years of updates etc. NV captures this value-sensitive market, besides, a 3050 will run eSports and most F2P phenomenon games just fine at very high settings and good frame rates
See, this is what I meant when I said it doesn't matter. That's how a consumer looks at it, it wouldn't matter if AMD offered 4080 RT performance for say 500$ the consumer only knows that "Nvidia walks all over it" and that's the end of the story.