Monday, December 2nd 2024

AMD Radeon RX 8800 XT RDNA 4 Enters Mass-production This Month: Rumor

Apparently, AMD's next-generation gaming graphics card is closer to launch than anyone in the media expected, with mass-production of the so-called Radeon RX 8800 XT poised to begin later this month, if sources on ChipHell are to be believed. The RX 8800 XT will be the fastest product from AMD's next-generation, and will be part of the performance segment, succeeding the current RX 7800 XT. There will not be an enthusiast-segment product in this generation, as AMD looks to consolidate in key market segments with the most sales. The RX 8800 XT will be powered by AMD's next-generation RDNA 4 graphics architecture.

There are some spicy claims related to the RX 8800 XT being made. Apparently, the card will rival the current GeForce RTX 4080 or RTX 4080 SUPER in ray tracing performance, which would mean a massive 45% increase in RT performance over even the current flagship RX 7900 XTX. Meanwhile, the power and thermal footprint of the GPU is expected to reduce with the switch to a newer foundry process, with the RX 8800 XT expected to have 25% lower board power than the RX 7900 XTX. Unlike the "Navi 31" and "Navi 32" powering the RX 7900 series and RX 7800 XT, respectively, the "Navi 48" driving the RX 8800 XT is expected to be a monolithic chip built entirely on a new process node. If we were to guess, this could very well be TSMC N4P, a node AMD is using for everything from its "Zen 5" chiplets to its "Strix Point" mobile processors.
Sources: ChipHell, Wccftech, VideoCardz
Add your own comment

182 Comments on AMD Radeon RX 8800 XT RDNA 4 Enters Mass-production This Month: Rumor

#26
ObscureAngelPT
@Onasi I have to agree with you 100%, but I think there is a higher reason for this push honestly.
This push exist this early because it saves money and time to developers which is something very important in the actual state that the gaming market is.
Posted on Reply
#27
Daven
Dristun3080 launched for $699 and 6800XT was super-competitive at $649, shit got out of hand thanks to miners. Then lockdown consumers spent their disposable incomes chasing insane prices and everyone out there saw how much people are willing to part with, then we got another multiplayer in inflation. These "new tech bad!" takes are getting out of hand.
Crypto, the pandemic and finally AI have really screwed up our beloved GPU market. Hopefully, Battlemage and RDNA4 will send the market down a better, more consumer-friendly path. While Nvidia has good products, I see them leaving the client market if the margins get too low.
Posted on Reply
#28
Neo_Morpheus
john_RT performance
Useless gimmick.
OnasiReal-time RT as it is in GAMES today is little more than a gimmick.
Bingo.
OnasiBut the push started way, waaaaay too early
I said the same thing over and over. When we get GPUS that can do RT at 4K on entry to medium level GPUS then RT can be considered more than a gimmick.
TomorrowHow is 270W still high and nasty?

Nvidia used to produce 250W flagships for generations and they were called efficient. Now that we have 450W flagships 270W is suddenly high?
270W is very respectable number and easily cooled by two slot, two fan cooling solution with low noise level.
Is the typical negative BS bias that AMD "enjoys".

When is Intel or Ngreedia, then power consumption doesnt matter, but when is AMD.....oh boy.
TomorrowThe only example i can think of is 7900XT that cost 900 when it launched. This was too high. 999 for 7900XTX (same price as 6900XT) was ok considering 4080 cost 200 more.
Agreed.

And I will add, since the bribed influencers (formerly known as tech reviewers) love to compare the 7900 XTX vs the 4090, in many instances, the 7900XTX can be between 15% slower to (in some rare occasions) faster than the 4090 but priced at around 50% less.

But hey, all AMD gpus are garbage, hence why no matter what, everyone simply buys Ngreedia.
Posted on Reply
#29
3valatzy
TomorrowHow is 270W still high and nasty?

Nvidia used to produce 250W flagships for generations and they were called efficient. Now that we have 450W flagships 270W is suddenly high?
270W is very respectable number and easily cooled by two slot, two fan cooling solution with low noise level.
I think you have a wrong understanding of the reality around you.

GTX 480 was 250W, and it was not called "efficient". It was a disaster.





Posted on Reply
#30
Daven
Neo_MorpheusUseless gimmick.

Bingo.

I said the same thing over and over. When we get GPUS that can do RT at 4K on entry to medium level GPUS then RT can be considered more than a gimmick.

Is the typical negative BS bias that AMD "enjoys".

When is Intel or Ngreedia, then power consumption doesnt matter, but when is AMD.....oh boy.

Agreed.

And I will add, since the bribed influencers (formerly known as tech reviewers) love to compare the 7900 XTX vs the 4090, in many instances, the 7900XTX can be between 15% slower to (in some rare occasions) faster than the 4090 but priced at around 50% less.

But hey, all AMD gpus are garbage, hence why no matter what, everyone simply buys Ngreedia.
Just wait for the Nvidia brand loyalists to start writing posts about how they hope AMD and Intel force Nvidia to decrease prices so they can afford to buy Nvidia. Of course for this to work that would mean AMD and Intel are the better buys and selling more cards than Nvidia. But in the end it won't matter, these Nvidia fans are the reason the prices are high in the first place. Nvidia won't decrease pricing if no one will buy anything other than Nvidia no matter what.
Posted on Reply
#31
john_
TomorrowIf you can call "achieving" barely playable 60fps on a card that most of it's shelf live has cost near 2000 as looking like 2 generations ahead then i dont know what to say. More like 2 generations behind. I remember the days when i bough a flagship card (that cost less than half as much) and cranked every setting to maximum and enjoyed a buttery smooth experience.
Forget what is playable. This is marketing. Someone pays $2000 for an RTX 4090, someone pays $1000 for the RX 7900XTX and one gets 60fps and the other one 15fps(I don't exactly remember the framerates, but I think PathTracing in those cards are like that). You know what you have? Not a playable game, but the first "proof" for the buyer of the RTX 4090 that their money where spend well. It's marketing and Nvidia is selling cards because of RT and DLSS.
4070 already came out at 600. You really think Nvidia would bother to lower 5070 to 550 when AMD is not a threat to them?
Nvidia will outsell AMD regardless if AMD prices their card at 500 or whatever.
They might, then what is AMD going to do? Lower the price to $500? Then to $450 and then to $400? Then in their financial results the gaming department will be more in red than Intel's. From a gamer/consumer perspective we all love low prices. But with Nvidia having all the support of the world, with countless out there been educated to love Nvidia products and hate AMD products, with countless out there willing to spend more money to get a worst Nvidia product than a better AMD product, aggressive pricing could end up a financial disaster for AMD. So they need to be careful. Now, if RDNA4 is a marvel architecture that they know that Nvidia can't counter and if we assume that they have secured enough wafers to cover the high demand that we could expect from a positive reaction from consumers, then and only then AMD will price their products aggressively. Putting an MSRP of $400 and failing to cover demand or scalpers driving the price to $600 will do no good to AMD, only bad.
Intel's problem were not due to price. It was because it was a 1st gen (retail) product with major driver problems.
Battlemage is almost here. Let's see if it will be a success.
Posted on Reply
#32
ThomasK
The performance impact vs visual improvement of ray tracing is honestly TRASH.

Next argument, please.
Posted on Reply
#33
Onasi
ObscureAngelPTThis push exist this early because it saves money and time to developers which is something very important in the actual state that the gaming market is.
*makes a so-so gesture* Somewhat. Time — yeah, potentially RT can be faster since you don’t have to manually set up lighting. Money - eh, hard to say, in terms of graphics my understanding is that the main big expense is actually assets themselves, which RT can’t really help with. Real factor for the push is simply hitting diminishing returns on raster performance increases and trying to find new ways and features to sell cards on. It’s cynical, but works.
3valatzyGTX 480 was 250W, and it was not called "efficient". It was a disaster.
Absolutely. I find it amusing how people now look at cards that are nearly double the TDP and it’s apparently fine, no problem there. Not judging one way or the other, but funny how things change. I personally wouldn’t touch anything above 250W, but I suppose ultimately more performance requires more watts pumped into the chip. At least, say, the 4090 is EFFICIENT in the sense that its power envelope is actually justified by its performance.
Posted on Reply
#34
AnotherReader
DavenAs a next gen replacement of the 7800XT, we should expect around a 20-30% increase in gen rasterization. That would place the chip around the 7900XT performance level. If priced around $399 as past rumors suggested, then we might finally have a killer perf/$ and perf/W graphics product.
Given that the 7900XT is selling above $500 even on Black Friday, a $399 MSRP for the 8800XT is rather optimistic. I would expect $549 as they need to clear out 7900XT stock.
Posted on Reply
#35
Vya Domus
john_When RX 7000 came out I was screaming about the low RT performance. I was called an Nvidia fanboy back then.
Even if this thing will indeed have 45% better RT performance or whatever it wont make a difference to the market share situation.

You'll see.
Posted on Reply
#36
Neo_Morpheus
DavenJust wait for the Nvidia brand loyalists to start writing posts about how they hope AMD and Intel force Nvidia to decrease prices so they can afford to buy Nvidia. Of course for this to work that would mean AMD and Intel are the better buys and selling more cards than Nvidia. But in the end it won't matter, these Nvidia fans are the reason the prices are high in the first place. Nvidia won't decrease pricing if no one will buy anything other than Nvidia no matter what.
I have also said the same thing many times and of course, its conveniently ignored.

But yes, thats what all of them want.
Posted on Reply
#37
john_
Neo_MorpheusUseless gimmick.
SONY asked for that gimmick, Nvidia used it to make competition look awful, tech press promoted it, users rejected the idea of buying a hi end AMD card because of it. But yeah, it's a gimmick. Which phase is this in psychology? Are we still in the first of resistance? Well AMD has already moved to the third of acceptance. And it's good for them. If RDNA4 is a success, they might even reach the phase of embracing.
Posted on Reply
#38
Neo_Morpheus
john_SONY asked for that gimmick, Nvidia used it to make competition look awful, tech press promoted it, users rejected the idea of buying a hi end AMD card because of it. But yeah, it's a gimmick. Which phase is this in psychology? Are we still in the first of resistance? Well AMD has already moved to the third of acceptance. And it's good for them. If RDNA4 is a success, they might even reach the phase of embracing.
And for us gamers, what does this thing add, besides pretty reflections on puddles?

Because gameplay wise, it adds nothing.

And about those companies adding it, they saw the sheep being misled by the influencers and willing to throw their moneis away, so they simply added that option to them.

Funny enough, nobody bothers in repeating or posting comments from even Ngreedia owners whom said they dont care about RT or cant justify the performance hit included by it.
Posted on Reply
#39
john_
Vya DomusEven if this thing will indeed have 45% better RT performance or whatever it wont make a difference to the market share situation.

You'll see.
You don't need to convince me on this. RTX 3050 sells 5-10 times better than RX 6600.
But at least AMD's cards will look more competitive that will force tech press to be less of promoters of Nvidia hardware, which could be the first step for a mentality change in the market.
Posted on Reply
#40
HD64G
$550-600 shall be its pricing range and close to 7900XTX raster and 4080 RT performance. My 5c. Not bad at all in vfm progression I think.
Posted on Reply
#41
john_
Neo_MorpheusAnd for us players, what does this thing add, besides pretty reflections on puddles?

Because gameplay wise, it adds nothing.

And about those companies adding it, they saw the sheep being misled by the influencers and willing to throw their moneis away, so they simply added that option to them.

Funny enough, nobody bothers in repeating or posting comments from even Ngreedia owners whom said they dont care about RT or cant justify the performance hit included by it.
YOU :p players keep promoting Nvidia like there is no tomorrow, so removing some of YOUR arguments against of choosing an AMD graphics card and lower RT performance was one, would be a GOOD thing.
Ok that YOU isn't exactly you, you do have a radeon logo, but you did posted like you are representing ALL gamers. And looking at Nvidia's market share, your point of view is a minority.

Gameplay wise PacMan and Tetris are better than 50% of the "games" out there.

The sheep pays. If the sheep is misled to pay for Feature A, you come out with the best Feature A in the market or,.... you don't sell.
Posted on Reply
#42
Vya Domus
john_SONY asked for that gimmick
You've seen the hilarious PS5/PS5 Pro comparison screenshots, PS5 Pro has better ray tracing only on paper, in reality most games still wont have RT and the ones that will the differences will be down to "which reflection looks slightly less noisy and blurry, let's zoom in".

And that's what you don't understand about this, the "who has the better RT" is a race that cannot be properly won, that's why Nvidia campaigns so hard about it. The end user experience is still the same, you enable RT, your FPS craters, the game looks kind of the same, maybe ? Doesn't matter if it's AMD or Nvidia, performance is shit regardless and there is no sign this will ever change.

"With our product the performance loss is 40% instead of 50%" is a proposition that you can never really use to sway consumers away from your competitor because it's a shit proposition regardless. Nvidia wants AMD to embark on this idiotic race because they know it doesn't matter how good their RT performance gets, marketing this properly to consumers is nearly impossible. And once again despite what people in our bubble claim I have to point out your average consumer still has no clue what any of this shit even does, another reason why focusing on RT is a total waste of time.
Posted on Reply
#43
Craptacular
john_Sales and the fact that AMD chose to retreat from the hi end,
Considering the high-end for RDNA 3 was chiplet design and the first two - three months of its life they were only releasing driver updates for the RDNA 3 and not RDNA 2 and RDNA 1 cards would be suggestive that there was something wrong with the chiplet design and that they tried to fix its performance issues with drivers but they couldn't. That to me to is suggestive in order to fix the performance issues using chiplets for GPUs would be a hardware design change, well new architectures take around 4-5 years to release, RDNA 4 was most likely already two years into its development when RDNA 3 was released, meaning it was too far along; so, they scrap the high end (chiplet) RDNA 4 and apply the hardware design change for chiplet based highend GPU to RDNA 5, which is now UDNA.

If I was a betting man, AMD is going to come out with a highend GPU that is chiplet based for the UDNA architecture.
Posted on Reply
#44
john_
Vya DomusYou've seen the hilarious PS5/PS5 Pro comparison screenshots, PS5 Pro has better ray tracing only on paper, in reality most games still wont have RT and the ones that will the differences will be down to "which reflection looks slightly less noisy and blurry, let's zoom in".

And that's what you don't understand about this, the "who has the better RT" is a race that cannot be properly won, that's why Nvidia campaigns so hard about it. The end user experience is still the same, you enable RT, your FPS craters, doesn't matter if it's AMD or Nvidia, performance is shit regardless and there is no sign this will ever change.

"With our product the performance loss is 40% instead of 50%" is proposition that you can never really use to sway consumers away from your competitor because it's a shit proposition regardless. Nvidia wants AMD to embark on this idiotic race because they know it doesn't matter how good their RT performance gets, marketing this properly to consumers it's nearly impossible. And once again despite what people in out bubble talk about I have to point out your average consumer still has no clue what any of this shit even does, another reason why focusing on RT is a total waste of time.
M A R K E T I N G
It's not that I don't understand, it's you that don't understand. And you are wrong. Nvidia is winning that race easily and tech press, even gamers who spend money for their cards and need the approval for their choice to spend more to get RT, is there to remind to everybody that RT is more important than anything in a game. "You can't immerse in a game without RT, you can't enjoy a game without RT" and BS like this ALL OVER THE INTERNET.
It's not what the user gets with RT, it's what the user THINKS they will get with RT. They will pay, they might be impressed, they might not, if they are not, they will just pay for the faster card that will offer them better RT, so they can enjoy RT, only to have to pay again and again and again. The same with raster graphics. The last 25 years I keep paying for a faster card because the new AAA game with the better raster graphics will offer me a better gaming experience. The same with RT. People will keep paying for it, only to enjoy it 20 years latter. But the question is, who will get all that money in that 20 years period?

You know 3DFX's failure was also because their latest cards where losing more compared to ATi's and Nvidia's when going from 16bit color to 32bit color. And 32bit color wasn't offering much to those graphics we had back then. But 32bit performance was a reason to avoid a 3DFX card.
Posted on Reply
#45
Neo_Morpheus
for the ones claiming that AMD simply abandoned the top end:

Posted on Reply
#46
john_
CraptacularConsidering the high-end for RDNA 3 was chiplet design and the first two - three months of its life they were only releasing driver updates for the RDNA 3 and not RDNA 2 and RDNA 1 cards would be suggestive that there was something wrong with the chiplet design and that they tried to fix its performance issues with drivers but they couldn't. That to me to is suggestive in order to fix the performance issues using chiplets for GPUs would be a hardware design change, well new architectures take around 4-5 years to release, RDNA 4 was most likely already two years into its development when RDNA 3 was released, meaning it was too far along; so, they scrap the high end (chiplet) RDNA 4 and apply the hardware design change for chiplet based highend GPU to RDNA 5, which is now UDNA.

If I was a betting man, AMD is going to come out with a highend GPU that is chiplet based for the UDNA architecture.
They had probably many bugs with RDNA3 and not with just hi end models. The high power consumption in video playback is one example.
I am expecting them to remain out of the high end market as long as they see that consumers are unwilling to pay for their cards.
Posted on Reply
#47
Neo_Morpheus
john_M A R K E T I N G
In so many words, thats what we have been saying, its marketing, FOMO, whatever, but the fact remains, RT is a gimmick, driven by marketing.
CraptacularConsidering the high-end for RDNA 3 was chiplet design and the first two - three months of its life they were only releasing driver updates for the RDNA 3 and not RDNA 2 and RDNA 1 cards would be suggestive that there was something wrong with the chiplet design and that they tried to fix its performance issues with drivers but they couldn't. That to me to is suggestive in order to fix the performance issues using chiplets for GPUs would be a hardware design change, well new architectures take around 4-5 years to release, RDNA 4 was most likely already two years into its development when RDNA 3 was released, meaning it was too far along; so, they scrap the high end (chiplet) RDNA 4 and apply the hardware design change for chiplet based highend GPU to RDNA 5, which is now UDNA.

If I was a betting man, AMD is going to come out with a highend GPU that is chiplet based for the UDNA architecture.
I recall reading that RDNA4 was really a cleanup of bugs in RDNA3 and thats because RDNA3 was a new design.

Then RDNA5 was going to be a proper new or better design.

Now we are talking about UDNA and that brings a whole lot of new crap that I think would end up helping them and us the gamers.

But they do need to do something marketing, because being honest doesnt work.
Posted on Reply
#48
john_
Neo_MorpheusIn so many words, thats what we have been saying, its marketing, FOMO, whatever, but the fact remains, RT is a gimmick, driven by marketing.
That gimmick sells cards and makes even RTX 3050 look a better choice than an RX 6600. And RTX 3050 I think is worst than RX 6600 in RT.
Do I have to paint a picture? Text doesn't seem to work. (OK, I am rude here. :D )
Let's agree to disagree. :)
Posted on Reply
#49
RandallFlagg
Hopefully there's more to this than what's been published. The 7800 XT was too little of an incremental upgrade to the 6800 XT, literally +2-3%. Now the spec leaks so far for the 8800 XT look underwhelming.

Also as has been mentioned, N4P is not a 'new node'. If it were an Intel node it would be called N5+. This implies better power efficiency, but likely nothing much on the performance front. And to that point, the rumored stream processor count is only +2-3% more than the 7800 XT.

Really thinking a discounted 7800 XT is the way.
Posted on Reply
#50
_roman_
The 7800XT was not an upgrade. I have a 7800XT and I check the performance charts quite often for my card. The 6800XT was very often better as the 7800XT. Personally I see the 7800XT 5-10% behind the 6800XT. That's why some called the card the renamed 7700XT.

Some other aspects were more important for myself as only the performance difference for the 7800XT vs the 6800XT / 6950XT.
Posted on Reply
Add your own comment
Dec 4th, 2024 04:15 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts