Monday, December 2nd 2024

AMD Radeon RX 8800 XT RDNA 4 Enters Mass-production This Month: Rumor

Apparently, AMD's next-generation gaming graphics card is closer to launch than anyone in the media expected, with mass-production of the so-called Radeon RX 8800 XT poised to begin later this month, if sources on ChipHell are to be believed. The RX 8800 XT will be the fastest product from AMD's next-generation, and will be part of the performance segment, succeeding the current RX 7800 XT. There will not be an enthusiast-segment product in this generation, as AMD looks to consolidate in key market segments with the most sales. The RX 8800 XT will be powered by AMD's next-generation RDNA 4 graphics architecture.

There are some spicy claims related to the RX 8800 XT being made. Apparently, the card will rival the current GeForce RTX 4080 or RTX 4080 SUPER in ray tracing performance, which would mean a massive 45% increase in RT performance over even the current flagship RX 7900 XTX. Meanwhile, the power and thermal footprint of the GPU is expected to reduce with the switch to a newer foundry process, with the RX 8800 XT expected to have 25% lower board power than the RX 7900 XTX. Unlike the "Navi 31" and "Navi 32" powering the RX 7900 series and RX 7800 XT, respectively, the "Navi 48" driving the RX 8800 XT is expected to be a monolithic chip built entirely on a new process node. If we were to guess, this could very well be TSMC N4P, a node AMD is using for everything from its "Zen 5" chiplets to its "Strix Point" mobile processors.
Sources: ChipHell, Wccftech, VideoCardz
Add your own comment

213 Comments on AMD Radeon RX 8800 XT RDNA 4 Enters Mass-production This Month: Rumor

#76
3valatzy
TomorrowFaster than that. Reported to be 7900 XT (raster) performance. This is 36% faster than 6800 XT.
I will believe it when I see it. Because it is also reported that RDNA 4 will be simply a bug fix of RDNA 3.
4096 shaders of RX 8800 XT vs. 4608 shaders of RX 6800 XT.
Unless its clocks are well beyond 3 GHz, and it has some other secret specs, it won't be that fast.
Posted on Reply
#77
Dr. Dro
Vya Domus@john_

See, this is what I meant when I said it doesn't matter. That's how a consumer looks at it, it wouldn't matter if AMD offered 4080 RT performance for say 500$ the consumer only knows that "Nvidia walks all over it" and that's the end of the story.
You see, 4080 performance is nice and all, but the 4080 came out in 2022 and it's the second tier chip. If an equivalent to a 3 year old second tier chip that's all they can deliver by 2025, doesn't take a genius to figure out why it's not impressive anymore. And there's the ecosystem gap to close.

500 is... about what this level of performance coupled with AMD's software is worth. At 500 they won't repeat Polaris, but it will be a reasonable level of performance for an attractive price, at least from our current perspective. It's just not impressive for someone like me, who's had a 4080 for that long.
Posted on Reply
#78
Vya Domus
Dr. Dro500 is... about what this level of performance coupled with AMD's software is worth.
Bear in mind the 4080/S is currently 1000$.

Nvidia's mindshare is legendary.
Posted on Reply
#79
Hecate91
Vya Domus@john_

See, this is what I meant when I said it doesn't matter. That's how a consumer looks at it, it wouldn't matter if AMD offered 4080 RT performance for say 500$ the consumer only knows that "Nvidia walks all over it" and that's the end of the story.
The mindshare and brand favoritism of refusing to try anything but a favorite brand works very well even with enthusiasts. I see so many still saying AMD drivers are bad, while Nvidia is still behind the times with their outdated control panel and "game ready" drivers needed to play newer games. Or AMD has to be nearly giving cards away, but of course most consumers only want that in hopes Nvidia will lower their prices.
Even if AMD offered better than 4080 RT performance, people will still come up with some excuse like "FSR is useless" because a reviewer shows a game screenshot 10X zoomed in, or the power consumption is too high. I don't expect AMD to sell an 8800XT for $500 though, the midrange market is more in the range of $550-600.
Posted on Reply
#80
Vya Domus
I don't even know if AMD would price such a card at 500$, they probably wouldn't, I just gave that figure to make a point of how ridiculous the thought process is of your typical Nvidia consumer, ala basically every consumer if you look at the market share.

The reality is there is no way in hell Nvidia themselves are going to give you 4080S performance for 500$ in the next generation but somehow that would just barely be considered acceptable from AMD.
Posted on Reply
#81
Dr. Dro
Vya DomusBear in mind the 4080/S is currently 1000$.

Nvidia's mindshare is legendary.
It is. And it's, in my opinion, currently some $250 overpriced. That'd be how much I value Nvidia's driver support and ecosystem features, $250.

Ada is on its way out and Blackwell on its way in. This should both cause Ada cards to devalue and new products to slot in the current market positions.

If AMD plays its cards right we might see the first price war in many years.
Hecate91The mindshare and brand favoritism of refusing to try anything but a favorite brand works very well even with enthusiasts. I see so many still saying AMD drivers are bad, while Nvidia is still behind the times with their outdated control panel and "game ready" drivers needed to play newer games. Or AMD has to be nearly giving cards away, but of course most consumers only want that in hopes Nvidia will lower their prices.
Even if AMD offered better than 4080 RT performance, people will still come up with some excuse like "FSR is useless" because a reviewer shows a game screenshot 10X zoomed in, or the power consumption is too high. I don't expect AMD to sell an 8800XT for $500 though, the midrange market is more in the range of $550-600.
Pretty control panels don't make a quality KMD. Much less a feature complete one. Has AMD already implemented DX11 driver command lists?
Posted on Reply
#82
Vya Domus
Dr. DroHas AMD already implemented DX11 driver command lists
Don't know but I am not sure this is a game you want to play within this discussion, Nvidia has far worse driver overhead these days.
Posted on Reply
#83
Hecate91
Vya DomusI don't even know if AMD would price such a card at 500$, they probably wouldn't, I just gave that figure to make a point of how ridiculous the thought process is of your typical Nvidia consumer, ala basically every consumer if you look at the market share.

The reality is there is no way in hell Nvidia themselves is going to give you 4080S performance for 500$ in the next generation but somehow that would just barely be considered acceptable from AMD.
I blame the reviewers just as much as I do consumers for what the market share is now, its only recently reviewers and the tech media are starting to complain of Nvidia having a near monopoly over the graphics card market.
And with Nvidia refusing to bring high end RT performance to their midrange cards is why I don't care about RT performance and I don't see it as more than a gimmick for high end enthusiast users until midrange cards can reasonably run RT without a massive performance hit.
Dr. DroPretty control panels don't make a quality KMD. Much less a feature complete one. Has AMD already implemented DX11 driver command lists?
I'm not sure on this, its not something I care to argue about anyway as the driver issues are massively overblown. I don't constantly update graphics drivers as I don't buy games on launch because I don't want to be a paying beta tester, if the driver works I leave it alone.
Having a nice control panel is just an example, but I found it weird when Nvidia announced a control panel app people went nuts and claimed the old control panel was just fine even though you have to dig through several menus to find settings while waiting for things to load, or needing MSI Afterburner to adjust fan settings.
Posted on Reply
#84
freeagent
Who cares about nvidias control panel lol.. like really? How often do you use it?

AMD only has itself to blame for where it is in the market. They should have just left it ATi, betcha performance would be stronger :)
Posted on Reply
#85
Hecate91
freeagentWho cares about nvidias control panel lol.. like really? How often do you use it?

AMD only has itself to blame for where it is in the market. They should have just left it ATi, betcha performance would be stronger :)
I expect a better control panel when the leather jacket man charges a premium for it, Nvidia claims they're a software company after all.
AMD has their share of screw ups, however the mindshare works when reviewers are always finding reasons to criticize AMD.
Posted on Reply
#86
freeagent
Hecate91I expect a better control panel when the leather jacket man charges a premium for it, Nvidia claims they're a software company after all
Its been the same for so long that I dont even care. If anything I would probably be annoyed if they changed it lol.
Hecate91AMD has their share of screw ups, however the mindshare works when reviewers are always finding reasons to criticize AMD.
They are finding reasons to criticize because there is reason to criticize.. but that is not the topic of the conversation.

And lets be real here.. obviously Nvidia's flagship will be a monster, they all are. But AMD will get there too. I wouldn't be surprised if they get some help.
Posted on Reply
#87
Hecate91
To each their own, I don't expect everyone to have the same standards on a control panel, I just expect better is all.
And well my point is the criticisms are over exaggerated, then the consumer reads it as Nvidia being the only option to buy, others have explained that topic so I'm not going to argue on it.
I'd like to see AMD come back with a high end card with UDNA although I think going after midrange is a better segment to focus on.
Posted on Reply
#88
freeagent
Hecate91I just expect better is all.
Ahh.. I see where you are coming from now.. because you spent XXX on their shiny hardware, you want the shiny software to match.

I have owned many of their flagship cards over the years, mostly before I had a family..

I do miss Coolbits though, was pretty handy..
Posted on Reply
#89
wolf
Better Than Native
Neo_MorpheusWhen we get GPUS that can do RT at 4K on entry to medium level GPUS then RT can be considered more than a gimmick.
Entry level GPU's can't even max the majority of raster only games at 4k and get 60+ fps, what a bar to set for your personal taste of RT being a gimmick or not lol.

I would say entry level GPU's should be able to use RT and get 60+ fps at 1080p being a good bar, and we're a lot closer to that. I think some people forget just how tall an order 4k really is.

-------------------------------------------------------------------------------------------------

Personally having seen some very appreciated differences between software RT (eg lumen) and all it's caveats, shortcomings and immersion breaking faults, and the same scenes using hardware RT, I don't see software overcoming what hardware can achieve any time soon, although I'd welcome any advancements there and to be pleasantly surprised. This video demonstrates this pretty well, although I wouldn't expect that to convince the inconvincible, such is their fault.

I'm very keen to see how impressive (or not) RDNA 4 products end up being, as well as Battlemage however I doubt their ability to offer a compelling upgrade to a 3080, RDNA4 should be able to pull that off if their RT and upscaling is on point.
Posted on Reply
#90
kapone32
wolfEntry level GPU's can't even max the majority of raster only games at 4k and get 60+ fps, what a bar to set for your personal taste of RT being a gimmick or not lol.

I would say entry level GPU's should be able to use RT and get 60+ fps at 1080p being a good bar, and we're a lot closer to that. I think some people forget just how tall an order 4k really is.

-------------------------------------------------------------------------------------------------

Personally having seen some very appreciated differences between software RT (eg lumen) and all it's caveats, shortcomings and immersion breaking faults, and the same scenes using hardware RT, I don't see software overcoming what hardware can achieve any time soon, although I'd welcome any advancements there and to be pleasantly surprised. This video demonstrates this pretty well, although I wouldn't expect that to convince the inconvincible, such is their fault.

I'm very keen to see how impressive (or not) RDNA 4 products end up being, as well as Battlemage however I doubt their ability to offer a compelling upgrade to a 3080, RDNA4 should be able to pull that off if their RT and upscaling is on point.
In the TPU review of the 7900Xt was the 3080 faster than it? RT is so overblown that it is funny to me when people hang their hat on it. It is really sad how mindshare works. Last week I was watching a Kitguru video and they were using a 9800X3D for a build. I commented that Why don't you use AMD GPUs so that people can see their software package, it is a real shame. Kitguru first loved my comment and then responded with "We have very good support from our Nvidia partners".

It does not matter what AMD does the YT community and sites like TPU that have staff that love to deride Radeon are also a part of the problem. Case in point at the CES where the 40 series launched Adam asked the guy from Tom's hardware if there were any choices other than the 7900XTX and he said the 7900XT. The look on Adam's face made him say "What? it is only 7% slower than the XTX.

Then because of mind share people make statements like a 3050 can do Ray tracing better than any AMD card. Or the one that said that 7900 GPUs are only 2% faster in RT and 5% faster in Raster than the 6800XT. Well that is why I am not even interested and it seems AMD knows it's market. I can not only assume that I am not the only Enthusiast that has been sated with the perfomance of the 7900 series GPUs.

Just remember the next time you load up anything other than a "AAA" Game that raster is not just in those but also every other Game we play on PC. One thing that proves the mindshare for me is that even though most of the 4090s Nvidia produces are sold to China to the point where the Govt that they are under banned them and what did they do? Make the 4090D. Now people can tell me that is conjecture until they remember the TPU post about Nvidia already producing the 5090D.
Posted on Reply
#91
neatfeatguy
Folks are thinking too much about nothing right now.

Last I remember hearing was AMD was looking to stay out of the enthusiast end of GPUs and focus on bringing 4090 type performance in the $500 range. Rumors are just that, rumors. Hopefully though, they can hit that mark with their 8800XT. I won't hold my breath waiting, but it's nice to think about.
Posted on Reply
#92
Dr. Dro
neatfeatguyFolks are thinking too much about nothing right now.

Last I remember hearing was AMD was looking to stay out of the enthusiast end of GPUs and focus on bringing 4090 type performance in the $500 range. Rumors are just that, rumors. Hopefully though, they can hit that mark with their 8800XT. I won't hold my breath waiting, but it's nice to think about.
4080. 4090 performance at $500 would be an extreme generational increase with a very significant simultaneous reduction in cost. If they pulled that off, they'd actually dethrone Nvidia. The 4090 will remain uncontested by AMD and Intel both, even into the 50 series' lifetime.
Posted on Reply
#93
wolf
Better Than Native
kapone32In the TPU review of the 7900Xt was the 3080 faster than it?
It's faster overall in the TPU review because of the AMD sponsored "RT Lite" titles (read, RT not worth enabling) - and even then, by such an insignificant amount it made virtually zero sense to upgrade from a 3080 to the 7900XT - that would be been almost entirely a waste of money for the way I play my PC games - bearing in mind I already owned the 3080 for about 2 years at that point.



3080 is at 89% at 1080p and 1440p respectively overall.

But here lets see games where RT is far more impactful, all of a sudden it's not an upgrade at all, in fact it's slower at RT.

RT is so overblown that it is funny to me when people hang their hat on it ............ It is really sad how mindshare works
If you've seen me around the forums you know that I get a lot of enjoyment from games from the immersion they give, and I happen to find lighting to be a very compelling improvement in a lot of games where RT is well executed. If I'm going to pay $500++ USD for a video card, you bet I want it to be a feature rich, envelope pushing option.

People can proclaim it's a mindshare issue and that it's sad as much as they want (and it may well be for others), but I have zero allegiance to Nvidia, they just happen to currently make the most compelling products for my use case. If and When AMD can do that (again, noting I have and continue to own AMD video cards), they'll always be considered when I want to upgrade.

AMD's worst enemy has been themselves, in products, marketing and PR - but I am confident and hopeful they're at least trying to turn that around, re: the focus of this news topic and FSR4 going ML.

I will gladly part with my hard earned when AMD make a product that suits my use case, mindshare, tech press coverage (opinion) has essentially nothing to do with it, I look at the numbers, features etc and make up my own mind.

I certainly don't expect anyone else to buy a video card for the reasons I do, and I find it equally daft anyone would expect the inverse, or claim to know why I make the choices I make without asking me first.
Posted on Reply
#94
Neo_Morpheus
wolfEntry level GPU's can't even max the majority of raster only games at 4k and get 60+ fps
Selective reading "missed" to medium levels GPUs.

That said, there are way too many variables, like game optimization, unnecessary bloated textures, etc that can tank the FPS on any game.

We already have games doing over 300fps at 1080P on low to medium range GPUS because they are either optimized or simply old but good looking enough.

But fine, lets concede a bit that 4K is not easy.
wolfI would say entry level GPU's should be able to use RT and get 60+ fps at 1080p being a good bar
Without frame generation and other shenanigans, thats simply nowhere near soon.
wolfand we're a lot closer to that
See above.
wolfI think some people forget just how tall an order 4k really is.
Again, depends on many factors.
kapone32It does not matter what AMD does the YT community and sites like TPU that have staff that love to deride Radeon are also a part of the problem. Case in point at the CES where the 40 series launched Adam asked the guy from Tom's hardware if there were any choices other than the 7900XTX and he said the 7900XT. The look on Adam's face made him say "What? it is only 7% slower than the XTX.
Indeed!

Youtuber after Youtuber video is the same nonsense.

Many of them only mention AMD to trash them, then is back to the usual "here is my 4090".

Hell, I remember watching a car repair video where the influencer had to mention how good his 4090 was, when it wasnt related to the video at all!

Its so bad that even TPU staff members present here will always trash AMD GPU's openly.
Dr. DroIf they pulled that off, they'd actually dethrone Nvidia.
Between the bribed influencers and fanbois in high places (reviewers that blindly worship Ngreedia even without being bribed) would not allow that.

Wont push or state that an AMD gpu is better or its a better purchase.

AMD needs to somehow change that mindshare, even if it involves bribing these indiciduals.
Dr. DroThe 4090 will remain uncontested by AMD and Intel both, even into the 50 series' lifetime.
That right there is mindshare speaking.

Per TPU itself, a 7900 XTX is between 6 and 20% slower, but the price difference can be as high as 50%.





I call that not that far to be uncontested.
wolff you've seen me around the forums you know that I get a lot of enjoyment from games from the immersion they give, and I happen to find lighting to be a very compelling improvement in a lot of games where RT is well executed. If I'm going to pay $500++ USD for a video card, you bet I want it to be a feature rich, envelope pushing option.
Now that makes sense, you are one of those that will stop to admire the reflections in puddles.



Hey, it is your money, so I'm glad that you get so much enjoyment over such things.
Posted on Reply
#95
Dr. Dro
Neo_MorpheusSelective reading "missed" to medium levels GPUs.

That said, there are way too many variables, like game optimization, unnecessary bloated textures, etc that can tank the FPS on any game.

We already have games doing over 300fps at 1080P on low to medium range GPUS because they are either optimized or simply old but good looking enough.

But fine, lets concede a bit that 4K is not easy.


Without frame generation and other shenanigans, thats simply nowhere near soon.

See above.

Again, depends on many factors.

Indeed!

Youtuber after Youtuber video is the same nonsense.

Many of them only mention AMD to trash them, then is back to the usual "here is my 4090".

Hell, I remember watching a car repair video where the influencer had to mention how good his 4090 was, when it wasnt related to the video at all!

Its so bad that even TPU staff members present here will always trash AMD GPU's openly.





Between the bribed influencers and fanbois in high places (reviewers that blindly worship Ngreedia even without being bribed) would not allow that.

Wont push or state that an AMD gpu is better or its a better purchase.

AMD needs to somehow change that mindshare, even if it involves bribing these indiciduals.



That right there is mindshare speaking.

Per TPU itself, a 7900 XTX is between 6 and 20% slower, but the price difference can be as high as 50%.





I call that not that far to be uncontested.
I would, because this is cherry-picked benchmark data from an aftermarket 450W+ card on a limited range of tested software and you are still entirely disregarding the software situation as brought up beforehand. The RTX 4090 is always faster than the 7900 XTX no matter the scenario, with very few edge cases where they clash. While it was clearly designed and architected to do so, this is something it has failed to achieve from the very start, which is why it was repositioned against the 4080 alongside pre-launch price cuts.

Blackwell vs. RDNA 4 is going to be a bloodbath complete with a ritual sacrifice.
Posted on Reply
#96
wolf
Better Than Native
Neo_MorpheusSelective reading "missed" to medium levels GPUs.
Sure you did say entry to medium - that's on me, it was not my intention to misrepresent your words.
Neo_MorpheusBut fine, lets concede a bit that 4K is not easy.
It's a massive ask from my experience of running it for 2.5 years now.
Neo_MorpheusWithout frame generation and other shenanigans, thats simply nowhere near soon.
See above.
Again, depends on many factors.
I think we're not all that far at all, AMD is further mind you, but an entry level 4060 without Upscale or FG is already over 60 in many titles and not far behind in others, path tracing is a different story, but it's not required to be a transformative difference visually.

I'll boil this one down to different interpretations and what either of us consider to be close/soon, or acceptable performance.
Neo_MorpheusNow that makes sense, you are one of those that will stop to admire the reflections in puddles.
To some degree, sure, but more so the transformative and immersive experience RTGI can give, as well as shadows and reflections. It should go without saying that's not limited to lighting, geometric density, character models, animations and so on all contribute - I just find that excellent lighting overall helps bring it all together.
Posted on Reply
#97
Neo_Morpheus
Dr. DroBlackwell vs. RDNA 4 is going to be a bloodbath complete with a ritual sacrifice.
Let me guess, you are already comparing a 500 bucks GPU against a 2000 one, correct? :roll:
Posted on Reply
#98
Dr. Dro
wolfI'll boil this one down to different interpretations and what either of us consider to be close/soon, or acceptable performance.
End of the day this is what it's all about.
Neo_MorpheusLet me guess, you are already comparing a 500 bucks GPU against a 2000 one, correct? :roll:
I do not buy $500 GPUs unless it's for like, secondary builds. And how are we completely sure it's gonna be just $500?
Posted on Reply
#99
Neo_Morpheus
Dr. DroI do not buy $500 GPUs
Then why does ithis article matter to you?

unless you are one of those that want AMD to release competitive products with the hope that Ngreedia cut their prices, just so you can keep giving Ngreedia more money, because you never had any intention in buying an AMD gpu.
Dr. DroAnd how are we completely sure it's gonna be just $500?
Well, thats the persistent rumor.
Posted on Reply
#100
Dr. Dro
Neo_MorpheusThen why does ithis article matter to you?

unless you are one of those that want AMD to release competitive products with the hope that Ngreedia cut their prices, just so you can keep giving Ngreedia more money, because you never had any intention in buying an AMD gpu.

Well, thats the persistent rumor.
I don't want AMD to release competitive products so NVIDIA lowers their prices. I want NVIDIA to lower their prices because they're backed into a corner because AMD has released a stellar product, only then will consumers win.

I will reserve final judgment until they are released and widely benchmarked, with their auxiliary features put to the test (encoders, driver stability, etc.)
Posted on Reply
Add your own comment
Dec 11th, 2024 17:57 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts