Tuesday, December 3rd 2024

AMD Radeon RX 8800 XT Reportedly Features 220 W TDP, RDNA 4 Efficiency

AMD's upcoming Radeon RX 8000 series GPUs based on RDNA 4 architecture are just around the corner, with rumors pointing to a CES unveiling event. Today, we are learning that the Radeon RX 8800 XT GPU will feature a 220 W TDP, compared to its Radeon RX 7800 XT predecessor with 263 W TDP, thanks to the Seasonic wattage calculator. While we expect to see better nodes used for making RNDA 4, the efficiency gains stem primarily from the improved microarchitectural design of the new RDNA generation. The RX 8800 XT will bring better performance while lowering power consumption by 16%. While no concrete official figures are known about RNDA 4 performance targets compared to RDNA, if AMD plans to maintain the competitive mid-range landscape with NVIDIA "Blackwell" and, as of today, Intel with Arc "Battlemage," team red must put out a good fight to remain competitive.

We reported on AMD Radeon RX 8800 XT entering mass production this month, with notable silicon design a departure from previous designs. The RX 8800 XT will reportedly utilize a monolithic chip dubbed "Navi 48," moving away from the chiplet-based approach seen in the current "Navi 31" and "Navi 32" GPUs. Perhaps most intriguing are claims about the card's ray tracing capabilities. Sources suggest the RX 8800 XT will match the NVIDIA GeForce RTX 4080/4080 SUPER in raster performance while having a remarkable 45% improvement over the current flagship RX 7900 XTX in ray tracing. However, these claims must be backed by independent testing first, as performance improvements depend on the specific case, like games optimized for either AMD or NVIDIA yield better results for the favorable graphics card.
Sources: Seasonic Wattage Calculator, via Tom's Hardware
Add your own comment

122 Comments on AMD Radeon RX 8800 XT Reportedly Features 220 W TDP, RDNA 4 Efficiency

#76
Blaeza
AusWolfI'll get one. My 6750 XT is getting ready for an early retirement, and I don't want Nvidia because 1. it's overpriced, and 2. AMD works better on Linux thanks to the open-source driver integrated into the kernel.

I still wouldn't mind a decent price tag, though.
Two things put me off Nvidia. Stupid connectors that melt and a lack of Vram. Otherwise I would probably have a 4070 Ti Super
Posted on Reply
#77
AusWolf
BlaezaTwo things put me off Nvidia. Stupid connectors that melt and a lack of Vram. Otherwise I would probably have a 4070 Ti Super
Oh yes, with those, I have 4 reasons, not 2.

Edit: Here's the 5th one - the early sale of cut-down chips only to release the good stuff a bit cheaper half a year later ($1200 4080 vs $1000 4080 Super). That's just dirty.
Posted on Reply
#78
Blaeza
AusWolfOh yes, with those, I have 4 reasons, not 2.
I'm faithful to AMD but if a 20gb vram great price to performance Nvidia card arrives I might have to put the issues aside and just get one. I'm worried the 8800XT will not be a big enough % boost in raster to warrant an upgrade. Might just stick with the GRE until UDNA. or not. Oh, I dunno! :cry:
Posted on Reply
#79
AusWolf
BlaezaI'm faithful to AMD but if a 20gb vram great price to performance Nvidia card arrives I might have to put the issues aside and just get one. I'm worried the 8800XT will not be a big enough % boost in raster to warrant an upgrade. Might just stick with the GRE until UDNA. or not. Oh, I dunno! :cry:
I'm not faithful to any company, but I've grown sick of Microsoft's bullshit, so AMD it is thanks to their better Linux support. Not that I mind, I'm not entirely happy with Nvidia's recent attitude, to be fair.

If I had a GRE, I'd probably keep it until UDNA. The 8800 XT sounds more like a side-grade as of now (let's see the reviews).
Posted on Reply
#80
kapone32
BlaezaI'm faithful to AMD but if a 20gb vram great price to performance Nvidia card arrives I might have to put the issues aside and just get one. I'm worried the 8800XT will not be a big enough % boost in raster to warrant an upgrade. Might just stick with the GRE until UDNA. or not. Oh, I dunno! :cry:
Does any Game you play make you feel like an upgrade? At that point we can turn on FSR anyway. UDNA for me as I have argued for 7900 owners.
Posted on Reply
#81
Blaeza
AusWolfI'm not faithful to any company, but I've grown sick of Microsoft's bullshit, so AMD it is thanks to their better Linux support. Not that I mind, I'm not entirely happy with Nvidia's recent attitude, to be fair.

If I had a GRE, I'd probably keep it until UDNA. The 8800 XT sounds more like a side-grade as of now (let's see the reviews).
kapone32Does any Game you play make you feel like an upgrade? At that point we can turn on FSR anyway. UDNA for me as I have argued for 7900 owners.
Only reason I've the upgrade bug is due to going from a 6900XT to my GRE was also a side grade but I had no choice when the 6900XT died.

Reviews will tell us all but damn they seem so far away.

EDIT. COD@1440p ultra has to have AFMF/FSR to acheive 165fps. I know I could lower the settings, but COD looks pants otherwise.
Posted on Reply
#82
AusWolf
kapone32Does any Game you play make you feel like an upgrade? At that point we can turn on FSR anyway. UDNA for me as I have argued for 7900 owners.
I'm looking for an upgrade exactly because I have to turn on FSR sometimes. Whatever people say, upscaling below 4K looks like crap, and is only meant to be a last resort before an upgrade.
Posted on Reply
#83
Krit
I'm also probably will go from 7800 to AMD again but why ? Because new gen nvidia at current times is always expensive it even does not matter new or used it's always expensive in both sides p/p for used RTX 4000 gpus literally does not exist at all. People usually are selling RTX 4000 series "used" gpus for new gpu prices or something like -5% off, if it's something like -10% off it's a miracle and very rare phenomenon. In good old days people would laugh and trash these who would try to sell used gpu for new gpu price or something like - 5-10% off. It's amazing how naive people are these days and can't see beyond their noses. And yeah all those RTX 4000 series lower/mid end gpu name changing by downgrading one tier lower they really were testing waters but sadly turns out that an average consumer are too dumb and will pay anything what nvidia wants.

Posted on Reply
#84
AnotherReader
BlaezaI'm faithful to AMD but if a 20gb vram great price to performance Nvidia card arrives I might have to put the issues aside and just get one. I'm worried the 8800XT will not be a big enough % boost in raster to warrant an upgrade. Might just stick with the GRE until UDNA. or not. Oh, I dunno! :cry:
If you have a GRE, then like the 7900 XT before it, the 8800 XT is unlikely to be more than 20% faster than it in rasterization. Ray tracing will hopefully see far more significant gains, but if that's important to you, you would probably choose Nvidia.
Posted on Reply
#85
3valatzy
Now would be perfect if AMD can pull an HD 4870 (if you go back to summer 2008, just a few weeks after Nvidia launched their GTX 280 for $649). AMD decided to market HD 4870 that was about 85% as fast as GTX 280, but for less than half the price ($300)! It was one of the best selling video cards in a long time, with 800 shaders compared to only 320 for the previous generation HD 3870 - AMD made sure to have plenty in stock, saving them up for months and then deciding the exact clock speed/voltage at the last minute for AIBs to set in their BIOS, to steal the thunder from Nvidia's launch. A lot of Nvidia customers were pissed, so Nvidia started issuing $150 make-up rebates to those who bought GTX 280 cards at launch and dropped the price to $499. GTX 280 chips were monstrous - nearly twice as big, with a 512-bit bus, compared to the HD 4870 that was much smaller, cooler, and doing great with only 256-bit bus but with only 512MB VRAM at first.

It was the biggest turn-around moment for ATI/AMD in years since the 9700 Pro launch. This is what kept AMD alive and kicking when AMD was greatly suffering with their 65nm and 45nm Phenom X4 CPUs, as AMD kept the momentum going on with their HD 5870 and 5970 vs the infamous GTX 480 Fermi debacle during 2009-2010.
Posted on Reply
#86
GodisanAtheist
Honestly more interested in what the 8600/XT do than the 8800XT.

I doubt it'll get me to swap out my 6800XT, but the 980Ti in my Steambox is having a rough go of even 2017 era games at 1080P (like Horizon Zero Dawn non-remaster if I crank up the settings).

It's a spare machine built out of old parts, so I'm not in any rush to drop serious coin on it.

I hope we either get 7600 performance at x5xx prices (~$150) or ~4060Ti level performance for 7600 prices (~$270). That would either make the N44 a good buy or put downward pressure on used parts that are fit for purpose.
Posted on Reply
#87
AcE
BlaezaTwo things put me off Nvidia. Stupid connectors that melt and a lack of Vram.
The connectors is a myth, it was a user error caused by improper connection of the connector, so someone didn't stand the idiot-check (yes the connector is not idiot-proof, at least it wasn't they improved it later). Lack of vram, well "just buy the more expensive cards" /jk.
BlaezaMight just stick with the GRE until UDNA. or not. Oh, I dunno! :cry:
Always just buy if you *need* more performance, not if you want to buy something for fun.
BlaezaOnly reason I've the upgrade bug is due to going from a 6900XT to my GRE was also a side grade but I had no choice when the 6900XT died.
How did it die?
AusWolfWhatever people say, upscaling below 4K looks like crap, and is only meant to be a last resort before an upgrade.
DLSS also looks good with 1440p, i tried this myself. Reviewers even say 1080p DLSS is still good. FSR needs to improve.
3valatzyIt was the biggest turn-around moment for ATI/AMD in years since the 9700 Pro launch. This is what kept AMD alive and kicking when AMD was greatly suffering with their 65nm and 45nm Phenom X4 CPUs, as AMD kept the momentum going on with their HD 5870 and 5970
Yes but that was all ATI and that were simpler times. The fact is Nvidia always learns and improves, Jensen is a perfectionist and he doesn't care about driving his opponent to the brink of extinction either, he's relentless.
GodisanAtheistdoubt it'll get me to swap out my 6800XT, but the 980Ti in my Steambox is having a rough go of even 2017 era games
Why not just connect a HDMI 2.1 cable (fibre optic) from the main pc and use that?
Posted on Reply
#88
Blaeza
AcEHow did it die?

1 of 2 things. Severe cannabis inhalation or I ESD'd it whilst building my AM5 rig. Not sure. I will get it fixed eventually.
Posted on Reply
#89
AcE
@Blaeza
1 of 2 things. Severe cannabis inhalation or I ESD'd it whilst building my AM5 rig. Not sure. I will get it fixed eventually.
Funny. Aren't most parts of tech today protected against ESD? I'm pretty sure at least mainboards are. Well, unlucky, but good luck with the fix
Posted on Reply
#90
AusWolf
AcEDLSS also looks good with 1440p, i tried this myself. Reviewers even say 1080p DLSS is still good. FSR needs to improve.
I tried DLSS on 1080p with Control and Cyberpunk, and didn't like it.

Upscaling doesn't need to improve. Games need to require sensible amount of GPU power at native.
Posted on Reply
#91
AcE
AusWolfI tried DLSS on 1080p with Control
That's a very old version of DLSS, I wouldn't count that anymore. But DLSS in 1080p is "okay", nobody said it's flawless.
AusWolfUpscaling doesn't need to improve. Games need to require sensible amount of GPU power at native.
That ship has long sailed and there's no going back. Lazy devs doing their part + RT making in entirely impossible even if the games are super optimised, just like CP2077 is. They just use that headroom to make the RT part even more extreme.
Posted on Reply
#92
3valatzy
AusWolfUpscaling doesn't need to improve. Games need to require sensible amount of GPU power at native.
AcEThat ship has long sailed and there's no going back. Lazy devs doing their part + RT making in entirely impossible even if the games are super optimised, just like CP2077 is. They just use that headroom to make the RT part even more extreme.
The games are not optimised at all. CP2077 with its ancient and ugly graphics environment must run literally on potato PCs. But instead, it is made as if the game is really demanding, so that Nvidia is allowed to continue selling its overpriced junk.
See Minecraft or Fortnite - they lack graphics at all, and still doesn't run at 1000 FPS, as it should have been in a normal world.
Posted on Reply
#93
AusWolf
AcEThat's a very old version of DLSS, I wouldn't count that anymore. But DLSS in 1080p is "okay", nobody said it's flawless.
"Okay" isn't my target when I'm looking for a GPU.
AcEThat ship has long sailed and there's no going back. Lazy devs doing their part + RT making in entirely impossible even if the games are super optimised, just like CP2077 is. They just use that headroom to make the RT part even more extreme.
Unless you're willing to sacrifice RT and ultra graphics, which I am.
Posted on Reply
#94
AcE
3valatzyThe games are not optimised at all. CP2077 with its ancient and ugly graphics environment must run literally on potato PCs. But instead, it is made as if the game is really demanding, so that Nvidia is allowed to continue selling its overpriced junk.
Not the games, CP 2077 is. It's supremely optimised even, this is not a run of the mill game, it's one of the biggest games ever and they did their best to optimise it and it runs way better than comparable games that use UE5 engine.
3valatzyCP2077 with its ancient and ugly graphics environment must run literally on potato PCs. But instead, it is made as if the game is really demanding, so that Nvidia is allowed to continue selling its overpriced junk.
1) you never played it or 2) you're not talking about CP2077. And also insinuating that CDPR was paid by Nvidia to make it an "Nvidia game" is kinda baseless and doesn't make much sense. More likely they put RT in that game to make it the best game technically / graphically possible.
AusWolf"Okay" isn't my target when I'm looking for a GPU.
Anyone who uses DLSS in 1080p has to accept "okay". And "okay" here means rather "good" still.

PS. "Laughing" at my post, talking nonsense and having no proper arguments, is truly laughable. :)
Posted on Reply
#95
3valatzy
AcEinsinuating that CDPR was paid by Nvidia to make it an "Nvidia game" is kinda baseless and doesn't make much sense
:nutkick:


Games/comments/bzgppb
Posted on Reply
#96
AcE
3valatzy:nutkick:


Games/comments/bzgppb
And? A partnership is nothing special and does still not make it a game "paid by Nvidia to feature RT to sell graphics cards". :laugh::roll:It seems you can't differentiate between a "partnership" and "being paid by company X to make a game a huge ad". The game runs great on Radeon GPUs until you activate RT, not really CDPRs fault that Radeon cards have bad RT / no proper RT cores as was even squarely discussed in this thread already.
Posted on Reply
#97
kapone32
AcEAnd? A partnership is nothing special and does still not make it a game "paid by Nvidia to feature RT to sell graphics cards". :laugh::roll:It seems you can't differentiate between a "partnership" and "being paid by company X to make a game a huge ad". The game runs great on Radeon GPUs until you activate RT, not really CDPRs fault that Radeon cards have bad RT / no proper RT cores as was even squarely discussed in this thread already.
You do realize that this has been happening for years, even Decades. It is just that this is a new age so people defend Nvidia even when they are told something that has been the truth for a long time. This is no different than Hairworks but today RT has become the narrative. Do you know what else was the narrative? People wanting USB 4 instead of PCIe. If you don't think that CPDR was given money and aid by Nvidia you are clearly mistaken. Money BTW could be something like an Nvidia engineer working for CPDR to implement RT on Nvidia's dime. The secret is that these companies do not pay for raster but features. Just like how CP2077 is about the only Game that supports Path Tracing. Just like how Tomb Radier was the only Game that supported Hairworks. For that exclusivity somethinfg has to be given. If you have played Total War Games, you would have seeen Intel, AMD and Nvidia splash screens while the Game is loaading. Do you think those are there just because?
Posted on Reply
#98
GodisanAtheist
AcEWhy not just connect a HDMI 2.1 cable (fibre optic) from the main pc and use that?
- Cuz I don't want to.
Posted on Reply
#99
AcE
kapone32You do realize that this has been happening for years, even Decades. It is just that this is a new age so people defend Nvidia
In this case i defend CDPR and not Nvidia, CP2077 is a game that was in the works for over a decade, so thinking the game was "made for Nvidia" like the delusional take of the other guy, is just that, delusional. And it makes sense for a generational game that it has the best tech possible, which again underlines that the issues have nothing to do with Nvidia but with the RT cores of AMD being too weak to handle the game in Path Tracing mode. The PT mode is also not a gimmick, it looks meaningfully better than the regular RT mode.
kapone32If you don't think that CPDR was given money and aid by Nvidia you are clearly mistaken.
Give a factual source that they have been given money, your take is laughable. Especially because CDPR isn't one of the small companies with no money, no, they actually have a lot of money compared to most companies, hence why CP2077 is one of the most expensive games ever made. "Aid" is nothing special on the other hand, even AMD aided them with technical advice, 99%.
kapone32This is no different than Hairworks but today RT has become the narrative.
Nonsense. Ray Tracing is around since decades, the only thing which changed is that Nvidia brought it to GPUs for Real Time Ray Tracing, you should be thankful that Nvidia improves technology, instead you're making up weird conspiracy theories and comparing old tech with proprietary tech like "Hairworks" which is the most nonsensical comparison ever.
kapone32Money BTW could be something like an Nvidia engineer working for CPDR to implement RT on Nvidia's dime.
Yea, give a source for that, or I'll again call that complete nonsense. Nvidia is not a game developer, and they will not work on games for companies, they will give some technical advice and that's it. So far, this is your worst of the worst take, and very delusional, like, you don't know much about game development at all.
kapone32Just like how CP2077 is about the only Game that supports Path Tracing.
Wrong, there are multiple and Alan Wake 2, actually runs pretty good on Radeon, I checked it yesterday to be sure that Nvidia has nothing to do with RT performance in games. It is more or less only CP2077 where Radeon cards have issues because the big city is simply too much tracing for those to handle. The game needs proper RT cores, whereas many other games run fine on Radeon RT, even Alan Wake 2 which uses Path Tracing and is also Nvidia "sponsored", which contradicts your conspiracy theories easily. The funny thing is, even Intel has less problems than Radeon in CP2077 with PT. Why? Again, because they have proper RT cores, AMD simply skimped too much on RT - and again this is also why soon Radeon will have proper RT cores and not just "Ray Accelerators" (small part of TMU's) anymore, with RDNA4.


Here we can see that the Intel card, comparable to RTX 30 series (which is of the same gen), "only" loses about 57% performance, the comparable 3060 loses 56%, whereas comparable RDNA2 and even RDNA3 cards lose over 68% performance here and struggle greatly. This proves that the game needs proper RT cores and then runs way better, it proves that it is *not* Nvidia optimised, because it runs comparable on Intel Arc as well. As so far that RT is a optional extra in the game, and the game runs extremely well on Radeon outside of Raytracing, someone *can not* make the argument that the game is "Nvidia optimised". You can only make the argument here that the game likes proper RT cores, and that's it.
kapone32If you have played Total War Games, you would have seeen Intel, AMD and Nvidia splash screens while the Game is loaading.
Splash screens don't mean much, a lot of these games with AMD splash screens or Nvidia splash screens run excellent on both companies cards. Again this is somewhat delusional take from you, someone who knows games well would actually know this, but yet I have to explain this fact to you.

If a game runs bad on a companies cards, it is coincidental TODAY, we don't live in Crysis 2 times anymore where Nvidia will pay the dev to implement invisible tesselation so that Radeon cards tank (likewise ATI did things like this also back then in the day). With social media / the internet stuff like that would spread like wildfire and ruin Nvidias reputation in seconds.
Posted on Reply
#100
kapone32
AcEIn this case i defend CDPR and not Nvidia, CP2077 is a game that was in the works for over a decade, so thinking the game was "made for Nvidia" like the delusional take of the other guy, is just that, delusional. And it makes sense for a generational game that it has the best tech possible, which again underlines that the issues have nothing to do with Nvidia but with the RT cores of AMD being too weak to handle the game in Path Tracing mode. The PT mode is also not a gimmick, it looks meaningfully better than the regular RT mode.

Give a factual source that they have been given money, your take is laughable. Especially because CDPR isn't one of the small companies with no money, no, they actually have a lot of money compared to most companies, hence why CP2077 is one of the most expensive games ever made. "Aid" is nothing special on the other hand, even AMD aided them with technical advice, 99%.

Nonsense. Ray Tracing is around since decades, the only thing which changed is that Nvidia brought it to GPUs for Real Time Ray Tracing, you should be thankful that Nvidia improves technology, instead you're making up weird conspiracy theories and comparing old tech with proprietary tech like "Hairworks" which is the most nonsensical comparison ever.

Yea, give a source for that, or I'll again call that complete nonsense. Nvidia is not a game developer, and they will not work on games for companies, they will give some technical advice and that's it. So far, this is your worst of the worst take, and very delusional, like, you don't know much about game development at all.

Wrong, there are multiple and Alan Wake 2, actually runs pretty good on Radeon, I checked it yesterday to be sure that Nvidia has nothing to do with RT performance in games. It is more or less only CP2077 where Radeon cards have issues because the big city is simply too much tracing for those to handle. The game needs proper RT cores, whereas many other games run fine on Radeon RT, even Alan Wake 2 which uses Path Tracing and is also Nvidia "sponsored", which contradicts your conspiracy theories easily. The funny thing is, even Intel has less problems than Radeon in CP2077 with PT. Why? Again, because they have proper RT cores, AMD simply skimped too much on RT - and again this is also why soon Radeon will have proper RT cores and not just "Ray Accelerators" (small part of TMU's) anymore, with RDNA4.


Here we can see that the Intel card, comparable to RTX 30 series (which is of the same gen), "only" loses about 57% performance, the comparable 3060 loses 56%, whereas comparable RDNA2 and even RDNA3 cards lose over 68% performance here and struggle greatly. This proves that the game needs proper RT cores and then runs way better, it proves that it is *not* Nvidia optimised, because it runs comparable on Intel Arc as well. As so far that RT is a optional extra in the game, and the game runs extremely well on Radeon outside of Raytracing, someone *can not* make the argument that the game is "Nvidia optimised". You can only make the argument here that the game likes proper RT cores, and that's it.

Splash screens don't mean much, a lot of these games with AMD splash screens or Nvidia splash screens run excellent on both companies cards. Again this is somewhat delusional take from you, someone who knows games well would actually know this, but yet I have to explain this fact to you.

If a game runs bad on a companies cards, it is coincidental TODAY, we don't live in Crysis 2 times anymore where Nvidia will pay the dev to implement invisible tesselation so that Radeon cards tank (likewise ATI did things like this also back then in the day). With social media / the internet stuff like that would spread like wildfire and ruin Nvidias reputation in seconds.
Before we go on can we please remember that is a AMD GPU power thread. Now as far as claiming that my post was full of conspiracy theories shows me that you spent too much time subscribing to the narrative. You don't appreciate that PC Games are being built using hardware from all of those vendors and as such will invest time and money into optimizing a Game for their specific platform. You don't seem to understand that raster is the foundation of 3D PC Gaming so the features you champion are moot in this thread. You see the 220W argument in this thread has everything to do with raster performance. Intel cards? What are we trying to turn this into? A let's bash AMD for 10 pages thread? Thank god I read Spy vs Spy as a kid and understand that this has nothing to do with the truth of this thread. BTW Hairworks are a feature just like how RT is today.
Posted on Reply
Add your own comment
Dec 11th, 2024 22:35 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts