Tuesday, December 3rd 2024

AMD Radeon RX 8800 XT Reportedly Features 220 W TDP, RDNA 4 Efficiency

AMD's upcoming Radeon RX 8000 series GPUs based on RDNA 4 architecture are just around the corner, with rumors pointing to a CES unveiling event. Today, we are learning that the Radeon RX 8800 XT GPU will feature a 220 W TDP, compared to its Radeon RX 7800 XT predecessor with 263 W TDP, thanks to the Seasonic wattage calculator. While we expect to see better nodes used for making RNDA 4, the efficiency gains stem primarily from the improved microarchitectural design of the new RDNA generation. The RX 8800 XT will bring better performance while lowering power consumption by 16%. While no concrete official figures are known about RNDA 4 performance targets compared to RDNA, if AMD plans to maintain the competitive mid-range landscape with NVIDIA "Blackwell" and, as of today, Intel with Arc "Battlemage," team red must put out a good fight to remain competitive.

We reported on AMD Radeon RX 8800 XT entering mass production this month, with notable silicon design a departure from previous designs. The RX 8800 XT will reportedly utilize a monolithic chip dubbed "Navi 48," moving away from the chiplet-based approach seen in the current "Navi 31" and "Navi 32" GPUs. Perhaps most intriguing are claims about the card's ray tracing capabilities. Sources suggest the RX 8800 XT will match the NVIDIA GeForce RTX 4080/4080 SUPER in raster performance while having a remarkable 45% improvement over the current flagship RX 7900 XTX in ray tracing. However, these claims must be backed by independent testing first, as performance improvements depend on the specific case, like games optimized for either AMD or NVIDIA yield better results for the favorable graphics card.
Sources: Seasonic Wattage Calculator, via Tom's Hardware
Add your own comment

122 Comments on AMD Radeon RX 8800 XT Reportedly Features 220 W TDP, RDNA 4 Efficiency

#101
AcE
kapone32Before we go on can we please remember that is a AMD GPU power thread.
And AMD was always a point in my arguments, yet you don't seem to have carefully read the news post as it is about much more than just "power". You don't make much sense here.
kapone32Now as far as claiming that my post was full of conspiracy theories shows me that you spent too much time subscribing to the narrative.
There is no "narrative" here, you cling to raster technology and you are not willing to accept the fact that the world is moving on and even one day raster will be dead and the whole game will be ray traced, god willing, giving us perfect graphics not rastered estimations, emulations, which we have now.
kapone32You don't appreciate that PC Games are being built using hardware from all of those vendors and as such will invest time and money into optimizing a Game for their specific platform.
And as such most games will be optimised for Nvidia and a lot also for AMD because of the console domination. In the end it balances itself out, something which you don't seem to understand, instead clinging to false narratives. Generally you're unable to argue, instead throwing around words like "you don't appreciate XY" - nonsense.
kapone32You don't seem to understand that raster is the foundation of 3D PC Gaming so the features you champion are moot in this thread.
You don't seem to understand that technology is ever evolving and your mindset is set in stone, so unless you want to become a dinosaur, it is a You problem here, whereas I'm trying to appreciate everything and understand everything.
kapone32You see the 220W argument in this thread has everything to do with raster performance.
Then go and read the whole thread and not just a small part of it, the Ray Tracing argument was one of the main talking points in this thread and even in the OP, you didn't even read the news entry itself, it seems. Very weird.
kapone32Intel cards? What are we trying to turn this into? A let's bash AMD for 10 pages thread?
Instead of reading my post and trying to get into useful counter arguments, or arguments, you're making up nonsense like this. Where exactly was AMD bashed? Saying AMD has suboptimal Ray Tracing, is nothing special, as everyone knows this, saying further more that they perform only bad in one "Nvidia game" (CP2077), proves the point that this is coincidental and that CP2077 isn't a "Nvidia game" per se, just a game with heavy ray tracing which will need proper RT cores and not a compromise that consists of bumped up TMUs with some ray tracing abilities. And if you read carefully I said where AMD RT is good and where it is not, which then hardly can be called "bashing".
kapone32BTW Hairworks are a feature just like how RT is today.
Wrong, Hairworks is a proprietary graphical tool to make hair look "better" or more realistic whereas ray tracing is a decades old technology that is well known and not invented by Nvidia, you couldn't in fact be more wrong. IF RT were like Hairworks, it would not be part of DirectX 12, btw. - which means it is something general, as I explained and not some proprietary tool or feature.
kapone32Thank god I read Spy vs Spy as a kid and understand that this has nothing to do with the truth of this thread.
Next time try to properly argue against my points instead of deflecting with nonsense such as this.
Posted on Reply
#102
AusWolf
AcEAnyone who uses DLSS in 1080p has to accept "okay". And "okay" here means rather "good" still.
I'd rather not rely on DLSS at 1080p because 1. It's crap and 2. 1080p doesn't require a nuclear reactor to run your games anyway.

Upscaling is for 4K and for people running on an ultra low budget.
Posted on Reply
#103
AcE
AusWolfI'd rather not rely on DLSS at 1080p because 1. It's crap and 2. 1080p doesn't require a nuclear reactor to run your games anyway.
1. It's not, you're just living in the past and are intolerant 2. depends which game and what settings, if RT on or not - so your opinion isn't necessarily true. Also depends on the GPU, if it's a older GPU it will well be in need of upscaling to make the game playable - oh and 3. upscaling is often times better than native... because TAA sucks, plainly. You're just so wrong.
AusWolfUpscaling is for 4K and for people running on an ultra low budget.
That is nonsense as well, the whole sentence. First of all upscaling isn't just for 4K, mainly it was introduced by Nvidia to make RT "playable" with enough fps, say 60+. Second, I'm on a high end GPU and I still use upscaling, so you're just wrong. Again, depends on the game, settings and GPU - and also FPS target. If your FPS targets are high enough upscaling is ALWAYS needed, no matter how strong the GPU is. So you're 100% wrong, in other words.
Posted on Reply
#104
AusWolf
AcE1. It's not, you're just living in the past and are intolerant
I'm intolerant to crappy-looking blurry graphics, yes.
AcE2. depends which game and what settings, if RT on or not - so your opinion isn't necessarily true. Also depends on the GPU, if it's a older GPU it will well be in need of upscaling to make the game playable
Like I said, upscaling is for you if you target 4K or if you're on a budget.
AcE- oh and 3. upscaling is often times better than native... because TAA sucks, plainly. You're just so wrong.
That's a matter of opinion. Mine differs from yours. It doesn't make it wrong.
AcEThat is nonsense as well, the whole sentence. First of all upscaling isn't just for 4K, mainly it was introduced by Nvidia to make RT "playable" with enough fps, say 60+. Second, I'm on a high end GPU and I still use upscaling, so you're just wrong. Again, depends on the game, settings and GPU - and also FPS target. If your FPS targets are high enough upscaling is ALWAYS needed, no matter how strong the GPU is. So you're 100% wrong, in other words.
Keep repeating how wrong I am, it won't make you any more right. We're discussing opinions here. You seem to like upscaling. I don't. End of.
Posted on Reply
#105
AcE
AusWolfI'm intolerant to crappy-looking blurry graphics, yes.
Which has nothing to do with this discussion, plus learn to argue.
AusWolfLike I said, upscaling is for you if you target 4K or if you're on a budget.
Which is nonsense as I have already explained in the other post, and as you have brought no argument to support your claim, you got no point, accept the L.
AusWolfThat's a matter of opinion. Mine differs from yours. It doesn't make me wrong.
Which is nonsense as I have already explained in the other post, and as you have brought no argument to support your claim, you got no point, accept the L.
AusWolfKeep repeating how wrong I am, it won't make you any more right. We're discussing opinions here. You seem to like upscaling. I don't. End of.
Too bad that I didn't just say "you're wrong", i delivered reasons *why* you're wrong. And as you are unable to counter them, you are just that, wrong and you can accept the L and move on, you will not achieve anything else here.

I won't waste time typing up much for a person who just can't accept the L, got no arguments and is just talking nonsense. Accept the L and move on, you got no points and no arguments. A 12 year old would win this argument against you and easily so. Empty words never impressed anyone, and pupils learn to argue early in the school, something which you seemed to have missed.
Posted on Reply
#106
AusWolf
AcEWhich has nothing to do with this discussion, plus learn to argue.

Which is nonsense, and as you have brought no argument to support your claim, you got no point, accept the L.

Which is nonsense, and as you have brought no argument to support your claim, you got no point, accept the L.

Which is nonsense, and as you have brought no argument to support your claim, you got no point, accept the L.

I won't waste time typing up much for a person who just can't accept the L, got no arguments and is just talking nonsense. Accept the L and move on, you got no points and no arguments. A 12 year old would win this argument against you.
See my posts above for argument. In short, upscaling looks bad at 1080p. It needs a high input resolution to look acceptable to me. If you disagree, that's you. There's no L or W here so stop being a child.
Posted on Reply
#107
AcE
AusWolfSee my posts above for argument. In short, upscaling looks bad at 1080p. It needs a high input resolution to look acceptable to me. If you disagree, that's you.
When reputable websites say that DLSS looks good at 1080p, that's probably a objective opinion, doesn't matter what "opinion" you have personally, as you seem to be generally biased against it. And I have already refuted all your other "arguments" (where there were any at all). In short, you're still absolutely wrong with your "opinion". DLSS 1080p (Quality), generally achieves "near native levels", which makes your opinion objectively wrong.
Posted on Reply
#108
AusWolf
AcEWhen reputable websites say that DLSS looks good at 1080p, that's probably a objective opinion, doesn't matter what "opinion" you have personally, as you seem to be generally biased against it. And I have already refuted all your other "arguments" (where there were any at all). In short, you're still absolutely wrong with your "opinion". DLSS 1080p (Quality), generally achieves "near native levels", which makes your opinion objectively wrong.
There is no such thing as an objective opinion. And I don't need "reputable websites" to tell me what I see on my own monitor. I can decide what I like and don't like on my own, thank you very much. If you're a child, and you constantly need to be told what to think because you're incapable of doing that for yourself, that isn't my problem.

If you said that you liked upscaling at 1080p based on your own experience with it, that would have been a totally different story. That is your opinion and I can accept that. But no, you had to bring up "reputable websites". Maybe that makes you the intolerant one here. Think about it.
Posted on Reply
#109
AusWolf
AcECongrats, it was a objective fact then, you still got no point. Semantics.
Again: we're not discussing facts, but opinions. It's not semantics.
AcEI'm not the manchild here who probably has never used the tech, is a dinosaur, lives in the past, is constantly talking nonsense and is a professional deflector. And keep deflecting, as you still got no arguments, you still got no point.
I used the tech, and formed my opinion that I didn't like it. I'm not gonna explain myself any further as there is no need. Unlike you, I don't need validation from "reputable websites" or anyone else. I know what I'm seeing on my own damn screen, and I'm perfectly capable of deciding whether I like it or not.
AcEI have very much already said that, maybe go back and reread? I used those reputable websites to confirm my own opinion of it, yes. And this doesn't make me "intolerant" at all, I just don't "tolerate" people who talk nonsense out of their *** and got zero experience with the tech, such as yourself. As is obvious you're a opponent of the tech, a rather extreme one, so it's very unlikely that your opinion has any merit, hence I see no reason to respect your opinion. Aside from the bunch of things like "it's just for 4K", you said, that were complete nonsense, which make you even less believable.
I have experience with the tech. I used it in a few games when I was still on Nvidia and 1080p. If you don't respect my opinion then why should I respect yours?

I believe this conversation says a lot more about you than it does about me, so maybe we should stop here. You think what you want, and so do I. Cool?
Posted on Reply
#110
Ahhzz
Keep it civil, or it will be shut down and points assigned.
Posted on Reply
#111
kapone32
AcEAnd AMD was always a point in my arguments, yet you don't seem to have carefully read the news post as it is about much more than just "power". You don't make much sense here.

There is no "narrative" here, you cling to raster technology and you are not willing to accept the fact that the world is moving on and even one day raster will be dead and the whole game will be ray traced, god willing, giving us perfect graphics not rastered estimations, emulations, which we have now.

And as such most games will be optimised for Nvidia and a lot also for AMD because of the console domination. In the end it balances itself out, something which you don't seem to understand, instead clinging to false narratives. Generally you're unable to argue, instead throwing around words like "you don't appreciate XY" - nonsense.

You don't seem to understand that technology is ever evolving and your mindset is set in stone, so unless you want to become a dinosaur, it is a You problem here, whereas I'm trying to appreciate everything and understand everything.

Then go and read the whole thread and not just a small part of it, the Ray Tracing argument was one of the main talking points in this thread and even in the OP, you didn't even read the news entry itself, it seems. Very weird.

Instead of reading my post and trying to get into useful counter arguments, or arguments, you're making up nonsense like this. Where exactly was AMD bashed? Saying AMD has suboptimal Ray Tracing, is nothing special, as everyone knows this, saying further more that they perform only bad in one "Nvidia game" (CP2077), proves the point that this is coincidental and that CP2077 isn't a "Nvidia game" per se, just a game with heavy ray tracing which will need proper RT cores and not a compromise that consists of bumped up TMUs with some ray tracing abilities. And if you read carefully I said where AMD RT is good and where it is not, which then hardly can be called "bashing".

Wrong, Hairworks is a proprietary graphical tool to make hair look "better" or more realistic whereas ray tracing is a decades old technology that is well known and not invented by Nvidia, you couldn't in fact be more wrong. IF RT were like Hairworks, it would not be part of DirectX 12, btw. - which means it is something general, as I explained and not some proprietary tool or feature.

Next time try to properly argue against my points instead of deflecting with nonsense such as this.
What are you saying? In what way does this reference the 220 Watt TDP? Do you even know why GPUs are parallel processors? There is no Gaming in 3D without raster. I just don't understand how you cannot seem to fathom that Companies in every space do things to get the advantage. It is called Branding. It has been a part of PC Gaming since the dawn. When we had way more vendors you had to be careful. GPUs were worse than Sound cards in terms of support. Today there is only AMD, Intel and Nvidia. Nvidia has always used "features" in conjunction with raster but once AMD caught them they pushed hard on "Features".

Now we have individuals that talk about "reputable" websites. They are all just talking heads opinion and repeating talking points. What do I mean? The 7900X3D is the most Jekyll and Hyde CPU ever. The latency argument was used and that was in nanoseconds, do you know how much 100 nanoseconds are? Or what the tangible difference is between 82 ns and 60 ns? The tech media (even though it wasn't sampled) opined on how slow the chip was even Windows got into it with Scheduling and that. but say something crazy like is a 5900X is better in anyway and the argument dies. That is not what I mean though. Some people will just go on. Go on Amazon or Newegg and the reviews of people who paid their own money to buy it and the overwhelming lean is very positive.

That is also what you don't understand Hairworks is no different than DLSS or RT. If you don't think I know about Ray Tracing that would mean you don't think I know who Trip Hawkins was. Nvidia's implementation of RT is part of DX12 but the way they make it work is not optimal for the group. They have tied it to hardware, like they always do and AMD has turned on the hooks for RT in DX12 (Or should I say Vulkan) via software and the narrative is not something to follow because if you put the average user in front of a screen using FSR, DLSS or XEss I am willing to bet they would not know the difference.

The narrative has given us USB 4 on X870E for 40GB/s USB C connections but that would benefit the Steam Deck and budget boards more. Someone buying X870E does not think about USB4 as a reason to buy the board. The vendors are not stupid though as the X670E boards that have great expansion have not moved an inch in price. Just like how someone buying a 8800XT is looking at Raster as the first peg and not RT. The tech media has sold the world on RT.


You bring up an interesting point for AMD. Most Games are made on console and the PC exclusive Games that PC don't run on consoles are usually Strategy, Space Sim and demanding ARPGs. In that space raster is still the most desirable thing you want as raw FPS is for those that just want to plug and play. If you want to tinker AMD software has a ton of features that most users who are/were on Nvidia do not realize. Just the other day a user posted about changing his display specs and even commented that he could not find it using AMD software. Well I posted a screenshot of the Window that IS in AMD software and the user was happy. That is supposed one of the main things TPU is supposed to be for, helping others out to enjoy the PC experience.

I know what you mean about CP2077 it runs fine on my 7900XT at 4K and looks awesome too. Until I turn on Ray Tracing (which is the last thing you think about in Combat) which tanks the frame rates and then if I turn on Path Tracing it is like Molasses. You would almost think that those features Nvidia inserted into the Game were free. Do you think that the extra programming came free? The thing is CP2077 is not City Skylines 2 where you can sit and admire your Game so it really does not matter. Just to prove the point I started this post at 7 AM and then got into City Skylines 2. It is now 1 PM and this paragraph is the last thing I have added. Do you know what that means? People actually enjoy Gaming with the 7900XT and if the 8800XT can match that in raster and beat it in RT it will be a success. People will know that their RT experience will only get better without needing to splurge on another card in 2 years time because AMD has fine wine down to a tee. There are still plenty of users on Polaris and the 7000 user club get's updated at least once a week. We are now putting Water blocks on our 7900 series GPUs for even better performance so go on believing your narrative that the World is out to wax negative on Nvidia, they are doing plenty themselves but this isnot the place to have that conversation. Please don't get personal as the hardware could care less what the user's opinion is.
Posted on Reply
#112
Kyan
AcE3. upscaling is often times better than native... because TAA sucks, plainly. You're just so wrong.
I can partly agree on this one because TAA was the worst AA tech ever created, at least DLSS/FSR or/XeSS are a bit better and have a similar fps hit. But there's a lot more better AA tech and I would rather use a "downscale base" AA than "upscale base" AA because objectively, you lose origninal details similar to how mp3 lose details to compress the size of the original audio.
Posted on Reply
#113
AcE
KyanI would rather use a "downscale base" AA than "upscale base" AA because objectively, you lose origninal details similar to how mp3 lose details to compress the size of the original audio.
That's not how machine / AI upscaling works. The goal is to not lose any details. And I have extensively used the tech, it works pretty well. The literal goal of upscaling is to give you native or better than native quality (at best) and reduce the performance impact of higher resolutions. DLAA on the other hand upscales the resolution, which is nothing special, but it also doesn't give you extra fps, it will lose you a bit - maybe useful for low resolutions, and not needed if you already start with 1440p or higher. A completely different matter and a different topic.

At the same time you brought a bad analogy - the same literal goal of MP3 is to only cut away details of the music which you can not even hear. And this is enough for the vast majority of people but audiophiles who mostly imagine they can hear better quality from "lossless" audio like FLAC or WAV and other options.

But this has nothing to do with MP3, as MP3 isn't machine learning or AI based and has a completely different approach.
Posted on Reply
#114
AusWolf
AcEThat's not how machine / AI upscaling works. The goal is to not lose any details. And I have extensively used the tech, it works pretty well. The literal goal of upscaling is to give you native or better than native quality (at best) and reduce the performance impact of higher resolutions. DLAA on the other hand upscales the resolution, which is nothing special, but it also doesn't give you extra fps, it will lose you a bit - maybe useful for low resolutions, and not needed if you already start with 1440p or higher. A completely different matter and a different topic.
DLSS takes a lower render resolution and upscales it to your chosen resolution. That's why it's called upscaling, not downscaling. For example, if you game at 1080p with DLSS Quality, then your game is actually rendered at 720p and upscaled to fit your monitor while using AI to "fill in the gaps" to minimise the loss in image quality. The only way it produces better than native image quality is if you consider TAA native and the game has a very bad TAA implementation in it. Then, it is better than TAA, you could say. Otherwise, with no TAA, a lower render resolution will always be inferior, no matter how much AI you throw at it.

Source: IGN and videocardz.com

DLAA is an AI-based anti-aliasing tech designed to produce better-than-native image quality. It's the opposite of DLSS, if you will.

Source: nvidia.com

This is regardless of how much one likes or dislikes the tech. These are just facts, provided purely for learning, not as a basis for further discussion (especially if said discussion only involves spewing "bullshit" and "you're wrong" out of every orifice).
Posted on Reply
#115
AcE
AusWolfDLSS takes a lower render resolution and upscales it to your chosen resolution. That's why it's called upscaling, not downscaling. For example, if you game at 1080p with DLSS Quality, then your game is actually rendered at 720p and upscaled to fit your monitor while using AI to "fill in the gaps" to minimise the loss in image quality. The only way it produces better than native image quality is if you consider TAA native and the game has a very bad TAA implementation in it. Then, it is better than TAA, you could say. Otherwise, with no TAA, a lower render resolution will always be inferior, no matter how much AI you throw at it.

Source: IGN and videocardz.com

DLAA is an AI-based anti-aliasing tech designed to produce better-than-native image quality. It's the opposite of DLSS, if you will.

Source: nvidia.com

This is regardless of how much one likes or dislikes the tech. These are just facts, provided purely for learning, not as a basis for further discussion (especially if said discussion only involves spewing "bullshit" and "you're wrong" out of every orifice).
That’s a lot of time wasted. I’m not bothering to waste more time to read this nonsense and explain this stuff again to the guy who runs crying to the mods when he’s losing arguments and generally is beyond ignorant. I’m gonna take you on block as well now, that’s just what I should’ve done from the beginning.

Notice: all the guys who disagree with me are Radeon users, it’s just denial and cope, that they have to talk down DLSS and RT. Pretty much expected biased behavior. Doesn’t change the fact that DLSS works great and RT looks good, stay jealous I guess? :) Maybe splurge a bit more money next time, that’s a lot of energy you guys waste to cope and discuss, just because you missed out on good tech. You get what you pay for, discussing this to death in a tech forum, won’t make your GPUs better than they are.

Suffice to say, the reviewer of TPU also agrees with me on the merits of DLSS, it’s just funny how these guys seem to live in their own bubbles.
Posted on Reply
#116
AusWolf
AcEThat’s a lot of time wasted. I’m not bothering to waste more time to read this nonsense and explain this stuff again to the guy who runs crying to the mods when he’s losing arguments and generally is beyond ignorant. I’m gonna take you on block as well now, that’s just what I should’ve done from the beginning.

Notice: all the guys who disagree with me are Radeon users, it’s just denial and cope, that they have to talk down DLSS and RT. Pretty much expected biased behavior. Doesn’t change the fact that DLSS works great and RT looks good, stay jealous I guess? :) Maybe splurge a bit more money next time, that’s a lot of energy you guys waste to cope and discuss, just because you missed out on good tech. You get what you pay for, discussing this to death in a tech forum, won’t make your GPUs better than they are.

Suffice to say, the reviewer of TPU also agrees with me on the merits of DLSS, it’s just funny how these guys seem to live in their own bubbles.
I gave you technical details with none of my own opinion in it to get your facts right, to make sure you don't confuse DLSS with DLAA which you obviously do, quoting from IGN, Videocardz and Nvidia themselves. Or do you suddenly not care about "reputable sites" anymore? :rolleyes:

I never ever "run crying to the mods" about anyone for being an idiot (as I find it amusing), and I will not make an exception with you, either. So please, block me. You'll be doing me a favour, believe that. ;)
Posted on Reply
#117
the54thvoid
Super Intoxicated Moderator
AcEThat’s a lot of time wasted. I’m not bothering to waste more time to read this nonsense and explain this stuff again to the guy who runs crying to the mods when he’s losing arguments and generally is beyond ignorant. I’m gonna take you on block as well now, that’s just what I should’ve done from the beginning.
Stop shitposting trash. Your own actions are getting our attention. Auswolf certainly did not report you.

Everyone else, move on.
Posted on Reply
#118
Kyan
AcENotice: all the guys who disagree with me are Radeon users, it’s just denial and cope, that they have to talk down DLSS and RT. Pretty much expected biased behavior. Doesn’t change the fact that DLSS works great and RT looks good, stay jealous I guess? :) Maybe splurge a bit more money next time, that’s a lot of energy you guys waste to cope and discuss, just because you missed out on good tech. You get what you pay for, discussing this to death in a tech forum, won’t make your GPUs better than they are.
The only use I will have with an Nvidia card is DLAA and other downscaling technology because I just prioritize sharp image rather than high fps (not in every game), Ultra++ settings, ray tracing or anything else. Majority of games these days are great in medium or high but aliasing will always be there depending on your pixel density and distance from your screen of course.
Personnal prefrence and different need push people to have different buy decision, not only for graphics card.

PS: you can laugh at my post but note that I did like one of yours which where pertinent, I'm not closed mind, I tend to be the opposite.
Posted on Reply
Add your own comment
Dec 11th, 2024 22:33 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts