Tuesday, December 3rd 2024
AMD Radeon RX 8800 XT Reportedly Features 220 W TDP, RDNA 4 Efficiency
AMD's upcoming Radeon RX 8000 series GPUs based on RDNA 4 architecture are just around the corner, with rumors pointing to a CES unveiling event. Today, we are learning that the Radeon RX 8800 XT GPU will feature a 220 W TDP, compared to its Radeon RX 7800 XT predecessor with 263 W TDP, thanks to the Seasonic wattage calculator. While we expect to see better nodes used for making RNDA 4, the efficiency gains stem primarily from the improved microarchitectural design of the new RDNA generation. The RX 8800 XT will bring better performance while lowering power consumption by 16%. While no concrete official figures are known about RNDA 4 performance targets compared to RDNA, if AMD plans to maintain the competitive mid-range landscape with NVIDIA "Blackwell" and, as of today, Intel with Arc "Battlemage," team red must put out a good fight to remain competitive.
We reported on AMD Radeon RX 8800 XT entering mass production this month, with notable silicon design a departure from previous designs. The RX 8800 XT will reportedly utilize a monolithic chip dubbed "Navi 48," moving away from the chiplet-based approach seen in the current "Navi 31" and "Navi 32" GPUs. Perhaps most intriguing are claims about the card's ray tracing capabilities. Sources suggest the RX 8800 XT will match the NVIDIA GeForce RTX 4080/4080 SUPER in raster performance while having a remarkable 45% improvement over the current flagship RX 7900 XTX in ray tracing. However, these claims must be backed by independent testing first, as performance improvements depend on the specific case, like games optimized for either AMD or NVIDIA yield better results for the favorable graphics card.
Sources:
Seasonic Wattage Calculator, via Tom's Hardware
We reported on AMD Radeon RX 8800 XT entering mass production this month, with notable silicon design a departure from previous designs. The RX 8800 XT will reportedly utilize a monolithic chip dubbed "Navi 48," moving away from the chiplet-based approach seen in the current "Navi 31" and "Navi 32" GPUs. Perhaps most intriguing are claims about the card's ray tracing capabilities. Sources suggest the RX 8800 XT will match the NVIDIA GeForce RTX 4080/4080 SUPER in raster performance while having a remarkable 45% improvement over the current flagship RX 7900 XTX in ray tracing. However, these claims must be backed by independent testing first, as performance improvements depend on the specific case, like games optimized for either AMD or NVIDIA yield better results for the favorable graphics card.
122 Comments on AMD Radeon RX 8800 XT Reportedly Features 220 W TDP, RDNA 4 Efficiency
Edit: Here's the 5th one - the early sale of cut-down chips only to release the good stuff a bit cheaper half a year later ($1200 4080 vs $1000 4080 Super). That's just dirty.
If I had a GRE, I'd probably keep it until UDNA. The 8800 XT sounds more like a side-grade as of now (let's see the reviews).
Reviews will tell us all but damn they seem so far away.
EDIT. COD@1440p ultra has to have AFMF/FSR to acheive 165fps. I know I could lower the settings, but COD looks pants otherwise.
It was the biggest turn-around moment for ATI/AMD in years since the 9700 Pro launch. This is what kept AMD alive and kicking when AMD was greatly suffering with their 65nm and 45nm Phenom X4 CPUs, as AMD kept the momentum going on with their HD 5870 and 5970 vs the infamous GTX 480 Fermi debacle during 2009-2010.
I doubt it'll get me to swap out my 6800XT, but the 980Ti in my Steambox is having a rough go of even 2017 era games at 1080P (like Horizon Zero Dawn non-remaster if I crank up the settings).
It's a spare machine built out of old parts, so I'm not in any rush to drop serious coin on it.
I hope we either get 7600 performance at x5xx prices (~$150) or ~4060Ti level performance for 7600 prices (~$270). That would either make the N44 a good buy or put downward pressure on used parts that are fit for purpose.
Upscaling doesn't need to improve. Games need to require sensible amount of GPU power at native.
See Minecraft or Fortnite - they lack graphics at all, and still doesn't run at 1000 FPS, as it should have been in a normal world.
PS. "Laughing" at my post, talking nonsense and having no proper arguments, is truly laughable. :)
Games/comments/bzgppb
Here we can see that the Intel card, comparable to RTX 30 series (which is of the same gen), "only" loses about 57% performance, the comparable 3060 loses 56%, whereas comparable RDNA2 and even RDNA3 cards lose over 68% performance here and struggle greatly. This proves that the game needs proper RT cores and then runs way better, it proves that it is *not* Nvidia optimised, because it runs comparable on Intel Arc as well. As so far that RT is a optional extra in the game, and the game runs extremely well on Radeon outside of Raytracing, someone *can not* make the argument that the game is "Nvidia optimised". You can only make the argument here that the game likes proper RT cores, and that's it. Splash screens don't mean much, a lot of these games with AMD splash screens or Nvidia splash screens run excellent on both companies cards. Again this is somewhat delusional take from you, someone who knows games well would actually know this, but yet I have to explain this fact to you.
If a game runs bad on a companies cards, it is coincidental TODAY, we don't live in Crysis 2 times anymore where Nvidia will pay the dev to implement invisible tesselation so that Radeon cards tank (likewise ATI did things like this also back then in the day). With social media / the internet stuff like that would spread like wildfire and ruin Nvidias reputation in seconds.