Thursday, August 3rd 2023

PowerColor AMD Radeon RX 7800 XT Pictured, Confirmed Based on "Navi 32"

PowerColor inadvertently released the first pictures of its AMD Radeon RX 7800 XT Red Devil graphics card. These pictures confirm that the RX 7800 XT is based on a maxed out version of the "Navi 32" GPU, and not the compact "Navi 31" powering the limited edition RX 7900 GRE. The "Navi 32" is a chiplet-based GPU, just like the "Navi 31," albeit smaller. Its 5 nm GCD (graphics compute die) physically features 60 RDNA3 compute units, which work out to 3,840 stream processors, 120 AI accelerators, 60 Ray accelerators, 192 TMUs, and possibly 128 ROPs. This GCD is surrounded by four 6 nm MCDs (memory cache dies), which each has a 16 MB segment of the GPU's 64 MB Infinity Cache memory, and make up its 256-bit GDDR6 memory interface.

The specs sheet put out by PowerColor confirms that the RX 7800 XT maxes out the "Navi 32," enabling all 60 CUs, and the chip's full 256-bit memory interface, to drive 16 GB of memory. The RX 7800 XT uses 18 Gbps memory speed, and hence has 576 GB/s of memory bandwidth at its disposal. The PowerColor RX 7800 XT Red Devil has dual-BIOS, and assuming the "standard/silent" BIOS runs the card at AMD reference clock speeds, we're looking at Game clocks of 2210 MHz, and 2565 MHz boost. The Red Devil draws power from a dual 8-pin PCIe power connector set up (375 W max); the cooler is visibly smaller than the one on the company's RX 7900 series Red Devil cards. A 16+2 phase VRM powers the card. With pictures of the card out, we expect a global product launch within the next 30 days.
Source: VideoCardz
Add your own comment

91 Comments on PowerColor AMD Radeon RX 7800 XT Pictured, Confirmed Based on "Navi 32"

#76
Beginner Macro Device
ARFYou mean that now there are one or two more demanding games released after 2021 which would need some settings adjustments in order to run 4K with the card?
I mean RX 6800 XT had at least one game with weak 4K perfromance from the start (Cyberpunk 2077; you needed to lower everything to mid-low to have stable 60 FPS). After three years, we have even more such games. This means the card has ultimately gone from the 4K GPUs list. It's still a no nonsense powerful GPU but if you want limitless or at least not crucifictional 4K you should get yourself something faster, 7900 XTX at least.
dalekdukesboyAll I care about is….will there be a 7950xtx and will it compete with 4090+ Nvidia?
No. Almost a year ago, AMD clearly stated they are not interested in competing with 4090 anytime soon. We will see an AMD GPU which outperforms RTX 4090 no sooner than in 2026.
Posted on Reply
#77
Lew Zealand
Beginner Micro DeviceI mean RX 6800 XT had at least one game with weak 4K perfromance from the start (Cyberpunk 2077; you needed to lower everything to mid-low to have stable 60 FPS). After three years, we have even more such games. This means the card has ultimately gone from the 4K GPUs list. It's still a no nonsense powerful GPU but if you want limitless or at least not crucifictional 4K you should get yourself something faster, 7900 XTX at least.
Implying that being unable to run one or even 2 or 3 games at 4K Ultra 60fps is stretching the exclusional definition of 4K card a bit as that means there were zero 4K video cards available when Cyberpunk came out and continued to be zero 4K cards for almost 2 years until the 4090 and 7900 XTX arrived. Being to run the majority but not all of current AAA titles at 4K Ultra/60 is a reasonable standard.

That implies that the 4070, 6800 XT and 3080 are 4K cards by the graph below but it's skewed by very high FPS in Doom Eternal as well as Battlefield and a few others. So I've generally targetted 75fps on a mix of games like this as a reasonable target to account for that, which make the 3090, 4070 Ti and 7900 XT the "real" 4K cards. If the 7800 XT somehow slots in above the 6900 XT then IMO it qualifies.

Posted on Reply
#78
Beginner Macro Device
Lew Zealandsomehow slots in above the 6900 XT
Only if you OC everything outta it. 7800 XT is very weak on CU count, having only 60 of them. IPC of RDNA3 is about 3 percent higher (if it's higher at all) than that of RDNA2 so it's just an overclocked RX 6800 per se. Non-XT one. This GPU will have very hard time catching up the 6800 XT, not to mention how far behind 6900 XT it is.
Lew ZealandImplying that being unable to run one or even 2 or 3 games at 4K Ultra 60fps is stretching the exclusional definition of 4K card a bit as that means there were zero 4K video cards available when Cyberpunk came out and continued to be zero 4K cards for almost 2 years until the 4090 and 7900 XTX arrived
Almost yes. There have been no ultimate 4K GPUs. RX 6800 series had been an entry level 4K GPUs line-up, whereas 6900 XT was a good 4K GPU. Now, 3 years later, 6800 series is great at 1440p and fairly acceptable at 4K but not quite impressive to say the least. TLOU, Harry Potter, Cyberpunk, Jedi Survivor... We already have four games which are unplayable on such GPUs at 4K@Ultra. And the list will grow.
Lew Zealandby the graph below
Average FPS isn't everything. I'd rather measure my minimum FPS. Whatever card has 60+ there is a real 4K GPU. So yes, if you wanna have REAL 4K you should've got yourself at least 6900 XT.
Posted on Reply
#79
dalekdukesboy
Beginner Micro DeviceI mean RX 6800 XT had at least one game with weak 4K perfromance from the start (Cyberpunk 2077; you needed to lower everything to mid-low to have stable 60 FPS). After three years, we have even more such games. This means the card has ultimately gone from the 4K GPUs list. It's still a no nonsense powerful GPU but if you want limitless or at least not crucifictional 4K you should get yourself something faster, 7900 XTX at least.


No. Almost a year ago, AMD clearly stated they are not interested in competing with 4090 anytime soon. We will see an AMD GPU which outperforms RTX 4090 no sooner than in 2026.
Lame, truly lame. AMD said just before 7900xtx launched it was very competitive with 4090…now they just threw their hands up and as you put it are not “interested” in competing with 4090. Like they just gave up!!
Posted on Reply
#80
Beginner Macro Device
dalekdukesboyAMD said just before 7900xtx launched it was very competitive with 4090
Source? I probably was too late for that party since I only heard them stating their flagship 7900 XTX is a 4080's competition which is fair enough.
Posted on Reply
#81
Lew Zealand
Beginner Micro DeviceAverage FPS isn't everything. I'd rather measure my minimum FPS. Whatever card has 60+ there is a real 4K GPU. So yes, if you wanna have REAL 4K you should've got yourself at least 6900 XT.
100% agreed on average FPS, and even that 62 FPS minimum for the 6900XT IMO isn't enough as that means about half the games can't maintain 60 FPS minimums (keeping in mind how Doom Eternal is skewing all the numbers). I generally shoot for 75 FPS all-game avg. minimums as that means there'll be a few games that still dip below 60 but judicious reductions in some settings will get those toughest to 60 all the time. Which sets the 7900 XT, 3090 Ti and 4080 as the minimum true 4K GPUs for today's games for me.

I use a 6800 XT at 1440p and while the average Minimum at 97 FPS is quite a bit higher than my arbitrary 75, there's one game (Atomic Heart) that already doesn't get 60 FPS min. But it's just one and I can reduce a setting to reach it if needed.
Posted on Reply
#82
AnotherReader
Chrispy_Yeah, I'm not sure MCDs use a lot of power TBH. The 7900GRE is most likely lower power because it's clocked so slow relative to the 7900XT.

I hope I'm wrong, I'm expecting just shy of 6800XT performance, and my own 6800XT is undervolted and downclocked to 2133MHz to run at 200W (reported for core only) so closer to 235W actual. That level of performance would be fine if it was priced at the $450+ and came with a 230W stock TDP before tuning. I'd like to think that such a TDP could be tuned down to 170W or so...
The MCDs themselves are unlikely to consume much power, but the attached memory also consumes power. According to Micron, GDDR6's average power consumption is 7.5 pJ per bit. This means that for a 7900 XTX, the total off-chip power consumption may be as high as about 99 W.
  1. 7.5 pJ/bit *20 Gbps * 384 bits = 57.6 W
  2. 0.4 pJ/bit *5.3 TB/s * 8 bits/byte = 17 W
  3. 2 W per device* 12 devices = 24 W
This yields a GCD power consumption of about 257 W. The actual figures are probably a little higher for the GCD and lower for the off-chip interface as the memory interface is unlikely to see 100% utilization because of the 96 MB last level cache. Now, even though there will be parts of Navi 32 that consume about the same power as in Navi 31, e.g. the front-end, we don't know their power consumption. Therefore, we can estimate Navi 32 power consumption as the sum of the following:
  1. 7.5 pJ/bit *18 Gbps * 256 bits = 34.6 W
  2. 0.4 pJ/bit *5.3 TB/s * (2/3)*8 bits/byte = 11.3 W
  3. 2 W per device* 8 devices = 16 W
  4. (60/96)*257.4 = 160.9 W for the GCD
This comes to about 223 W for Navi 32 if the clocks are the same as Navi 31.
Posted on Reply
#83
Chrispy_
AnotherReaderThe MCDs themselves are unlikely to consume much power, but the attached memory also consumes power. According to Micron, GDDR6's average power consumption is 7.5 pJ per bit. This means that for a 7900 XTX, the total off-chip power consumption may be as high as about 99 W.
  1. 7.5 pJ/bit *20 Gbps * 384 bits = 57.6 W
  2. 0.4 pJ/bit *5.3 TB/s * 8 bits/byte = 17 W
  3. 2 W per device* 12 devices = 24 W
This yields a GCD power consumption of about 257 W. The actual figures are probably a little higher for the GCD and lower for the off-chip interface as the memory interface is unlikely to see 100% utilization because of the 96 MB last level cache. Now, even though there will be parts of Navi 32 that consume about the same power as in Navi 31, e.g. the front-end, we don't know their power consumption. Therefore, we can estimate Navi 32 power consumption as the sum of the following:
  1. 7.5 pJ/bit *18 Gbps * 256 bits = 34.6 W
  2. 0.4 pJ/bit *5.3 TB/s * (2/3)*8 bits/byte = 11.3 W
  3. 2 W per device* 8 devices = 16 W
  4. (60/96)*257.4 = 160.9 W for the GCD
This comes to about 223 W for Navi 32 if the clocks are the same as Navi 31.
The memory clocks will likely be the same, but the GCD core clocks are unlikely to be, which makes guessing the overall TDP nothing more than a crapshoot until we have final clocks. Your math looks sound though.
Posted on Reply
#84
dalekdukesboy
Beginner Micro DeviceSource? I probably was too late for that party since I only heard them stating their flagship 7900 XTX is a 4080's competition which is fair enough.
Frig. I saw the article on here about the 7900 xtx taking a swing at the 4090 but of course now I can’t find it. It was just before 7900 xtx released. Either November or December of 2022.

www.techpowerup.com/311978/amd-confirms-new-enthusiast-class-radeon-7000-series-graphics-cards-this-quarter

This is the article recently somewhat confirming or speculating that the 7950 will exist…Still looking for older article on here that talked about 7900 xtx swinging for 4090.
Posted on Reply
#85
80-watt Hamster
Darmok N JaladAnd you can get a 6650 XT for as low as $249 right now.

I do wonder if we’re in a spot where games are outpacing hardware. For example, the RX 560 was promoted as a 1080p gamer, so was the 5500XT, and the 6500XT. Seems we’re sliding up a tier to get “1080p gaming.” It takes an x7x0 series card to get something billed as a 1440p card.
Games are absolutely outpacing hardware. Every AAA release is trying to be Crysis. It's why I keep going on in almost every thread like this about expectations, product naming and power envelopes.

Until Ampere, >$1000 halo cards would get released and most of us would go, "Cool. Anyway..." Now there's this tacit expectation of 4k60U for anything branded x7xx or higher. But now that x9xx performance has been normalized, big new releases keep pushing the envelope so the halo buyers feel like they got their money's worth. Which means the high end buyers don't get 4k60U anymore, and stuff even lower down the stack gets slagged off as trash when it doesn't meet similarly inflated performance targets. An x7xx card shouldn't sell a single unit at $600 (but witness the 4070) or pull over 200W (hello 3070). Yet here we are, because gamers and eye candy are like kids in a candy store with a no-limit gift card.
Posted on Reply
#86
ratirt
80-watt HamsterGames are absolutely outpacing hardware. Every AAA release is trying to be Crysis. It's why I keep going on in almost every thread like this about expectations, product naming and power envelopes.

Until Ampere, >$1000 halo cards would get released and most of us would go, "Cool. Anyway..." Now there's this tacit expectation of 4k60U for anything branded x7xx or higher. But now that x9xx performance has been normalized, big new releases keep pushing the envelope so the halo buyers feel like they got their money's worth. Which means the high end buyers don't get 4k60U anymore, and stuff even lower down the stack gets slagged off as trash when it doesn't meet similarly inflated performance targets. An x7xx card shouldn't sell a single unit at $600 (but witness the 4070) or pull over 200W (hello 3070). Yet here we are, because gamers and eye candy are like kids in a candy store with a no-limit gift card.
I agree with you to a degree. I agree with, people have been having higher expectations with the supposedly mid tier cards. Like 3070 or 4070 etc. but I think it is for a reason. In my eyes the reason is the price. Mid cards did not cost that much as these do now. Maybe that is why people started expecting more from these cards since they cost a lot more. I get your point but I'm thinking about about mid tier cards reaching a price mark of $800-$900. That is a lot of money for something that is not even maxing out raster game wouldn't you say? Then you have the above cards that can tackle RT in one way or another which you either wanna experience it or you don't (like me. I am in the ''I don't care about RT'' category ).
Posted on Reply
#87
thepath
This thing has same CU as RT 6800 and less than 6800XT

It will probably perform like slightly overclocked RX 6800 at best. I doubt that it will match 6800XT

Remember RX 7600 has same CU as 6600XT/6650XT.... and 6650XT perform almost same as 7600....
Posted on Reply
#88
AnotherReader
thepathThis thing has same CU as RT 6800 and less than 6800XT

It will probably perform like slightly overclocked RX 6800 at best. I doubt that it will match 6800XT

Remember RX 7600 has same CU as 6600XT/6650XT.... and 6650XT perform almost same as 7600....
The RX 7600 is clocked lower than the 6650 XT while this should be clocked substantially higher than the RX 6800. I expect this to at least match the 6800 XT and maybe even the 6900 XT. Its reception will depend on the price.
Posted on Reply
#89
Eskimonster
I´m not gonna guess what´s what. but i want a new gpu, hoping this is cheap.
Spec´s look nice "albeit" i should buy a Geefarce to compliment my screens G-sync, butt i can ignore them if the price is low nuff.
Posted on Reply
#90
thepath
AnotherReaderThe RX 7600 is clocked lower than the 6650 XT while this should be clocked substantially higher than the RX 6800. I expect this to at least match the 6800 XT and maybe even the 6900 XT. Its reception will depend on the price.
7600 has slightly higher boost frequency and memory speed than 6650XT

ALso, 7800XT has Game clocks of 2210 MHz, and 2565 MHz boost according to article which does not look high enough to make up for CU difference (6800XT has 20% more cores)

It will barely match 6800XT....
Posted on Reply
#91
AnotherReader
thepath7600 has slightly higher boost frequency and memory speed than 6650XT

ALso, 7800XT has Game clocks of 2210 MHz, and 2565 MHz boost according to article which does not look high enough to make up for CU difference (6800XT has 20% more cores)

It will barely match 6800XT....
Don't go by the AMD specs. Look at actual reviews. TPU only reviewed one 6650 XT and that one is clocked substantially higher than a RX 7600: 2699 MHz in Cyberpunk vs 2525 for the newer card.
Posted on Reply
Add your own comment
May 19th, 2024 09:47 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts