• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

PowerColor AMD Radeon RX 7800 XT Pictured, Confirmed Based on "Navi 32"

Joined
Apr 14, 2022
Messages
672 (0.86/day)
Location
London, UK
Processor AMD Ryzen 7 5800X3D
Motherboard ASUS B550M-Plus WiFi II
Cooling Noctua U12A chromax.black
Memory Corsair Vengeance 32GB 3600Mhz
Video Card(s) Palit RTX 4080 GameRock OC
Storage Samsung 970 Evo Plus 1TB + 980 Pro 2TB
Display(s) Asus XG35VQ
Case Asus Prime AP201
Audio Device(s) Creative Gigaworks - Razer Blackshark V2 Pro
Power Supply Corsair SF750
Mouse Razer Viper
Software Windows 11 64bit
Joined
Feb 24, 2023
Messages
2,301 (4.96/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC
Processor i5-12400F / 10600KF
Motherboard Gigabyte B760M DS3H / Z490 Vision D
Cooling Laminar RM1 / Gammaxx 400
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333
Video Card(s) RX 6700 XT / RX 480 8 GB
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 + 1 TB WD HDD
Display(s) Compit HA2704 / Viewsonic VX3276-MHD-2
Case Matrexx 55 / Junkyard special
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / FSP Epsilon 700 W / Corsair CX650M [backup]
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 10 and 11
You mean that now there are one or two more demanding games released after 2021 which would need some settings adjustments in order to run 4K with the card?
I mean RX 6800 XT had at least one game with weak 4K perfromance from the start (Cyberpunk 2077; you needed to lower everything to mid-low to have stable 60 FPS). After three years, we have even more such games. This means the card has ultimately gone from the 4K GPUs list. It's still a no nonsense powerful GPU but if you want limitless or at least not crucifictional 4K you should get yourself something faster, 7900 XTX at least.

All I care about is….will there be a 7950xtx and will it compete with 4090+ Nvidia?
No. Almost a year ago, AMD clearly stated they are not interested in competing with 4090 anytime soon. We will see an AMD GPU which outperforms RTX 4090 no sooner than in 2026.
 
Joined
Jul 20, 2020
Messages
868 (0.61/day)
System Name Gamey #1 / #2
Processor Ryzen 7 5800X3D / Core i7-9700F
Motherboard Asrock B450M P4 / Asrock B360M P4
Cooling IDCool SE-226-XT / CM Hyper 212
Memory 32GB 3200 CL16 / 32GB 2666 CL14
Video Card(s) PC 6800 XT / Soyo RTX 2060 Super
Storage 4TB Team MP34 / 512G Tosh RD400+2TB WD3Dblu
Display(s) LG 32GK650F 1440p 144Hz VA
Case Corsair 4000Air / CM N200
Audio Device(s) Dragonfly Black
Power Supply EVGA 650 G3 / Corsair CX550M
Mouse JSCO JNL-101k Noiseless
Keyboard Steelseries Apex 3 TKL
Software Win 10, Throttlestop
I mean RX 6800 XT had at least one game with weak 4K perfromance from the start (Cyberpunk 2077; you needed to lower everything to mid-low to have stable 60 FPS). After three years, we have even more such games. This means the card has ultimately gone from the 4K GPUs list. It's still a no nonsense powerful GPU but if you want limitless or at least not crucifictional 4K you should get yourself something faster, 7900 XTX at least.

Implying that being unable to run one or even 2 or 3 games at 4K Ultra 60fps is stretching the exclusional definition of 4K card a bit as that means there were zero 4K video cards available when Cyberpunk came out and continued to be zero 4K cards for almost 2 years until the 4090 and 7900 XTX arrived. Being to run the majority but not all of current AAA titles at 4K Ultra/60 is a reasonable standard.

That implies that the 4070, 6800 XT and 3080 are 4K cards by the graph below but it's skewed by very high FPS in Doom Eternal as well as Battlefield and a few others. So I've generally targetted 75fps on a mix of games like this as a reasonable target to account for that, which make the 3090, 4070 Ti and 7900 XT the "real" 4K cards. If the 7800 XT somehow slots in above the 6900 XT then IMO it qualifies.

 
Last edited:
Joined
Feb 24, 2023
Messages
2,301 (4.96/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC
Processor i5-12400F / 10600KF
Motherboard Gigabyte B760M DS3H / Z490 Vision D
Cooling Laminar RM1 / Gammaxx 400
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333
Video Card(s) RX 6700 XT / RX 480 8 GB
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 + 1 TB WD HDD
Display(s) Compit HA2704 / Viewsonic VX3276-MHD-2
Case Matrexx 55 / Junkyard special
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / FSP Epsilon 700 W / Corsair CX650M [backup]
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 10 and 11
somehow slots in above the 6900 XT
Only if you OC everything outta it. 7800 XT is very weak on CU count, having only 60 of them. IPC of RDNA3 is about 3 percent higher (if it's higher at all) than that of RDNA2 so it's just an overclocked RX 6800 per se. Non-XT one. This GPU will have very hard time catching up the 6800 XT, not to mention how far behind 6900 XT it is.
Implying that being unable to run one or even 2 or 3 games at 4K Ultra 60fps is stretching the exclusional definition of 4K card a bit as that means there were zero 4K video cards available when Cyberpunk came out and continued to be zero 4K cards for almost 2 years until the 4090 and 7900 XTX arrived
Almost yes. There have been no ultimate 4K GPUs. RX 6800 series had been an entry level 4K GPUs line-up, whereas 6900 XT was a good 4K GPU. Now, 3 years later, 6800 series is great at 1440p and fairly acceptable at 4K but not quite impressive to say the least. TLOU, Harry Potter, Cyberpunk, Jedi Survivor... We already have four games which are unplayable on such GPUs at 4K@Ultra. And the list will grow.

by the graph below
Average FPS isn't everything. I'd rather measure my minimum FPS. Whatever card has 60+ there is a real 4K GPU. So yes, if you wanna have REAL 4K you should've got yourself at least 6900 XT.
1691289358319.png
 
Joined
Mar 7, 2007
Messages
1,418 (0.23/day)
Processor E5-1680 V2
Motherboard Rampage IV black
Video Card(s) Asrock 7900 xtx
Storage 500 gb sd
Software windows 10 64 bit
Benchmark Scores 29,433 3dmark06 score
I mean RX 6800 XT had at least one game with weak 4K perfromance from the start (Cyberpunk 2077; you needed to lower everything to mid-low to have stable 60 FPS). After three years, we have even more such games. This means the card has ultimately gone from the 4K GPUs list. It's still a no nonsense powerful GPU but if you want limitless or at least not crucifictional 4K you should get yourself something faster, 7900 XTX at least.


No. Almost a year ago, AMD clearly stated they are not interested in competing with 4090 anytime soon. We will see an AMD GPU which outperforms RTX 4090 no sooner than in 2026.
Lame, truly lame. AMD said just before 7900xtx launched it was very competitive with 4090…now they just threw their hands up and as you put it are not “interested” in competing with 4090. Like they just gave up!!
 
Joined
Feb 24, 2023
Messages
2,301 (4.96/day)
Location
Russian Wild West
System Name DLSS / YOLO-PC
Processor i5-12400F / 10600KF
Motherboard Gigabyte B760M DS3H / Z490 Vision D
Cooling Laminar RM1 / Gammaxx 400
Memory 32 GB DDR4-3200 / 16 GB DDR4-3333
Video Card(s) RX 6700 XT / RX 480 8 GB
Storage A couple SSDs, m.2 NVMe included / 240 GB CX1 + 1 TB WD HDD
Display(s) Compit HA2704 / Viewsonic VX3276-MHD-2
Case Matrexx 55 / Junkyard special
Audio Device(s) Want loud, use headphones. Want quiet, use satellites.
Power Supply Thermaltake 1000 W / FSP Epsilon 700 W / Corsair CX650M [backup]
Mouse Don't disturb, cheese eating in progress...
Keyboard Makes some noise. Probably onto something.
VR HMD I live in real reality and don't need a virtual one.
Software Windows 10 and 11
AMD said just before 7900xtx launched it was very competitive with 4090
Source? I probably was too late for that party since I only heard them stating their flagship 7900 XTX is a 4080's competition which is fair enough.
 
Joined
Jul 20, 2020
Messages
868 (0.61/day)
System Name Gamey #1 / #2
Processor Ryzen 7 5800X3D / Core i7-9700F
Motherboard Asrock B450M P4 / Asrock B360M P4
Cooling IDCool SE-226-XT / CM Hyper 212
Memory 32GB 3200 CL16 / 32GB 2666 CL14
Video Card(s) PC 6800 XT / Soyo RTX 2060 Super
Storage 4TB Team MP34 / 512G Tosh RD400+2TB WD3Dblu
Display(s) LG 32GK650F 1440p 144Hz VA
Case Corsair 4000Air / CM N200
Audio Device(s) Dragonfly Black
Power Supply EVGA 650 G3 / Corsair CX550M
Mouse JSCO JNL-101k Noiseless
Keyboard Steelseries Apex 3 TKL
Software Win 10, Throttlestop
Average FPS isn't everything. I'd rather measure my minimum FPS. Whatever card has 60+ there is a real 4K GPU. So yes, if you wanna have REAL 4K you should've got yourself at least 6900 XT.
View attachment 307828

100% agreed on average FPS, and even that 62 FPS minimum for the 6900XT IMO isn't enough as that means about half the games can't maintain 60 FPS minimums (keeping in mind how Doom Eternal is skewing all the numbers). I generally shoot for 75 FPS all-game avg. minimums as that means there'll be a few games that still dip below 60 but judicious reductions in some settings will get those toughest to 60 all the time. Which sets the 7900 XT, 3090 Ti and 4080 as the minimum true 4K GPUs for today's games for me.

I use a 6800 XT at 1440p and while the average Minimum at 97 FPS is quite a bit higher than my arbitrary 75, there's one game (Atomic Heart) that already doesn't get 60 FPS min. But it's just one and I can reduce a setting to reach it if needed.
 
Joined
Nov 26, 2021
Messages
1,372 (1.49/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
Yeah, I'm not sure MCDs use a lot of power TBH. The 7900GRE is most likely lower power because it's clocked so slow relative to the 7900XT.

I hope I'm wrong, I'm expecting just shy of 6800XT performance, and my own 6800XT is undervolted and downclocked to 2133MHz to run at 200W (reported for core only) so closer to 235W actual. That level of performance would be fine if it was priced at the $450+ and came with a 230W stock TDP before tuning. I'd like to think that such a TDP could be tuned down to 170W or so...
The MCDs themselves are unlikely to consume much power, but the attached memory also consumes power. According to Micron, GDDR6's average power consumption is 7.5 pJ per bit. This means that for a 7900 XTX, the total off-chip power consumption may be as high as about 99 W.
  1. 7.5 pJ/bit *20 Gbps * 384 bits = 57.6 W
  2. 0.4 pJ/bit *5.3 TB/s * 8 bits/byte = 17 W
  3. 2 W per device* 12 devices = 24 W
This yields a GCD power consumption of about 257 W. The actual figures are probably a little higher for the GCD and lower for the off-chip interface as the memory interface is unlikely to see 100% utilization because of the 96 MB last level cache. Now, even though there will be parts of Navi 32 that consume about the same power as in Navi 31, e.g. the front-end, we don't know their power consumption. Therefore, we can estimate Navi 32 power consumption as the sum of the following:
  1. 7.5 pJ/bit *18 Gbps * 256 bits = 34.6 W
  2. 0.4 pJ/bit *5.3 TB/s * (2/3)*8 bits/byte = 11.3 W
  3. 2 W per device* 8 devices = 16 W
  4. (60/96)*257.4 = 160.9 W for the GCD
This comes to about 223 W for Navi 32 if the clocks are the same as Navi 31.
 
Joined
Feb 20, 2019
Messages
7,487 (3.88/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
The MCDs themselves are unlikely to consume much power, but the attached memory also consumes power. According to Micron, GDDR6's average power consumption is 7.5 pJ per bit. This means that for a 7900 XTX, the total off-chip power consumption may be as high as about 99 W.
  1. 7.5 pJ/bit *20 Gbps * 384 bits = 57.6 W
  2. 0.4 pJ/bit *5.3 TB/s * 8 bits/byte = 17 W
  3. 2 W per device* 12 devices = 24 W
This yields a GCD power consumption of about 257 W. The actual figures are probably a little higher for the GCD and lower for the off-chip interface as the memory interface is unlikely to see 100% utilization because of the 96 MB last level cache. Now, even though there will be parts of Navi 32 that consume about the same power as in Navi 31, e.g. the front-end, we don't know their power consumption. Therefore, we can estimate Navi 32 power consumption as the sum of the following:
  1. 7.5 pJ/bit *18 Gbps * 256 bits = 34.6 W
  2. 0.4 pJ/bit *5.3 TB/s * (2/3)*8 bits/byte = 11.3 W
  3. 2 W per device* 8 devices = 16 W
  4. (60/96)*257.4 = 160.9 W for the GCD
This comes to about 223 W for Navi 32 if the clocks are the same as Navi 31.
The memory clocks will likely be the same, but the GCD core clocks are unlikely to be, which makes guessing the overall TDP nothing more than a crapshoot until we have final clocks. Your math looks sound though.
 
Joined
Mar 7, 2007
Messages
1,418 (0.23/day)
Processor E5-1680 V2
Motherboard Rampage IV black
Video Card(s) Asrock 7900 xtx
Storage 500 gb sd
Software windows 10 64 bit
Benchmark Scores 29,433 3dmark06 score
Source? I probably was too late for that party since I only heard them stating their flagship 7900 XTX is a 4080's competition which is fair enough.
Frig. I saw the article on here about the 7900 xtx taking a swing at the 4090 but of course now I can’t find it. It was just before 7900 xtx released. Either November or December of 2022.


This is the article recently somewhat confirming or speculating that the 7950 will exist…Still looking for older article on here that talked about 7900 xtx swinging for 4090.
 
Last edited:
Joined
Aug 21, 2015
Messages
1,680 (0.52/day)
Location
North Dakota
System Name Office
Processor Ryzen 5600G
Motherboard ASUS B450M-A II
Cooling be quiet! Shadow Rock LP
Memory 16GB Patriot Viper Steel DDR4-3200
Video Card(s) Gigabyte RX 5600 XT
Storage PNY CS1030 250GB, Crucial MX500 2TB
Display(s) Dell S2719DGF
Case Fractal Define 7 Compact
Power Supply EVGA 550 G3
Mouse Logitech M705 Marthon
Keyboard Logitech G410
Software Windows 10 Pro 22H2
And you can get a 6650 XT for as low as $249 right now.

I do wonder if we’re in a spot where games are outpacing hardware. For example, the RX 560 was promoted as a 1080p gamer, so was the 5500XT, and the 6500XT. Seems we’re sliding up a tier to get “1080p gaming.” It takes an x7x0 series card to get something billed as a 1440p card.

Games are absolutely outpacing hardware. Every AAA release is trying to be Crysis. It's why I keep going on in almost every thread like this about expectations, product naming and power envelopes.

Until Ampere, >$1000 halo cards would get released and most of us would go, "Cool. Anyway..." Now there's this tacit expectation of 4k60U for anything branded x7xx or higher. But now that x9xx performance has been normalized, big new releases keep pushing the envelope so the halo buyers feel like they got their money's worth. Which means the high end buyers don't get 4k60U anymore, and stuff even lower down the stack gets slagged off as trash when it doesn't meet similarly inflated performance targets. An x7xx card shouldn't sell a single unit at $600 (but witness the 4070) or pull over 200W (hello 3070). Yet here we are, because gamers and eye candy are like kids in a candy store with a no-limit gift card.
 
Joined
May 31, 2016
Messages
4,340 (1.48/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 16GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Games are absolutely outpacing hardware. Every AAA release is trying to be Crysis. It's why I keep going on in almost every thread like this about expectations, product naming and power envelopes.

Until Ampere, >$1000 halo cards would get released and most of us would go, "Cool. Anyway..." Now there's this tacit expectation of 4k60U for anything branded x7xx or higher. But now that x9xx performance has been normalized, big new releases keep pushing the envelope so the halo buyers feel like they got their money's worth. Which means the high end buyers don't get 4k60U anymore, and stuff even lower down the stack gets slagged off as trash when it doesn't meet similarly inflated performance targets. An x7xx card shouldn't sell a single unit at $600 (but witness the 4070) or pull over 200W (hello 3070). Yet here we are, because gamers and eye candy are like kids in a candy store with a no-limit gift card.
I agree with you to a degree. I agree with, people have been having higher expectations with the supposedly mid tier cards. Like 3070 or 4070 etc. but I think it is for a reason. In my eyes the reason is the price. Mid cards did not cost that much as these do now. Maybe that is why people started expecting more from these cards since they cost a lot more. I get your point but I'm thinking about about mid tier cards reaching a price mark of $800-$900. That is a lot of money for something that is not even maxing out raster game wouldn't you say? Then you have the above cards that can tackle RT in one way or another which you either wanna experience it or you don't (like me. I am in the ''I don't care about RT'' category ).
 
Joined
Jan 13, 2020
Messages
25 (0.02/day)
This thing has same CU as RT 6800 and less than 6800XT

It will probably perform like slightly overclocked RX 6800 at best. I doubt that it will match 6800XT

Remember RX 7600 has same CU as 6600XT/6650XT.... and 6650XT perform almost same as 7600....
 
Joined
Nov 26, 2021
Messages
1,372 (1.49/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
This thing has same CU as RT 6800 and less than 6800XT

It will probably perform like slightly overclocked RX 6800 at best. I doubt that it will match 6800XT

Remember RX 7600 has same CU as 6600XT/6650XT.... and 6650XT perform almost same as 7600....
The RX 7600 is clocked lower than the 6650 XT while this should be clocked substantially higher than the RX 6800. I expect this to at least match the 6800 XT and maybe even the 6900 XT. Its reception will depend on the price.
 
Joined
Mar 24, 2019
Messages
620 (0.33/day)
Location
Denmark - Aarhus
System Name Iglo
Processor 5800X3D
Motherboard TUF GAMING B550-PLUS WIFI II
Cooling Arctic Liquid Freezer II 360
Memory 32 gigs - 3600hz
Video Card(s) EVGA GeForce GTX 1080 SC2 GAMING
Storage NvmE x2 + SSD + spinning rust
Display(s) BenQ XL2420Z - lenovo both 27" and 1080p 144/60
Case Fractal Design Meshify C TG Black
Audio Device(s) Logitech Z-2300 2.1 200w Speaker /w 8 inch subwoofer
Power Supply Seasonic Prime Ultra Platinum 550w
Mouse Logitech G900
Keyboard Corsair k100 Air Wireless RGB Cherry MX
Software win 10
Benchmark Scores Super-PI 1M T: 7,993 s :CinebR20: 5755 point GeekB: 2097 S-11398-M 3D :TS 7674/12260
I´m not gonna guess what´s what. but i want a new gpu, hoping this is cheap.
Spec´s look nice "albeit" i should buy a Geefarce to compliment my screens G-sync, butt i can ignore them if the price is low nuff.
 
Joined
Jan 13, 2020
Messages
25 (0.02/day)
The RX 7600 is clocked lower than the 6650 XT while this should be clocked substantially higher than the RX 6800. I expect this to at least match the 6800 XT and maybe even the 6900 XT. Its reception will depend on the price.

7600 has slightly higher boost frequency and memory speed than 6650XT

ALso, 7800XT has Game clocks of 2210 MHz, and 2565 MHz boost according to article which does not look high enough to make up for CU difference (6800XT has 20% more cores)

It will barely match 6800XT....
 
Joined
Nov 26, 2021
Messages
1,372 (1.49/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
7600 has slightly higher boost frequency and memory speed than 6650XT

ALso, 7800XT has Game clocks of 2210 MHz, and 2565 MHz boost according to article which does not look high enough to make up for CU difference (6800XT has 20% more cores)

It will barely match 6800XT....
Don't go by the AMD specs. Look at actual reviews. TPU only reviewed one 6650 XT and that one is clocked substantially higher than a RX 7600: 2699 MHz in Cyberpunk vs 2525 for the newer card.
 
Top