• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 8800 XT Reportedly Features 220 W TDP, RDNA 4 Efficiency

Joined
Sep 17, 2014
Messages
22,499 (6.03/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Well that is your opinion. I enjoyed Crossfire support so much that most of the Games I bought at that time supported Crossfire. Mutli GPU is not the same thing as crossfire and has no impact on Games. Ashes of the Singularity is the only Game I know that just supports Multi GPU native. The thing with Polaris was that Crossfire was at the Driver level so if the Game supported it it worked and if not the other card would basically be turned off.
That 'thing' was bog standard for every crossfire and SLI capable GPU. Which meant most of the time you would clearly notice that you would actually run on one card, and if you didnt, it was clearly audible too because of all the extra heat and noise :)

Driver for Nvidia even lets me pick SFR or AFR. Not that it matters though; no game support = you are looking at something arcane that doesnt work or literally renders half the screen.
 
Joined
Oct 5, 2018
Messages
28 (0.01/day)
Processor Ryzen 7 5800X3D
Motherboard Gigabyte B550 AORUS ELITE V2 rev 1.2
Cooling be quiet! Silent Loop 2
Memory 2x16GB 3200 A-DATA DDR4
Video Card(s) Asus 6700 XT
Storage OCZ Vertex 4 120GB + Samsung 2TB 980 Pro
Display(s) Asus ROG Strix XG309CM
Case be quiet! Silent Base 601
Power Supply be quiet! Straight Power 11 750W (BN307)
Mouse Logitech G403 Hero
Keyboard Logitech G710+
All rumors about amd gpu cards never goes live, so this vision is more like dream card rather than real one. Even rumor about double 8-pin doesn't fit to this vision which can end up with gpu with TGP around 300W... better wait instead of overhyping this gpu like many previous miracle AMD gpus which never goes live at the end.
 
Joined
Jan 27, 2024
Messages
227 (0.73/day)
Processor AMD
Motherboard AMD chipset
Cooling Cool
Memory Fast
Video Card(s) AMD/ATi Radeon | Matrox Ultra high quality
Storage Lexar
Display(s) 4K
Case Transparent left side window
Audio Device(s) Yes
Power Supply Deepcool Gold 750W
Mouse Yes
Keyboard Yes
VR HMD No
Software Windows 10
Benchmark Scores Yes
Can the TPU team review the image quality of videos encoded in AV1 with the new VGAs from Intel, AMD and Nvidia? Please...

Is it possible to encode videos in 2 passes using the GPU? If so, which app does it?

That, but also a standard image quality testing in order to see which brand of cards cheats with the textures and which does not (yeah, am looking at you, Nvidia :D)


All rumors about amd gpu cards never goes live, so this vision is more like dream card rather than real one. Even rumor about double 8-pin doesn't fit to this vision which can end up with gpu with TGP around 300W... better wait instead of overhyping this gpu like many previous miracle AMD gpus which never goes live at the end.

It is about smart engineering and AI. I know there is no smart engineering at AMD, but maybe it will be their first time to implement it.
It's called undervolting, it is pretty simple and straight-forward, can be easily done at the factory. The trade-off - you lose 2% of performance, but your cards get lowered TDP from 300W to some sane 180W..
 
Joined
Nov 26, 2021
Messages
1,670 (1.51/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
All rumors about amd gpu cards never goes live, so this vision is more like dream card rather than real one. Even rumor about double 8-pin doesn't fit to this vision which can end up with gpu with TGP around 300W... better wait instead of overhyping this gpu like many previous miracle AMD gpus which never goes live at the end.
Two 8 pin connectors are used for the 7700 XT as well which isn't even a 250 W card.
 
Joined
Jun 2, 2017
Messages
9,252 (3.37/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
That 'thing' was bog standard for every crossfire and SLI capable GPU. Which meant most of the time you would clearly notice that you would actually run on one card, and if you didnt, it was clearly audible too because of all the extra heat and noise :)

Driver for Nvidia even lets me pick SFR or AFR. Not that it matters though; no game support = you are looking at something arcane that doesnt work or literally renders half the screen.
There were about 4 settings for Crossfire. I used it for the life of Total War from Medieval 2 to 3 Kingdoms, when they changed the engine. You could still enable Crossfire in the script, but it did nothing to the engine. At that point I started giving up on Multi GPU and starting focusing on other things to put in my other PCIe slots. I guess the heat that you are talking about is true if you don't have a case that is up to snuff but we are talking about 2 RX570/580 combos that might have pulled 150 Watts each. Plus they were inexpensive and popular.
 
Joined
Jan 27, 2024
Messages
227 (0.73/day)
Processor AMD
Motherboard AMD chipset
Cooling Cool
Memory Fast
Video Card(s) AMD/ATi Radeon | Matrox Ultra high quality
Storage Lexar
Display(s) 4K
Case Transparent left side window
Audio Device(s) Yes
Power Supply Deepcool Gold 750W
Mouse Yes
Keyboard Yes
VR HMD No
Software Windows 10
Benchmark Scores Yes
Joined
Jun 2, 2017
Messages
9,252 (3.37/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
I have a 7600XT from As Rock. It is not a power hungry card but it comes with 2 8 pin connectors. I do believe that is going to be the standard for everyone else.
 
Joined
Jan 27, 2024
Messages
227 (0.73/day)
Processor AMD
Motherboard AMD chipset
Cooling Cool
Memory Fast
Video Card(s) AMD/ATi Radeon | Matrox Ultra high quality
Storage Lexar
Display(s) 4K
Case Transparent left side window
Audio Device(s) Yes
Power Supply Deepcool Gold 750W
Mouse Yes
Keyboard Yes
VR HMD No
Software Windows 10
Benchmark Scores Yes
I have a 7600XT from As Rock. It is not a power hungry card but it comes with 2 8 pin connectors. I do believe that is going to be the standard for everyone else.

RX 7600 is either single 6-pin or single 8-pin.
 
Joined
Sep 26, 2022
Messages
2,086 (2.60/day)
Location
Brazil
System Name G-Station 1.17 FINAL
Processor AMD Ryzen 7 5700X3D
Motherboard Gigabyte X470 Aorus Gaming 7 WiFi
Cooling DeepCool AK620 Digital
Memory Asgard Bragi DDR4-3600CL14 2x16GB
Video Card(s) Sapphire PULSE RX 7900 XTX
Storage 240GB Samsung 840 Evo, 1TB Asgard AN2, 2TB Hiksemi FUTURE-LITE, 320GB+1TB 7200RPM HDD
Display(s) Samsung 34" Odyssey OLED G8
Case Thermaltake Level 20 MT
Audio Device(s) Astro A40 TR + MixAmp
Power Supply Cougar GEX X2 1000W
Mouse Razer Viper Ultimate
Keyboard Razer Huntsman Elite (Red)
Software Windows 11 Pro
That is a design mistake. Because you can't put 375-watt connectors (2 x 150-watt + 75-watt from the PCIe slot) on a 245-watt card.
Why is it a mistake? It doesn't make much difference in space utilization from an 8+6 pin combination. It makes keeping track of inventory and assembly way easier on them, not having to keep tabs on two different parts.
 
Joined
Nov 26, 2021
Messages
1,670 (1.51/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
That is a design mistake. Because you can't put 375-watt connectors (2 x 150-watt + 75-watt from the PCIe slot) on a 245-watt card.
Surely, not every AIB miscalculated. It's simple; 228 W would require 153 W from the 8 pin connector and 75 W from the PCIe slot. In practice, many GPUs draw miniscule amounts from the PCIe slot. Given how many people use third class PSUs, it's prudent to avoid more than 150 W from the 8 pin connector. Two 8 pin connectors make sense when you look at it from that perspective.
 
Joined
Jan 27, 2024
Messages
227 (0.73/day)
Processor AMD
Motherboard AMD chipset
Cooling Cool
Memory Fast
Video Card(s) AMD/ATi Radeon | Matrox Ultra high quality
Storage Lexar
Display(s) 4K
Case Transparent left side window
Audio Device(s) Yes
Power Supply Deepcool Gold 750W
Mouse Yes
Keyboard Yes
VR HMD No
Software Windows 10
Benchmark Scores Yes
Why is it a mistake? It doesn't make much difference in space utilization from an 8+6 pin combination. It makes keeping track of inventory and assembly way easier on them, not having to keep tabs on two different parts.

It is a tough PSU requirement. I must be double 6-pin. Not all PSUs have those PCIe power connectors, which are ugly, space consuming, and can't be hidden inside the case.
 
Joined
Nov 27, 2023
Messages
2,402 (6.42/day)
System Name The Workhorse
Processor AMD Ryzen R9 5900X
Motherboard Gigabyte Aorus B550 Pro
Cooling CPU - Noctua NH-D15S Case - 3 Noctua NF-A14 PWM at the bottom, 2 Fractal Design 180mm at the front
Memory GSkill Trident Z 3200CL14
Video Card(s) NVidia GTX 1070 MSI QuickSilver
Storage Adata SX8200Pro
Display(s) LG 32GK850G
Case Fractal Design Torrent (Solid)
Audio Device(s) FiiO E-10K DAC/Amp, Samson Meteorite USB Microphone
Power Supply Corsair RMx850 (2018)
Mouse Razer Viper (Original) on a X-Raypad Equate Plus V2
Keyboard Cooler Master QuickFire Rapid TKL keyboard (Cherry MX Black)
Software Windows 11 Pro (24H2)
@3valatzy
Pretty much every decent-ish PSU will come with at least two 6+2 pin PCI-e cables. Not really an issue.
I mean, if you want easy, simple, one solution to use on any card, well, 12V-2x6 is there to solve that, but I thought nobody liked it because it burns down your dog and kicks your house or something. Even though the revised connector is totally fine.
 
Joined
Aug 21, 2013
Messages
1,908 (0.46/day)
Given the rumoured specifications, 4080 performance is very unlikely. Going by the numbers in the latest GPU review, the 4080 is 42% faster than the 7800 XT at 1440p and 49% faster at 4K. That is too great a gap to be overcome by a 6.7% increase in Compute Units.
I even have doubts if it can reach 4070 Ti Super/7900XT level raster because if it only has 64CU's (TPU's placeholder page even says 56) then it will be difficult to close the gap to a 84CU card and then surpass it by 30% (difference between 7800XT and 7900XT).
RT is harder to pin down as here AMD could reap the low hanging fruit and massively increase RT performance without increasing the number of RT cores (same number as CU's). Here i can actually believe 4080S level performance.
If it's 45% faster in RT vs the 7900xtx, that makes it basically a 4080/4080S. Since the raster is also similar, then im calling it, 799$ MSRP.
Raster is not similar. Raster is ~4070 Ti Super level tho the reported specs dont support that.
I agree. This is how the hype train starts rolling and then the inevitable derailment leads to bashing of the actual product, even if it's undeserved. The Compute Unit count and rumoured clock speeds suggest performance in the ballpark of the 7900 XT, not the 4080, and certainly not the 7900 XTX which is 20 % faster than the 7900 XT at 4K.
Glad someone gets it. Already i see people start making unrealistic claims. Lets temper our expectations.
AMD never won against Nvidia since over 15 years. The only other small "win" they had was with R9 290X, which was very temporarily, they were a bit faster than 780 and Titan and the answer to that was fast by Nvidia, the 780 Ti, I don't count that very temporary win as a W for AMD.
They didn't and 290X was temporary?
You need to check you timeline and prices.

Yes 290X launched in October 2013 and while Nvidia did release both the 780 Ti and the first Titan a month later those cards were more expensive while not being a whole lot faster. Titan was only miniscule 3% faster while costing obscene (for a gaming card at the time) 999 while 780 Ti was more reasonable 699 but still only 4% faster.

290X at 549 remained the bang for buck choice until Nvidia released GTX 980 in September 2014 for also 549 that beat the RX 290 by a more convincing 13%.
It wasn't until the middle of 2015 when Nvidia released 980 Ti for 649 that convincingly beat the 290X by 28% (and 390X by 21%) at much lower power consumption. So essentially 290X had at least 12 months of being the best value high end card.
 
Joined
Nov 26, 2021
Messages
1,670 (1.51/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
I even have doubts if it can reach 4070 Ti Super/7900XT level raster because if it only has 64CU's (TPU's placeholder page even says 56) then it will be difficult to close the gap to a 84CU card and then surpass it by 30% (difference between 7800XT and 7900XT).
RT is harder to pin down as here AMD could reap the low hanging fruit and massively increase RT performance without increasing the number of RT cores (same number as CU's). Here i can actually believe 4080S level performance.

Raster is not similar. Raster is ~4070 Ti Super level tho the reported specs dont support that.

Glad someone gets it. Already i see people start making unrealistic claims. Lets temper our expectations.

They didn't and 290X was temporary?
You need to check you timeline and prices.

Yes 290X launched in October 2013 and while Nvidia did release both the 780 Ti and the first Titan a month later those cards were more expensive while not being a whole lot faster. Titan was only miniscule 3% faster while costing obscene (for a gaming card at the time) 999 while 780 Ti was more reasonable 699 but still only 4% faster.

290X at 549 remained the bang for buck choice until Nvidia released GTX 980 in September 2014 for also 549 that beat the RX 290 by a more convincing 13%.
It wasn't until the middle of 2015 when Nvidia released 980 Ti for 649 that convincingly beat the 290X by 28% (and 390X by 21%) at much lower power consumption. So essentially 290X had at least 12 months of being the best value high end card.
Yes, matching 7900 XT's rasterization performance, in the absence of any increase in the performance of a single compute unit, would require high clocks: 3 GHz would be enough, but it's rather unlikely with a 220 W TDP. We know that RDNA 3.5 has doubled the number of texture samplers per compute unit and that may allow a greater than expected performance increase in some cases. In any case, at least the rumours about 7900 XTX level rasterization performance seem ridiculous. I'm also uncertain if they can match Nvidia for ray tracing performance after being behind for so long; the most likely case would be a big improvement over RDNA3, but a smaller gap to Ada.

As for the 290X, it was leading the 780 Ti in TPU's suite before the sun had set on 28 nm being the latest node for GPUs.
 
Last edited:
Joined
Apr 30, 2011
Messages
2,709 (0.55/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
That is a design mistake. Because you can't put 375-watt connectors (2 x 150-watt + 75-watt from the PCIe slot) on a 245-watt card.
OC can go more than 300W on those GPUs, so this config is the best for safety of the current regulation.
 

AcE

New Member
Joined
Dec 3, 2024
Messages
23 (11.50/day)
I even have doubts if it can reach 4070 Ti Super/7900XT level raster because if it only has 64CU's (TPU's placeholder page even says 56) then it will be difficult to close the gap to a 84CU card and then surpass it by 30% (difference between 7800XT and 7900XT).
It will have at least 64 CUs, maybe more, the 7800 XT already has 60 CUs, stay realistic.
Yes 290X launched in October 2013 and while Nvidia did release both the 780 Ti and the first Titan a month later those cards were more expensive while not being a whole lot faster. Titan was only miniscule 3% faster while costing obscene (for a gaming card at the time) 999 while 780 Ti was more reasonable 699 but still only 4% faster.
4% faster is 4% faster, that's far away from a W for AMD. If you want a W you must be clearly faster and not 4% slower. The 780 Ti was solely released to beat the 290X, which it did - and prices never matter for Enthusiast cards, we all know that, 500, 700, tomato, tamoto. Most people bought the 780 Ti over it. And also, the 290 vanilla easily outsold the 290X, AMD usually cut themselves back then by releasing a card which was 100$ less with just 256 shaders shaved off. They did some weird decisions back then, which since RX 7000 times they stopped doing. 6800 XT had only 512 shaders less than 6900 XT and also 300$ less msrp, another mistake by AMD. But the 7900 XT is 20% slower than the XTX due to them also shaving the bus off by 64 bit and reducing clocks perhaps as well. That's how long AMD needed to learn proper "product segmentation" but then again the 7900 XT was overpriced at launch and it needed months for them to correct the price.
290X at 549 remained the bang for buck choice until Nvidia released GTX 980 in September 2014 for also 549 that beat the RX 290 by a more convincing 13%.
Not for the vast majority of people, due to Nvidias mindshare most people still bought the 780 Ti over it, and then even cards like the 780 vanilla which was slower and had less vram. Lastly the 290X didn't even compete well with his own brother 290 vanilla which had nearly the same performance for 100$ less.
It wasn't until the middle of 2015 when Nvidia released 980 Ti for 649 that convincingly beat the 290X by 28% (and 390X by 21%) at much lower power consumption. So essentially 290X had at least 12 months of being the best value high end card.
No, the 980 Ti was released to compete with the Fury X, this is already a different generation and has not much to do with the 290X.
 
Top