• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Radeon RX 7800 XT Based on New ASIC with Navi 31 GCD on Navi 32 Package?

Joined
Sep 6, 2013
Messages
3,329 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
That's more opinion and perspective than a conclusion about proper functionality. The point being that AMD's recent GPUs and drivers for same have been exceptionally stable. Whether or not the GPU uses a given amount of power in a specific situation is not something that will effect stability in desktop, media viewing or gaming.


Thus...
Obviously we are exchanging opinions. And as Harry Callahan have said... :p

Don't know what a GPU needs to do to keep feeding a high refresh rate monitor or a dual monitor setup. But close to 100W power consumption in video playback is unacceptable in 2023. While an opinion I also believe it is logical to say that. Your smartphone in your pocket or on your desk can probably playback the same video files at 5-10W power consumption. In fact even AMD's own APUs play video while keeping their power consumption for the whole SOC at even lower than 15W. So, something gone wrong with RDNA 3 here. And even if we consider this still an opinion, then there is this chart
1687508875548.png

where a 6900XT is at half the power consumption.
AMD Radeon RX 7900 XTX Review - Disrupting the RTX 4080 - Power Consumption | TechPowerUp
 
Joined
Sep 17, 2014
Messages
22,437 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Obviously we are exchanging opinions. And as Harry Callahan have said... :p

Don't know what a GPU needs to do to keep feeding a high refresh rate monitor or a dual monitor setup. But close to 100W power consumption in video playback is unacceptable in 2023. While an opinion I also believe it is logical to say that. Your smartphone in your pocket or on your desk can probably playback the same video files at 5-10W power consumption. In fact even AMD's own APUs play video while keeping their power consumption for the whole SOC at even lower than 15W. So, something gone wrong with RDNA 3 here. And even if we consider this still an opinion, then there is this chart
View attachment 302130
where a 6900XT is at half the power consumption.
AMD Radeon RX 7900 XTX Review - Disrupting the RTX 4080 - Power Consumption | TechPowerUp
Its high. But there are other examples with strange gaps. Look at the 3090ti compared to a 3090. 13W at the same VRAM capacity. Mkay? The APU/CPU/Phone comparisons are pretty much nonsense. Yes, other parts built for efficiency and even specifically efficient video playback, over performance can run said content efficiently. I mean... water is wet.

We're simply looking at a power state issue here, one that has plagued every other gen of cards in either camp historically, and its always multi monitor or desktop light usage being the culprit. RDNA3 'went wrong' is jumping to conclusions. The 3090ti didn't go wrong either, it just has a different set of power states and voltages to eek out that 5% gain over its sibling.
 
Joined
Dec 25, 2020
Messages
6,710 (4.70/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Its obviously not for you if you're sitting on a 3090 that is already within spitting distance :) I think you're exaggerating a little. AMD's GPU stack has been in a far worse position not too long ago, arguably from (post-) Hawaii XT up to and including RDNA1. They have a more competitive stack since RDNA2, not a worse one. RDNA3 features that are missing are mostly in the RT performance/DLSS camp, and even despite that they run everything fine. Its really not relevant that a 4090 exists with a much higher perf cap; everything below it is samey in almost every way between these two camps.

I'm not exaggerating, the fact that it's not for me speaks volumes... it's been a refresh wave and a full generation since I purchased my 33 month old graphics card. It should be ancient at this point, if you look at it objectively 3 years is more than half of the R9 Fury X's entire lifetime as a supported product(!), but no, AMD can't even release a product that will decisively beat it!

Well once AMD keep dipping into the red as they did last quarter, heads will roll, let hope it's this guy first LOL
View attachment 302129

Scott Herkelman, Frank Azor and Sasa Marinkovic work in marketing, if they look like morons doing their job it's because they're literally scraping the barrel to try and market these products. There's next to no redeemable thing about owning a Radeon today. You're missing out on all of the cool things if you buy one. That's not me speaking, it's the market share doing so.
 
Joined
Sep 17, 2014
Messages
22,437 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
I'm not exaggerating, the fact that it's not for me speaks volumes... it's been a refresh wave and a full generation since I purchased my 33 month old graphics card. It should be ancient at this point, if you look at it objectively 3 years is more than half of the R9 Fury X's entire lifetime as a supported product(!), but no, AMD can't even release a product that will decisively beat it!



Scott Herkelman, Frank Azor and Sasa Marinkovic work in marketing, if they look like morons doing their job it's because they're literally scraping the barrel to try and market these products. There's next to no redeemable thing about owning a Radeon today. You're missing out on all of the cool things if you buy one. That's not me speaking, it's the market share doing so.
Dude the entire Ampere > Ada performance gap is abysmal just the same. The age of your graphics card isn't relevant either as both camps have spaced out their release cadence to bi-yearly for some time now, too. Ada isn't for you either, unless you are willing to up the spending cap to 1,5x what you used to pay for x90.

Get real. Seriously. Or simply adjust your expectations. This shit has been happening since Turing and its the new norm, no, you don't need to upgrade every gen, it was never a good idea, and it certainly isn't now.

Its funny you mention Fury X in this context btw :D What's happening in that head of yours I wonder... are you gearing up to get a 4090 after all, or just expressing general disappointment?
 
Joined
Feb 20, 2019
Messages
8,275 (3.93/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
I just want a release date for Navi32, and some updates on whether they've fixed the silicon issues that caused disappointing clocks in Navi31 and forced the entire AMD driver team to stop what they were doing for 4 months and try to fix Navi31 clock stability in software.
 
Joined
Sep 6, 2013
Messages
3,329 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
Its high. But there are other examples with strange gaps. Look at the 3090ti compared to a 3090. 13W at the same VRAM capacity. Mkay? The APU/CPU/Phone comparisons are pretty much nonsense. Yes, other parts built for efficiency and even specifically efficient video playback, over performance can run said content efficiently. I mean... water is wet.

We're simply looking at a power state issue here, one that has plagued every other gen of cards in either camp historically, and its always multi monitor or desktop light usage being the culprit. RDNA3 'went wrong' is jumping to conclusions. The 3090ti didn't go wrong either, it just has a different set of power states and voltages to eek out that 5% gain over its sibling.
My opinion, strictly an opinion here, let's say totally wrong, is that you choose to ignore the facts. But that's my opinion, probably wrong and because it is borderline insult I apologize in advance for this opinion.
 
Joined
Dec 25, 2020
Messages
6,710 (4.70/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Dude the entire Ampere > Ada performance gap is abysmal just the same. The age of your graphics card isn't relevant either as both camps have spaced out their release cadence to bi-yearly for some time now, too. Ada isn't for you either, unless you are willing to up the spending cap to 1,5x what you used to pay for x90.

Get real. Seriously. Or simply adjust your expectations. This shit has been happening since Turing and its the new norm, no, you don't need to upgrade every gen, it was never a good idea, and it certainly isn't now.

Its funny you mention Fury X in this context btw :D What's happening in that head of yours I wonder... are you gearing up to get a 4090 after all, or just expressing general disappointment?

The 4090 can do +80% over the 3090 in many situations, so it's not the same. Cut-down as it is, too.

Regarding Fury X... just another chapter of my prolonged love-hate relationship with AMD. It'll take quite some time for me to stop resenting them for EOL'ing it with high severity bugs I've reported going unfixed.
 
Joined
Sep 21, 2020
Messages
1,638 (1.07/day)
Processor 5800X3D -30 CO
Motherboard MSI B550 Tomahawk
Cooling DeepCool Assassin III
Memory 32GB G.SKILL Ripjaws V @ 3800 CL14
Video Card(s) ASRock MBA 7900XTX
Storage 1TB WD SN850X + 1TB ADATA SX8200 Pro
Display(s) Dell S2721QS 4K60
Case Cooler Master CM690 II Advanced USB 3.0
Audio Device(s) Audiotrak Prodigy Cube Black (JRC MUSES 8820D) + CAL (recabled)
Power Supply Seasonic Prime TX-750
Mouse Logitech Cordless Desktop Wave
Keyboard Logitech Cordless Desktop Wave
Software Windows 10 Pro
as someone with a high-end Ampere card, it's not for me either. Its marginal gains in raster
Would you call a 33% difference "marginal"?

1687510432457.png


close to 100W power consumption in video playback
This has long been fixed. It's still high, but nowhere near 100 W. Same with idle:

power-video-playback.pngidle.jpg
 
Joined
Sep 6, 2013
Messages
3,329 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
This has long been fixed. It's still high, but nowhere near 100 W.
Ah! thanks. Gone at RDNA2 levels. Not exactly fixed, still hight, but much much better. Hope they drop it lower in future GPUs.
 
Joined
Jul 5, 2013
Messages
27,725 (6.67/day)
Don't know what a GPU needs to do to keep feeding a high refresh rate monitor or a dual monitor setup. But close to 100W power consumption in video playback is unacceptable in 2023. While an opinion I also believe it is logical to say that. Your smartphone in your pocket or on your desk can probably playback the same video files at 5-10W power consumption. In fact even AMD's own APUs play video while keeping their power consumption for the whole SOC at even lower than 15W. So, something gone wrong with RDNA 3 here. And even if we consider this still an opinion, then there is this chart
1687508875548.png

where a 6900XT is at half the power consumption.
Likely an architectural thing. However, it's still not a serious problem. It's 88w, not 800w, and that's during use. Not a real problem. And as others have stated, that situation was fixed swiftly. People need to stop complaining about non-issues.

My opinion, strictly an opinion here, let's say totally wrong, is that you choose to ignore the facts. But that's my opinion, probably wrong and because it is borderline insult I apologize in advance for this opinion.
There's nothing wrong with opinions and the expression there-of. It's when those opinions fly in the face of reason and logic that a problem arises.
 
Joined
Sep 17, 2014
Messages
22,437 (6.03/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
My opinion, strictly an opinion here, let's say totally wrong, is that you choose to ignore the facts. But that's my opinion, probably wrong and because it is borderline insult I apologize in advance for this opinion.
No, no I'm not saying you are wrong, I'm putting things in perspective. Some sanity checks, I feel, are necessary here. Things are pulled way out of proportion. Yes, there are gaps in RDNA3. No, its not optimal as a release, as a stack, and how it was priced. At the same time, neither was Ada, and it still commands a higher net price, also per frame, for a somewhat expanded featureset and a few less gaps. But it ALSO has its gaps - notably in VRAM - and those aren't fixable.

As for insults... don't even worry, none seen or taken, we're discussing a thing, people shouldn't have to walk a tight rope doing so.

The 4090 can do +80% over the 3090 in many situations, so it's not the same. Cut-down as it is, too.

Regarding Fury X... just another chapter of my prolonged love-hate relationship with AMD. It'll take quite some time for me to stop resenting them for EOL'ing it with high severity bugs I've reported going unfixed.
Fury X was arguably the worst GPU ever in terms of support... yeah. Total fail and we can blame Raja"HBM"Koduri.
 
Joined
Dec 25, 2020
Messages
6,710 (4.70/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Would you call a 33% difference "marginal"?

View attachment 302132

For what it's worth, at that rate it's 10% a year - when compared to a bone stock 350W 3090 using review data from back then. Reality is closer to it being 1:1 with the 7900 XT in general. I can't say I'm impressed, especially considering its last-generation RT performance, multitude of quirks and inferior software support.

If the 7900 XTX hadn't failed, it would have been matching the 4090, just as the 6900 XT once matched the 3090.
 
Joined
Mar 10, 2010
Messages
11,878 (2.21/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
Keeping the GCD and easy scaling based on MCD is only, like, the entire point behind bringing RDNA3 to chiplets, what took them so long?? :confused:

Also, "answer to the AD103" is an interesting statement considering it's hoping for parity with 4070 Ti at best? 4070 Ti is AD104.



So I guess that precludes you from ever commenting on 12VHPWR's woes because you never touched a 40 series card? We're all armchair generals here :D

Though I am curious what these clock "bugs" are, even when it comes to problems seems like Navi31 has more relevant concerns that need to be solved first.
Like what?.

Having owned Navi31 without issue since launch I often wonder where y'all get your issues from.

A bug dropping performance marginally that was rumoured but not proven to have happened Still ended up with a card that runs flawlessly smooth and fast in all games.


All GPU designs end up with faults in the design Errata lists same with CPU just one company gets noted though by some here, realistic, not.
 
Joined
Sep 21, 2020
Messages
1,638 (1.07/day)
Processor 5800X3D -30 CO
Motherboard MSI B550 Tomahawk
Cooling DeepCool Assassin III
Memory 32GB G.SKILL Ripjaws V @ 3800 CL14
Video Card(s) ASRock MBA 7900XTX
Storage 1TB WD SN850X + 1TB ADATA SX8200 Pro
Display(s) Dell S2721QS 4K60
Case Cooler Master CM690 II Advanced USB 3.0
Audio Device(s) Audiotrak Prodigy Cube Black (JRC MUSES 8820D) + CAL (recabled)
Power Supply Seasonic Prime TX-750
Mouse Logitech Cordless Desktop Wave
Keyboard Logitech Cordless Desktop Wave
Software Windows 10 Pro
compared to a bone stock 350W 3090 using review data from back then. Reality is closer to it being 1:1 with the 7900 XT
This is from the latest GPU review. All cards have been re-tested with the current game suite. At no resolution is the RTX3090 on par with the 7900XT in raster. The upgrade may not make sense for you (and reasonably so), but saying these two are about equal is a long stretch:

relative-performance-1920-1080.pngrelative-performance-2560-1440.pngrelative-performance-3840-2160.png
 
Joined
Sep 8, 2009
Messages
1,077 (0.19/day)
Location
Porto
Processor Ryzen 9 5900X
Motherboard Gigabyte X570 Aorus Pro
Cooling AiO 240mm
Memory 2x 32GB Kingston Fury Beast 3600MHz CL18
Video Card(s) Radeon RX 6900XT Reference (amd.com)
Storage O.S.: 256GB SATA | 2x 1TB SanDisk SSD SATA Data | Games: 1TB Samsung 970 Evo
Display(s) LG 34" UWQHD
Audio Device(s) X-Fi XtremeMusic + Gigaworks SB750 7.1 THX
Power Supply XFX 850W
Mouse Logitech G502 Wireless
VR HMD Lenovo Explorer
Software Windows 10 64bit
I have to admit it's a bit disheartening that AMD has to use a N31 chip to make sure the 7800XT is faster than the 6800XT. This is not the performance upgrade one would have expected 2 years after RDNA2's release.

N5 should have brought a 20% performance (i.e. clocks) increase over N7, instead the 7900XTX usually clocks just as high as the 6900XT. And the N6 RX7600 clocks just as high as the N5 7900XTX.

My guess is AMD really had planned for the N5 RDNA3 chips to clock 20% higher, putting the N31 7900XTX and the N32 7800XT in the >2.7GHz range which would get the latter to be closer to the 6800XT in compute throughput (72CUs on N21 @ 2250MHz ~= 60 CUs on N32 @ 2700MHz).

Instead the new process brought them nearly zero clock advantages, and AMD now has to use the bigger N31 chip to bring some advantage over the 6800XT.


Big question now is if RDNA4 is solving these issues, since it's now clear that AMD hasn't been able to solve them on N32.




And the most ironic thing is how we do have RDNA3 GPUs clocking at >2700MHz at a low(ish) power usage inside the Phoenix SoC, but those are hopelessly bandwidth starved because AMD decided against putting L3 cache on it.
 

tabascosauz

Moderator
Supporter
Staff member
Joined
Jun 24, 2015
Messages
8,145 (2.37/day)
Location
Western Canada
System Name ab┃ob
Processor 7800X3D┃5800X3D
Motherboard B650E PG-ITX┃X570 Impact
Cooling NH-U12A + T30┃AXP120-x67
Memory 64GB 6400CL32┃32GB 3600CL14
Video Card(s) RTX 4070 Ti Eagle┃RTX A2000
Storage 8TB of SSDs┃1TB SN550
Case Caselabs S3┃Lazer3D HT5
Ah! thanks. Gone at RDNA2 levels. Not exactly fixed, still hight, but much much better. Hope they drop it lower in future GPUs.

No review can properly encapsulate what multi-monitor and video playback power figures are. And it's not like Nvidia GPUs are immune to increasing power consumption the more monitors/resolutions you add, but they have more intermediate VRAM clock steps available to them to lessen the blow. If you can't make 10W, you might still make 20W. If you can't make 20W, you might still make 40W.

"Normal" RDNA video playback of ~40W isn't even remotely a problem. Yes, it's high compared to GeForce, but even the dinky 7900XT MBA cooler can manage to stay fanless at that level, most of the time.

When you see 80W video playback, that's not video playback power, that's AMD's perpetual lack of intermediate VRAM clocks overshadowing everything else and turning everything into 80-90W because VRAM is stuck at 2500MHz.

Time and time again an Adrenalin release is accompanied by reports that the multi monitor problem is "fixed". A quick trip to Reddit confirms plenty of run of the mill 2 monitor high refresh setups stay unchanged at 80-90W. All the Radeon team seem to be doing is just vetting specific display setups over time - if no major artifacting issues, allow it to drop VRAM clock. Which would actually eventually solve the problem, if AMD didn't have a habit of repeatedly regressing on the multi monitor problem and undoing all the progress they'd made every couple years.

Back to the point that GeForce cards aren't immune to variance, I get about half the idle power figures that w1zz had in reviews. But that's a difference of 10W vs. 20W, not 20W vs 80W. Wondrous what VRAM can do when it has access to more than just two modes.

Like what?.

Having owned Navi31 without issue since launch I often wonder where y'all get your issues from.

A bug dropping performance marginally that was rumoured but not proven to have happened Still ended up with a card that runs flawlessly smooth and fast in all games.


All GPU designs end up with faults in the design Errata lists same with CPU just one company gets noted though by some here, realistic, not.

A shocking number of them all come from AMD's driver relationship with multi monitor setups. If you only have one screen of course it runs like a top, it's the ideal scenario.

No, I don't count "not hitting 3GHz" as a "bug", it just diminishes the stuff that actually matters. The performance is there, it's everything else. You can't just slap an engine on frame rails and sell it as a car, no matter how good that motor is.
 
Joined
Feb 24, 2021
Messages
143 (0.10/day)
System Name Upgraded CyberpowerPC Ultra 5 Elite Gaming PC
Processor AMD Ryzen 7 5800X3D
Motherboard MSI B450M Pro-VDH Plus
Cooling Thermalright Peerless Assassin 120 SE
Memory CM4X8GD3000C16K4D (OC to CL14)
Video Card(s) XFX Speedster MERC RX 7800 XT
Storage TCSunbow X3 1TB, ADATA SU630 240GB, Seagate BarraCuda ST2000DM008 2TB
Display(s) AOC Agon AG241QX 1440p 144Hz
Case Cooler Master MasterBox MB520 (CyberpowerPC variant)
Power Supply 600W Cooler Master
You keep complaining about Bug fixes yet you have a 4090.
Progress benefits all of us, even if we already have a good enough GPU.
 
Joined
Sep 6, 2013
Messages
3,329 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
Likely an architectural thing. However, it's still not a serious problem. It's 88w, not 800w, and that's during use. Not a real problem. And as others have stated, that situation was fixed swiftly. People need to stop complaining about non-issues.
It's high. 42-48W is high (not 88W, the updated numbers from @QuietBob 's post show 42-48W). It's not 300W that we can expect in gaming, but still high. When other hardware and I mean AMD's hardware, can do the same task at a fraction of the power the 7900X/X needs, then there is something wrong in the design. A decision to maybe simplify the design, or cut costs? Don't know. But with AMD searching for advantages and not disadvantages we should be looking at much lower power consumption in every new GPU series. Not higher and after a fix the same as the previous gen (which unfortunately makes RDNA3 again look more and more like RDNA2, even in video playback). Imagine AMD's cards from the lowest to the highest needing single digit power to playback a video. That would have been a clear advantage because gamers don't use their PCs 100% of the time to game. They could be playing videos or watching movies and series. And that's hours of PC usage. So 10W (6600 XT) vs 42-48W (7900 XT/X) it's a serious difference. I repeat my perspective here. AMD should be searching for advantages where it can. Intel for example with it's pathetic iGPUs before the ARC, where at least trying to find advantages where it could. And it did had an advantage in the media engine if I am not mistaken.

No, no I'm not saying you are wrong, I'm putting things in perspective. Some sanity checks, I feel, are necessary here. Things are pulled way out of proportion. Yes, there are gaps in RDNA3. No, its not optimal as a release, as a stack, and how it was priced. At the same time, neither was Ada, and it still commands a higher net price, also per frame, for a somewhat expanded featureset and a few less gaps. But it ALSO has its gaps - notably in VRAM - and those aren't fixable.

As for insults... don't even worry, none seen or taken, we're discussing a thing, people shouldn't have to walk a tight rope doing so.
Thanks.

There are serious gaps in RDNA3. The fact that I keep calling RDNA3 as a failure and as nothing more than RDNA2 and no one yet has come out and throw me charts showing how wrong I am, does say something. Hope they manage to take advantage of the new architectural advantages of RDNA3 with RDNA 3.5 or 4. Probably they where too much occupied into making this first generation of chiplets to work this time.

As for Nvidia. It is using VRAM as a kill switch at least from 15-20 years ago. All their architectures are probably build that way.
Many many years ago I was doing some simple benchmarks. I have kept a few. For example, 9800GT with 512MB of VRAM. Look what happens in the forth test when the VRAM goes over 512MBs

1687514084798.png

It becomes a slide show. I mean a TRUE slideshow. While performance is dropping from test 1 to test 2 to test 3 the way someone would expect, it tanks in the last test. I don't seem to have saved tests with an ATi/AMD card - might be somewhere - but I remember that ATi/AMD cards where not dropping dead when going over VRAM capacity.

15-20 years latter we see the same with the only difference this time AMD suffering the same (but at least usually offering more VRAM at the same price range)
1687514573570.png
 
Joined
Mar 10, 2010
Messages
11,878 (2.21/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
No review can properly encapsulate what multi-monitor and video playback power figures are. And it's not like Nvidia GPUs are immune to increasing power consumption the more monitors/resolutions you add, but they have more intermediate VRAM clock steps available to them to lessen the blow. If you can't make 10W, you might still make 20W. If you can't make 20W, you might still make 40W.

"Normal" RDNA video playback of ~40W isn't even remotely a problem. Yes, it's high compared to GeForce, but even the dinky 7900XT MBA cooler can manage to stay fanless at that level, most of the time.

When you see 80W video playback, that's not video playback power, that's AMD's perpetual lack of intermediate VRAM clocks overshadowing everything else and turning everything into 80-90W because VRAM is stuck at 2500MHz.

Time and time again an Adrenalin release is accompanied by reports that the multi monitor problem is "fixed". A quick trip to Reddit confirms plenty of run of the mill 2 monitor high refresh setups stay unchanged at 80-90W. All the Radeon team seem to be doing is just vetting specific display setups over time - if no major artifacting issues, allow it to drop VRAM clock. Which would actually eventually solve the problem, if AMD didn't have a habit of repeatedly regressing on the multi monitor problem and undoing all the progress they'd made every couple years.

Back to the point that GeForce cards aren't immune to variance, I get about half the idle power figures that w1zz had in reviews. But that's a difference of 10W vs. 20W, not 20W vs 80W. Wondrous what VRAM can do when it has access to more than just two modes.



A shocking number of them all come from AMD's driver relationship with multi monitor setups. If you only have one screen of course it runs like a top, it's the ideal scenario.

No, I don't count "not hitting 3GHz" as a "bug", it just diminishes the stuff that actually matters. The performance is there, it's everything else. You can't just slap an engine on frame rails and sell it as a car, no matter how good that motor is.
Yes I have two monitors, a shocking number?.

I know of one, high power draw. The end.

So would be interested in knowing a shocking number of other bug's with proof.
 

tabascosauz

Moderator
Supporter
Staff member
Joined
Jun 24, 2015
Messages
8,145 (2.37/day)
Location
Western Canada
System Name ab┃ob
Processor 7800X3D┃5800X3D
Motherboard B650E PG-ITX┃X570 Impact
Cooling NH-U12A + T30┃AXP120-x67
Memory 64GB 6400CL32┃32GB 3600CL14
Video Card(s) RTX 4070 Ti Eagle┃RTX A2000
Storage 8TB of SSDs┃1TB SN550
Case Caselabs S3┃Lazer3D HT5
Yes I have two monitors, a shocking number?.

I know of one, high power draw. The end.

So would be interested in knowing a shocking number of other bug's with proof.

My dude, I've said my piece months ago when I had the card. If you have only 60Hz screens and/or single monitor there's nothing to note. I'm not repeating the whole writeup again. Go look back in the owners club or my project log.
 
Joined
Dec 25, 2020
Messages
6,710 (4.70/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
It's high. 42-48W is high (not 88W, the updated numbers from @QuietBob 's post show 42-48W). It's not 300W that we can expect in gaming, but still high. When other hardware and I mean AMD's hardware, can do the same task at a fraction of the power the 7900X/X needs, then there is something wrong in the design. A decision to maybe simplify the design, or cut costs? Don't know. But with AMD searching for advantages and not disadvantages we should be looking at much lower power consumption in every new GPU series. Not higher and after a fix the same as the previous gen (which unfortunately makes RDNA3 again look more and more like RDNA2, even in video playback). Imagine AMD's cards from the lowest to the highest needing single digit power to playback a video. That would have been a clear advantage because gamers don't use their PCs 100% of the time to game. They could be playing videos or watching movies and series. And that's hours of PC usage. So 10W (6600 XT) vs 42-48W (7900 XT/X) it's a serious difference. I repeat my perspective here. AMD should be searching for advantages where it can. Intel for example with it's pathetic iGPUs before the ARC, where at least trying to find advantages where it could. And it did had an advantage in the media engine if I am not mistaken.


Thanks.

There are serious gaps in RDNA3. The fact that I keep calling RDNA3 as a failure and as nothing more than RDNA2 and no one yet has come out and throw me charts showing how wrong I am, does say something. Hope they manage to take advantage of the new architectural advantages of RDNA3 with RDNA 3.5 or 4. Probably they where too much occupied into making this first generation of chiplets to work this time.

As for Nvidia. It is using VRAM as a kill switch at least from 15-20 years ago. All their architectures are probably build that way.
Many many years ago I was doing some simple benchmarks. I have kept a few. For example, 9800GT with 512MB of VRAM. Look what happens in the forth test when the VRAM goes over 512MBs

View attachment 302140
It becomes a slide show. I mean a TRUE slideshow. While performance is dropping from test 1 to test 2 to test 3 the way someone would expect, it tanks in the last test. I don't seem to have saved tests with an ATi/AMD card - might be somewhere - but I remember that ATi/AMD cards where not dropping dead when going over VRAM capacity.

15-20 years latter we see the same with the only difference this time AMD suffering the same (but at least usually offering more VRAM at the same price range)
View attachment 302142

Oh, the old Tropics demo. Love it, I still have it in my benching suite :D

I actually ran it on my current build - of course its on not running on the UHD 770

Tropics 1080 8xAA.png

I agree with you too and that is also what I am meaning to say when I sound so harsh towards these RX 7000 series cards. So would any chart, from any reviewer. On the best cases, the 7600 does barely 10% over the 6600 XT, and is actually pretty dead even to 4% faster than the 6650 XT. There is something wrong big time with RDNA 3, either it was an architectural gamble that AMD hoped it'd pay off but didn't, the drivers are hilariously and absolutely rotten, or they have severe hardware errata, it's gotta be one of these three, either way, it doesn't look good.
 
Joined
Sep 6, 2013
Messages
3,329 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
Quibbling over less than 50w of power shouldn't be one of them. Nor should it be something people get unpleasant about.
Look, we see things differently. And it's not about 50W of power in gaming for example or in Furmark. It's in video playback. I mean why waste an extra of 30W of power while watching a movie? I am not 15 years old when I was playing games all day. Today I will spend more time watching youtube videos and movies than gaming. So power consumption in video playback is important for me. Does AMD have the luxury to tell me "If you want low power consumption in video playback go and buy a competing card"? As I said, AMD should start looking for advantages that CAN achieve. Not trying to play catch up with Nvidia with Nvidia dictating the rules. Where is FreeSync 3.0 with Frame Generation? I am throwing it as an example. AMD should be looking in improving it's hardware in various ways, not just trying to follow where Nvidia wants to drive the market.
 
Joined
May 17, 2021
Messages
3,005 (2.34/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
on par with the 4070ti, so 10% slower then the 7900xt, how many cards will they plan on releasing to stack them every 10%?!

i guess no 7800xtx or it would just be hilarious :roll:
 
Top