• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

4080 vs 7900XTX power consumption - Optimum Tech

Joined
Sep 10, 2018
Messages
6,925 (3.05/day)
Location
California
System Name His & Hers
Processor R7 5800X/ R7 7950X3D Stock
Motherboard X670E Aorus Pro X/ROG Crosshair VIII Hero
Cooling Corsair h150 elite/ Corsair h115i Platinum
Memory Trident Z5 Neo 6000/ 32 GB 3200 CL14 @3800 CL16 Team T Force Nighthawk
Video Card(s) Evga FTW 3 Ultra 3080ti/ Gigabyte Gaming OC 4090
Storage lots of SSD.
Display(s) A whole bunch OLED, VA, IPS.....
Case 011 Dynamic XL/ Phanteks Evolv X
Audio Device(s) Arctis Pro + gaming Dac/ Corsair sp 2500/ Logitech G560/Samsung Q990B
Power Supply Seasonic Ultra Prime Titanium 1000w/850w
Mouse Logitech G502 Lightspeed/ Logitech G Pro Hero.
Keyboard Logitech - G915 LIGHTSPEED / Logitech G Pro
So back then I was really into COD, and at that time it was World at War... Everything had a wierd white outline around it, and the game looked very cartoony. I tried multiple Windows installs, drivers.. nothing fixed it. It did not help it was pretty much the only game that I played at the time. It sucked compared to my 295 that was in RMA with XFX :D

And then the overclocking. It did 1GHz core no problem, but if you even breathed on the memory by even 1MHz, it would just get the screen blinking like mad.

I actually gave that card away to a kid who lived in a van at OC Forums :kookoo:

I only played cod on consoles back then... actually in the 2000s I really only played ARPGs and RTS games on pc thankfully I don't remember having many issues. I did play Crysis and Crysis 2 and remember some weird stuff with vsync etc but that's about it in general.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,178 (1.27/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
It's disappointing that somehow there's notions of whataboutism, openly attacking a staff member, and managing to crap on Nvidia buyers??? reading between the lines there is hilariously disturbing, and so off topic, but some people can't help themselves... :shadedshu:

Probably time to close this one off Mods, unless anyone can think of something on topic and constructive to be gained from keeping it open. All the rational takes have already been stated.
 
Joined
Dec 25, 2020
Messages
6,789 (4.73/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
If you go further back in history, AMD was more efficient. That title goes back and forth.

The last time I remember AMD having an uncontested lead in performance per watt was with the HD 5000 series over 14 years ago. They had Nvidia beat to the punch for DX11 too, GeForce wouldn't support it at all for the first 6 months of Windows 7's lifetime, and 10.1 support was pretty much an AMD exclusive through all of its (ir)relevancy, as few developers ever bothered as Nvidia hardware couldn't do it at all and by the time it was widely available, the cards supported the much better DX11 codepath anyway. This early implementation had some implications that remain true even today, such as their driver not supporting command lists/deferred contexts to this day, as it was considered optional by Microsoft and AMD opted not to implement it.

Even then, Cypress had a limitation with its very poor tessellation performance caused by low triangle throughput when compared to Fermi, and the drivers were a mess through most of its lifecycle - owners of the HD 5970 such as myself had to deal with pure asinine garbage such as negative scaling and bugs because some genius at AMD decided that making CrossFire forced-on in the drivers with no way to switch it off was a great idea. By the time they allowed this without a direct registry edit, it was too late. Rookie as I was at the time, registry tweaks were simply not something I dared touch because I just didn't understand it.

Game developers being lazy and high-end games such as Crysis 2 targeting Fermi hardware primarily due to the GTX 580 being much better than the rest of the cards of its time and not bothering with handling occlusion well (particularly as Fermi could manage the geometry) rendered more than a few accusations of sabotage by Nvidia at the time. Even the original GF100 could handle twice as much geometry as Cypress back then. That isn't to say AMD is lazy or even utterly incompetent buffoons, because they aren't - they're pioneers even, their problem is of a managerial nature - for example, the RV770 used in the HD 4870 already had a programmable tessellator, but as it wasn't conformant with the DX11 shader model 5.0 spec it went largely unused at the time. I honestly and sincerely believe that the problem with Radeon is of a corporate nature, it's the company culture and the boomers at the helm who just can't keep up with the industry any longer.

This would again repeat with the infamous "Gimpworks" from Nvidia, which could largely be attributed to the AMD driver's low instancing performance as it was restricted to immediate context and the CPUs of the time simply didn't have the IPC and frequency to muscle through 200,000+ individual hair strands + the game's usual requirements. When Nvidia ran into the instancing wall with the Fermi hardware, they simply disabled it on Fermi and removed the hardware command scheduler altogether beginning with Kepler, which meant the driver handled it - and in fact, still does handle it with deferred contexts and can even defer commands from and originally immediate context, sacrificing what is now ample CPU power to maximize GPU performance. This is the true reason why AMD cards have had their long standing (and founded) reputation of performing better with low-end processors with few threads available.

With the longevity of the current generations, software innovations will make or break the user experience, particularly as hardware has grown quite powerful and games have not seen an extreme increase in fidelity since the PS4/Xbox One days, most of the power of these new consoles is spent in sugar coating the graphics with fancy ray traced illumination and high-resolution textures. The latter has never been a problem for a powerful desktop GPU and you can disable the former in most cases. Nvidia understands this, and that is why they zealously and jealously guard and segment their star features such as DLSS 3 as an incentive to have people upgrade. AMD, on the other hand, is still struggling with high idle power, TDRs while playing videos, and the occasional completely botched driver release... and FSR 3 is delayed/still a no show.
 
Last edited:

Outback Bronze

Super Moderator
Staff member
Joined
Aug 3, 2011
Messages
2,030 (0.42/day)
Location
Walkabout Creek
System Name Raptor Baked
Processor 14900k w.c.
Motherboard Z790 Hero
Cooling w.c.
Memory 48GB G.Skill 7200
Video Card(s) Zotac 4080 w.c.
Storage 2TB Kingston kc3k
Display(s) Samsung 34" G8
Case Corsair 460X
Audio Device(s) Onboard
Power Supply PCIe5 850w
Mouse Asus
Keyboard Corsair
Software Win 11
Benchmark Scores Cool n Quiet.
Joined
Aug 9, 2019
Messages
1,695 (0.87/day)
Processor 7800X3D 2x16GB CO
Motherboard Asrock B650m HDV
Cooling Peerless Assassin SE
Memory 2x16GB DR A-die@6000c30 tuned
Video Card(s) Asus 4070 dual OC 2610@915mv
Storage WD blue 1TB nvme
Display(s) Lenovo G24-10 144Hz
Case Corsair D4000 Airflow
Power Supply EVGA GQ 650W
Software Windows 10 home 64
Benchmark Scores Superposition 8k 5267 Aida64 58.5ns
If you go further back in history, AMD was more efficient. That title goes back and forth.
Yeah, but I was talking about efficiency with framecap, v-sync etc. It seems Nvidia is better at running cards efficient at partisl load vs AMD and has been now for several generations.
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
5,035 (1.99/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans replaced with Noctua A14x25 G2
Cooling Optimus Block, HWLabs Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 1x A14G2, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MT 26-36-36-48, 56.6ns AIDA, 2050 FCLK, 160 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front, LINKUP Ultra PCIe 4.0 x16 white
Audio Device(s) Audeze Maxwell Ultraviolet w/upgrade pads & LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro
Power Supply SF750 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper Pro V2 8 KHz Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape
Keyboard Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerV2 mod, TLabs Leath/Suede
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores Legendary
Yeah, but I was talking about efficiency with framecap, v-sync etc. It seems Nvidia is better at running cards efficient at partisl load vs AMD and has been now for several generations.
Yeah this is what I'm interested in. I spoke with W1z about adding more Vsync/framecap testing to future GPU reviews. It's possible we could add 120 FPS and maybe even 240 FPS in the Vsync testing.

This is relevant to me greatly as I cap frames at 237 FPS for input latency, efficiency and lower stress on hardware, the diminishing returns of overall frame latency from higher rendered frames, of which some aren't displayed, aren't worth it. I know a lot of other people run at set FPS as well, so knowing how well architectures scale when not either idle or at full load is relevant data.

Maybe competitor X is 90% as efficient at full load than competitor Y, but if that drops to 60% for example at less than full load, say 70% load, then that's a major difference. Especially as many game engines/resolutions will not take full advantage of a GPU's horsepower, or perhaps the user is CPU limited.

1689682783934.png
 
Last edited:

Space Lynx

Astronaut
Joined
Oct 17, 2014
Messages
17,269 (4.67/day)
Location
Kepler-186f
Processor 7800X3D -25 all core ($196)
Motherboard B650 Steel Legend ($179)
Cooling Frost Commander 140 ($42)
Memory 32gb ddr5 (2x16) cl 30 6000 ($80)
Video Card(s) Merc 310 7900 XT @3100 core $(705)
Display(s) Agon 27" QD-OLED Glossy 240hz 1440p ($399)
Case NZXT H710 (Red/Black) ($60)
I always cap my fps 5 fps below the monitor refresh rate. There is literally no reason to run it past the refresh rate... smoothness can only equate to the frames the monitor can deliver. I have never understood people who leave frames uncapped, maybe e-sports I guess, but that's not my thing.
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
5,035 (1.99/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans replaced with Noctua A14x25 G2
Cooling Optimus Block, HWLabs Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 1x A14G2, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MT 26-36-36-48, 56.6ns AIDA, 2050 FCLK, 160 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front, LINKUP Ultra PCIe 4.0 x16 white
Audio Device(s) Audeze Maxwell Ultraviolet w/upgrade pads & LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro
Power Supply SF750 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper Pro V2 8 KHz Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape
Keyboard Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerV2 mod, TLabs Leath/Suede
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores Legendary
I always cap my fps 5 fps below the monitor refresh rate. There is literally no reason to run it past the refresh rate... smoothness can only equate to the frames the monitor can deliver. I have never understood people who leave frames uncapped, maybe e-sports I guess, but that's not my thing.
There's some minor frame latency improvements to running uncapped, but you run the risk of input latency issues, especially as you'll further stress the CPU, and at esports/competitive grade framerates you'll want to be running a 4K or 8K Hz mouse too, which is more stress on CPU.
 
Joined
Jun 2, 2017
Messages
9,184 (3.36/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
The last time I remember AMD having an uncontested lead in performance per watt was with the HD 5000 series over 14 years ago. They had Nvidia beat to the punch for DX11 too, GeForce wouldn't support it at all for the first 6 months of Windows 7's lifetime, and 10.1 support was pretty much an AMD exclusive through all of its (ir)relevancy, as few developers ever bothered as Nvidia hardware couldn't do it at all and by the time it was widely available, the cards supported the much better DX11 codepath anyway. This early implementation had some implications that remain true even today, such as their driver not supporting command lists/deferred contexts to this day, as it was considered optional by Microsoft and AMD opted not to implement it.

Even then, Cypress had a limitation with its very poor tessellation performance caused by low triangle throughput when compared to Fermi, and the drivers were a mess through most of its lifecycle - owners of the HD 5970 such as myself had to deal with pure asinine garbage such as negative scaling and bugs because some genius at AMD decided that making CrossFire forced-on in the drivers with no way to switch it off was a great idea. By the time they allowed this without a direct registry edit, it was too late. Rookie as I was at the time, registry tweaks were simply not something I dared touch because I just didn't understand it.

Game developers being lazy and high-end games such as Crysis 2 targeting Fermi hardware primarily due to the GTX 580 being much better than the rest of the cards of its time and not bothering with handling occlusion well (particularly as Fermi could manage the geometry) rendered more than a few accusations of sabotage by Nvidia at the time. Even the original GF100 could handle twice as much geometry as Cypress back then. That isn't to say AMD is lazy or even utterly incompetent buffoons, because they aren't - they're pioneers even, their problem is of a managerial nature - for example, the RV770 used in the HD 4870 already had a programmable tessellator, but as it wasn't conformant with the DX11 shader model 5.0 spec it went largely unused at the time. I honestly and sincerely believe that the problem with Radeon is of a corporate nature, it's the company culture and the boomers at the helm who just can't keep up with the industry any longer.

This would again repeat with the infamous "Gimpworks" from Nvidia, which could largely be attributed to the AMD driver's low instancing performance as it was restricted to immediate context and the CPUs of the time simply didn't have the IPC and frequency to muscle through 200,000+ individual hair strands + the game's usual requirements. When Nvidia ran into the instancing wall with the Fermi hardware, they simply disabled it on Fermi and removed the hardware command scheduler altogether beginning with Kepler, which meant the driver handled it - and in fact, still does handle it with deferred contexts and can even defer commands from and originally immediate context, sacrificing what is now ample CPU power to maximize GPU performance. This is the true reason why AMD cards have had their long standing (and founded) reputation of performing better with low-end processors with few threads available.

With the longevity of the current generations, software innovations will make or break the user experience, particularly as hardware has grown quite powerful and games have not seen an extreme increase in fidelity since the PS4/Xbox One days, most of the power of these new consoles is spent in sugar coating the graphics with fancy ray traced illumination and high-resolution textures. The latter has never been a problem for a powerful desktop GPU and you can disable the former in most cases. Nvidia understands this, and that is why they zealously and jealously guard and segment their star features such as DLSS 3 as an incentive to have people upgrade. AMD, on the other hand, is still struggling with high idle power, TDRs while playing videos, and the occasional completely botched driver release... and FSR 3 is delayed/still a no show.
You had me until the High idle power comment. I have not seen a botched driver in about 4 years and I have been using AMD since the original 6800 1 GB card. I do feel that FSR 3 will come sooner than we think.
 
Joined
Nov 26, 2021
Messages
1,648 (1.50/day)
Location
Mississauga, Canada
Processor Ryzen 7 5700X
Motherboard ASUS TUF Gaming X570-PRO (WiFi 6)
Cooling Noctua NH-C14S (two fans)
Memory 2x16GB DDR4 3200
Video Card(s) Reference Vega 64
Storage Intel 665p 1TB, WD Black SN850X 2TB, Crucial MX300 1TB SATA, Samsung 830 256 GB SATA
Display(s) Nixeus NX-EDG27, and Samsung S23A700
Case Fractal Design R5
Power Supply Seasonic PRIME TITANIUM 850W
Mouse Logitech
VR HMD Oculus Rift
Software Windows 11 Pro, and Ubuntu 20.04
So back then I was really into COD, and at that time it was World at War... Everything had a wierd white outline around it, and the game looked very cartoony. I tried multiple Windows installs, drivers.. nothing fixed it. It did not help it was pretty much the only game that I played at the time. It sucked compared to my 295 that was in RMA with XFX :D

And then the overclocking. It did 1GHz core no problem, but if you even breathed on the memory by even 1MHz, it would just get the screen blinking like mad.

I actually gave that card away to a kid who lived in a van at OC Forums :kookoo:
Out of curiosity, did you try to RMA that 4890?
 

freeagent

Moderator
Staff member
Joined
Sep 16, 2018
Messages
8,567 (3.78/day)
Location
Winnipeg, Canada
Processor AMD R7 5800X3D
Motherboard Asus Crosshair VIII Dark Hero
Cooling Thermalright Frozen Edge 360, 3x TL-B12 V2, 2x TL-B12 V1
Memory 2x8 G.Skill Trident Z Royal 3200C14, 2x8GB G.Skill Trident Z Black and White 3200 C14
Video Card(s) Zotac 4070 Ti Trinity OC
Storage WD SN850 1TB, SN850X 2TB, SN770 1TB
Display(s) LG 50UP7100
Case Fractal Torrent Compact
Audio Device(s) JBL Bar 700
Power Supply Seasonic Vertex GX-1000, Monster HDP1800
Mouse Logitech G502 Hero
Keyboard Logitech G213
VR HMD Oculus 3
Software Yes
Benchmark Scores Yes

Cheeseball

Not a Potato
Supporter
Joined
Jan 2, 2009
Messages
1,997 (0.34/day)
Location
Pittsburgh, PA
System Name Titan
Processor AMD Ryzen™ 7 7950X3D
Motherboard ASRock X870 Taichi Lite
Cooling Thermalright Phantom Spirit 120 EVO CPU
Memory TEAMGROUP T-Force Delta RGB 2x16GB DDR5-6000 CL30
Video Card(s) ASRock Radeon RX 7900 XTX 24 GB GDDR6 (MBA) / NVIDIA RTX 4090 Founder's Edition
Storage Crucial T500 2TB x 3
Display(s) LG 32GS95UE-B, ASUS ROG Swift OLED (PG27AQDP), LG C4 42" (OLED42C4PUA)
Case HYTE Hakos Baelz Y60
Audio Device(s) Kanto Audio YU2 and SUB8 Desktop Speakers and Subwoofer, Cloud Alpha Wireless
Power Supply Corsair SF1000L
Mouse Logitech Pro Superlight 2 (White), G303 Shroud Edition
Keyboard Wooting 60HE+ / 8BitDo Retro Mechanical Keyboard (N Edition) / NuPhy Air75 v2
VR HMD Occulus Quest 2 128GB
Software Windows 11 Pro 64-bit 23H2 Build 22631.4317
You had me until the High idle power comment. I have not seen a botched driver in about 4 years and I have been using AMD since the original 6800 1 GB card. I do feel that FSR 3 will come sooner than we think.

He is not wrong though, but what he likely means about the drivers being "botched" are mostly due to new hardware (and major software changes) at their respective timeframes. 5700 XT in 2019 was not a fun experience (@INSTG8R knows this from my bug reports lol), but this is because they just switched to Adrenalin and the 5700 XT RDNA1 was new architecture at the time. I felt it was 100% good around April 2020 (because they fixed the high memory clocks issue when idle and the control panel wasn't freezing randomly anymore). 6000 series release was not that bad, but the initial driver (December 2020) had the idle clocks issue again only to be worked on the next driver and was addressed in the one after (I remember it was good around March 2021). 7000 series, as you can see in reviews also had the idle clocks issue but was just resolved recently (along with the 6 months VR issue).

But with the RX 480 in 2016 (along with the RX 580/590, but I have not used this GPU myself) and the Radeon VII in 2019, I consider these their best balanced cards, not only because the VII was a proper prosumer card with working compute, but because the RX 480/580 were $200 monsters that could compete with the GTX 1060 and provided good 1080p value compared to the GTX 1070. The drivers were not bad (as in there were no noticeable issues) during their time too.
 
Last edited:
Joined
Jun 2, 2017
Messages
9,184 (3.36/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
He is not wrong though, but what he likely means about the drivers being "botched" are mostly due to new hardware (and major software changes) at their respective timeframes. 5700 XT in 2019 was not a fun experience (@INSTG8R knows this from my bug reports lol), but this is because they just switched to Adrenalin and the 5700 XT RDNA1 was new architecture at the time. I felt it was 100% good around April 2020 (because they fixed the high memory clocks issue when idle and the control panel wasn't freezing randomly anymore). 6000 series release was not that bad, but the initial driver (December 2020) had the idle clocks issue again only to be worked on the next driver and was addressed in the one after (I remember it was good around March 2021). 7000 series, as you can see in reviews also had the idle clocks issue but was just resolved recently (along with the 6 months VR issue).

But with the RX 480 in 2016 (along with the RX 580/590, but I have not used this GPU myself) and the Radeon VII in 2019, I consider these their best balanced cards, not only because the VII was a proper prosumer card with working compute, but because the RX 480/580 were $200 monsters that could compete with the GTX 1060 and provided good 1080p value compared to the GTX 1070. The drivers were not bad (as in there were no noticeable issues) during their time too.
Thanks for some real context. There is no such thing as a 100% fool proof GPU. I totally enjoyed the 470/580 Cards as Crossfire was at the driver level so it just worked flawlessly and you are right about the pricing. For me where AMD lost is in price. When Vega launched Mining put the prices into the high end even though it was a good card(s) it was not worth $1000. Then the 5000 series was meh for me as the only thing better than Vega was power draw so I guess by the time I bought one those (5600XT) problems were resolved. The thing with 6000 was (for me) as fast as 2 Vega 64s in Crossfire (6800XT) I certainly wanted a Vega 7 but I held out as you can see. I never experienced Monthly drivers from AMD like I had for 6000. Now 7000 is here and I can Ray Trace as fast as a 3090 and blow it away at regular Gaming.

I was watching a Level One video this morning and it proved something to me. It was about running programs designed to run CUDA exclusively and how AMD Instinct cards are able to do that without any CUDA cores to the tune of 85%. That is not it though. Just like in CPUs AMD does not ignore the negative narrative but actively works to defeat it. The thread is about power consumption though and I will use the 6800XT as an example. When that launched it could pull up to 330 Watts but by the time the driver stack had matured the 6800XT does not consume more 255 Watts under normal consumption. If I am really concerned about power draw and want to game though a 4080 or 7900XTX makes absolutely no sense to buy. You could buy a 4060 or a 6700XT and not have to buy a 1000 Watt PSU. Guess what? I bought a 1000 Watt PSU. It is even the one that has no pigtails for the PCie plugs so every 8 pin is connected directly to the Power supply. To be honest I bought that (HX1200I RIP) for 7000 as I have no issue with my GPU pulling 342 Watts form the Wall for High refresh 4K Gaming on a 144Hz panel. It can only be butter and you can call me a fan boy all you want but AMD is all I am going use.
 
Joined
Dec 10, 2022
Messages
486 (0.68/day)
System Name The Phantom in the Black Tower
Processor AMD Ryzen 7 5800X3D
Motherboard ASRock X570 Pro4 AM4
Cooling AMD Wraith Prism, 5 x Cooler Master Sickleflow 120mm
Memory 64GB Team Vulcan DDR4-3600 CL18 (4×16GB)
Video Card(s) ASRock Radeon RX 7900 XTX Phantom Gaming OC 24GB
Storage WDS500G3X0E (OS), WDS100T2B0C, TM8FP6002T0C101 (x2) and ~40TB of total HDD space
Display(s) Haier 55E5500U 55" 2160p60Hz
Case Ultra U12-40670 Super Tower
Audio Device(s) Logitech Z200
Power Supply EVGA 1000 G2 Supernova 1kW 80+Gold-Certified
Mouse Logitech MK320
Keyboard Logitech MK320
VR HMD None
Software Windows 10 Professional
Benchmark Scores Fire Strike Ultra: 19484 Time Spy Extreme: 11006 Port Royal: 16545 SuperPosition 4K Optimised: 23439
I know Sapphire is a good brand, but their current Pulse and Nitro+ variants are too big for my use case, albeit the Pulse can fit but it has that extra PCI-E power connector which is disappointing to me since I'm trying to minimize cable jank. I would've stayed with the PowerColor Hellhound, but the coil whine was actually noticeable, and this was from two cards (One newly bought and another new replacement, also due to the coil whine.)
I think that you just got unlucky because I hadn't previously heard of any coil whine issues that are specific to the Hellhound (or any other card for that matter). I think that you just got unlucky because Powercolor sells thousands of those cards. To be perfectly honest, coil whine is completely unpredictable. It can happen to any card regardless of team or brand without any pattern that I've ever been able to follow. To make things even more frustrating, sometimes a card won't have it but will develop it over time and sometimes a card will have it but then one day it won't anymore. Then the one that had it go away might get it back and the one that had it begin might have it go away...

It's literally a random roll of the dice as to whether or not you get coil whine and/or whether or not it stays or goes away or comes back again.

Yeah, it sounds insane but that's how it is. The only thing that you can be sure of is that you can limit it by using Radeon Chill or VSync.
 
Joined
Jun 27, 2011
Messages
6,765 (1.38/day)
Processor 7800x3d
Motherboard Gigabyte B650 Auros Elite AX
Cooling Custom Water
Memory GSKILL 2x16gb 6000mhz Cas 30 with custom timings
Video Card(s) MSI RX 6750 XT MECH 2X 12G OC
Storage Adata SX8200 1tb with Windows, Samsung 990 Pro 2tb with games
Display(s) HP Omen 27q QHD 165hz
Case ThermalTake P3
Power Supply SuperFlower Leadex Titanium
Software Windows 11 64 Bit
Benchmark Scores CB23: 1811 / 19424 CB24: 1136 / 7687
Yeah, but I was talking about efficiency with framecap, v-sync etc. It seems Nvidia is better at running cards efficient at partisl load vs AMD and has been now for several generations.
At this moment in time, that is true. I don't think anyone disputes that for the current and most recent generations. I was simply pointing out it has not and will not always be true.
 
Joined
Dec 10, 2022
Messages
486 (0.68/day)
System Name The Phantom in the Black Tower
Processor AMD Ryzen 7 5800X3D
Motherboard ASRock X570 Pro4 AM4
Cooling AMD Wraith Prism, 5 x Cooler Master Sickleflow 120mm
Memory 64GB Team Vulcan DDR4-3600 CL18 (4×16GB)
Video Card(s) ASRock Radeon RX 7900 XTX Phantom Gaming OC 24GB
Storage WDS500G3X0E (OS), WDS100T2B0C, TM8FP6002T0C101 (x2) and ~40TB of total HDD space
Display(s) Haier 55E5500U 55" 2160p60Hz
Case Ultra U12-40670 Super Tower
Audio Device(s) Logitech Z200
Power Supply EVGA 1000 G2 Supernova 1kW 80+Gold-Certified
Mouse Logitech MK320
Keyboard Logitech MK320
VR HMD None
Software Windows 10 Professional
Benchmark Scores Fire Strike Ultra: 19484 Time Spy Extreme: 11006 Port Royal: 16545 SuperPosition 4K Optimised: 23439
The last time I remember AMD having an uncontested lead in performance per watt was with the HD 5000 series over 14 years ago.
Umm, I don't know how to say this without making you look a bit ridiculous but....

It's not huge, but it's definitely an uncontested lead and it's also pretty recent. No biggie though, I had the same attitude towards it then that I do today.... "Who the hell cares?"

If people truly cared about efficiency and weren't just paying it empty lip-service, then nobody would've bought any of those insane Raptor Lake CPUs, but they did. There's nothing more ludicrous than someone complaining about the power use of a Radeon card while their rig is sporting an i9-13900K/S or i7-13700K with a 36mm AIO. :roll:
 
Joined
Dec 25, 2020
Messages
6,789 (4.73/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS Special Edition
Motherboard ASUS ROG MAXIMUS Z790 APEX ENCORE
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) ASUS ROG Strix GeForce RTX™ 4080 16GB GDDR6X White OC Edition
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic Intellimouse
Keyboard Generic PS/2
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores I pulled a Qiqi~
Umm, I don't know how to say this without making you look a bit ridiculous but....

It's not huge, but it's definitely an uncontested lead and it's also pretty recent. No biggie though, I had the same attitude towards it then that I do today.... "Who the hell cares?"

If people truly cared about efficiency and weren't just paying it empty lip-service, then nobody would've bought any of those insane Raptor Lake CPUs, but they did. There's nothing more ludicrous than someone complaining about the power use of a Radeon card while their rig is sporting an i9-13900K/S or i7-13700K with a 36mm AIO. :roll:

I wasn't complaining, and that's a single game (sample size one) on a GPU that simply isn't as powerful as the ones you're comparing them to. That's a situational small lead due to a more stringent power limit on a game known to make very good utilization of its architecture. Break out the 6950 XT and let's see that grand accomplishment (not) evaporate.

IMHO you measure efficiency from a normalized performance standpoint across a large sample size. A decisive lead? Radeon HD 5970 (dual fully enabled Cypress XT, with two cores equivalent to HD 5870s) used less power than one GTX 480 on average. That's what I mean by "decisive lead".
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,178 (1.27/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
...that's a single game (sample size one) on a GPU that simply isn't as powerful as the ones you're comparing them to..

...you measure efficiency from a normalized performance standpoint across a large sample size....
Quoted for truth, I never liked HUB's power testing for this reason, a sample size of one game. For RDNA2 v Ampere, it's so close I'd call it a tie, yet it makes it plainly obvious that Ampere was an efficient architecture on an inefficient node.
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
42,247 (6.64/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
I think that you just got unlucky because I hadn't previously heard of any coil whine issues that are specific to the Hellhound (or any other card for that matter). I think that you just got unlucky because Powercolor sells thousands of those cards. To be perfectly honest, coil whine is completely unpredictable. It can happen to any card regardless of team or brand without any pattern that I've ever been able to follow. To make things even more frustrating, sometimes a card won't have it but will develop it over time and sometimes a card will have it but then one day it won't anymore. Then the one that had it go away might get it back and the one that had it begin might have it go away...

It's literally a random roll of the dice as to whether or not you get coil whine and/or whether or not it stays or goes away or comes back again.

Yeah, it sounds insane but that's how it is. The only thing that you can be sure of is that you can limit it by using Radeon Chill or VSync.

There is a thread to fix inductor resonance on this page

Umm, I don't know how to say this without making you look a bit ridiculous but....

It's not huge, but it's definitely an uncontested lead and it's also pretty recent. No biggie though, I had the same attitude towards it then that I do today.... "Who the hell cares?"

If people truly cared about efficiency and weren't just paying it empty lip-service, then nobody would've bought any of those insane Raptor Lake CPUs, but they did. There's nothing more ludicrous than someone complaining about the power use of a Radeon card while their rig is sporting an i9-13900K/S or i7-13700K with a 36mm AIO. :roll:
Hypocrites they are
 
Joined
Feb 11, 2009
Messages
5,556 (0.96/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> RX7800XT
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
I always love seeing people who pay over a thousand dollars for a GPU pretending to care about power consumption.

I understand what you are meaning to say, but imo power consumption is an indicator of the quality of a product.
You want more performance per watt then the previous generations, it has to be better.

Personally I would want gpu's to be limited to consume at most 300 watts and then have developers work within those constraints to make the most of it (like for example Formula 1 Racing),
the cards being 1000 bucks at the high end or not is not relevant there.
 
Joined
Nov 11, 2016
Messages
3,417 (1.16/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
Umm, I don't know how to say this without making you look a bit ridiculous but....

It's not huge, but it's definitely an uncontested lead and it's also pretty recent. No biggie though, I had the same attitude towards it then that I do today.... "Who the hell cares?"

If people truly cared about efficiency and weren't just paying it empty lip-service, then nobody would've bought any of those insane Raptor Lake CPUs, but they did. There's nothing more ludicrous than someone complaining about the power use of a Radeon card while their rig is sporting an i9-13900K/S or i7-13700K with a 36mm AIO. :roll:

So your counter to a Performance per watt argument is just a power consumption chart? shouldn't you include the Performance chart too?

Cause here it is
4K-Doom.png


3090 is 20% faster than 6800XT, yet the total power consumption is only 11% more, doesn't that mean 3090 is better in Performance per Watt?

3090: 0.39 Frame per Watt
6800XT: 0.36 Frame per Watt
 

Cheeseball

Not a Potato
Supporter
Joined
Jan 2, 2009
Messages
1,997 (0.34/day)
Location
Pittsburgh, PA
System Name Titan
Processor AMD Ryzen™ 7 7950X3D
Motherboard ASRock X870 Taichi Lite
Cooling Thermalright Phantom Spirit 120 EVO CPU
Memory TEAMGROUP T-Force Delta RGB 2x16GB DDR5-6000 CL30
Video Card(s) ASRock Radeon RX 7900 XTX 24 GB GDDR6 (MBA) / NVIDIA RTX 4090 Founder's Edition
Storage Crucial T500 2TB x 3
Display(s) LG 32GS95UE-B, ASUS ROG Swift OLED (PG27AQDP), LG C4 42" (OLED42C4PUA)
Case HYTE Hakos Baelz Y60
Audio Device(s) Kanto Audio YU2 and SUB8 Desktop Speakers and Subwoofer, Cloud Alpha Wireless
Power Supply Corsair SF1000L
Mouse Logitech Pro Superlight 2 (White), G303 Shroud Edition
Keyboard Wooting 60HE+ / 8BitDo Retro Mechanical Keyboard (N Edition) / NuPhy Air75 v2
VR HMD Occulus Quest 2 128GB
Software Windows 11 Pro 64-bit 23H2 Build 22631.4317
I think that you just got unlucky because I hadn't previously heard of any coil whine issues that are specific to the Hellhound (or any other card for that matter). I think that you just got unlucky because Powercolor sells thousands of those cards. To be perfectly honest, coil whine is completely unpredictable. It can happen to any card regardless of team or brand without any pattern that I've ever been able to follow. To make things even more frustrating, sometimes a card won't have it but will develop it over time and sometimes a card will have it but then one day it won't anymore. Then the one that had it go away might get it back and the one that had it begin might have it go away...

It's literally a random roll of the dice as to whether or not you get coil whine and/or whether or not it stays or goes away or comes back again.

Yeah, it sounds insane but that's how it is. The only thing that you can be sure of is that you can limit it by using Radeon Chill or VSync.
I completely understand. Its unfortunate that two of PowerColor's 7900 XTX Hellhounds had coil whine, but it is not an isolated case so much as a major issue. And I know its not my PSU as my RM850x has run the Pulse XTX, my current ASRock XTX MBA, a previous PowerColor Red Devil 6950 XT and a RTX 3090 FE without issue. That and I have tried it on three other PSUs (HX850, SF1000 and V850 SFX) during my ITX fittings with the same coil whine for both cards. I'll just tack it onto PowerColor's years of varying QA. This sucked because it had a proper white PCB.

I prefer using RTSS over Adrenalin's Radeon Chill for framerate limiting as I don't want it based off screen movement. That and I only use Anti-Lag since I mostly play competitive games. All it needs is a proper "boost" feature (like Reflex Boost) to tell the GPU to push higher clocks when I'm CPU limited, but this is low priority as the XTX can handle those games quite fine.
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,178 (1.27/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
So your counter to a Performance per watt argument is just a power consumption chart? shouldn't you include the Performance chart too?
Yet sadly it's reacted to unironically too, ain't that the beauty of it I suppose, cherry picking arguments to suit your preference when and if it suits you and rejecting the counter arguments when they don't.

It's clear that AMD has the inferior architecture but if they're priced out of greed and not compellingly, who cares? If I'm playing Jedi: Survivor at 4k with the highest textures at with flawless gameplay, do you think I give a rat's posterior about whether or not my video card has mountains of unused VRAM? The answer is no, I honestly couldn't care less and neither would most people. Also, don't forget, a card could have the most VRAM in the world but if someone paid extra for it and the card is hamstrung because it doesn't have the GPU horsepower to make use of it, the buyer is a fool, plain and simple.

It's been clear for well over a decade that lots of fools buy AMD.
 
Joined
Jan 8, 2017
Messages
9,438 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
You want more performance per watt then the previous generations, it has to be better.
If you check you'll see that actually all of these GPUs have better efficiency than the previous generation.
the cards being 1000 bucks at the high end or not is not relevant there.
Of course it is, if you're concerned about the price of electricity clearly you can afford an extra 10 bucks a month or if you're concerned about heat you can use the AC.
 
Joined
Nov 11, 2016
Messages
3,417 (1.16/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
Yet sadly it's reacted to unironically too, ain't that the beauty of it I suppose, cherry picking arguments to suit your preference when and if it suits you and rejecting the counter arguments when they don't.

It's clear that AMD has the inferior architecture but if they're priced out of greed and not compellingly, who cares? If I'm playing Jedi: Survivor at 4k with the highest textures at with flawless gameplay, do you think I give a rat's posterior about whether or not my video card has mountains of unused VRAM? The answer is no, I honestly couldn't care less and neither would most people. Also, don't forget, a card could have the most VRAM in the world but if someone paid extra for it and the card is hamstrung because it doesn't have the GPU horsepower to make use of it, the buyer is a fool, plain and simple.

It's been clear for well over a decade that lots of fools buy AMD.

The more tech outlets appear over the year (such as tech tubers), the worsening marketshare Radeon is becoming, so I guess people are getting smarter?

Radeon is reaching an all time low marketshare now.
29160939924l.jpg
 
Top