• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

4080 vs 7900XTX power consumption - Optimum Tech

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
5,029 (1.99/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans replaced with Noctua A14x25 G2
Cooling Optimus Block, HWLabs Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 1x A14G2, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MT 26-36-36-48, 56.6ns AIDA, 2050 FCLK, 160 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front, LINKUP Ultra PCIe 4.0 x16 white
Audio Device(s) Audeze Maxwell Ultraviolet w/upgrade pads & LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro
Power Supply SF750 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper Pro V2 8 KHz Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape
Keyboard Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerV2 mod, TLabs Leath/Suede
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores Legendary
It's clear some folks choose to hyper focus on the $ cost of higher power draw, maybe in an attempt to argue it's irrelevant, but ignore the other major reason more efficient cards are desirable, they output less heat into your system.

Every chip is power limited in some way or another, efficiency = performance. It's not hard to understand, even AMD marketing pushes their 50% gen over gen efficiency gains, although it's not quite that much.
 
Joined
Dec 10, 2022
Messages
486 (0.68/day)
System Name The Phantom in the Black Tower
Processor AMD Ryzen 7 5800X3D
Motherboard ASRock X570 Pro4 AM4
Cooling AMD Wraith Prism, 5 x Cooler Master Sickleflow 120mm
Memory 64GB Team Vulcan DDR4-3600 CL18 (4×16GB)
Video Card(s) ASRock Radeon RX 7900 XTX Phantom Gaming OC 24GB
Storage WDS500G3X0E (OS), WDS100T2B0C, TM8FP6002T0C101 (x2) and ~40TB of total HDD space
Display(s) Haier 55E5500U 55" 2160p60Hz
Case Ultra U12-40670 Super Tower
Audio Device(s) Logitech Z200
Power Supply EVGA 1000 G2 Supernova 1kW 80+Gold-Certified
Mouse Logitech MK320
Keyboard Logitech MK320
VR HMD None
Software Windows 10 Professional
Benchmark Scores Fire Strike Ultra: 19484 Time Spy Extreme: 11006 Port Royal: 16545 SuperPosition 4K Optimised: 23439
Not that I'm a reference or anything, but I do tend to buy components with lower power draw. They're easier to keep cool and quiet (yeah, I know) using cheaper fans and heatsinks.
That's why I believe that CPU power draw is a much bigger deal than GPU power draw, the cost of the coolers can be crazy. I've always just used stock air coolers because they work just fine and they cost nothing. :laugh:
My systems are silent enough that in a few cases I had my friends go "oh, you turned it on already"?
Yeah, my main rig is in my living room so I'm always sitting about 20' away and any noise is drowned out by the game I'm playing, video I'm watching, music I'm playing, etc.
Also worth noting, in the past, highest TDP on a GPU was much lower. Today, if you're looking at 100W difference, that's two thirds of what a mid range card used to draw. Or like 20 LED bulbs turned on at the same time...
Yeah, but you're not going to see a 100W difference between two GPUs in the same generation and performance tier. Even if I had a card that consumed 50W more than another at the same performance tier, it would cost me less than $10CAD extra per year. That's a nothingburger to me if the more efficient GPU costs an extra $50-$100 because it would take me longer than the useful life of the card to break even. Since the primary sources of electricity in my region are nuclear, hydro and wind (91% combined), it won't increase my carbon footprint either.
 
Joined
Feb 11, 2009
Messages
5,550 (0.96/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> RX7800XT
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
If you check you'll see that actually all of these GPUs have better efficiency than the previous generation.

Of course it is, if you're concerned about the price of electricity clearly you can afford an extra 10 bucks a month or if you're concerned about heat you can use the AC.

Yes...I never contested that?

you are the one suggesting it doesnt matter because "if you can afford to spend 1000 dollars on a gpu, the power bill will not be relevant".
Im trying to explain how that isnt the point of the argument, its not about the price of electricity or heat, its about the quality of a product.

just because you can afford it does not make it any less wasteful or poor engineering.


Thats why I mentioned formula 1, I hate this trend of gpu's performing better for the consumption....while also constantly consuming more power.

The most high end card of 20 years ago, the 8800 Ultra used 2 x 6pin connectors and that was crazy for the time, thats a combined 150 + pcie 75 watt, 225 watt for the most high end card.....

And now they need 600+ watts, while harping on how efficiency has improved......

Again I want the gpu to be limited by some governing body and then just have them get creative and innovate with what they have, 300 watts max, make due with it.
 
Joined
Feb 18, 2005
Messages
5,847 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
Roughly $75 more per year in power costs isn't a factor for people buying a GPU in this price range. It's an afterthought- it is not relevant.

Resorting to insults is low.
FFS... it's not about the cost, it's about the principle. Or are you going to tell me that that isn't relevant either?
 
Joined
Jan 8, 2017
Messages
9,436 (3.28/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Im trying to explain how that isnt the point of the argument, its not about the price of electricity or heat, its about the quality of a product.
I don't know what you could possibly mean by "better quality" in this context, like I said all of these new GPUs have better efficiency compared to previous generations and are similar to each other anyway.

I don't think anyone ever picks a 4080 over a 7900XTX because it's 10% more efficient or whatever and it's therefore a "better quality product", sorry I don't believe it and I wont ever believe it. Real consumers simply do not care about that, these are made up reasons proposed by forum dwellers to argue which brand is better.
 
Last edited:
Joined
Feb 11, 2009
Messages
5,550 (0.96/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> RX7800XT
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
I don't know what you could possibly mean by "better quality" in this context, like I said all of these new GPUs have better efficiency compared to previous generations and are similar to each other anyway.

I don't think anyone ever picks a 4080 over a 7900XTX because it's 10% more efficient or whatever and it's therefore a "better quality product", sorry I don't believe it and I wont ever believe it. Real consumers simply do not care about that, these are made up reasons proposed by forum dwellers to argue which brand is better.

And again, I never claimed otherwise, never have I once suggested anyone buys a 4080 over a 7900xtx because of efficiency.
But it is, objectively, a better product in that sense.
 
Joined
Jan 8, 2017
Messages
9,436 (3.28/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
And again, I never claimed otherwise, never have I once suggested anyone buys a 4080 over a 7900xtx because of efficiency.
So what even is your argument then ? If you agree people don't care, why even debate this.
 
Joined
Feb 18, 2005
Messages
5,847 (0.81/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Razer Pro Type Ultra
Software Windows 10 Professional x64
I don't think anyone ever picks a 4080 over a 7900XTX because it's 10% more efficient or whatever and it's therefore a "better quality product", sorry I don't believe it and I wont ever believe it. Real consumers simply do not care about that, these are made up reasons proposed by forum dwellers to argue which brand is better.
I was intending to pick up a new GPU this year around Black Friday, and AMD is effectively my only option because it's the only company making GPUs with USB-C ports (but only on the reference models because AIBs suck d**k), except that I have chosen not to buy their product because as an engineer their s**tty attitude to efficiency pisses me off.

If the 7700/7800 series have USB-C and are reasonably priced, then maybe I will be able to overcome my repugnance and buy one, but absolutely not on a product that's close to a thousand bucks. For that kind of money AMD simply needs to do better.
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
5,029 (1.99/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans replaced with Noctua A14x25 G2
Cooling Optimus Block, HWLabs Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 1x A14G2, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MT 26-36-36-48, 56.6ns AIDA, 2050 FCLK, 160 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front, LINKUP Ultra PCIe 4.0 x16 white
Audio Device(s) Audeze Maxwell Ultraviolet w/upgrade pads & LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro
Power Supply SF750 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper Pro V2 8 KHz Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape
Keyboard Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerV2 mod, TLabs Leath/Suede
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores Legendary
I was intending to pick up a new GPU this year around Black Friday, and AMD is effectively my only option because it's the only company making GPUs with USB-C ports (but only on the reference models because AIBs suck d**k), except that I have chosen not to buy their product because as an engineer their s**tty attitude to efficiency pisses me off.

If the 7700/7800 series have USB-C and are reasonably priced, then maybe I will be able to overcome my repugnance and buy one, but absolutely not on a product that's close to a thousand bucks. For that kind of money AMD simply needs to do better.
Just pass through the signal and use the motherboard USB-C or a monitor with the connection.
 
Joined
Dec 10, 2022
Messages
486 (0.68/day)
System Name The Phantom in the Black Tower
Processor AMD Ryzen 7 5800X3D
Motherboard ASRock X570 Pro4 AM4
Cooling AMD Wraith Prism, 5 x Cooler Master Sickleflow 120mm
Memory 64GB Team Vulcan DDR4-3600 CL18 (4×16GB)
Video Card(s) ASRock Radeon RX 7900 XTX Phantom Gaming OC 24GB
Storage WDS500G3X0E (OS), WDS100T2B0C, TM8FP6002T0C101 (x2) and ~40TB of total HDD space
Display(s) Haier 55E5500U 55" 2160p60Hz
Case Ultra U12-40670 Super Tower
Audio Device(s) Logitech Z200
Power Supply EVGA 1000 G2 Supernova 1kW 80+Gold-Certified
Mouse Logitech MK320
Keyboard Logitech MK320
VR HMD None
Software Windows 10 Professional
Benchmark Scores Fire Strike Ultra: 19484 Time Spy Extreme: 11006 Port Royal: 16545 SuperPosition 4K Optimised: 23439
If you check you'll see that actually all of these GPUs have better efficiency than the previous generation.

Of course it is, if you're concerned about the price of electricity clearly you can afford an extra 10 bucks a month or if you're concerned about heat you can use the AC.
It's not an extra $10 per month, it's more like an extra $10 per year. Unless you live in the UK where electricity is stupidly expensive and then it's an extra £22 per year. This is assuming an extra 50W for a GPU that's maxxed-out (like running 3DMark) for 4 hours a day, 365 days per year.

Here in Canada, I'm looking at less than $10 per year. Since I don't run 3DMark for 4 hours a day, for me, it would be probably more like an extra $5-$7 per year because gaming won't show the maximum wattage difference like running 3DMark would. It's a joke to me that anyone is willing to whine about it but it's clear that we don't have a dearth of know-nothing-know-it-alls.

We're never going to purchase them because they don't satisfy our criteria for being power-efficient. Why is this so difficult for you cretins to understand? Why do you continue to attempt to disregard this argument as if it's irrelevant? Is it because you're intellectually stunted and only capable of fanboyism?
It appears that most people didn't understand your humour. I thought that it was quite clever myself! ;)
 
Joined
Jun 7, 2023
Messages
170 (0.32/day)
Location
Holy Roman Empire
System Name Shadowchaser /// Shadowchaser Jr.
Processor AMD Ryzen 7 7700 /// AMD Ryzen 9 5900X
Motherboard Asus ROG Strix X670E-E Gaming /// Asus ROG Strix B550-E Gaming
Cooling Noctua NH-D15S chromax.black /// Thermalright Venomous X
Memory Kingston Fury Beast RGB 6000C30 2x16GB /// G.Skill Trident Z RGB 3200C14 4x8GB
Video Card(s) Zotac GeForce RTX 4080 Trinity 16GB /// Gigabyte G1 GTX 980Ti
Storage Kingston KC3000 2TB + Samsung 970EVO 2TB + 2x WD Red 4TB /// Samsung 970EVO 1TB + WD Red 4TB
Display(s) Gigabyte M27Q /// Dell U2715H
Case Fractal Define 7 Black /// Fractal Define R5 Black
Power Supply Seasonic SS-750KM3 /// Seasonic SS-660KM
Mouse Logitech MX Master 2S /// Logitech MX Master
Software Linux Debian 12 x64 + MS Windows 10 Pro x64
Yep, I couldn't care less about power usage of my PC during gaming. It is not like it is in game mode for 24/7. In my country with my gaming routine, it costs lets say maybe €15 per year.
But what really troubles me is heat that comes from GPU. So I never look on power draw on reviews, only heat.
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
5,029 (1.99/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans replaced with Noctua A14x25 G2
Cooling Optimus Block, HWLabs Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 1x A14G2, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MT 26-36-36-48, 56.6ns AIDA, 2050 FCLK, 160 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front, LINKUP Ultra PCIe 4.0 x16 white
Audio Device(s) Audeze Maxwell Ultraviolet w/upgrade pads & LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro
Power Supply SF750 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper Pro V2 8 KHz Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape
Keyboard Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerV2 mod, TLabs Leath/Suede
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores Legendary
But what really troubles me is heat that comes from GPU. So I never look on power draw on reviews, only heat.
Lol.
You do realise the issue with that sentence right?
 
Joined
Jun 7, 2023
Messages
170 (0.32/day)
Location
Holy Roman Empire
System Name Shadowchaser /// Shadowchaser Jr.
Processor AMD Ryzen 7 7700 /// AMD Ryzen 9 5900X
Motherboard Asus ROG Strix X670E-E Gaming /// Asus ROG Strix B550-E Gaming
Cooling Noctua NH-D15S chromax.black /// Thermalright Venomous X
Memory Kingston Fury Beast RGB 6000C30 2x16GB /// G.Skill Trident Z RGB 3200C14 4x8GB
Video Card(s) Zotac GeForce RTX 4080 Trinity 16GB /// Gigabyte G1 GTX 980Ti
Storage Kingston KC3000 2TB + Samsung 970EVO 2TB + 2x WD Red 4TB /// Samsung 970EVO 1TB + WD Red 4TB
Display(s) Gigabyte M27Q /// Dell U2715H
Case Fractal Define 7 Black /// Fractal Define R5 Black
Power Supply Seasonic SS-750KM3 /// Seasonic SS-660KM
Mouse Logitech MX Master 2S /// Logitech MX Master
Software Linux Debian 12 x64 + MS Windows 10 Pro x64
Lol.
You do realise the issue with that sentence right?
Sorry, english is not my native language. I know that is "more power = more heat", but what I meant to say is I do not care for electric bill at the end of the month :)
 
Joined
Apr 14, 2022
Messages
745 (0.78/day)
Location
London, UK
Processor AMD Ryzen 7 5800X3D
Motherboard ASUS B550M-Plus WiFi II
Cooling Noctua U12A chromax.black
Memory Corsair Vengeance 32GB 3600Mhz
Video Card(s) Palit RTX 4080 GameRock OC
Storage Samsung 970 Evo Plus 1TB + 980 Pro 2TB
Display(s) Acer Nitro XV271UM3B IPS 180Hz
Case Asus Prime AP201
Audio Device(s) Creative Gigaworks - Razer Blackshark V2 Pro
Power Supply Corsair SF750
Mouse Razer Viper
Keyboard Asus ROG Falchion
Software Windows 11 64bit
Interesting that in less demanding games, the % difference is much higher.

I knew that the 4080 was a bit more efficient in general, but wasn't aware of how the efficiency changes when utilization isn't 100%.

View attachment 304225
View attachment 304226


Were the nVidias GPUs the ones that had the driver overhead thingy? Meaning that they tend to need or use more the CPU resources in order to perform at their best while the radeons were capable of delivering most of their top performance without relying on the CPU speed.

So, in CS GO, the radeon can deliver 350fps at 213W and 67% GPU usage at unknown CPU usage.
While the 4080 delivers the same performance at 65W and 32% at unknown CPU usage.
It's obvious that the 4080 uses the CPU way more than the Radeon. But no matter what CPU, most of them consume way less power than a GPU.
So, it makes sense why the total power draw is that much lower in the 4080 system.

1689804374494.png
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,175 (1.27/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
@nguyen It seems like the solution is plug your ears and ignore evidence to the contrary, oh well, all the other readers can see the evidence and make up their own mind. Some of the regular crop seem to agree despite it being factually incorrect, or at the very least debatable.

Then there's the matter of the RTX A2000, not that it's a consumer card, but was 'popular' with the SFF crowd as the most powerful low profile GPU for gaming. It's Ampere, on Samsung 8nm and blows RNDA2 right out of the water in efficiency. I'd be curious to see if RDNA2 (or CDNA2?) has cards that punch as efficient as this. My 2c, it counts. Although I suspect team Red won't agree.
1689825019623.png
 
Joined
Jun 14, 2020
Messages
3,460 (2.13/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Its becoming increasingly hard to take this joker staff member seriously. The bias is flowing freely. The pattern is clear.

In short, it is disgusting.

/thread
That's unwarranted. Guy just posted a review, from a reputable reviewer non the less. Ali from optimum tech is a very very good reviewer.

"Why aren't we seeing posts about the colossal difference between Raptor Lake and Zen?"
Because it doesn't exist

Umm, I don't know how to say this without making you look a bit ridiculous but....

It's not huge, but it's definitely an uncontested lead and it's also pretty recent. No biggie though, I had the same attitude towards it then that I do today.... "Who the hell cares?"

If people truly cared about efficiency and weren't just paying it empty lip-service, then nobody would've bought any of those insane Raptor Lake CPUs, but they did. There's nothing more ludicrous than someone complaining about the power use of a Radeon card while their rig is sporting an i9-13900K/S or i7-13700K with a 36mm AIO. :roll:
That graph is power draw, not efficiency. Both the 3080 and the 3090 were actually faster in that game, so it makes sense to draw more power. Furthermore, he was measuring total system power draw, which means, when the 3090 is delivering 210 fps vs 175 fps on the 6800xt, the whole system is pulling more power cause it's working harder. Even then the 3090 is at 0.39fps / watt and the 6800xt at 0.36 fps / watt.. The 3090 is more efficient, and the difference would have been bigger if he measured gpu power draw and not total system.
 

Frick

Fishfaced Nincompoop
Joined
Feb 27, 2006
Messages
19,571 (2.86/day)
Location
Piteå
System Name White DJ in Detroit
Processor Ryzen 5 5600
Motherboard Asrock B450M-HDV
Cooling Be Quiet! Pure Rock 2
Memory 2 x 16GB Kingston Fury 3400mhz
Video Card(s) XFX 6950XT Speedster MERC 319
Storage Kingston A400 240GB | WD Black SN750 2TB |WD Blue 1TB x 2 | Toshiba P300 2TB | Seagate Expansion 8TB
Display(s) Samsung U32J590U 4K + BenQ GL2450HT 1080p
Case Fractal Design Define R4
Audio Device(s) Plantronics 5220, Nektar SE61 keyboard
Power Supply Corsair RM850x v3
Mouse Logitech G602
Keyboard Cherry MX Board 1.0 TKL Brown
Software Windows 10 Pro
Benchmark Scores Rimworld 4K ready!
I don't think anyone ever picks a 4080 over a 7900XTX because it's 10% more efficient or whatever and it's therefore a "better quality product", sorry I don't believe it and I wont ever believe it. Real consumers simply do not care about that, these are made up reasons proposed by forum dwellers to argue which brand is better.

Effeciency no, but pure power draw (including transients, which aren't as bad on the 7900 as the 6900) is definitely a thing to consider, because it impacts what PSU you have to have. A decent 650W or unit is much cheaper than a decent 850W unit. The 6950's are coming down to decent prices, even to a point where I'm tempted, but I'd have to spend like €200 more just on a PSU. Better effeciency means more performance for less total money spent. Sure, if you have a good 1kW+ unit there are no problems, but most people don't have those.

ANYWAY, I'd love for w1z to do Vsync 60Hz power draw testing on 1440p and 4K in addition to 1080p, especially for the higher end cards.
 
Joined
Feb 14, 2012
Messages
1,846 (0.40/day)
Location
Romania
ANYWAY, I'd love for w1z to do Vsync 60Hz power draw testing
That would be a cool article. Maybe even do it as a separate benchmark to the usual hardware review.
but I'd have to spend like €200 more just on a PSU

This was my major turn off regarding 4070 vs 6950 debate. I went for the former, because i did not need to change my 650W 5 year old Seasonic. I would have gotten a 6950 to test it but i only have two 8p cables. So the 6950 was about 100$ more expensive and i would have to add at least a 150$ 850W PSU to the mix just to get 15% more performace and 4GB more of vRAM. Time will tell if i made the right call with "only" 12GB of vRAM.
 

Frick

Fishfaced Nincompoop
Joined
Feb 27, 2006
Messages
19,571 (2.86/day)
Location
Piteå
System Name White DJ in Detroit
Processor Ryzen 5 5600
Motherboard Asrock B450M-HDV
Cooling Be Quiet! Pure Rock 2
Memory 2 x 16GB Kingston Fury 3400mhz
Video Card(s) XFX 6950XT Speedster MERC 319
Storage Kingston A400 240GB | WD Black SN750 2TB |WD Blue 1TB x 2 | Toshiba P300 2TB | Seagate Expansion 8TB
Display(s) Samsung U32J590U 4K + BenQ GL2450HT 1080p
Case Fractal Design Define R4
Audio Device(s) Plantronics 5220, Nektar SE61 keyboard
Power Supply Corsair RM850x v3
Mouse Logitech G602
Keyboard Cherry MX Board 1.0 TKL Brown
Software Windows 10 Pro
Benchmark Scores Rimworld 4K ready!
That would be a cool article. Maybe even do it as a separate benchmark to the usual hardware review.

I want it in all reviews. May not matter much on say a 4060 or the upcoming AMD lower end cards, but for something like a 4080/4090 or a 7900XT it would definitely be interesting to see. Maybe even 1080p/144FPS and 1440p/144FPS.
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
5,029 (1.99/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans replaced with Noctua A14x25 G2
Cooling Optimus Block, HWLabs Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 1x A14G2, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MT 26-36-36-48, 56.6ns AIDA, 2050 FCLK, 160 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front, LINKUP Ultra PCIe 4.0 x16 white
Audio Device(s) Audeze Maxwell Ultraviolet w/upgrade pads & LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro
Power Supply SF750 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper Pro V2 8 KHz Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape
Keyboard Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerV2 mod, TLabs Leath/Suede
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores Legendary
We already do 60 Hz V-sync testing in GPU reviews.

1689883540214.png
 

wolf

Better Than Native
Joined
May 7, 2007
Messages
8,175 (1.27/day)
System Name MightyX
Processor Ryzen 9800X3D
Motherboard Gigabyte X650I AX
Cooling Scythe Fuma 2
Memory 32GB DDR5 6000 CL30
Video Card(s) Asus TUF RTX3080 Deshrouded
Storage WD Black SN850X 2TB
Display(s) LG 42C2 4K OLED
Case Coolermaster NR200P
Audio Device(s) LG SN5Y / Focal Clear
Power Supply Corsair SF750 Platinum
Mouse Corsair Dark Core RBG Pro SE
Keyboard Glorious GMMK Compact w/pudding
VR HMD Meta Quest 3
Software case populated with Artic P12's
Benchmark Scores 4k120 OLED Gsync bliss
Joined
Sep 23, 2022
Messages
1,288 (1.62/day)
And just for reference from the 60Hz testing:

V-Sync: If you don't need the highest framerate and want to conserve power, running at 60 FPS is a good option. In this test, we run Cyberpunk 2077 at 1920x1080, capped to 60 FPS. This test is also useful in testing a graphic card's ability to react to situations with only low power requirements. For graphics cards that can't reach 60 FPS at 1080p, we report the power draw at the highest achievable frame rate.
 

dgianstefani

TPU Proofreader
Staff member
Joined
Dec 29, 2017
Messages
5,029 (1.99/day)
Location
Swansea, Wales
System Name Silent
Processor Ryzen 7800X3D @ 5.15ghz BCLK OC, TG AM5 High Performance Heatspreader
Motherboard ASUS ROG Strix X670E-I, chipset fans replaced with Noctua A14x25 G2
Cooling Optimus Block, HWLabs Copper 240/40 + 240/30, D5/Res, 4x Noctua A12x25, 1x A14G2, Mayhems Ultra Pure
Memory 32 GB Dominator Platinum 6150 MT 26-36-36-48, 56.6ns AIDA, 2050 FCLK, 160 ns tRFC, active cooled
Video Card(s) RTX 3080 Ti Founders Edition, Conductonaut Extreme, 18 W/mK MinusPad Extreme, Corsair XG7 Waterblock
Storage Intel Optane DC P1600X 118 GB, Samsung 990 Pro 2 TB
Display(s) 32" 240 Hz 1440p Samsung G7, 31.5" 165 Hz 1440p LG NanoIPS Ultragear, MX900 dual gas VESA mount
Case Sliger SM570 CNC Aluminium 13-Litre, 3D printed feet, custom front, LINKUP Ultra PCIe 4.0 x16 white
Audio Device(s) Audeze Maxwell Ultraviolet w/upgrade pads & LCD headband, Galaxy Buds 3 Pro, Razer Nommo Pro
Power Supply SF750 Plat, full transparent custom cables, Sentinel Pro 1500 Online Double Conversion UPS w/Noctua
Mouse Razer Viper Pro V2 8 KHz Mercury White w/Tiger Ice Skates & Pulsar Supergrip tape
Keyboard Wooting 60HE+ module, TOFU-R CNC Alu/Brass, SS Prismcaps W+Jellykey, LekkerV2 mod, TLabs Leath/Suede
Software Windows 11 IoT Enterprise LTSC 24H2
Benchmark Scores Legendary
Well, it's what I've been trying to say lol, these OT results aren't exactly out of line with what we already know, it's just TPU doesn't test low load at framerates higher than 60 (yet).

What an interesting result, half of bloody Turing is more efficient than RDNA2 here.
More efficient than RDNA3 you mean ;).

I'd like to also point out that people frequently ask me "why do you always recommend 4060/Ti over 6700 XT" or "muh 8 GB VRAM" (how's that working out now that 16 GB variant has no performance difference) this kind of chart is a big reason.

Twice or even three times the energy efficiency at frame capped games (of which there are many locked at 60 FPS, eg bethesda games or certain console ports etc.), or other associated low loads isn't irrelevant. People take 100% load efficiency and raw raster performance and compare the options, but there's a lot more states the GPU will be in, often more frequently. Another thing to remember is that TPU tests with maxed out CPU hardware, most people aren't rocking a 13900K, and I doubt their GPU is pegged at 100% all the time.

It's a whole other factor if you want to start comparing RT efficiency too, due to the dedicated hardware used in NV cards.

4070 and 6900 XT are roughly comparable in both price and performance, 6900 XT is 5-10% faster in raster, but 20% slower in RT.

Yet one peaks at 636 w, 130 w higher than a 4090, and the other is 235 w.
power-spikes.png
 
Last edited:
Joined
Nov 11, 2016
Messages
3,412 (1.16/day)
System Name The de-ploughminator Mk-III
Processor 9800X3D
Motherboard Gigabyte X870E Aorus Master
Cooling DeepCool AK620
Memory 2x32GB G.SKill 6400MT Cas32
Video Card(s) Asus RTX4090 TUF
Storage 4TB Samsung 990 Pro
Display(s) 48" LG OLED C4
Case Corsair 5000D Air
Audio Device(s) KEF LSX II LT speakers + KEF KC62 Subwoofer
Power Supply Corsair HX850
Mouse Razor Death Adder v3
Keyboard Razor Huntsman V3 Pro TKL
Software win11
Well, it's what I've been trying to say lol, these OT results aren't exactly out of line with what we already know, it's just TPU doesn't test low load at framerates higher than 60 (yet).


More efficient than RDNA3 you mean ;).

I'd like to also point out that people frequently ask me "why do you always recommend 4060/Ti over 6700 XT" or "muh 8 GB VRAM" (how's that working out now that 16 GB variant has no performance difference) this kind of chart is a big reason.

Twice or even three times the energy efficiency at frame capped games (of which there are many locked at 60 FPS, eg bethesda games or certain console ports etc.), or other associated low loads isn't irrelevant. People take 100% load efficiency and raw raster performance and compare the options, but there's a lot more states the GPU will be in, often more frequently. Another thing to remember is that TPU tests with maxed out CPU hardware, most people aren't rocking a 13900K, and I doubt their GPU is pegged at 100% all the time.

It's a whole other factor if you want to start comparing RT efficiency too, due to the dedicated hardware used in NV cards.

4070 and 6900 XT are roughly comparable in both price and performance, 6900 XT is 5-10% faster in raster, but 20% slower in RT.

Yet one peaks at 636 w, 130 w higher than a 4090, and the other is 235 w.

Yup, save some money on GPU only to spend more on PSU and electricity bill sounds good to some people I guess ;).

I just built a mini PC for friend with 13900K + 4080 and the entire PC draws maximum 380W at the wall (CP2077 and Last of US @ 4K Ultra settings). I'm pretty sure my friend wouldn't want to replace his aging AC just so he could use a 7900XTX in his PC :roll: .
20230607_190749.jpg
 
Joined
May 17, 2021
Messages
3,005 (2.33/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
I honestly dont think Radeons have been that bad over the last 2 gens. Seem like a bit more of a clichè market to me.

As someone that owned a rx5700 i laughed at this, it actually got much worst then the Polaris generation. The hardware is solid, the software is shit. This won't change for as much as some people would like it to. You can save money going AMD, i did it, but stop fooling yourselves and others.
 
Top