• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

4080 vs 7900XTX power consumption - Optimum Tech

The more tech outlets appear over the year (such as tech tubers), the worsening marketshare Radeon is becoming, so I guess people are getting smarter?

Radeon is reaching an all time low marketshare now.
View attachment 305420

Yes, this is quite sad.

Its a damn shame to see graphs like that when
I honestly dont think Radeons have been that bad over the last 2 gens. Seem like a bit more of a clichè market to me.

AMD have been kicking great goals in their CPU division. I hope the same thing can be said about their GPU division going forward.
 
Interesting that in less demanding games, the % difference is much higher.

I knew that the 4080 was a bit more efficient in general, but wasn't aware of how the efficiency changes when utilization isn't 100%.

View attachment 304225
View attachment 304226

This entirely comes off like the difference in the nvidia control panel with "prefer maximum performance" turned on


Even if the testing is valid, these charts are really misleading because they'll be shared thousands of times online with no mention of what GPU's they actually are, as comments on page 1 mentioned.
For all we know, this custom OC'd card is the cause of the higher wattage behaviour as a design choice exclusive to it.
 
This entirely comes off like the difference in the nvidia control panel with "prefer maximum performance" turned on


Even if the testing is valid, these charts are really misleading because they'll be shared thousands of times online with no mention of what GPU's they actually are, as comments on page 1 mentioned.
For all we know, this custom OC'd card is the cause of the higher wattage behaviour as a design choice exclusive to it.

Using 7900XTX MBA would be worse because no one is selling those anymore.

Results would be exactly the same with Asus TUF 4080 vs Asus TUF 7900XTX
 
The last time I remember AMD having an uncontested lead in performance per watt was with the HD 5000 series over 14 years ago.

Weren't RDNA1 and 2 both better performance per watt than 2000 and 3000 series?
 
Weren't RDNA1 and 2 both better performance per watt than 2000 and 3000 series?
They were lower wattage, but also lower performance, so not really.

From the latest GPU review tested with 2023.2 bench.

1689765549889.png


6800 is in a good spot since it's lower clocked, but the rest are the same or worse than equivalent NVIDIA series.

It's more impressive when you consider that RDNA2 was on TSMC 7 nm and Ampere was on Samsung 8 (10 nm).
 
It's more impressive when you consider that RDNA2 was on TSMC 7 nm and Ampere was on Samsung 8 (10 nm).
It was so close between different models of the same arch (RDNA2 vs Ampere), trading blows that it amounts to a tie, and yeah absolutely it shows as I said before, Ampere was an efficient arch on an inferior node. The side effects of which are biting consumers now, as the outright perf leap from Ampere to Ada has gotten Nvidia rubbing their hands so hard they bumped down several sku names to a die one tier lower. ie a 4050 masquerading as a 4060, but I digress.
 
I wasn't complaining, and that's a single game (sample size one) on a GPU that simply isn't as powerful as the ones you're comparing them to. That's a situational small lead due to a more stringent power limit on a game known to make very good utilization of its architecture. Break out the 6950 XT and let's see that grand accomplishment (not) evaporate.
I wasn't talking about you. Hell, I don't know what CPU you have. You could have a Ryzen 5 3600X for all I know! :laugh:

Seriously though, I was just speaking figuratively so that anyone who is complaining and has a Raptor Lake CPU might stop and give their own heads a shake.
IMHO you measure efficiency from a normalized performance standpoint across a large sample size. A decisive lead? Radeon HD 5970 (dual fully enabled Cypress XT, with two cores equivalent to HD 5870s) used less power than one GTX 480 on average. That's what I mean by "decisive lead".
Hey, I wasn't trying to be insulting, I was just pointing out that RDNA2 was more efficient than Ampere. The word that you used was "uncontested" and, as I said, the lead wasn't big, but it was uncontested. I was trying to figure out how to point out RDNA2's advantage over Ampere without coming across as being arrogant or condescending because you're a pretty decent human being. I posted the chart because, you know me, I always back up statements and never expect people to just take my word for things. That's why I said that I didn't know how to not make you look bad... It wasn't what I was trying to do but...

I agree with you about Evergreen vs. Fermi (although Fermi was the very definition of a hot mess). I still have an HD 5870. ;)
Weren't RDNA1 and 2 both better performance per watt than 2000 and 3000 series?
I know that RDNA2 was better than Ampere but I don't remember if RDNA was better than Turing.
 
Last edited:
Oh well.

Riled up emotions I see.

:D

Would be fun to see a member with both 7900XTX and a 4080 to do similar testing.

Apparently sharing some results and commenting is a crime though, so maybe not.
It's not a crime per se, you just have to be careful not to paint AMD in a bad light, that's all ;)

It's ok to call Nvidia "Ngreedia", Apple customers "sheep" (guilty as charged, a few times), burn Intel, Microsoft or Google to the stake for being monopolies. But AMD? AMD is Internet's darling, you're not supposed to go there.

(And just to be clear, I'm not saying the original post should be taken at face value (quite the opposite), I'm just pointing out the "my preciousss" reactions it sparked in a knee-jerk manner.)
 
I was just pointing out that RDNA2 was more efficient than Ampere.
And what do you make of the several posts since yours with additional data that is at odds with your assertion?
 
So your counter to a Performance per watt argument is just a power consumption chart? shouldn't you include the Performance chart too?
What are you talking about? I was reminding Dr. Dro that RDNA2 was, on average, more efficient than Ampere because he said that he didn't remember Radeon being more efficient than GeForce since Evergreen vs. Fermi.
Cause here it is
View attachment 305401

3090 is 20% faster than 6800XT, yet the total power consumption is only 11% more, doesn't that mean 3090 is better in Performance per Watt?

3090: 0.39 Frame per Watt
6800XT: 0.36 Frame per Watt
Nice cherry pick, but smart people don't post the power consumption in single games because, just like performance, numbers vary from game to game. They post the system average power consumption across ALL applications.

I really don't understand the point of your post. You posted something ridiculous because... you just felt like arguing? In any case, you haven't accomplished anything except to demonstrate that you don't know how this works.
 
There is evidently a disconnect here where some people think "wow that power consumption looks less than ideal and will add cost to the life of the card" vs the people actually buying these high end cards have their electric bill on autopay and don't even think twice about it. It's great to have diversity of thought from users all around the world from all ways of life, but when users are looking at products just to compare them, and will never actually purchase them- it creates a disconnect between real usage and what looks good to someone who will never own that product.
 
What are you talking about? I was reminding Dr. Dro that RDNA2 was, on average, more efficient than Ampere because he said that he didn't remember Radeon being more efficient than GeForce since Evergreen vs. Fermi.

Nice cherry pick, but smart people don't post the power consumption in single games because, just like performance, numbers vary from game to game. They post the system average power consumption across ALL applications.

I really don't understand the point of your post. You posted something ridiculous because... you just felt like arguing? In any case, you haven't accomplished anything except to demonstrate that you don't know how this works.

Efficiency is measured by Perf/Watt, what is the point of showing Power Consumption without the Performance?

And if you have actually read your own provided chart, it says total power consumption running Doom Eternal at 4K, not the average power consumption across all application :roll:. Here let me mark it for you
Power.png
 
It's not a crime per se, you just have to be careful not to paint AMD in a bad light, that's all ;)

It's ok to call Nvidia "Ngreedia", Apple customers "sheep" (guilty as charged, a few times), burn Intel, Microsoft or Google to the stake for being monopolies. But AMD? AMD is Internet's darling, you're not supposed to go there.

(And just to be clear, I'm not saying the original post should be taken at face value (quite the opposite), I'm just pointing out the "my preciousss" reactions it sparked in a knee-jerk manner.)
I personally had no problem with the content of the post, I just didn't understand the point of going over it. It's like, yeah, the Radeons use more juice... we already know that, so what is the point of this?

It would be like if he had posted something that talks about the power consumption of Raptor Lake, I would have said the same thing... "Yeah we know, who cares?" and the efficiency difference between Raptor Lake and Zen4 is far larger.

It just feels like fanboy propaganda and there's no place for that in a reputable tech forum like TechPowerUp and it has nothing to do with the fact that it's targeting AMD because I'd feel the same way about Raptor Lake.

The only reason that I recommend Radeons to people is because they're less expensive and the only nVidia cards that aren't complete rip-offs are the 3000 cards but the only card below the RTX 3080 with more than 8GB of VRAM is the RTX 3060 which makes the RTX 3060 Ti/RTX 3070/RTX 3070Ti cards impossible to recommend. When someone has a budget below the price of the RTX 3080 but wants to do content creation with Adobe Premiere, I always recommend the RTX 3060 for the 12GB of VRAM. If they want to do content creation with Davinci Resolve, I recommend the RX 6700 XT for the same reason. Right now, buying nVidia is a mistake unless you're buying an RTX 4090 or you're in an area with limited options. That's nVidia's fault, nobody else's. Now, sure, the Ada Lovelace cards are more power efficient and that is not debatable but their pricing is so bad that it doesn't matter. Again, that's nVidia's fault and nobody else's.

Personally, I only use Radeons because I've seen the kind of company that nVidia is when I was on the other side of the counter at Tiger Direct and they've never presented me with a compelling enough reason to pay extra for their hardware. The thing is, if someone says that they specifically want nVidia, I present them with their best nVidia option and mention that if they're only gaming, a Radeon card will be a better deal for them. I don't say it because I want to push an agenda, I say it because it's objectively true in most cases.

If nVidia was offering a better deal, I'd recommend them because I care about individual people far more than I could ever care about multi-billion-dollar corporations. The problem is that they're not offering a better deal and that's not just my opinion, it's the opinion of people like Steve Burke, Steve Walton, Jarred Walton, Paul('s Hardware) and Daniel Owen.

Their opinions count for a lot more than some person in a tech forum that I've never heard of, especially when they all agree on something.
 
Last edited:
They even cover exactly that, yeah it's a wash.

I finished Doom Eternal with RT ON too (at highest difficulty no less), that Performance per Watt chart would look very different with RT ON :laugh:.

But HUB would never ever test efficiency with RT enabled ;)
 
wow that power consumption looks less than ideal and will add cost to the life of the card
I don't know what you mean by that, whether it's 50W less or more it's not going to make any difference to the lifespan of these things.
 
I don't know what you mean by that, whether it's 50W less or more it's not going to make any difference to the lifespan of these things.
Lifespan does not equal cost over the life of the card.

Cute pic. I may have to adopt an anime girl pfp lol
 
Lifespan does not equal cost over the life of the card.
Oh you meant added electricity cost.

Yeah, obviously the people who buy these things are not gonna care about the difference 50W or whatever is going to make to the electricity bill. Or anyone, really.
 
Oh you meant added electricity cost.

Yeah, obviously the people who buy these things are not gonna care about the difference 50W or whatever is going to make to the electricity bill. Or anyone, really.
A lot of people not buying the cards really care for some odd reason- case in point this thread :laugh:
 
A lot of people not buying the cards really care for some odd reason- case in point this thread :laugh:

Sure, let's say "made up reasons just so they can claim X is so much better than Y".
 
when users are looking at products just to compare them, and will never actually purchase them
We're never going to purchase them because they don't satisfy our criteria for being power-efficient. Why is this so difficult for you cretins to understand? Why do you continue to attempt to disregard this argument as if it's irrelevant? Is it because you're intellectually stunted and only capable of fanboyism?
 
Oh you meant added electricity cost.

Yeah, obviously the people who buy these things are not gonna care about the difference 50W or whatever is going to make to the electricity bill. Or anyone, really.
Not that I'm a reference or anything, but I do tend to buy components with lower power draw. They're easier to keep cool and quiet (yeah, I know) using cheaper fans and heatsinks.
My systems are silent enough that in a few cases I had my friends go "oh, you turned it on already"?

Also worth noting, in the past, highest TDP on a GPU was much lower. Today, if you're looking at 100W difference, that's two thirds of what a mid range card used to draw. Or like 20 LED bulbs turned on at the same time...
 
We're never going to purchase them because they don't satisfy our criteria for being power-efficient. Why is this so difficult for you cretins to understand? Why do you continue to attempt to disregard this argument as if it's irrelevant? Is it because you're intellectually stunted and only capable of fanboyism?
Roughly $75 more per year in power costs isn't a factor for people buying a GPU in this price range. It's an afterthought- it is not relevant.

Resorting to insults is low.
 
Paying thousands to play computer games but scrounging for pennies when paying the electricity bill.

Clearly some people got their priorities wrong.
 
Last edited:
Oh you meant added electricity cost.

Yeah, obviously the people who buy these things are not gonna care about the difference 50W or whatever is going to make to the electricity bill. Or anyone, really.
I took a look at the Wattage Calculator at sust-it.net and chose the UK price cap for July 2023 (because I know that the prices in the UK are insane and this is the most current). I set it to 50W for 4 hours per day x 365 days (1460 hours) and got back £21.90. So, if the card is maxxed-out for 4 hours per day, every single day of the year, that will be the cost increase over the year.

Here in Canada (at least for the province in which I live), depending on the time of day used, that extra 50W would cost between $5.99 and $12.41 per year for the same number of hours.

Again, this is assuming that the card is doing something that maxxes it out, like running benchmarks for 1,460 hours per year. For gaming, it would probably be between fifty and eighty percent of the prices shown, assuming that people were gaming for 4 hours every single day (which is also unrealistic).

This is why I say "Who cares?" when it comes to GPU power consumption.

Now, CPU power consumption is a different story but not because of electrical prices. GPUs are on video cards that have coolers designed for them but a hot-running CPU can require a high-performance cooling solution that can easily cost ~$100 over the cost of the CPU itself, increasing the cost dramatically. Some people use this to justify buying an AIO but that can be a whole other can of worms in itself.
 
Back
Top