• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA RTX 40 Series Could Reach 800 Watts on Desktop, 175 Watt for Mobile/Laptop

I know it is about power consumption, you are missing my point it seems. It is not about modifying a graphics card for lower power, it is about making yourself look better by seemingly having a lower power card when you have not. Do you not get that?

Makes no difference, chances are the people in the forum who have a 3090/ti will buy a 4090/ti and use whatever means to justify it while making out they care about high power use.
I see the opposite being a bit more possible. Teenagers bragging about their new 3090 when daddy won't even buy them a 3050.
 
I know it is about power consumption, you are missing my point it seems. It is not about modifying a graphics card for lower power, it is about making yourself look better by seemingly having a lower power card when you have not. Do you not get that?
I did understand you and it is silly what you are saying. It's not about what falls data you put in the specs or description for other people think you are environmental dude. For me it is about tweaking a card that costs so damn much and lower the power consumption and performance at the same time because it is a power hog even though you could've gone with lower tier card and save a lot of cash and be more environment friendly.
 
I'm not sold on upgrading my 3090 to a 4090 yet but I do run a 1k watt PSU in my rig and my current case has room for a second PSU and I happen to have a second 1K watt PSU new in the box and sitting on a shelf. I should be safe.
 
I did understand you and it is silly what you are saying. It's not about what falls data you put in the specs or description for other people think you are environmental dude. For me it is about tweaking a card that costs so damn much and lower the power consumption and performance at the same time because it is a power hog even though you could've gone with lower tier card and save a lot of cash and be more environment friendly.

Well to be fair, our hobby is not a really environmentally friendly one really is it. Think about all the power used to make our CPU's or GPU's, imagine how much power all the rigs on TPU would use combined. You can buy a high power GPU and make it use less power, but why bother. If you are so bothered about using too much power, just don't use your PC at all and save lots. People are just hypocrites, buying high end parts, then making it use less to justify it. Not me, my PC will use as much as it uses, fuck the enviroment, we have fucked it anyway, now we are trying to save the scraps of what is left before it's too late.

I don't drive, or use trains which both pollute, i bet 90% of people on TPU have a car which pollutes far more than your PC. Maybe you should try not using that instead of crippling your GPU to make yourself feel better about buying such a power guzzling GPU in the first place.

/unwatch
 
Well to be fair, our hobby is not a really environmentally friendly one really is it. Think about all the power used to make our CPU's or GPU's, imagine how much power all the rigs on TPU would use combined. You can buy a high power GPU and make it use less power, but why bother. If you are so bothered about using too much power, just don't use your PC at all and save lots. People are just hypocrites, buying high end parts, then making it use less to justify it. Not me, my PC will use as much as it uses, fuck the enviroment, we have fucked it anyway, now we are trying to save the scraps of what is left before it's too late.

I don't drive, or use trains which both pollute, i bet 90% of people on TPU have a car which pollutes far more than your PC. Maybe you should try not using that instead of crippling your GPU to make yourself feel better about buying such a power guzzling GPU in the first place.

/unwatch
Nothing is nowadays but this doesn't mean we have to use excuses to drop the subject. If the CPUs make such an impact on the environment, at least we should hope the usage does not impact it as much. Just because you chose to be an arrogant doesn't mean anyone should be. You pay for something and you should be able to use it as it is without interfering with the product. Power has become a very delicate subject occupied by a variety of problems and the waste of power is unjustified which can and already is causing a lot of problems and suffering. Considering your arrogance and arguments you bring with pollution, I'm not surprised you use this type of wording. "don't buy it", "not use it". So it is not worth saving "scraps"?
If you were to buy a GPU but there is nothing there and never will be anymore but you can have a 'scrap' used one that would allow you to play games a bit more in your lifetime, wouldn't you buy it?
I disagree about trains and cars to some extent. These are electric in Norway thus they do not emit pollution. Although, that depends where the electricity is coming from. Coal, or renewable source like turbines for instance or hydro or fusion.
Chip off a bit of the pollution being emitted from every industry and you will get a huge positive impact on the environment. Just because it is hard to accomplish, or some companies put excessive emphasis on product's performance, doesn't mean we should not try and educate those who say it is all lost at this point so we might just not care at all.
 
I'm not sold on upgrading my 3090 to a 4090 yet but I do run a 1k watt PSU in my rig and my current case has room for a second PSU and I happen to have a second 1K watt PSU new in the box and sitting on a shelf. I should be safe.
I run a 6500 XT with a 750 W PSU because it doesn't spin its fan until a 300 W consumption, so it stays passive within my system's (almost) entire performance range. Call me a silence freak. :roll:
 
I run a 6500 XT with a 750 W PSU because it doesn't spin its fan until a 300 W consumption, so it stays passive within my system's (almost) entire performance range. Call me a silence freak. :roll:
I run a 6900xt with a 760PSU and never had a problem but I get your point :D
 
Nothing is nowadays but this doesn't mean we have to use excuses to drop the subject. If the CPUs make such an impact on the environment, at least we should hope the usage does not impact it as much. Just because you chose to be an arrogant doesn't mean anyone should be. You pay for something and you should be able to use it as it is without interfering with the product. Power has become a very delicate subject occupied by a variety of problems and the waste of power is unjustified which can and already is causing a lot of problems and suffering. Considering your arrogance and arguments you bring with pollution, I'm not surprised you use this type of wording. "don't buy it", "not use it". So it is not worth saving "scraps"?
If you were to buy a GPU but there is nothing there and never will be anymore but you can have a 'scrap' used one that would allow you to play games a bit more in your lifetime, wouldn't you buy it?
I disagree about trains and cars to some extent. These are electric in Norway thus they do not emit pollution. Although, that depends where the electricity is coming from. Coal, or renewable source like turbines for instance or hydro or fusion.
Chip off a bit of the pollution being emitted from every industry and you will get a huge positive impact on the environment. Just because it is hard to accomplish, or some companies put excessive emphasis on product's performance, doesn't mean we should not try and educate those who say it is all lost at this point so we might just not care at all.
He is right about one thing, though: buying an expensive graphics card and then limiting its performance is a waste of money. ;)

I consider the subject of electric cars pure hypocrisy (you have to make those huge, polluting batteries from something, and then replace them every X years), but this is a subject worthy of another thread.
 
He is right about one thing, though: buying an expensive graphics card and then limiting its performance is a waste of money. ;)
Well that is my take away with this subject and I dont think that was his. It was him who stated 'just buy a 3090 and limit the power'. Why would you do that? Just buy less power hungry card.
The notion here is companies tend to increase power consumption to satisfy performance hunger no matter what the cost. I disagree with that not with whom to buy what.
I consider the subject of electric cars pure hypocrisy (you have to make those huge, polluting batteries from something, and then replace them every X years), but this is a subject worthy of another thread.
True but that is just an example dont take it the wrong way.
Same goes for GPU PCB's disposal but if the GPU consume as much power as the rest of your equipment home that is even worse.
 
"give it more juice" "give it more juice".

Anyone remember how when Maxwell was out, Nvidia fans constantly bragged about power efficiency, especially with respect to AMD's Hawaii GPUs.... Now, somehow magically, efficiency is never even uttered by them and even argued that it's not important at all in some cases.

AMD has already claimed that the efficiency of the new 7000 series is increased over the 6000 series, which is a good step, but with Nvidia cranking up the power consumption, it's very possible AMD may have to do the same to ensure a respectable level of competition, which is a shame. It's analogous to how in sports, if a sizable amount of the players are using performance enhancing drugs, an athlete who isn't using them and normally wouldn't, is highly incentivized to start using them. With the state of the world being what it is, we should be striving to use less power, but sadly, it's a well documented phenomenon that technological gains in efficiency never coincide with an overall decrease in power consumption, it's called the "Rebound Effect".
Well they could market the benefits of lower power, performance isnt the only metric, especially if its less than 10% differential.

As an example these 2 choices assuming same price point.

Nvidia 4080 600W TDP 12 gigs VRAM 10% FPS advantage
AMD 7800XT 300W TDP 16 gigs VRAM

Guess which I would pick, the AMD an absolute no bainer. Nvidia it feels like they designing their GPUs for the 500 FPS crowd with maxed out rendering and underspec'd VRAM. (remember their 500fps/hz video).

AMD slogan "With us there is no need to replace your PSU".

The only reason I got my 3080 as Nvidia are selling FE at MSRP in my country whilst AMD are not. It was a decision on ÂŁÂŁ.

I think on the intel thread someone said reviewers are to blame, I kind of agree, as most reviewers always treat performance as the deciding factor, and even treat performance differences of a few % as enough reason to buy a different product..
 
Last edited:
Well that is my take away with this subject and I dont think that was his. It was him who stated 'just buy a 3090 and limit the power'. Why would you do that? Just buy less power hungry card.
The notion here is companies tend to increase power consumption to satisfy performance hunger no matter what the cost. I disagree with that not with whom to buy what.

True but that is just an example dont take it the wrong way.
Same goes for GPU PCB's disposal but if the GPU consume as much power as the rest of your equipment home that is even worse.
I think @Tigger 's point was that you could buy a 3090, run it at full power, but post on forums like this that you have a 3060 just to look good in front of the environmentalist crowd.

I agree with what you said, though. I also despise of companies' notion of "increase performance at all costs". I don't even need all the power AMD and Nvidia say I need, because I'm still fine with 1080p and 30+ fps. That's why I downgraded from a 2070 to a 6500 XT. I had all the performance I needed, and more, but the fan noise was killing me.
 
Last edited:
It was him who stated 'just buy a 3090 and limit the power'

Show me where i said that. I have said many times i run my GPU OC'd, it's others who are running 3090's with power limits, as i definitely would not.

I have said Buy a 3090ti, do not modify it in any way, run it full power balls out. which is a mile from that.

You totally did not understand my comment in the first place, I will say it again. You have a high power card, you put a lower powered card in your system specs so people think you are not a power hog and you can make out as if you care about power use. got it now? I am not gonna say it again.
 
Last edited by a moderator:
175W will drain even the maximum-allowed capacity of 99WH battery in about 15 minutes, ignoring the screen, the CPU, and all the inefficiencies created by such a rapid discharge.
If you ignore everything else, it would actually take about 34 minutes: 99Wh x 60 min/h = 5940 Wmin, and 5940 Wmin / 175W ~= 33.94 min. But yeah, taking all other stuff into account, it would probably be closer to 15 minutes.
 
You totally did not understand my comment in the first place, I will say it again. You have a high power card, you put a lower powered card in your system specs so people think you are not a power hog and you can make out as if you care about power use. got it now? I am not gonna say it again.
I did understand but it does not make a difference. I'm talking about the actual card and how much power it needs. The power consumption does not change even when you do put something else in your spec. It is irrelevant and I dont understand why we are talking about this.
Either way I might have misunderstood what you have written earlier. My apologies.

I think @Tigger 's point was that you could buy a 3090, run it at full power, but post on forums like this that you have a 3060 just to look good in front of the environmentalist crowd.

I agree with what you said, though. I also despise of companies' notion of "increase performance at all costs". I don't even need all the power AMD and Nvidia say I need, because I'm still fine with 1080p and 30+ fps. That's why I downgraded from a 2070 to a 6500 XT. I had all the performance I needed, and more, but the fan noise was killing me.
For me that is not an option I want 4k. I so dig the 4k gameplay and i literally would not want to go down with resolution even a notch. 1080p is simply no go for me ever again. 1440p if I have to but with pain but if i need some more FPS I'd rather crank the detail down a bit. With my 6900xt there's few games requiring detail drop and if they do it is merely a little tweak.
 
If you ignore everything else, it would actually take about 34 minutes: 99Wh x 60 min/h = 5940 Wmin, and 5940 Wmin / 175W ~= 33.94 min. But yeah, taking all other stuff into account, it would probably be closer to 15 minutes.
It's way more complex than that, and 15 minutes is probably hopelessly optimistic.

Battery capacity is stated at a specific discharge rate, called C, and the discharge efficiency isn't linear - the faster you discharge certain batteries, the more of the energy stored they waste as heat through internal resistance, and the faster the voltage drops to levels that render the laptop useless.

Battery chemistry, cell arrangement, and aggressive cooling can mitigate these issues to produce better discharge rates, but realistically a laptop battery offers none of these things and rapid discharges may yield significantly less than half the stated capacity. The 99Wh rating of a huge laptop battery is typically only achieved if it's trickled out slowly over the course of several hours. If I was forced to guess how many Watt-hours you'd get back from a 99Wh battery at 3C, I'd say maybe 20-30Wh from my experience with building rapid-discharge batteries from 18650 cells. That's a very vague guess though. It could be 10Wh, it could be 60Wh. Only the battery manufacturer would really know for sure without explicitly testing that scenario.
 
I think @Tigger 's point was that you could buy a 3090, run it at full power, but post on forums like this that you have a 3060 just to look good in front of the environmentalist crowd.

I agree with what you said, though. I also despise of companies' notion of "increase performance at all costs". I don't even need all the power AMD and Nvidia say I need, because I'm still fine with 1080p and 30+ fps. That's why I downgraded from a 2070 to a 6500 XT. I had all the performance I needed, and more, but the fan noise was killing me.

Wow someone actually understood my post.
 
I don't even need all the power AMD and Nvidia say I need, because I'm still fine with 1080p and 30+ fps. That's why I downgraded from a 2070 to a 6500 XT. I had all the performance I needed, and more, but the fan noise was killing me.
I might be in the same boat. Now that I've sold the mining rigs the noise levels in my living room are very low again and the late 2021 purchase of an RTX 3060 addition to my HTPC means I can increase the graphics settings very subtly but I have to underclock the heck out of it if I don't want to hear the fans.

Meanwhile, the sort of games I actually play on the HTPC are 100% fine on a Radeon RX 5500 8GB I have lying around and that thing is basically passively cooled until 100W and whisper quiet up to its 130W TDP. Perhaps I should swap back to it and sell the 3060 before Lovelace comes out....
 
I might be in the same boat. Now that I've sold the mining rigs the noise levels in my living room are very low again and the late 2021 purchase of an RTX 3060 addition to my HTPC means I can increase the graphics settings very subtly but I have to underclock the heck out of it if I don't want to hear the fans.

Meanwhile, the sort of games I actually play on the HTPC are 100% fine on a Radeon RX 5500 8GB I have lying around and that thing is basically passively cooled until 100W and whisper quiet up to its 130W TDP. Perhaps I should swap back to it and sell the 3060 before Lovelace comes out....
I'm thinking about selling the 2070 as well. For now, I think I'm keeping it just in case I need more performance before Lovelace / Arc / RDNA3 come out. I know its price will drop fast, but I don't look at PC hardware as an investment anyway. :ohwell:

I used to say to people "buy the fastest you can afford". Now I'm saying "don't buy more than what you need". As for me, I'm okay with a couple lost FPS (within reasonable boundaries), but I can't deal with noise anymore. Both my main PC and my living room HTPC are extremely silent, and my bedroom HTPC is passively cooled except for two very quiet case fans and the PSU. I just can't go back to big, bulky, noisy towers.
 
This sounds like porkies. So you're saying you cut 110W power draw from your 3080 Ti yet you only lose 1-5%, or even gain performance, in some titles?

Be honest with yourself. You're prob losing more like 5-10% perf. And in some titles, your 250W 3080 Ti performs closer to a 3080.
Depends on the game...in some titles I can save 70w in some 120, but focus was on same performance with much better consumption/temperatures/noise.
 

Attachments

  • GoW_2022_06_25_01_09_04_426.jpg
    GoW_2022_06_25_01_09_04_426.jpg
    3.2 MB · Views: 72
  • GoW_2022_06_25_01_08_47_737.jpg
    GoW_2022_06_25_01_08_47_737.jpg
    2.9 MB · Views: 67
  • GoW_2022_06_25_01_18_21_617.jpg
    GoW_2022_06_25_01_18_21_617.jpg
    4 MB · Views: 68
  • GoW_2022_06_25_01_18_35_864.jpg
    GoW_2022_06_25_01_18_35_864.jpg
    4 MB · Views: 69
Last edited:
This is ridiculous. 800W for a card...maybe it can come with a "vacuum cleaner" & their own air-conditioning also? :roll:
 
So...this is kind of fun. Reading through I get the people whining that this is an environmental travesty, those suggesting that they know undervolting performance results, and the people with enough member berries to make both Nvidiots and AMDelusionals have a pride festival.

It's actually quite fun.


That said, I'll offer the common sense situation. Right now, GPU miners are flooding the market with the 30xx and 60xx series cards. Both of which have significant inventories to work through, and despite their production volumes are still going strong with even recent refreshes. Great.
If the current gen cards offer 80% of the performance, and 60% of the price, with 60% of the power draw then why are you upgrading? That additional 20% in performance has a disproportionate increase in cost...because. Not ideal. Likewise, the AMDelusionals are claiming the next gen cards from their team are going to be better, and cure cancer. The same promises of every single generation of cards. That's not member' berries, it's no longer falling for the hype train.


I subscribe to neither AMD or Nvidia. I buy the most reasonable product for the price. Right now I can get a card that was used for mining...at half the price of a new one. Every day that number seems to be falling, and even the retailers are offering their token "sales" before the official markdowns of the cards as they try to cope with the bubble of demand evaporating. It's my hope that when the 40xx series rolls out AMD will cut costs, the 30xx series will be "old tech," and I can buy a 3070 for less than the cost of a new console. It'll likely drive years of dual monitor gaming, assuming that a 6800 isn't positioned better in the price to performance ratio. If it is, let's go AMD. Spending all of this energy concerning ourselves with rumors that evolve daily only serves to drive hype and tension for a new series of GPUs...from a company that lied about its dependency on miners. Neither AND nor Nvidia are angels...so maybe let's treat them like the profit driven entities they are. Let's buy whatever is best, divorced from irrational merits.



Alternatively, let's remember that Sega does what Nintendon't. That little cringe should let everyone know that the bifurcation of a group along idiotic lines doesn't lead to better things...it leads to the "ugly sonic" logic that has driven Sega into the ground...it killed the Dreamcast. It gave us the WiiU...driven off of a tablet which couldn't be used to play the games separately (had to wait for the Switch there). If you don't get it, let me be the "old man" for a moment. Microsoft figured out the future of Xbox was as a rental outfit...not as a seller of consoles. Nintendo is in the midst of selling you the same game on three separate virtual consoles...which retails for $15 30 years after it came out. This is business....and when we lose sight of that we can be taken advantage of.
 
If you have an older CPU anything from i9-9900K down. Your new RTX 480 will over power the CPU so going this route you will have no choice but to downgrade you GPU.
I have a RTX 3080ti Founders edition and I had to down grade the GPU by 10% running a i9-9900K otherwise the frame rate will be very low,. (CPU and GPU out of sync.)
It will be a problem people not knowing when upgrading to the 480? Only other solution will be to get the latest CPU, motherboard, power supply and memory $$$$$$.
 
If you have an older CPU anything from i9-9900K down. Your new RTX 480 will over power the CPU so going this route you will have no choice but to downgrade you GPU.
I have a RTX 3080ti Founders edition and I had to down grade the GPU by 10% running a i9-9900K otherwise the frame rate will be very low,. (CPU and GPU out of sync.)
It will be a problem people not knowing when upgrading to the 480? Only other solution will be to get the latest CPU, motherboard, power supply and memory $$$$$$.
What you say makes no sense... especially since you don't even state your gaming resolution. Most people know that CPU bottleneck may appear @ low res when you want very FPS. I play with a 9900K and RTX 3080 @ 1440p and most of the time i'm stil GPU limited since I turn eye candies to the max.
 
You are running a RTX 3080 not RTX 3080TI Founders Edition.
As to the resolution I am running 2 x 27" monitors @ 1440p just upgrated to the new GPU running at stock speed the fps was 30 and the developer mode showed CPU limited.
Reduced the GPU by 10% as stated before now 60 fps.
Should have said running Microsoft Flight Simulator 2020 all maxed out.
 
You are running a RTX 3080 not RTX 3080TI Founders Edition.
As to the resolution I am running 2 x 27" monitors @ 1440p just upgrated to the new GPU running at stock speed the fps was 30 and the developer mode showed CPU limited.
Reduced the GPU by 10% as stated before now 60 fps.
Should have said running Microsoft Flight Simulator 2020 all maxed out.
you say that you lowered the 3080 TI's capabilities by lower voltage and frequency (I assume that is what you meant by downgrading) and somehow you have more FPS? That totally makes no sense.
 
Back
Top