Monday, June 6th 2022

NVIDIA RTX 4080 Rumored To Feature 420 W TDP

The upcoming generation of graphics cards from NVIDIA look set to feature significantly higher power budgets than their predecessors according to a recent claim from leaker Kopite. The RTX 4090 has been rumored to feature a TDP above 400 W for some time and this latest leak indicates that the RTX 4080 may also ship with an increased power requirement of 420 W. This RTX 4080 (PG139-SKU360) would represent an increase of 100 W compared to the RTX 3080 with power rises also expected with the RTX 4070 and RTX 4060. The RTX 4070 could see a power budget as high as 400 W if NVIDIA chooses to use GDDR6X memory for the card while the RTX 4060 is rumored to see a 50 W increase to 220 W at a minimum. The preliminary rumors indicate a launch date for these cards in late 2022.
Sources: @kopite7kimi (via VideoCardz), @kopite7kimi
Add your own comment

80 Comments on NVIDIA RTX 4080 Rumored To Feature 420 W TDP

#1
eidairaman1
The Exiled Airman
Probably need this just to stay cool.

Posted on Reply
#2
Prima.Vera
Somebody from gov should regulate this nonsense. This is getting ridiculous. A GPU that consumes more then a standard PSU from a couple of years ago..
Posted on Reply
#3
AlwaysHope
The relevance is of course, the cost of the electricity, not how much the device actually consumes.
Posted on Reply
#4
Minus Infinity
AlwaysHopeThe relevance is of course, the cost of the electricity, not how much the device actually consumes.
So if it's cheap pollute away, consequences be damned.
Posted on Reply
#5
thesmokingman
It's kind of ironic considering the world is coming to grips with global warming and they keep turning the wick up on these parts. Hello, we want more performance at higher efficiency not less.
Posted on Reply
#6
hardcore_gamer
Prima.VeraSomebody from gov should regulate this nonsense. This is getting ridiculous. A GPU that consumes more then a standard PSU from a couple of years ago..
Unfortunately, there is no way around it. Dennard scaling hit a wall while Moore is still going. Transistors/die can still increase significantly but power/transistor isn't going down at the same rate. If you want more performance, you have to give it more power. The last thing we need is governments regulating high-performance computing.
Posted on Reply
#7
AlwaysHope
Minus InfinitySo if it's cheap pollute away, consequences be damned.
That's an assumption if ever there was one...
Where I live, all domestic electricity is hydro generated. :D
Posted on Reply
#8
Chaitanya
Prima.VeraSomebody from gov should regulate this nonsense. This is getting ridiculous. A GPU that consumes more then a standard PSU from a couple of years ago..
Why just GPUs even Intel CPUs are getting out of hand and with rumoured return of HEDT, workstations might need a pair of kW class PSUs atleast to powerup.

On othernote, that leaker seems to Kimi Raikkanon fan.
Posted on Reply
#10
thesmokingman
ChaitanyaWhy just GPUs even Intel CPUs are getting out of hand and with rumoured return of HEDT, workstations might need a pair of kW class PSUs atleast to powerup.

On othernote, that leaker seems to Kimi Raikkanon fan.
Don't forget the optional hidden under table chiller. Which would go great with Nvidia's 900w gpu. You'll need to order two chillers lmao.
Posted on Reply
#11
Bwaze
It looks more and more I'll skip this generation. Going from GTX 1080 Ti to RTX 3080 was quite a leap in consumption and generated heat, and custom water cooling just means the heat is transfered to my room more quietly. I don't really want to add another 100 W of heat...
Posted on Reply
#12
Sipu
Amd be like "we can do that now?" Funny how high power consumprion was always an issue (ok performance was shit) in the 2000's, but now both intel and nvidia are going crazy. Consodering how efficient zen and rdna is, it will be interesting to see if they can scale that up to ridiculous power with new gens
Posted on Reply
#13
Pumper
AlwaysHopeThat's an assumption if ever there was one...
Where I live, all domestic electricity is hydro generated. :D
Energy = heat. So even if you are using 100% green energy, you are still contributing to global warming if you are using more of it than necessary.
Posted on Reply
#14
Yrd
warrior420Nice.
Roll eyes with the most rollingest of eyes.
Posted on Reply
#15
lilhasselhoffer
PumperEnergy = heat. So even if you are using 100% green energy, you are still contributing to global warming if you are using more of it than necessary.
? I can't tell whether this is a troll answer. By definition global climate change (because warming is inaccurate) is not influenced by the consumption of energy...and that's trivial to demonstrate. Nuclear power is taking unstable atoms, concentrating them until they create a controlled chain reaction, and then using their heat to driven a turbine. This process has literally been going on since the earth formed...and contributes no additional greenhouse gasses. If you'd like to argue, there are millennia old caves in Mexico that demonstrate this process is literally older than man. (Naica Crystal Caves)

By this logic, the only option would be to define that what you want is necessary, then do a calculation on total power draw rather than peak draw. Theoretically then, a 4080 shouldn't exist because the 3060 exists...and it can do the came calculations much slower, but overall more efficiently. Likewise, a 3060 should not exist because there are lower spec and more efficient GPUs...let alone the system you've got being less efficient using CISC processors rather than RISC. If you don't get it yet, there's a point where this argument of efficiency is a joke race to the bottom...because technically humans have pencils. Our biological processes are more efficient than any computer...so it'd be more efficient to physically color and illustrate a game than to use a computer of any kind...which hopefully highlights how silly this point is by proxy.


With regards to the thread topic, a 4080 using 420 watts of power is...whatever it is. I'm honestly looking forward to the market cards that target 1080p resolution gaming...which in this generation might be the 4060 or 4070. If they release as absolutely energy hungry messes, then it's going to be a strong case to break out the 30x0 cards as a much better value for the money. It'll also hopefully be an opportunity for AMD to get its crap together and release cards right. That is to say they release a competitive product that forces Nvidia to be more price competitive (with drivers and BIOSes that aren't an utter joke).

Personally, I've dealt with the AMD 5700 xt lineup...and having months of wait to get the properly tuned bios for the cards that didn't make them stutter more than was anywhere near acceptable was...frustrating. Nvidia pushing an idiotic level of power consumption would be that opportunity in a nutshell. I may be a bit more optimistic than rational though.
Posted on Reply
#16
Bwaze
lilhasselhoffer.


If they release as absolutely energy hungry messes, then it's going to be a strong case to break out the 30x0 cards as a much better value for the money.
When we're talking about resolutions... I know that VR isn't even remotely relevant in the market, but it could really do with doubling, tripling of the graphics cards performance.

And about going with 30x0 cards if the 40x0 proves inefficient - remember the 20x0 release. In spring of 2018 we had a cryptomarket collapse, and used GTX 1080 Ti cards were plentiful in the summer (but prices in stores remained high, market was still drunk on success of cryptoboom). When the RTX 2080 was released in September 2018, the prices of used 1080 Ti actually shot up - because the new generation didn't bring any price / performance increase - you paid to be guinnea pig for new technologies like RT and DLSS which were only very slowly getting released.
Posted on Reply
#17
Luminescent
Regardless of power consumption, GPU prices are still very high in europe, we have to wait a couple of years for prices to settle down, if ever.
Posted on Reply
#18
watzupken
The question now is, how much more performance are we getting from the increase in power requirements? 420W represents a 100W jump from a reference RTX 3080 which slightly over a 30% increase, and I would expect in return that the card performs 50% faster at a minimal. I guess we will know towards the end of the year.
Posted on Reply
#19
bug
Ffs, I have been buying midrange because the power draw was moderate and easy to manage...
Posted on Reply
#20
Rob94hawk
Looking forward to miners dumping their 3080's on ebay for $300 after crypto circles the drain some more.
Posted on Reply
#21
Vayra86
thesmokingmanDon't forget the optional hidden under table chiller. Which would go great with Nvidia's 900w gpu. You'll need to order two chillers lmao.
I think you figured it all out now. That's where SLI went! Double cooling for a single card!
PumperEnergy = heat. So even if you are using 100% green energy, you are still contributing to global warming if you are using more of it than necessary.
That's not how it works. Excess energy is converted to heat, but heat on its own does not contribute to global warming ;) Read a bit on the greenhouse effect.
bugFfs, I have been buying midrange because the power draw was moderate and easy to manage...
Nvidia is just helping you and me to buy even lower in the stack. Which is a futile exercise, because are we really gaining performance of any measure gen-to-gen then?

I think GPU is slowly reaching a dead end. Turing was the writing on the wall. Its either going to have to go MCM, or it will stall. There is no other low hanging fruit either, much the same as it is on CPU.
Posted on Reply
#22
ARF
lilhasselhoffer? I can't tell whether this is a troll answer. By definition global climate change (because warming is inaccurate) is not influenced by the consumption of energy...and that's trivial to demonstrate. Nuclear power is taking unstable atoms, concentrating them until they create a controlled chain reaction, and then using their heat to driven a turbine. This process has literally been going on since the earth formed...and contributes no additional greenhouse gasses. If you'd like to argue, there are millennia old caves in Mexico that demonstrate this process is literally older than man. (Naica Crystal Caves)

By this logic, the only option would be to define that what you want is necessary, then do a calculation on total power draw rather than peak draw. Theoretically then, a 4080 shouldn't exist because the 3060 exists...and it can do the came calculations much slower, but overall more efficiently. Likewise, a 3060 should not exist because there are lower spec and more efficient GPUs...let alone the system you've got being less efficient using CISC processors rather than RISC. If you don't get it yet, there's a point where this argument of efficiency is a joke race to the bottom...because technically humans have pencils. Our biological processes are more efficient than any computer...so it'd be more efficient to physically color and illustrate a game than to use a computer of any kind...which hopefully highlights how silly this point is by proxy.


With regards to the thread topic, a 4080 using 420 watts of power is...whatever it is. I'm honestly looking forward to the market cards that target 1080p resolution gaming...which in this generation might be the 4060 or 4070. If they release as absolutely energy hungry messes, then it's going to be a strong case to break out the 30x0 cards as a much better value for the money. It'll also hopefully be an opportunity for AMD to get its crap together and release cards right. That is to say they release a competitive product that forces Nvidia to be more price competitive (with drivers and BIOSes that aren't an utter joke).

Personally, I've dealt with the AMD 5700 xt lineup...and having months of wait to get the properly tuned bios for the cards that didn't make them stutter more than was anywhere near acceptable was...frustrating. Nvidia pushing an idiotic level of power consumption would be that opportunity in a nutshell. I may be a bit more optimistic than rational though.
Don't agree with anything in this so long post. Everything that you just posted is wrong.

420 watts for a mid-range card is unacceptable and ugly by nvidia.
4080 won't be high-end. There will be 4090, 4090 Ti and allegedly a Titan-class flagship.

Yes, climate change is indeed influenced by the energy use - transport, industry, etc. which burn polluting coal, oil and fossil gas.


Let's hope AMD's new 7000 cards are ok with power consumption (look at the thread above which states 750-watt and 650-watt PSU recommended), and with this generation nvidia will lose market share.

And no, 1080p gaming is crap, move on to 2160p - better and nicer.
Posted on Reply
#23
napata
thesmokingmanIt's kind of ironic considering the world is coming to grips with global warming and they keep turning the wick up on these parts. Hello, we want more performance at higher efficiency not less.
Whatever power they'll draw they'll be much more efficient. Efficiency and higher absolute power draw are not mutually exclusive.
Posted on Reply
#25
ZoneDymo
AlwaysHopeThat's an assumption if ever there was one...
Where I live, all domestic electricity is hydro generated. :D
Im sorry but this requires a reality check.

1. We are talking about GLOBAL warming, if the 400+ watt parts were only sold in your area then sure* but they are sold world wide.
2. I would love to know where you even live because I kinda doubt it is all hydro generated and if that limit is not already reached, not resisting against this ever growing power demand will mean that at some point it will be overwhelmed.

3. And this is perhaps the more painful part, The Grand Tour, Season 4, episode 1.
They went to Cambodia to go on a river ride trip, only to find the rivers bare and the houses on stilts looking mighty silly with no water running underneath them
Where did the water go? has it evaporated as a result of global warming? well no.
China is where that same river goes through first and they made....you guessed it, a nice big fat hydrodam, so they can generated all the power for themselves and absolutely destroy all life following after.
The people in Cambodia are completely dependent on that river and James May mentioned that damming china is doing, the results of it, are pretty much on par with Nuclear War....

So yeah, aint all that easy or free from consequences.
Prima.VeraSomebody from gov should regulate this nonsense. This is getting ridiculous. A GPU that consumes more then a standard PSU from a couple of years ago..
I fully agree with this, I think we could see lovely developments coming from restraints.
Force the gpu's to not consume more then 300 watts (or perhaps evne less), save for absolute (low quantity) halo products that are meant for OC records etc perhaps.
Posted on Reply
Add your own comment
Dec 21st, 2024 21:33 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts