Monday, June 20th 2022

NVIDIA RTX 40 Series Could Reach 800 Watts on Desktop, 175 Watt for Mobile/Laptop

Rumors of NVIDIA's upcoming Ada Lovelace graphics cards keep appearing. With every new update, it seems like the total power consumption is getting bigger, and today we are getting information about different SKUs, including mobile and desktop variants. According to a well-known leaker, kopite7kimi, we have information about the power limits of the upcoming GPUs. The new RTX 40 series GPUs will feature a few initial SKUs: AD102, AD103, AD104, and AD106. Every SKU, except the top AD102, will be available as well. The first in line, AD102, is the most power-hungry SKU with a maximum power limit rating of 800 Watts. This will require multiple power connectors and a very beefy cooling solution to keep it running.

Going down the stack, we have an AD103 SKU limited to 450 Watts on desktop and 175 Watts on mobile. The AD104 chip is limited to 400 Watts on desktop, while the mobile version is still 175 Watts. Additionally, the AD106 SKU is limited to 260 Watts on desktop and 140 Watts on mobile.
Making a difference between a power limit and a TGP is essential. While the Total Graphics Power is what NVIDIA rates its GPUs to run at, the power limit represents the amount of power that can be applied by board partners and overclocking attempts. It is not necessarily translated into TGP, meaning the final TGP values should be significantly lower.
Sources: @kopite7kimi (Twitter), via VideoCardz
Add your own comment

133 Comments on NVIDIA RTX 40 Series Could Reach 800 Watts on Desktop, 175 Watt for Mobile/Laptop

#51
FeelinFroggy
laszlolol 800 W is insane considering we're going green but seems that the green team completely misunderstood it; 5 hrs gaming = 5 kw /day more than any other appliance in the house
I dont think it will be sustaining 800 watts while gaming. The 800 watts is the peak. It would only hit the peak under certain circumstances and even then for only a very short period of time.

Your 1000 watt microwave will still pull much more power (although much less frequently if you game 5 hours per day).
Posted on Reply
#52
Punkenjoy
The thing is there are diminishing return with increase wattage but it still increase. Nvidia want to make sure it have the best GPU around and they probably underestimate RDNA 3.

If they were not so afraid of AMD, they could reduce the power consumption and produce cheaper board (meaning higher margin). Having a GFX card that consume that much power isn't free.

Undervolting could be an option, but i am not sure i would buy something to undervolt it. Maybe in summer, i mean in winter here i have to heat anyway so it wouldn't be that bad. But if they are against GPU that consume way less, it might be a good option to consider them instead.

but people vote with their wallet. If nvidia still sell plenty of high power GPU, they will continue to push the power envelope.
Posted on Reply
#53
GreiverBlade
oh, so i did not go overkill going from a 750W PSU to a 1000W Proton BDF-1000C :laugh:

oh well, my next GPU is still undecided at the present moment but i know it will not be green, literally? as in not Nvidia? or figuratively? as in extremely power hungry? :roll:
suspense suspense ahahah
Posted on Reply
#54
Octopuss
lol this is some crazy shit, count me out even if the real consumption is half that.
I'm so glad I didn't invest in 4k monitor and quite possibly never will, still rocking 1200p and waiting for the 1440p holy grail that likely doesn't exist...
Posted on Reply
#55
ThrashZone
Hi,
Guessing vnidia hasn't gotten the greenpeace mailers yet :slap:
Posted on Reply
#56
Chris34
I already set up an appointment with my Plumber to connect my watercooling set up to the hot water pipes of the house. Can't wait!
Posted on Reply
#57
Imaamdfanboy
Madness..Why do we have to put up with this nonsense of every generation been double performance.Rather keep the power envelope in check and bring out new cards every 3 to 4 years with beter than before efficiency..We need lower powered gpu's with more performance.
Posted on Reply
#58
Oasis
If mobile/laptop is only 175w it will be interesting how they make the RTX 4080 Mobile actually competitive to the RTX 4070 etc

:eek:
Posted on Reply
#59
mechtech
If you can afford a top end card than you can afford the electricity for it.
Posted on Reply
#60
Veseleil
jesdalsHmmm gen 4 raiser cables and a mount outside the case? perhaps thats the new 2022 look. A 800watt toaster inside my Fractal Design Meshify C would be a challenge - wonder how high AMDs next gen is going to be?
Glad I'm not the only one who cut the vent holes on Q500L like that. :laugh:
Posted on Reply
#61
ARF
ThrashZoneHi,
Guessing vnidia hasn't gotten the greenpeace mailers yet :slap:
Funny, I guess nvidia works against any ideas coming from the Greenpeace org :banghead:
Posted on Reply
#62
kapone32
SIGSEGV

so, it's confirmed then.
thanks
Is that a real card?
Posted on Reply
#63
BlaezaLite
kapone32Is that a real card?
Got to be mate. Shame it's not the Ti though, thats got 6 fans.
Posted on Reply
#64
kapone32
JKRsegaGot to be mate. Shame it's not the Ti though, thats got 6 fans.
April Fools
Posted on Reply
#65
BlaezaLite
Aw really? I could've swore this was real...:rolleyes:
Posted on Reply
#66
Shatun_Bear
HABOYou have so much misleading stuff in your comments... man. During undervolting process, your main focus is to be more effective with your hw. Loosing performance with lower power is not linear. For example I have 3080 ti 0.75v on core and overclocked mems and my consumption dropped from 360W to 250W and performance dropped like 1-5% max even in some titles is better like default performance. You are talking trash from internet without experience with those things.
This sounds like porkies. So you're saying you cut 110W power draw from your 3080 Ti yet you only lose 1-5%, or even gain performance, in some titles?

Be honest with yourself. You're prob losing more like 5-10% perf. And in some titles, your 250W 3080 Ti performs closer to a 3080.
Posted on Reply
#67
Hofnaerrchen
ARFYour example is misleading because you are assuming that will be true for the new cards. What if it is not true?
It will be true. The core count on GPUs rose significantly over the years, clock speed did not - just lately passed 2GHz... improving clock speeds is much more power demanding - this was already true with the Pentium IV (supposed to reach 5GHz with just one core but it did not happen because of the needed power or look at Zen 4. Clock speeds will go up, but AMD also increased the TDP for it's next generation.

GPUs have much higher core counts - and do benefit from having them, because the tasks they are used for can be paralleized massively.
DavenHas anyone downclocked a 450W 3090TI to 300W or below? If so, how much performance was lost on this $2000 GPU?
Igor from IgorsLAB did... www.igorslab.de/en/cooler-breaker-station-fusion-reactor-when-the-geforce-rtx-3090-ti-with-300-watt-choke-sets-the-efficiency-list-on-its-head-and-beats-the-radeons/
Posted on Reply
#69
Lycanwolfen
Nvidia just keeps going in the wrong direction. 800 watts its getting so bad your going to need a seperate circuit breaker for your PC to run it. 15A can handle only 1725 watts 800 just for a video card then you have motherboard cpu ram other things soo that around 1200 watts min at least. Crazy thats where nvidia has gone to crazy town.
Posted on Reply
#70
simlife
umm no one needs 12k or human eye cant use it... keep in mind... the 3090(alreadly old and 6 years from now bad hardware) can do 8 k in doom.. the current consoles are 40% or so weaker and more powerful% higher then the average pc(steam hardware survey) and those are 4k-1200p targets... heck to be honest i cry foal to ppl liking the swtich a 2017 weaker then the 2013 ps4... (THE MOST popular tech item still) so umm dont worry about 800 watts its a click site or a stupid idk tech person
LycanwolfenNvidia just keeps going in the wrong direction. 800 watts its getting so bad your going to need a seperate circuit breaker for your PC to run it. 15A can handle only 1725 watts 800 just for a video card then you have motherboard cpu ram other things soo that around 1200 watts min at least. Crazy thats where nvidia has gone to crazy town.
yeah u believing it is crazy... when you comment about more crazy u know the 3090 is like 5 times the 2020consoles and 20x the most popular tech item in the world the swtich 4bg of v and system ram...(vs a low end pc 8 gbs and 4 vram still exist(so 24 gb of just v ram is 3x more then a entire pcs?????...) unless u are rich or hate the plannet or monet 4k is still dumb but the 3090 can do 8k keep in mind that would be like 50 dvds... or about 14 buerays at high quality
Posted on Reply
#71
Carlyle2020hs
A dude once said:
"There are no secrets to success. It is the result of preparation, hard work, and learning from failure."

So as how to cool 800 watt i tested some stuff and with my present gear i came up with this novel
"dual tower backplate" idea:


fyi: air cooled 3080/90 cards in this particular position did not fare well


And if anybody knows really good thermal putty, concerning consistency after extreme temperature cycles (about 100c to -4c) and where to get it,
kindly give me a shout and i will test the tec again.

Posted on Reply
#72
Minus Infinity
TheDeeGeeIf the RTX 4060 will be 320 Watts i will simply downclock it so it only uses around 220, as that's my limit in terms of noise and heat.
Or you buy a 7700XT and not need to do anything and get much faster performance at same power.

I'm abandoning Nvidia this gen.
Posted on Reply
#73
Jism
Ah people. There is a power limit of up to 800w. It does'nt mean the card is constantly using 800w.

Gaming Scenes just differ; one has vsync, the other has a FPS cap at 144Hz, i think in the most extreme situation your looking at 800w, but not on average or "TGP" etc.

I agree that power raises quite high now. If you run a 800W video card and approx 250W for your CPU then you need a beefy 1200 up to 1500W PSU for that.

But it's the other side of the cookie. I think both camps are preparing to hit 8k gaming at 144hz or so and that just requires lots of powers.
Posted on Reply
#74
AusWolf
Does this mean that even the 4060 will be out of my heat/noise target range?
Posted on Reply
#75
TheUn4seen
I love all those "could" and "up to" speculations. When I lived in the UK I had an "up to" 20Mb/s Internet connection, which in practice meant 200kb/s on a good day. We'll see when actual products reach the market and get reviewed. Either way it will be too much for some people and not enough for others. Personally, I'll probably steer clear, I found myself being interested in no more than one major game per year - as in, I bought a 3090 specifically to play Cyberpunk 2077, which turned out to be a great investment thanks to all the price shenanigans and ETH mining that happened in the last two years, but, most of the time, such approach is just nonsensical. I think I grew out of playing games and it's time to accept the "grumpy old man" phase of life. My current 3080 will most likely be the last dedicated GPU I ever bought.
Also, the overall push towards higher power draw is just disappointing. There's no innovation in offering twice the performance with two times higher power draw. That's just brute forcing a way up the charts, like Intel did in Pentium 4 and Pentium D days. Feels like we're back in Smithfield time.
Posted on Reply
Add your own comment
Dec 22nd, 2024 07:09 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts