Monday, June 20th 2022

NVIDIA RTX 40 Series Could Reach 800 Watts on Desktop, 175 Watt for Mobile/Laptop

Rumors of NVIDIA's upcoming Ada Lovelace graphics cards keep appearing. With every new update, it seems like the total power consumption is getting bigger, and today we are getting information about different SKUs, including mobile and desktop variants. According to a well-known leaker, kopite7kimi, we have information about the power limits of the upcoming GPUs. The new RTX 40 series GPUs will feature a few initial SKUs: AD102, AD103, AD104, and AD106. Every SKU, except the top AD102, will be available as well. The first in line, AD102, is the most power-hungry SKU with a maximum power limit rating of 800 Watts. This will require multiple power connectors and a very beefy cooling solution to keep it running.

Going down the stack, we have an AD103 SKU limited to 450 Watts on desktop and 175 Watts on mobile. The AD104 chip is limited to 400 Watts on desktop, while the mobile version is still 175 Watts. Additionally, the AD106 SKU is limited to 260 Watts on desktop and 140 Watts on mobile.
Making a difference between a power limit and a TGP is essential. While the Total Graphics Power is what NVIDIA rates its GPUs to run at, the power limit represents the amount of power that can be applied by board partners and overclocking attempts. It is not necessarily translated into TGP, meaning the final TGP values should be significantly lower.
Sources: @kopite7kimi (Twitter), via VideoCardz
Add your own comment

133 Comments on NVIDIA RTX 40 Series Could Reach 800 Watts on Desktop, 175 Watt for Mobile/Laptop

#1
TheDeeGee
If the RTX 4060 will be 320 Watts i will simply downclock it so it only uses around 220, as that's my limit in terms of noise and heat.
Posted on Reply
#2
AlwaysHope
Glad I'm picking up a new 1kw PSU tomorrow in preparation for upcoming high power draw dGPUs from either AMD or Nvidia... :)
Posted on Reply
#3
ARF
TheDeeGeeIf the RTX 4060 will be 320 Watts i will simply downclock it so it only uses around 220, as that's my limit in terms of noise and heat.
If you decrease the wattage by 50%, the performance will also fall correspondingly. In which case the whole exercise of buying a brand new, shiny 40-series simply diminishes.
Posted on Reply
#4
Gungar
ARFIf you decrease the wattage by 50%, the performance will also fall correspondingly. In which case the whole exercise of buying a brand new, shiny 40-series simply diminishes.
That's not how hardware works :laugh:
Posted on Reply
#5
ARF
GungarThat's not how hardware works :laugh:
Hope dies last :D
Posted on Reply
#7
Garrus
ARFHope dies last :D
No he is right. First of all you lose less performance than the power loss, because many other things don't change (ram speed for example). Also it will automatically use lower voltages. And then you can even set an undervolt manually. You could be able to save half the power but only lose 20 percent performance for example, or even better, figure out how much power you can save with a 10 percent performance drop, you might be suprised.

I'd like to limit to 300W, so I might buy a 350W card and undervolt it.
Posted on Reply
#8
BlaezaLite
May as well be 1.21 gigawatts. Nvidia have lost their marbles. I'll settle for a "low end" 3070.
Posted on Reply
#9
Bwaze
I remember when 30 series came out a lot of people pointed at high power draw, and theoretised that Samsung's inefficient process was to be blamed. And that the next generation will be the efficient one.

Well, they have never been more wrong in their lives.
Posted on Reply
#10
TheDeeGee
ARFIf you decrease the wattage by 50%, the performance will also fall correspondingly. In which case the whole exercise of buying a brand new, shiny 40-series simply diminishes.
Expected.

But really, let's first see what it will all look like, the 4060 won't come out until 2023 anyways.

I'm still on 1200p, but i want to great ready for 1440p. And i still plan on sticking to 60FPS, so maybe a 4050 Ti will cut it.
Posted on Reply
#11
ARF
Instead of buying a 320-watt graphics card and undervolting it to 220-watt, while losing 30-40% performance in the process, better buy a lower performance tier card, save some cash and call it a day.

Instead of RTX 4060 @320W @220W for X$, better RTX 4050 @200W @180W for (X-Y)$..
Posted on Reply
#12
AnarchoPrimitiv
Anyone remember how when Maxwell was out, Nvidia fans constantly bragged about power efficiency, especially with respect to AMD's Hawaii GPUs.... Now, somehow magically, efficiency is never even uttered by them and even argued that it's not important at all in some cases.

AMD has already claimed that the efficiency of the new 7000 series is increased over the 6000 series, which is a good step, but with Nvidia cranking up the power consumption, it's very possible AMD may have to do the same to ensure a respectable level of competition, which is a shame. It's analogous to how in sports, if a sizable amount of the players are using performance enhancing drugs, an athlete who isn't using them and normally wouldn't, is highly incentivized to start using them. With the state of the world being what it is, we should be striving to use less power, but sadly, it's a well documented phenomenon that technological gains in efficiency never coincide with an overall decrease in power consumption, it's called the "Rebound Effect".
Posted on Reply
#13
Pumper
AnarchoPrimitivAMD has already claimed that the efficiency of the new 7000 series is increased over the 6000 series, which is a good step, but with Nvidia cranking up the power consumption, it's very possible AMD may have to do the same to ensure a respectable level of competition, which is a shame.
More likely AMD having superior performance is the reason why nvidia has to crank up the TDP in order to compete.
Posted on Reply
#14
Vya Domus
TheDeeGeeIf the RTX 4060 will be 320 Watts i will simply downclock it so it only uses around 220, as that's my limit in terms of noise and heat.
Then what's the point ? If they have to increase the TDP so much that means there are zero efficiency improvements, if you limit it to 220W might as well buy a 3060ti because it's going to perform the same.
Posted on Reply
#15
Hofnaerrchen
ARFInstead of buying a 320-watt graphics card and undervolting it to 220-watt, while losing 30-40% performance in the process, better buy a lower performance tier card, save some cash and call it a day.

Instead of RTX 4060 @320W @220W for X$, better RTX 4050 @200W @180W for (X-Y)$..
Actually it's better to buy a higher end GPU if you want to Undervolt and use the card for a longer period of time. Lower end GPUs need their powerlimit to reach or get close to their full potential (higher clock speeds need more power) while higher end GPUs are much more flexible. Having much more cores running at lower speed/voltage is much more efficient. Igor from igorsLAB did some testing of the 3090ti at 313 Watt (66% of the base 466W) and 4k performance plunged to an abysmal ;) 90%. Making it one of the most efficient GPUs in the current generation. And with execution units going up massively in the next generation this trend will become even more obvious. I am no friend of high power consumption and I would like to see companies to limit power consumption on products right from the start but until politics or the majority of the market demands lower powerconsumption I will have to do this manually. I rather pay more for the product - using it 3-4 years - manually reducing the powerlimit to something i consider aceptable (max. 200W +/- 10%) and keep my power bill down while performance still goes up quite substanially, coming from a 175W RTX 2070.

The question is: Are you willing/capable to pay a higher price for the GPU to save money the long term (power bill). The alternative is using your current GPU longer. Something that also is an option because of FSR/DLSS etc.
Posted on Reply
#16
Chrispy_
At what point does a laptop stop being a laptop? 175W will drain even the maximum-allowed capacity of 99WH battery in about 15 minutes, ignoring the screen, the CPU, and all the inefficiencies created by such a rapid discharge.

I'm waiting for a 6800U laptop that will game acceptably on 25W with the 12CU variant of RDNA2 IGP. No, it won't run AAA games at ultra settings but at the same time, laptop gaming has and always will be a compromise on keyboard, screen, and performance. A good game is still a good game even if you have to run it at lower graphics settings.

On desktop, there are very few AAA games that don't look great on a sub-200W GPU like the 3060Ti or 6700XT. 800W is ludicrious and until a game comes along that genuinely benefits from 4K120 Ultra, there's no need for the overwhelming majority of people to spend that much money and power on a GPU. Game developers sure as hell aren't catering their art assets and level design to that market yet.
Posted on Reply
#17
Dimitriman
Seems like Nvidia will be saying goodbye to SFF this next round. Max 4060 series will be compatible with PSU and cooling capabilities of small ITX cases.
Posted on Reply
#19
Space Lynx
Astronaut
AlwaysHopeGlad I'm picking up a new 1kw PSU tomorrow in preparation for upcoming high power draw dGPUs from either AMD or Nvidia... :)
I think you will want a 1200w PUS if you have a 800 watt gpu.
Posted on Reply
#20
Broken Processor
CallandorWoTI think you will want a 1200w PUS if you have a 800 watt gpu.
It's certainly looking that way and possibly more by the time partner cards come out and if you run a lot of hard disks and the like. I don't think my water loop will even cover that in a quiet manner.
Time will tell.
Posted on Reply
#21
jesdals
Hmmm gen 4 raiser cables and a mount outside the case? perhaps thats the new 2022 look. A 800watt toaster inside my Fractal Design Meshify C would be a challenge - wonder how high AMDs next gen is going to be?
Posted on Reply
#22
Gungar
Vya DomusThen what's the point ? If they have to increase the TDP so much that means there are zero efficiency improvements, if you limit it to 220W might as well buy a 3060ti because it's going to perform the same.
It depends on the number of cuda cores, you want more cores but less max frequency = a lot more energy efficient. If you take a 3080 for exemple and reduce 40% it's power consumption you get 3070 performance for 3060 power consumption or even less than that.
Posted on Reply
#23
Colddecked
BwazeI remember when 30 series came out a lot of people pointed at high power draw, and theoretised that Samsung's inefficient process was to be blamed. And that the next generation will be the efficient one.

Well, they have never been more wrong in their lives.
I mean, aren't the high end cards going to me mcm so ~2x transistors out the gate. 800 sounds like not typical use like XOC bios under water probably.
DimitrimanSeems like Nvidia will be saying goodbye to SFF this next round. Max 4060 series will be compatible with PSU and cooling capabilities of small ITX cases.
If you have a really new psu with their new connector it'll be alot easier. I don't think 4070 is going to be that crazy, considering there now 80, 90, and ti models in between.
Posted on Reply
#24
ratirt
800Watts. Damn that is a lot and a bit more.
Posted on Reply
#25
Daven
TheDeeGeeIf the RTX 4060 will be 320 Watts i will simply downclock it so it only uses around 220, as that's my limit in terms of noise and heat.
But what about the 600W+ cards? If you downclock those to under 300W so it stays cool in most existing cases, one would lose too much performance.

Most here are thinking about downclocking 350W and under cards as has been the normal high upper limit. This won’t work the way you think it will for insanely high power cards.

Has anyone downclocked a 450W 3090TI to 300W or below? If so, how much performance was lost on this $2000 GPU?
Posted on Reply
Add your own comment
Dec 22nd, 2024 01:32 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts