• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA RTX 40 Series Could Reach 800 Watts on Desktop, 175 Watt for Mobile/Laptop

AleksandarK

News Editor
Staff member
Joined
Aug 19, 2017
Messages
2,998 (1.07/day)
Rumors of NVIDIA's upcoming Ada Lovelace graphics cards keep appearing. With every new update, it seems like the total power consumption is getting bigger, and today we are getting information about different SKUs, including mobile and desktop variants. According to a well-known leaker, kopite7kimi, we have information about the power limits of the upcoming GPUs. The new RTX 40 series GPUs will feature a few initial SKUs: AD102, AD103, AD104, and AD106. Every SKU, except the top AD102, will be available as well. The first in line, AD102, is the most power-hungry SKU with a maximum power limit rating of 800 Watts. This will require multiple power connectors and a very beefy cooling solution to keep it running.

Going down the stack, we have an AD103 SKU limited to 450 Watts on desktop and 175 Watts on mobile. The AD104 chip is limited to 400 Watts on desktop, while the mobile version is still 175 Watts. Additionally, the AD106 SKU is limited to 260 Watts on desktop and 140 Watts on mobile.



Making a difference between a power limit and a TGP is essential. While the Total Graphics Power is what NVIDIA rates its GPUs to run at, the power limit represents the amount of power that can be applied by board partners and overclocking attempts. It is not necessarily translated into TGP, meaning the final TGP values should be significantly lower.

View at TechPowerUp Main Site | Source
 
If the RTX 4060 will be 320 Watts i will simply downclock it so it only uses around 220, as that's my limit in terms of noise and heat.
 
Glad I'm picking up a new 1kw PSU tomorrow in preparation for upcoming high power draw dGPUs from either AMD or Nvidia... :)
 
If the RTX 4060 will be 320 Watts i will simply downclock it so it only uses around 220, as that's my limit in terms of noise and heat.

If you decrease the wattage by 50%, the performance will also fall correspondingly. In which case the whole exercise of buying a brand new, shiny 40-series simply diminishes.
 
If you decrease the wattage by 50%, the performance will also fall correspondingly. In which case the whole exercise of buying a brand new, shiny 40-series simply diminishes.

That's not how hardware works :laugh:
 
Hope dies last :D
No he is right. First of all you lose less performance than the power loss, because many other things don't change (ram speed for example). Also it will automatically use lower voltages. And then you can even set an undervolt manually. You could be able to save half the power but only lose 20 percent performance for example, or even better, figure out how much power you can save with a 10 percent performance drop, you might be suprised.

I'd like to limit to 300W, so I might buy a 350W card and undervolt it.
 
May as well be 1.21 gigawatts. Nvidia have lost their marbles. I'll settle for a "low end" 3070.
 
If you decrease the wattage by 50%, the performance will also fall correspondingly. In which case the whole exercise of buying a brand new, shiny 40-series simply diminishes.
Expected.

But really, let's first see what it will all look like, the 4060 won't come out until 2023 anyways.

I'm still on 1200p, but i want to great ready for 1440p. And i still plan on sticking to 60FPS, so maybe a 4050 Ti will cut it.
 
Instead of buying a 320-watt graphics card and undervolting it to 220-watt, while losing 30-40% performance in the process, better buy a lower performance tier card, save some cash and call it a day.

Instead of RTX 4060 @320W @220W for X$, better RTX 4050 @200W @180W for (X-Y)$..
 
Anyone remember how when Maxwell was out, Nvidia fans constantly bragged about power efficiency, especially with respect to AMD's Hawaii GPUs.... Now, somehow magically, efficiency is never even uttered by them and even argued that it's not important at all in some cases.

AMD has already claimed that the efficiency of the new 7000 series is increased over the 6000 series, which is a good step, but with Nvidia cranking up the power consumption, it's very possible AMD may have to do the same to ensure a respectable level of competition, which is a shame. It's analogous to how in sports, if a sizable amount of the players are using performance enhancing drugs, an athlete who isn't using them and normally wouldn't, is highly incentivized to start using them. With the state of the world being what it is, we should be striving to use less power, but sadly, it's a well documented phenomenon that technological gains in efficiency never coincide with an overall decrease in power consumption, it's called the "Rebound Effect".
 
AMD has already claimed that the efficiency of the new 7000 series is increased over the 6000 series, which is a good step, but with Nvidia cranking up the power consumption, it's very possible AMD may have to do the same to ensure a respectable level of competition, which is a shame.
More likely AMD having superior performance is the reason why nvidia has to crank up the TDP in order to compete.
 
If the RTX 4060 will be 320 Watts i will simply downclock it so it only uses around 220, as that's my limit in terms of noise and heat.

Then what's the point ? If they have to increase the TDP so much that means there are zero efficiency improvements, if you limit it to 220W might as well buy a 3060ti because it's going to perform the same.
 
Instead of buying a 320-watt graphics card and undervolting it to 220-watt, while losing 30-40% performance in the process, better buy a lower performance tier card, save some cash and call it a day.

Instead of RTX 4060 @320W @220W for X$, better RTX 4050 @200W @180W for (X-Y)$..

Actually it's better to buy a higher end GPU if you want to Undervolt and use the card for a longer period of time. Lower end GPUs need their powerlimit to reach or get close to their full potential (higher clock speeds need more power) while higher end GPUs are much more flexible. Having much more cores running at lower speed/voltage is much more efficient. Igor from igorsLAB did some testing of the 3090ti at 313 Watt (66% of the base 466W) and 4k performance plunged to an abysmal ;) 90%. Making it one of the most efficient GPUs in the current generation. And with execution units going up massively in the next generation this trend will become even more obvious. I am no friend of high power consumption and I would like to see companies to limit power consumption on products right from the start but until politics or the majority of the market demands lower powerconsumption I will have to do this manually. I rather pay more for the product - using it 3-4 years - manually reducing the powerlimit to something i consider aceptable (max. 200W +/- 10%) and keep my power bill down while performance still goes up quite substanially, coming from a 175W RTX 2070.

The question is: Are you willing/capable to pay a higher price for the GPU to save money the long term (power bill). The alternative is using your current GPU longer. Something that also is an option because of FSR/DLSS etc.
 
At what point does a laptop stop being a laptop? 175W will drain even the maximum-allowed capacity of 99WH battery in about 15 minutes, ignoring the screen, the CPU, and all the inefficiencies created by such a rapid discharge.

I'm waiting for a 6800U laptop that will game acceptably on 25W with the 12CU variant of RDNA2 IGP. No, it won't run AAA games at ultra settings but at the same time, laptop gaming has and always will be a compromise on keyboard, screen, and performance. A good game is still a good game even if you have to run it at lower graphics settings.

On desktop, there are very few AAA games that don't look great on a sub-200W GPU like the 3060Ti or 6700XT. 800W is ludicrious and until a game comes along that genuinely benefits from 4K120 Ultra, there's no need for the overwhelming majority of people to spend that much money and power on a GPU. Game developers sure as hell aren't catering their art assets and level design to that market yet.
 
Seems like Nvidia will be saying goodbye to SFF this next round. Max 4060 series will be compatible with PSU and cooling capabilities of small ITX cases.
 
irresponsible and weak
 
Glad I'm picking up a new 1kw PSU tomorrow in preparation for upcoming high power draw dGPUs from either AMD or Nvidia... :)

I think you will want a 1200w PUS if you have a 800 watt gpu.
 
Hmmm gen 4 raiser cables and a mount outside the case? perhaps thats the new 2022 look. A 800watt toaster inside my Fractal Design Meshify C would be a challenge - wonder how high AMDs next gen is going to be?
1655726393427.png
 
Then what's the point ? If they have to increase the TDP so much that means there are zero efficiency improvements, if you limit it to 220W might as well buy a 3060ti because it's going to perform the same.

It depends on the number of cuda cores, you want more cores but less max frequency = a lot more energy efficient. If you take a 3080 for exemple and reduce 40% it's power consumption you get 3070 performance for 3060 power consumption or even less than that.
 
I remember when 30 series came out a lot of people pointed at high power draw, and theoretised that Samsung's inefficient process was to be blamed. And that the next generation will be the efficient one.

Well, they have never been more wrong in their lives.

I mean, aren't the high end cards going to me mcm so ~2x transistors out the gate. 800 sounds like not typical use like XOC bios under water probably.

Seems like Nvidia will be saying goodbye to SFF this next round. Max 4060 series will be compatible with PSU and cooling capabilities of small ITX cases.

If you have a really new psu with their new connector it'll be alot easier. I don't think 4070 is going to be that crazy, considering there now 80, 90, and ti models in between.
 
800Watts. Damn that is a lot and a bit more.
 
Back
Top