• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA RTX 40 Series Could Reach 800 Watts on Desktop, 175 Watt for Mobile/Laptop

I love all those "could" and "up to" speculations. When I lived in the UK I had an "up to" 20Mb/s Internet connection, which in practice meant 200kb/s on a good day. We'll see when actual products reach the market and get reviewed. Either way it will be too much for some people and not enough for others. Personally, I'll probably steer clear, I found myself being interested in no more than one major game per year - as in, I bought a 3090 specifically to play Cyberpunk 2077, which turned out to be a great investment thanks to all the price shenanigans and ETH mining that happened in the last two years, but, most of the time, such approach is just nonsensical. I think I grew out of playing games and it's time to accept the "grumpy old man" phase of life. My current 3080 will most likely be the last dedicated GPU I ever bought.
Also, the overall push towards higher power draw is just disappointing. There's no innovation in offering twice the performance with two times higher power draw. That's just brute forcing a way up the charts, like Intel did in Pentium 4 and Pentium D days. Feels like we're back in Smithfield time.
 
Last edited:
Glad I'm picking up a new 1kw PSU tomorrow in preparation for upcoming high power draw dGPUs from either AMD or Nvidia... :)
Why not wait for the upcoming ATX 3.0?
 
Hmmm gen 4 raiser cables and a mount outside the case? perhaps thats the new 2022 look. A 800watt toaster inside my Fractal Design Meshify C would be a challenge - wonder how high AMDs next gen is going to be? View attachment 251668
You joke but I am already doing this with my 3090 TI.

I wish I was kidding. 450W TDP just needs it even with my blower fans.

There is no sane way AD102 is going in a consumer case. If I had to guess, there will probably be no consumer AD102 gpus.
 
Why not wait for the upcoming ATX 3.0?
We are speculating here until they are actually on the market, but I do not have plans for more than 300w card next gpu upgrade - 2x8pin connectors (if I can help it) & besides that, I'll bet they will produce adapters for non ATX 3.0 compliant PSUs. To deliver up to 600w on a single connector I think will be for cards way outside my budget anyway.
Ideally, 50 - 60% is the peak efficiency level for a PSU. So atm, if my OC rig is drawing on avg - 500w from the wall plug in a gaming session, a 1kw PSU is ideal & too complicate things further...lol.. no one knows for sure either what next gen AMD or Intel cpus will drink with power consumption when OC.
 
Does this mean that even the 4060 will be out of my heat/noise target range?
Those rumors are probably wrong. It's a new node they don't need to push the cards that much to get performance gains. Last gen GPU rumors were all wrong btw.
 
I mean, I am not against more power draw to an extent as long as:
A: It can be cooled relatively easily
B: Its performance matches the power output

Though I must say if this is true and it gets anywhere near that, its going to definitely be a water cooled card for me. Though its kinda funny as I remember the heat output on triple and quad GPU setups ive done, never though one single GPU would be like my 3 R9 290X setup I used to have.
 
What about sli as well? Top SKU still has that... :rolleyes:
 
Power up your gaming rig and cause a neighborhood brown out.
 
I suspect EU will kick in and set the limits at some point, this is getting out of hand.
 
I suspect EU will kick in and set the limits at some point, this is getting out of hand.
Nooooo, fuck it! Soon you'll get a mini power station with your GPU!
 
It'll be like SLI but with a single card. Ugh. It'll be great during the winter though, not so much during the summer.

Instead of jacking up power requirements why doesn't Nvidia look into using HBM over GDDR6x? I had thought HBM used a lot less power for a lot more performance than any flavour of GDDRx?
 
Or you buy a 7700XT and not need to do anything and get much faster performance at same power.

I'm abandoning Nvidia this gen.
I need Nvidia for whenever i play a retro game and need Sparse Grid Supersampling, D3D 4x4, 16xS and what not.

AMD doesn't offer that, and i don't want AMD in my PC ever again after the 5800X fiasco.
 
people who buy highest end GPU's don't give a hoot anyway, they just convince people thy do.

Just buy a 4090 and put 4060 in specs, problem solved
 
people who buy highest end GPU's don't give a hoot anyway, they just convince people thy do.
I do. I wouldn't have said I did before, but I am firmly convinced anything over 450W becomes a thermal dynamics problem for your standard ATX case.... Big time.
 
Just for example, the cut down AD104 reference card will probably have 16Gbps GDDR6 at 220W TBP, anything more it will be suicidal from Nvidia at that price range.
Also, I really can't think a single reason why a TSMC N5 or N4, below 300mm² (296?, if AD102 is 600mm²) , heavily cut down part (184TC), with plain GDDR6 will need more, since a near full chip (3070) at close to 400mm² (392) on the sabpar 8nm Samsung (10nm derivative) reference card has 220W TBP.
It just does not compute!
 
Just buy a 4090 and put 4060 in specs, problem solved
Buy a card for $2k and make it run as a $400 card :) that is a brilliant idea if you don't have a brain that is.
It does not solve anything. People pay for performance but with those high TDPs all over, it would be hard to sustain that performance.
 
I do. I wouldn't have said I did before, but I am firmly convinced anything over 450W becomes a thermal dynamics problem for your standard ATX case.... Big time.
But R-T-B weren't there peeps who ran SLI/Crossfire rigs that drew more than 500 Watts power just for the video cards alone? Even on air?
 
Buy a card for $2k and make it run as a $400 card :) that is a brilliant idea if you don't have a brain that is.
It does not solve anything. People pay for performance but with those high TDPs all over, it would be hard to sustain that performance.

In your system specs <-----
Not about making a 2k card run like a $400 one at all. Guess you did not fully understand the comment. It was about having a high power using card but making people think you don't as you care about the "environment"
 
In your system specs <-----
Not about making a 2k card run like a $400 one at all. Guess you did not fully understand the comment. It was about having a high power using card but making people think you don't as you care about the "environment"
What about my system specs? I got 6900XT so...?
Anything below spec is unacceptable when you have to lower the performance to make it run cooler since your cooling cant keep up in your rig.
It does not matter what the reason is. If you pay a lot and lower power consumption and performance (it does not matter by how much) to make it work properly in your case is unacceptable. Odds are it will happen considering power required to run those things. Time will tell.
You dont need to buy this one and lower spec. Just buy lower model with less power needs. Save money and environment.
 
But R-T-B weren't there peeps who ran SLI/Crossfire rigs that drew more than 500 Watts power just for the video cards alone? Even on air?
Yes. But the thermal density of two cards is far lower than one.
 
Yes. But the thermal density of two cards is far lower than one.
And the blower style cards exhausted most of the heat outside of case as well right? Are blower style cards obsolete for the high end now due to the high TBP?
 
And the blower style cards exhausted most of the heat outside of case as well right? Are blower style cards obsolete for the high end now due to the high TBP?
I would not be surprised if that is the case.
 
What about my system specs? I got 6900XT so...?
Anything below spec is unacceptable when you have to lower the performance to make it run cooler since your cooling cant keep up in your rig.
It does not matter what the reason is. If you pay a lot and lower power consumption and performance (it does not matter by how much) to make it work properly in your case is unacceptable. Odds are it will happen considering power required to run those things. Time will tell.
You dont need to buy this one and lower spec. Just buy lower model with less power needs. Save money and environment.

Buy a 3090ti, do not modify it in any way, run it full power balls out. In system specs put a 3060 so if people look at your specs they think you have a lower power card, while in the mean time you are putting the V up to the people whining about high power users. You are not lowering the power consumption, but fooling other forum users that you care by having a lower power using card.
 
Buy a 3090ti, do not modify it in any way, run it full power balls out. In system specs put a 3060 so if people look at your specs they think you have a lower power card, while in the mean time you are putting the V up to the people whining about high power users. You are not lowering the power consumption, but fooling other forum users that you care by having a lower power using card.
Well it is about power consumption though. You are missing the point.
 
Well it is about power consumption though. You are missing the point.

I know it is about power consumption, you are missing my point it seems. It is not about modifying a graphics card for lower power, it is about making yourself look better by seemingly having a lower power card when you have not. Do you not get that?

Makes no difference, chances are the people in the forum who have a 3090/ti will buy a 4090/ti and use whatever means to justify it while making out they care about high power use.
 
Back
Top