Wednesday, April 27th 2022

NVIDIA Allegedly Testing a 900 Watt TGP Ada Lovelace AD102 GPU

With the release of Hopper, NVIDIA's cycle of new architecture releases is not yet over. Later this year, we expect to see next-generation gaming architecture codenamed Ada Lovelace. According to a well-known hardware leaker for NVIDIA products, @kopite7kimi, on Twitter, the green team is reportedly testing a potent variant of the upcoming AD102 SKU. As the leak indicates, we could see an Ada Lovelace AD102 SKU with a Total Graphics Power (TGP) of 900 Watts. While we don't know where this SKU is supposed to sit in the Ada Lovelace family, it could be the most powerful, Titan-like design making a comeback. Alternatively, this could be a GeForce RTX 4090 Ti SKU. It carries 48 GB of GDDR6X memory running at 24 Gbps speeds alongside monstrous TGP. Feeding the card are two 16-pin connectors.

Another confirmation from the leaker is that the upcoming RTX 4080 GPU uses the AD103 SKU variant, while the RTX 4090 uses AD102. For further information, we have to wait a few more months and see what NVIDIA decides to launch in the upcoming generation of gaming-oriented graphics cards.
Sources: @kopite7kimi (Twitter), via VideoCardz
Add your own comment

102 Comments on NVIDIA Allegedly Testing a 900 Watt TGP Ada Lovelace AD102 GPU

#1
R0H1T
Typical Nvidia :slap:

Posted on Reply
#3
john_
Sorry Nvidia. Already have three air conditions, air heater and central heating in my house. I am ready for winter.
Posted on Reply
#4
Vya Domus
No "G" in the codename means this isn't supposed to be a consumer product most likely, A100 for example was a completely different architecture from the Ampere used in consumer products. That doesn't mean I don't believe that they are capable of introducing cards with such ridiculous TGPs to the masses.
Posted on Reply
#5
LabRat 891
PapaTaipeiThis should be illegal.
While I'm not gonna be looking to put a >600W+ part in my rig...
"Do more, with less" scaled up, may often appear as simply "more". As long as overall Performance Per Watt is increasing, we are still moving forward.

Think of the Steam Shovel, Jet Engines, or even Aluminum (yes, simply the material aluminum). All use considerably more resources than what they supplanted but allowed utilities well beyond what came before it.
Perhaps another more personal example, is how homes in the US moved from ~50A-<100A residential mains service to 200A being rather common. We have more appliances that save us time, multiply our labors, etc.
(We've also moved from illumination being a large part of our daily power use to appliances/tools and entertainment devices. Increases in efficiency (and overall decreases in consumption) in one area can allow for 'more budget' in another)
Should a homeowner or business be barred from using a welder, electric oven, etc. just because it uses a lot of current? Should we just ignore the multiplication of labor and utility?

In the history of technology, there has always been an increase in apparent resource consumption as overall efficiency increases and/or ability to do work multiplies.
Posted on Reply
#6
Rhein7
You only got just one job AMD, just one job.... :slap:
Posted on Reply
#7
R0H1T
LabRat 891While I'm not gonna be looking to put a >600W+ part in my rig...
"Do more, with less" scaled up, may often appear as simply "more". As long as overall Performance Per Watt is increasing, we are still moving forward.
It's not increasing that much with these stupidly overlocked cards, look at Apple ~ go wider or go home! This is why HBM(3) is the way to go, it's a lot more efficient than these monster power hungry GDDR6x cards.
Posted on Reply
#8
Crackong
The A100 was 400W max and we got 3090ti with 450W TGP
So if this AD102 is 900W TGP
Please expect 450/400 * 900 = 1012W TGP for our next performance champion :toast:
Posted on Reply
#11
LabRat 891
R0H1TIt's not increasing that much with these stupidly overlocked cards, look at Apple ~ go wider or go home! This is why HBM(3) is the way to go, it's a lot more efficient than these monster power hungry GDDR6x cards.
Rhein7You only got just one job AMD, just one job.... :slap:
These are related. Though, memory does constitute a large part of the power use. Unless bonding techniques have improved greatly however, HBM is quite spendy.

AMD *currently* has the lead on chiplet/MCM designs. They're already ahead of the curve on addressing the 'issues' with smaller and smaller lithography, as well as the problems with poor perf/W scaling as frequencies increase in monolithic ASICs. Rather than 'going big' or 'going wide' in a single chip, they 'glue on more girth'
nVidia and Intel are not ignoring MCM whatsoever; hence my emphasis on AMD only being 'currently' ahead. I see the Intel-headed multi-company effort to standardize MCM/Chiplet design and implementation potentially changing that AMD-held lead.
Posted on Reply
#12
gmn17
add to that a 1 KW CPU :)
Posted on Reply
#13
DeathtoGnomes
I wonder if Nvidia considered that consumers have to pay for electricity. I could not justify paying for an additional 1kw of service.
Posted on Reply
#14
Jism
Its the drawback of going one big monolitic chip design. AMD on the other hands opts for MGM based GPU's, multiple smaller GPU's combined as one.
Posted on Reply
#15
Tomgang
900 watt o_O

My electric bill :fear:

That's to much, no matter what it cost I am not buying a freaking 900 watt card.
Posted on Reply
#16
R0H1T
gmn 17add to that a 1 KW CPU :)
And power it through a 5A socket :D
JismIts the drawback of going one big monolitic chip design. AMD on the other hands opts for MGM based GPU's, multiple smaller GPU's combined as one.
True for AMD just not in the GPU space, as yet. The earliest you can see an MCM based GPU is probably next year.
Posted on Reply
#17
Unregistered
A 900W GPU is just stupid. Imagine that heat dumped in your case with a air cooler. No doubt EK will be straight on it with a £400 full cover block for this.
#18
Ruru
S.T.A.R.S.
"it could be the most powerful, Titan-like design making a comeback"

3090/3090 Ti is a Titan but branded as a GeForce.
Posted on Reply
#19
ratirt
TiggerA 900W GPU is just stupid. Imagine that heat dumped in your case with a air cooler. No doubt EK will be straight on it with a £400 full cover block for this.
case? refrigerator is the place where you keep those 900W things.
I need to start playing on my vacuum since it really uses less power. If that could only be possible. Heck, my fridge uses less than this card will.
Posted on Reply
#20
R0H1T
Hey my AC uses less power than this electronic Hummer :nutkick:
Posted on Reply
#21
Chrispy_
This flagship/halo nonsense is irrelevant.

There's no point getting a GPU for gaming that's any more than 2-3x the performance of current-gen consoles. Games need to run at 4K on those, because that's what the console manufacturers promised.
What that means for games is that you can crank the settings a bit higher and increase the framerate but realistically the games aren't going to be much more demanding until the next generation of consoles lands.

If current consoles are running AAA games at 60fps and dynamic resolution of 1200-1800p in 4K mode on hardware that is roughly equivalent to a 6700XT (XBOX) or slightly improved 6600XT (PS5) then there's really not likely to be anything coming from developers that pushes beyond the comfort zone of those equivalent cards. Lovelace flagships with stupid 900W power draw will be superseded by another generation or two for sure by the time they're needed. It would be like buying a TitanXP for $1200 5 years ago and having it soundly beaten by newer technology with new feature set just a couple of years later at $700, or matched by a $329 card four years later at half the power draw.

For the period of time the TitanXP was a flagship, there were no games that pushed it hard enough to justify the spend. Those games arrived about 3 years later when it was no longer a flagship and could no longer justify either its power draw or purchase price.

I'm not saying that stupidly-powerful, wastefully-inefficient, offensively-expensive flagships shouldn't exist, I'm just saying that 99.9+% of the gaming population shouldn't have any reason to care about them, or what the rich choose to throw their money away on.
Posted on Reply
#22
Flanker
This is a weapon of mass destruction!
Posted on Reply
#23
Punkenjoy
It's true that more energy intensive machine now do more. But they do WAY more and are still efficient when you compare the amount of energy in versus the work done.

With those cards, if a card running at 900w is 2 time faster than a card running at 300w, i could say, ok maybe, we are on something. But it's more likely that this card will be less than 50% more than a a 300w CPU, it's not progress.

I really doubt that it will be more than 50% more powerful than the same chip running at 300w but i may be wrong. Also, we need to consider that those are gaming card. They don't build infrastructure or house. They are just entertainment machine.

to me i think 250w is the sweat spot and 300/350w is really the upper limit of what i can tolerate. i wouldn't want to get a 900w heater in my PC. i would have to cool the case, then find a way to cool the room. If you don't have air conditioning in your room forget that during the summer.


To me, it's just Nvidia not wanting to lose max perf of next generation being slow on chiplet and taking crazy decision to stay on top.
Posted on Reply
#24
ppn
don't sweat it, VRMs are just overengineered.

well you need this halo non sense for 4K120 in elden Ring. 4080 Ti / 90 for sure, 4080 won't cut it. Since +60% is the golden rule improvement that can be expected and 3080 delivers only 60FPS.
Posted on Reply
#25
mazzilla
Holy crap. 900w? Just for the card? Jeez thats small domestic appliance territory. I suppose if you can afford the card you can afford the power use. Thats just utterly nuts.
Posted on Reply
Add your own comment
Nov 17th, 2024 11:12 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts