Wednesday, April 27th 2022
NVIDIA Allegedly Testing a 900 Watt TGP Ada Lovelace AD102 GPU
With the release of Hopper, NVIDIA's cycle of new architecture releases is not yet over. Later this year, we expect to see next-generation gaming architecture codenamed Ada Lovelace. According to a well-known hardware leaker for NVIDIA products, @kopite7kimi, on Twitter, the green team is reportedly testing a potent variant of the upcoming AD102 SKU. As the leak indicates, we could see an Ada Lovelace AD102 SKU with a Total Graphics Power (TGP) of 900 Watts. While we don't know where this SKU is supposed to sit in the Ada Lovelace family, it could be the most powerful, Titan-like design making a comeback. Alternatively, this could be a GeForce RTX 4090 Ti SKU. It carries 48 GB of GDDR6X memory running at 24 Gbps speeds alongside monstrous TGP. Feeding the card are two 16-pin connectors.
Another confirmation from the leaker is that the upcoming RTX 4080 GPU uses the AD103 SKU variant, while the RTX 4090 uses AD102. For further information, we have to wait a few more months and see what NVIDIA decides to launch in the upcoming generation of gaming-oriented graphics cards.
Sources:
@kopite7kimi (Twitter), via VideoCardz
Another confirmation from the leaker is that the upcoming RTX 4080 GPU uses the AD103 SKU variant, while the RTX 4090 uses AD102. For further information, we have to wait a few more months and see what NVIDIA decides to launch in the upcoming generation of gaming-oriented graphics cards.
102 Comments on NVIDIA Allegedly Testing a 900 Watt TGP Ada Lovelace AD102 GPU
"Do more, with less" scaled up, may often appear as simply "more". As long as overall Performance Per Watt is increasing, we are still moving forward.
Think of the Steam Shovel, Jet Engines, or even Aluminum (yes, simply the material aluminum). All use considerably more resources than what they supplanted but allowed utilities well beyond what came before it.
Perhaps another more personal example, is how homes in the US moved from ~50A-<100A residential mains service to 200A being rather common. We have more appliances that save us time, multiply our labors, etc.
(We've also moved from illumination being a large part of our daily power use to appliances/tools and entertainment devices. Increases in efficiency (and overall decreases in consumption) in one area can allow for 'more budget' in another)
Should a homeowner or business be barred from using a welder, electric oven, etc. just because it uses a lot of current? Should we just ignore the multiplication of labor and utility?
In the history of technology, there has always been an increase in apparent resource consumption as overall efficiency increases and/or ability to do work multiplies.
So if this AD102 is 900W TGP
Please expect 450/400 * 900 = 1012W TGP for our next performance champion :toast:
Go Nvidia :peace:
AMD *currently* has the lead on chiplet/MCM designs. They're already ahead of the curve on addressing the 'issues' with smaller and smaller lithography, as well as the problems with poor perf/W scaling as frequencies increase in monolithic ASICs. Rather than 'going big' or 'going wide' in a single chip, they 'glue on more girth'
nVidia and Intel are not ignoring MCM whatsoever; hence my emphasis on AMD only being 'currently' ahead. I see the Intel-headed multi-company effort to standardize MCM/Chiplet design and implementation potentially changing that AMD-held lead.
My electric bill :fear:
That's to much, no matter what it cost I am not buying a freaking 900 watt card.
3090/3090 Ti is a Titan but branded as a GeForce.
I need to start playing on my vacuum since it really uses less power. If that could only be possible. Heck, my fridge uses less than this card will.
There's no point getting a GPU for gaming that's any more than 2-3x the performance of current-gen consoles. Games need to run at 4K on those, because that's what the console manufacturers promised.
What that means for games is that you can crank the settings a bit higher and increase the framerate but realistically the games aren't going to be much more demanding until the next generation of consoles lands.
If current consoles are running AAA games at 60fps and dynamic resolution of 1200-1800p in 4K mode on hardware that is roughly equivalent to a 6700XT (XBOX) or slightly improved 6600XT (PS5) then there's really not likely to be anything coming from developers that pushes beyond the comfort zone of those equivalent cards. Lovelace flagships with stupid 900W power draw will be superseded by another generation or two for sure by the time they're needed. It would be like buying a TitanXP for $1200 5 years ago and having it soundly beaten by newer technology with new feature set just a couple of years later at $700, or matched by a $329 card four years later at half the power draw.
For the period of time the TitanXP was a flagship, there were no games that pushed it hard enough to justify the spend. Those games arrived about 3 years later when it was no longer a flagship and could no longer justify either its power draw or purchase price.
I'm not saying that stupidly-powerful, wastefully-inefficient, offensively-expensive flagships shouldn't exist, I'm just saying that 99.9+% of the gaming population shouldn't have any reason to care about them, or what the rich choose to throw their money away on.
With those cards, if a card running at 900w is 2 time faster than a card running at 300w, i could say, ok maybe, we are on something. But it's more likely that this card will be less than 50% more than a a 300w CPU, it's not progress.
I really doubt that it will be more than 50% more powerful than the same chip running at 300w but i may be wrong. Also, we need to consider that those are gaming card. They don't build infrastructure or house. They are just entertainment machine.
to me i think 250w is the sweat spot and 300/350w is really the upper limit of what i can tolerate. i wouldn't want to get a 900w heater in my PC. i would have to cool the case, then find a way to cool the room. If you don't have air conditioning in your room forget that during the summer.
To me, it's just Nvidia not wanting to lose max perf of next generation being slow on chiplet and taking crazy decision to stay on top.
well you need this halo non sense for 4K120 in elden Ring. 4080 Ti / 90 for sure, 4080 won't cut it. Since +60% is the golden rule improvement that can be expected and 3080 delivers only 60FPS.