Wednesday, April 27th 2022
NVIDIA Allegedly Testing a 900 Watt TGP Ada Lovelace AD102 GPU
With the release of Hopper, NVIDIA's cycle of new architecture releases is not yet over. Later this year, we expect to see next-generation gaming architecture codenamed Ada Lovelace. According to a well-known hardware leaker for NVIDIA products, @kopite7kimi, on Twitter, the green team is reportedly testing a potent variant of the upcoming AD102 SKU. As the leak indicates, we could see an Ada Lovelace AD102 SKU with a Total Graphics Power (TGP) of 900 Watts. While we don't know where this SKU is supposed to sit in the Ada Lovelace family, it could be the most powerful, Titan-like design making a comeback. Alternatively, this could be a GeForce RTX 4090 Ti SKU. It carries 48 GB of GDDR6X memory running at 24 Gbps speeds alongside monstrous TGP. Feeding the card are two 16-pin connectors.
Another confirmation from the leaker is that the upcoming RTX 4080 GPU uses the AD103 SKU variant, while the RTX 4090 uses AD102. For further information, we have to wait a few more months and see what NVIDIA decides to launch in the upcoming generation of gaming-oriented graphics cards.
Sources:
@kopite7kimi (Twitter), via VideoCardz
Another confirmation from the leaker is that the upcoming RTX 4080 GPU uses the AD103 SKU variant, while the RTX 4090 uses AD102. For further information, we have to wait a few more months and see what NVIDIA decides to launch in the upcoming generation of gaming-oriented graphics cards.
102 Comments on NVIDIA Allegedly Testing a 900 Watt TGP Ada Lovelace AD102 GPU
Just give me a 2KW power supply and 2KW solar power on the roof.
We only accepted more powerful cards once they upgraded the shit out of cooling systems (post-Pascal), and I wouldn't expect anything air cooled to exceed 450w in three slots anytime soon! Fermi popularized stock dual-slot coolers, but triples are not gong to become commonplace anytime soon!
Especially with graphics cards, you get less additional performance for each Watt of power that's added.
I've graphed it in the past and the sweet spot of maximum efficiency is usually around 2/3rds of the clockspeed and half the power consumption. That's been true for both AMD and Nvidia GPUs since about 2016 so there's no reason to suspect that it won't also apply to Lovelace on the new process node.
and compared to its 10nm FinFET process, TSMC's 7nm FinFET features 1.6X logic density, ~20% speed improvement, and ~40% power reduction. which in turn offers 2X logic density than its 16nm predecessor, along with ~15% faster speed and ~35% less power consumption.
SO I dont exactly know how many times faster speed and/or power reduction that makes, but since Nvidia skipped 10 and 7nm, ADa102 should be running at 1,20*1,20*1,15 or 1,65 * chips were already running at 2Ghz on 16nm, so we have 3.2GHz. ADa103 at same power/ size as 1080Ti or 0.6*0,6*0,65 250W down to 60 watts when downclocked to 2Ghz.
"How can we get people to instantly dislike our products without them even launching yet"
"Leak out some 900W board tests..."
On a more serious note, this will definitely be a xx90 chip. Buyers of those don't give a damn about power consumption or heat. The majority of us, mere mortals, should be focused on xx60 and xx70 cards instead.
Honestly.. they lost me at 400w. I have no interest in the next gen cards from either manufacturer.. don’t care how many times stronger it is compared to my 3070 Ti. If you have no problems with running a 500w+ GPU.. all the power to you no pun intended :D
So much for going greener.. the boys running those companies have been dipping into the green a bit too much if you ask me, and I’m not talking about money either lol..
and so on, 300W is triple slot. 900 watts is 9 slots. or a wall of 8x8x8fans like captain workplace did make fun of it.
And 4070 is such a joke again with 192 bit bus, i have to opt for a 4080 now.
You, know, the world is changing towards "eco" modes, electric cars, etc. The world is turning its back to the fossil fuels... and this is the ellites' revenge - your choice now is to hate the electricity and its prices...
I have no other explanation.
This is ridiculous and if I were in the power - I would put Nvidia in the courts and not allow them to leave until they become humans again..
I wonder if pool companies considered that consumers have to pay for water. I could not justify paying for an additional 100 gallons of service.
ece. ece. If the power draw of a GPU means your power bill will be a concern, you shouldnt be buying a likely $2500+ video card. compared to a 400w GPU like the 3090 it's literally a 5-6$ difference over an entire YEAR unless you are playing 8+ hours a day, in which case your streaming will more then make up the difference.
Nevermind the marketing material has an answer for that:
The H100 CNX alleviates this problem. With a dedicated path from the network to the GPU, it allows GPUDirect® RDMA to operate at near line speeds. The data transfer also occurs at PCIe Gen5 speeds regardless of host PCIe backplane. Scaling up GPU power in a host can be done in a balanced manner, since the ideal GPU-to-NIC ratio is achieved. A server can also be equipped with more acceleration power, because converged accelerators require fewer PCIe lanes and device slots than discrete cards.