Wednesday, April 27th 2022
NVIDIA Allegedly Testing a 900 Watt TGP Ada Lovelace AD102 GPU
With the release of Hopper, NVIDIA's cycle of new architecture releases is not yet over. Later this year, we expect to see next-generation gaming architecture codenamed Ada Lovelace. According to a well-known hardware leaker for NVIDIA products, @kopite7kimi, on Twitter, the green team is reportedly testing a potent variant of the upcoming AD102 SKU. As the leak indicates, we could see an Ada Lovelace AD102 SKU with a Total Graphics Power (TGP) of 900 Watts. While we don't know where this SKU is supposed to sit in the Ada Lovelace family, it could be the most powerful, Titan-like design making a comeback. Alternatively, this could be a GeForce RTX 4090 Ti SKU. It carries 48 GB of GDDR6X memory running at 24 Gbps speeds alongside monstrous TGP. Feeding the card are two 16-pin connectors.
Another confirmation from the leaker is that the upcoming RTX 4080 GPU uses the AD103 SKU variant, while the RTX 4090 uses AD102. For further information, we have to wait a few more months and see what NVIDIA decides to launch in the upcoming generation of gaming-oriented graphics cards.
Sources:
@kopite7kimi (Twitter), via VideoCardz
Another confirmation from the leaker is that the upcoming RTX 4080 GPU uses the AD103 SKU variant, while the RTX 4090 uses AD102. For further information, we have to wait a few more months and see what NVIDIA decides to launch in the upcoming generation of gaming-oriented graphics cards.
102 Comments on NVIDIA Allegedly Testing a 900 Watt TGP Ada Lovelace AD102 GPU
a 900w GPU under your math would be 3.6 kWh per day would be 61.2c per day, or 223.38 per year
a 250w GPU would be 1 kWh per day would be 17 per day, or 62.05 per year
If the $161 differene per year is a big issue, a $2500 GPU is not for you, hence the ferrari thing. The guys with the cash for something like this are NOT going to care about $160 a year difference. The guy who bought a several hundred thousand dollar ferrari doesnt care that gas went from $3 to $8, just like the guy dumping $2500+ on a big GPU doesnt care if his electric bill goes up $100 a year.
I expect some gamers playing well over 25 hours on weekend ( Fri-Sun) raiding. I can attest that as I still occasionally play that long on weekends on some of the more intensive games still. Highly agree with this.
Edit: Autocorrect
Regarding reference cards probably we will have 220W for the cut down AD104 with close to 6900XT performance at QHD hopefully and 350W for the near full AD103 version with nearly 1.6X vs the cut down AD104 at 4K.
I don't care about AD102 options, too pricey, still the reference cards are going to be between 450-600W probably.
Yes IF is efficient as heck for a chiplet. This does not make it beat an otherwise theoretically identical monolithic core, but that's really an academic point when the cost savings are so high. No, it's dangerous for underrated power supplies. Same as ever. RTFM. That would be fine. Just don't blatantly outlaw the things. I'm all for consumer awareness. This is my main concern. As the owner of a TDP 450W GPU, this thing is going to roast your other components. No question.
Thats too much. Reminds me of the old Geforce 5800 ultra remember that thing sounded like a vaccum cleaner turning on.
Nvidia really need to start thinking again less power and better GPU performance per watt.
GT = Tesla
TU = Turing
GA = Ampere
AD = Ada
I think this is a meme rumor though, at the very worst this is an early ES of a ~500W SKU or an SXM2 specialty processor
Either way it does not really matter I suppose, it's just an internal oddity/curiosity to account for :)
Alexander Fleming first discovered the properties of penicillium I believe in 1928, but it would only be in the middle of the war around 1942-1943 that it saw culture and usage as the primordial antibiotic drug.