Wednesday, April 27th 2022

NVIDIA Allegedly Testing a 900 Watt TGP Ada Lovelace AD102 GPU

With the release of Hopper, NVIDIA's cycle of new architecture releases is not yet over. Later this year, we expect to see next-generation gaming architecture codenamed Ada Lovelace. According to a well-known hardware leaker for NVIDIA products, @kopite7kimi, on Twitter, the green team is reportedly testing a potent variant of the upcoming AD102 SKU. As the leak indicates, we could see an Ada Lovelace AD102 SKU with a Total Graphics Power (TGP) of 900 Watts. While we don't know where this SKU is supposed to sit in the Ada Lovelace family, it could be the most powerful, Titan-like design making a comeback. Alternatively, this could be a GeForce RTX 4090 Ti SKU. It carries 48 GB of GDDR6X memory running at 24 Gbps speeds alongside monstrous TGP. Feeding the card are two 16-pin connectors.

Another confirmation from the leaker is that the upcoming RTX 4080 GPU uses the AD103 SKU variant, while the RTX 4090 uses AD102. For further information, we have to wait a few more months and see what NVIDIA decides to launch in the upcoming generation of gaming-oriented graphics cards.
Sources: @kopite7kimi (Twitter), via VideoCardz
Add your own comment

102 Comments on NVIDIA Allegedly Testing a 900 Watt TGP Ada Lovelace AD102 GPU

#76
TheinsanegamerN
DeathtoGnomesI think the math might be off here, going by wiki example (because I have to use a source to prove my point :


so thats 1Kw/h (per hour) x 4 hours costs 80 cents( someone else's figure), times 365, is $292 or $24 per month. Of course your current system usage is not figured in. So lets assume you use 450w currently that works out to a $12 per month increase. Not huge sum for 4 hours of gameplay. Yea I'd be ok with that, but I dont game for just 4 hours usually. Being retired means I have more time to game if i wanted to.

EDIT: I do know that after 17 Kwh per day, I incur an addition rate to the base rate. Rates also change for peak hours.


The way inflation is increasing, by the time a 900watt card is bought, electricity will likely cost much more. The Ferrari owner will be paying $8 a gallon (by the time this card comes out), but when he bought it was $3, $50 might mean something to some people.
yes, $50 will mean something to some people. But if $50 means a lot to you then you are likely not buying $2500 video cards in the first place.

a 900w GPU under your math would be 3.6 kWh per day would be 61.2c per day, or 223.38 per year

a 250w GPU would be 1 kWh per day would be 17 per day, or 62.05 per year

If the $161 differene per year is a big issue, a $2500 GPU is not for you, hence the ferrari thing. The guys with the cash for something like this are NOT going to care about $160 a year difference. The guy who bought a several hundred thousand dollar ferrari doesnt care that gas went from $3 to $8, just like the guy dumping $2500+ on a big GPU doesnt care if his electric bill goes up $100 a year.
Posted on Reply
#77
ARF
There is no normal way to cool that 900-watt GPU?

Posted on Reply
#78
DeathtoGnomes
TheinsanegamerNyes, $50 will mean something to some people. But if $50 means a lot to you then you are likely not buying $2500 video cards in the first place.

a 900w GPU under your math would be 3.6 kWh per day would be 61.2c per day, or 223.38 per year

a 250w GPU would be 1 kWh per day would be 17 per day, or 62.05 per year


If the $161 differene per year is a big issue, a $2500 GPU is not for you, hence the ferrari thing. The guys with the cash for something like this are NOT going to care about $160 a year difference. The guy who bought a several hundred thousand dollar ferrari doesnt care that gas went from $3 to $8, just like the guy dumping $2500+ on a big GPU doesnt care if his electric bill goes up $100 a year.
Thanks for doing the extended math. :rockout: It should be expressed that that is per 4 hour block per day..

I expect some gamers playing well over 25 hours on weekend ( Fri-Sun) raiding. I can attest that as I still occasionally play that long on weekends on some of the more intensive games still.
TheinsanegamerNIf the $161 differene per year is a big issue, a $2500 GPU is not for you,
Highly agree with this.
Posted on Reply
#79
chrcoluk
Nvidia trying to get themselves banned in Europe?
Posted on Reply
#80
ARF
DeathtoGnomesHighly agree with this.
How do you cool a 0.9 kW graphics card? Liquid nitrogen?
Posted on Reply
#81
progste
I mean... I did say I was expecting this, but I was actually just memeing...
Posted on Reply
#82
Count von Schwalbe
Nocturnus Moderatus
ARFHow do you cool a 0.9 kW graphics card? Liquid nitrogen?
Liquid Nitrogen could be risky for a GPU, as the PCB is much more exposed. I could probably design a wildly expensive and complicated water lock with integrated heatpipes but air cooling? "Look at my E-ATX tower for my GPU and my ITX on top for everything else!"

Edit: Autocorrect
Posted on Reply
#84
ModEl4
I don't believe all these reports about 900W etc.
Regarding reference cards probably we will have 220W for the cut down AD104 with close to 6900XT performance at QHD hopefully and 350W for the near full AD103 version with nearly 1.6X vs the cut down AD104 at 4K.
I don't care about AD102 options, too pricey, still the reference cards are going to be between 450-600W probably.
Posted on Reply
#85
R-T-B
R0H1TDesign choice like?
The core design itself?

Yes IF is efficient as heck for a chiplet. This does not make it beat an otherwise theoretically identical monolithic core, but that's really an academic point when the cost savings are so high.
RichardsAny gpu above 500watts is dangerous for power supplies..
No, it's dangerous for underrated power supplies. Same as ever. RTFM.
ARFThere is the EU energy efficiency labeling, though. It basically recommends you what appliances are better. The higher the rating towards A+++ the better.

That would be fine. Just don't blatantly outlaw the things. I'm all for consumer awareness.
ARFHow do you cool a 0.9 kW graphics card? Liquid nitrogen?
This is my main concern. As the owner of a TDP 450W GPU, this thing is going to roast your other components. No question.
Posted on Reply
#86
Eskimonster
Feels like a thing that happens over and over again, when chipmakers are desperate to get the last out of an architecture.
Posted on Reply
#87
Lycanwolfen
Nvidia just going in the wrong direction. 900 watts!!!! Might as well sell it with its own power supply unit to plug into the back of it. Sorta like the old Voodoo 5 6000 had.

Thats too much. Reminds me of the old Geforce 5800 ultra remember that thing sounded like a vaccum cleaner turning on.

Nvidia really need to start thinking again less power and better GPU performance per watt.
Posted on Reply
#88
ghazi
R0H1TDesign choice like?

Moving data through cores, caches, dies or chiplets is arguably the biggest power hog these days & that's where AMD excels with IF, this is especially evident in processors with over 12~16 cores.
This is actually a double-edged sword though, because you always have that baseline IF power which scales proportional to the number of links. This is most of why 8-core EPYC chips are still rated for 180W, and why AMD can't really compete in the crappy sub-10W segment. If you look at those core vs uncore power draw charts for Intel the numbers are very different.
Posted on Reply
#89
Berfs1
Surely the title has a typo, you do mean 90 W, rright?
Posted on Reply
#91
Dr. Dro
Vya DomusNo "G" in the codename means this isn't supposed to be a consumer product most likely, A100 for example was a completely different architecture from the Ampere used in consumer products. That doesn't mean I don't believe that they are capable of introducing cards with such ridiculous TGPs to the masses.
Ada uses AD for the same reason Turing used TU nomenclature, they would repeat an already existing internal processor name

GT = Tesla
TU = Turing
GA = Ampere
AD = Ada

I think this is a meme rumor though, at the very worst this is an early ES of a ~500W SKU or an SXM2 specialty processor
Posted on Reply
#92
ARF
Dr. DroAda uses AD for the same reason Turing used TU nomenclature, they would repeat an already existing internal processor name

GT = Tesla
TU = Turing
GA = Ampere
AD = Ada

I think this is a meme rumor though, at the very worst this is an early ES of a ~500W SKU or an SXM2 specialty processor
It could have been called GL - GeForce (Ada) Lovelace.
Posted on Reply
#93
Dr. Dro
ARFIt could have been called GL - GeForce (Ada) Lovelace.
I think it's being called just Ada though, both internally and externally... GL is also maybe a bad pick because it could cause confusion with GL subvariants? (Quadro/RTX enterprise)

Either way it does not really matter I suppose, it's just an internal oddity/curiosity to account for :)
Posted on Reply
#94
ARF
Dr. DroI think it's being called just Ada though, both internally and externally... GL is also maybe a bad pick because it could cause confusion with GL subvariants? (Quadro/RTX enterprise)

Either way it does not really matter I suppose, it's just an internal oddity/curiosity to account for :)
"Ad" from "Ada" literally means hell, and "Ada" means "the hell" in at least one European language. Very weird naming.
Posted on Reply
#95
Count von Schwalbe
Nocturnus Moderatus
ARF"Ad" from "Ada" literally means hell, and "Ada" means "the hell" in at least one European language. Very weird naming.
Ada King, Countess of Lovelace - aka Ada Lovelace - is widely regarded as having written the first computer program in the mid 1850's.
Posted on Reply
#96
80251
Count von SchwalbeAda King, Countess of Lovelace - aka Ada Lovelace - is widely regarded as having written the first computer program in the mid 1850's.
Didn't she write it for the aborted and non-functional Turing machine?
Posted on Reply
#97
Count von Schwalbe
Nocturnus Moderatus
80251Didn't she write it for the aborted and non-functional Turing machine?
The Babbage Analytical Engine, which was only partially completed.
Posted on Reply
#98
Dr. Dro
Count von SchwalbeAda King, Countess of Lovelace - aka Ada Lovelace - is widely regarded as having written the first computer program in the mid 1850's.
Probably a little earlier, if I remember correctly she passed at the age of 36 from uterine cancer in 1852. What a way to go... with today's medicine she would probably lived to 100 and beyond.
Posted on Reply
#99
80251
Dr. DroProbably a little earlier, if I remember correctly she passed at the age of 36 from uterine cancer in 1852. What a way to go... with today's medicine she would probably lived to 100 and beyond.
Could they even treat any cancer besides skin cancer in the 19th century? I remember I once knew a guy said he would prefer to live in the 19th century -- but only with 20th century medical technology.
Posted on Reply
#100
Dr. Dro
80251Could they even treat any cancer besides skin cancer in the 19th century? I remember I once knew a guy said he would prefer to live in the 19th century -- but only with 20th century medical technology.
I'm afraid not. Even tuberculosis which is pretty much a very easily treatable disease nowadays was a death sentence until after WW2 and the advent of penicillin.

Alexander Fleming first discovered the properties of penicillium I believe in 1928, but it would only be in the middle of the war around 1942-1943 that it saw culture and usage as the primordial antibiotic drug.
Posted on Reply
Add your own comment
Dec 18th, 2024 06:55 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts