Thursday, May 5th 2022

NVIDIA GeForce RTX 3090 Ti Gets Custom 890 Watt XOC BIOS

Extreme overclocking is an enthusiast discipline where overclockers try to push their hardware to extreme limits. Combining powerful cooling solutions like liquid nitrogen (LN2), which reaches sub-zero temperatures alongside modified hardware, the silicon can output tremendous power. Today, we are witnessing a custom XOC (eXtreme OverClocking) BIOS for the NVIDIA GeForce RTX 3090 Ti graphics card that can push the GA102 SKU to impressive 890 Watts of power, representing almost a two-fold increase to the stock TDP. Enthusiasts pursuing large frequencies with their RTX 3090 Ti are likely users of this XOC BIOS. However, most likely, we will see GALAX HOF or EVGA KINGPIN cards with dual 16-pin power connectors utilize this.

As shown below, MEGAsizeGPU, the creator of this BIOS, managed to push his ASUS GeForce RTX 3090 Ti TUF with XOC BIOS to 615 Watts, so KINGPIN and HOF designs will have to be used to draw all the possible heat. The XOC BIOS was uploaded to our VGA BIOS database, however, caution is advised as this can break your graphics card.
Sources: MEGAsizeGPU, via VideoCardz, XOC BIOS (TechPowerUp VGA DataBase)
Add your own comment

63 Comments on NVIDIA GeForce RTX 3090 Ti Gets Custom 890 Watt XOC BIOS

#26
Slizzo
ARFWhy does nvidia use the Ti suffix and not Ultra anymore? Ultra is its legacy naming...
Ti was a thing before Ultra.
Posted on Reply
#27
wolf
Better Than Native
lexluthermiesterThis is getting silly. 890w?!? NVidia needs a reality check and fast.
I mean, it's not an official BIOS though?
lexluthermiesterNo, but they hold the VBIOS encryption keys & tools and would have approved it. They are also apparently preparing a 900w 4090.
I get this point but it's still unoffical, not to mention almost entirely pointless. Nvidia did not release this "product"

I'd put money on there being no 900w 4090. If that tweeted rumor is to be even half believed, my bet is it's 2x450w GPU's being tested together to see what they can get out of them.
Posted on Reply
#28
ModEl4
lol, I can't understand how only we hear about stories like this (ada102 will be 900W etc) but when some reputable sites do undervolting testing at 300W and find it matching 3090, these don't make news at all.
Imagine if Nvidia offered a "green" bios option in the ada series with focus on aggressive lower power consumption (and benefits like noise etc) that would have been an amazing option
Edit:
Something like the below scenario with only 5% performance loss or around that range would be great:
Highest AD102 580W standard -> 450W green bios
Lowest AD102 450W standard -> 350W green bios
Highest AD103 350W standard -> 270W green bios
Posted on Reply
#29
DeathtoGnomes
lexluthermiesterMy concern is not so much the high power usage, but with the raw wattage. In combination with other parts in the subject PC and adding to that everything else that will be on a typical home power circuit, these high wattage cards are getting very close to overloading typical household power limits. I live in an home built in the 1980's and it is equipped with 1500w circuit breakers. I'm not rewiring my house just for a high performance PC.
This was my arguement on the 900w announcement thread.

Maybe we can power these high wattage cards with an external power cord long enough to run or another outlet on another circuit.:shadedshu:
Posted on Reply
#30
watzupken
lexluthermiesterThis is getting silly. 890w?!? NVidia needs a reality check and fast.
I agree. I feel Nvidia is trying to soften the blow when they release Ada Lovelace by allowing significantly higher power limit on Ampere. But if the rumored power limit on AMD RDNA3 is true, it will still make Nvidia's Ada Lovelace look really dumb.
Posted on Reply
#31
the54thvoid
Super Intoxicated Moderator
ModEl4lol, I can't understand how only we hear about stories like this (ada102 will be 900W etc) but when some reputable sites do undervolting testing at 300W and find it matching 3090, these don't make news at all.
Imagine if Nvidia offered a "green" bios option in the ada series with focus on aggressive lower power consumption (and benefits like noise etc) that would have been an amazing option
Edit:
Something like the below scenario with only 5% performance loss or around that range would be great:
Highest AD102 580W standard -> 450W green bios
Lowest AD102 450W standard -> 350W green bios
Highest AD103 350W standard -> 270W green bios
I wouldn't imagine a scenario where someone busy a more expensive card to run it as though it was a cheaper card with lower performance. And before you point out undervolting can do the same thing, if the cards were designed to effectively run at far lower wattage with the same performance, Nvidia would have done that themselves.
Posted on Reply
#32
ModEl4
the54thvoidI wouldn't imagine a scenario where someone busy a more expensive card to run it as though it was a cheaper card with lower performance. And before you point out undervolting can do the same thing, if the cards were designed to effectively run at far lower wattage with the same performance, Nvidia would have done that themselves.
Yes sure it's not a possible scenario, but since the performance loss is miniscule (5-8%) and the power consumption decrease so great, i wouldn't mind if it came true.
SlizzoTi was a thing before Ultra.
was it though?
I don't remember very well that era, but we had a TNT2 Ultra back then, before TNT2 we had TNT and RIVA128, was there a Ti TNT ot Riva model?
Posted on Reply
#33
agent_x007
the54thvoidAnd before you point out undervolting can do the same thing, if the cards were designed to effectively run at far lower wattage with the same performance, Nvidia would have done that themselves.
They wouldn't, because less voltage per MHz = less dies that pass validation (they are matching for spec MHz after all, and not voltage needed to do it).
It's more economical for them to run highest voltage possible to get most out of the lower quality dies. Yes, it sucks for consumer's power bill and heat that get's dumped into case, but NV bottom line doesn't care about those.

To me, NV at this point should just put two vBIOS'es on cards, one with 0.7 - 0.8V max. vGPU, and the other with 1.0xy V as max. User picks if he wants a heater with max performance, or best FPS/W he can get.
Posted on Reply
#34
Palladium
900W? lol.

Heck, the 3070 Ti being 5% faster than the non-Ti for 36% more power draw was stupid enough on my book.
Posted on Reply
#35
watzupken
ModEl4lol, I can't understand how only we hear about stories like this (ada102 will be 900W etc) but when some reputable sites do undervolting testing at 300W and find it matching 3090, these don't make news at all.
Imagine if Nvidia offered a "green" bios option in the ada series with focus on aggressive lower power consumption (and benefits like noise etc) that would have been an amazing option
Edit:
Something like the below scenario with only 5% performance loss or around that range would be great:
Highest AD102 580W standard -> 450W green bios
Lowest AD102 450W standard -> 350W green bios
Highest AD103 350W standard -> 270W green bios
It makes little sense if you want to buy an Ada Lovelace GPU only to limit it to run as fast as a RTX 3090. Then you are better off just getting the RTX 3090 and undervolting it in the first place. Undervolting is not some magic that just works on every single card out there. The mileage just like overclocking is subjective. The reason why Nvidia is pushing the GPU so hard which requires such a high power consumption may be to try and keep up with competition. When you cannot go as wide, you have to keep up by pushing clockspeed, which is not going to be pretty when it comes to power consumption. Rumors of Navi 31 cards have 2.5x more CUs than existing Navi 21, while Nvidia is not able to double their CUDA cores with Ada Lovelace.

Based on the numbers you provided, I will be surprised that you just lose just 5% performance. I don't even know how you derive the "magical" 5% performance lost.
Palladium900W? lol.

Heck, I would pick 3070 non-Ti over the Ti just for 80W less power draw alone.
Very few people will buy a card that draws 900W of power. Even if I could afford it, I won't, unless I have some specialized need for such a card. Buyers will generally be people that need the CUDA/ Tensor cores for their work, or hardcore PC enthusiasts. Not only will the card cost a bomb, but you need some hard core cooling to keep the card in a manageable temp. And even if you have some custom water cooler setup for it, you need very power air conditioning in your room/ enclosed area to avoid the place becoming a sauna. Even with current higher end cards, I am observing room temps creeping up whenever the GPU is under sustained load.
Posted on Reply
#36
BlaezaLite
I can understand this power for ln2 overclocking, but average Joe is going to never push that much wattage through there whole system.
Posted on Reply
#37
agent_x007
It doesn't matter how much TDP card has, just put it at voltage your environment can cope with. No point in forcing card to boost higher than throttle point (regardless, if it's TDP or Temp based).
Posted on Reply
#38
R00kie
ModEl4was it though?
I don't remember very well that era, but we had a TNT2 Ultra back then, before TNT2 we had TNT and RIVA128, was there a Ti TNT ot Riva model?
That's correct, TNT2 Ultra was the first card with that moniker, The first card that had Ti in its name was a Geforce 2
Posted on Reply
#39
Batailleuse
King MustardSure, but the RTX 4090 (and 4080) will beat the 6950 XT 16 GB.

They'll continue to leapfrog each other, generally because they release at different periods.
yeah, but we are in a generation.

if the refresh high end RX6000 beats the already refresh 3090ti... you know who actually has the best product this gen.

now thats benchmarks, in game it a bot different, amd is good, but usually nvidia is slightly better in most games (not all) \

and at this point AMD is beating Nvidia for cheaper. A 6900XT can be gotten for under 1000$, try finding a 3090 for 1000$ i won't even mention the 3090ti and the ridiculous 2000$ price tag.

sure, they'll leapfrog on a per release status, but within a gen AMD seems to be winning... and you dont need 890W TDP to do that.
Posted on Reply
#40
Chomiq

Not for regular users, makes sense only when LN2 overclocking.
Posted on Reply
#42
levima43
SlizzoTi was a thing before Ultra.
The GeForce 2 Ultra came out in September 2000 and the GeForce 2 Ti came out in October 2001.

[EDIT] Forgot about TNT2 Ultra, woops.
Posted on Reply
#43
ModEl4
watzupkenIt makes little sense if you want to buy an Ada Lovelace GPU only to limit it to run as fast as a RTX 3090. Then you are better off just getting the RTX 3090 and undervolting it in the first place. Undervolting is not some magic that just works on every single card out there. The mileage just like overclocking is subjective. The reason why Nvidia is pushing the GPU so hard which requires such a high power consumption may be to try and keep up with competition. When you cannot go as wide, you have to keep up by pushing clockspeed, which is not going to be pretty when it comes to power consumption.
You misunderstood, with the comment regarding undervolting testing not making the news i was referring in some sites like Igor's that already undervolted the 3090Ti to 300W (so no Ada which hasn't released yet...) and found out that it nearly matches reference 3090 (it's -5% slower vs 3090 Suprim X and -1.4% vs 3080Ti Suprim X) so although the 3090 Ti 480W Suprim X consumes +60% more than the undervolted to 300W card, the performance loss for the 300W card is only 10%.
watzupkenRumors of Navi 31 cards have 2.5x more CUs than existing Navi 21, while Nvidia is not able to double their CUDA cores with Ada Lovelace.
The architectures are not going to be the same, so we don't know what performance it's company is going to extract based only on the CU count, let's suppose that the leaks are correct about CU/Cuda core counts, so we know them, but you know what we also know, we know the frequency capabilities of each process according to TSMC (OC on air premium cards):
16nm TSMC 100% 2GHz
8nm Samsung 100-102.5% 2.05GHz
7nm TSMC 135-140% 2.7-2.8GHz
6nm TSMC 141.5-147% 2.835-2.94GHz
5nm TSMC 155-160% 3.1-3.2GHz
Of course the Architectures must be optimised for high frequency to hit these theoretical differences.
So the jump in frequency for Nvidia probably isn't going to be the same as AMD's and of course more importantly there are more technical reasons like the pixel-fillrate/bandwidth ratios deltas for the new architectures vs the old ones, the pixel-fillrate/texel-fillrate/FP32 TF ratio which for the AD102 is going to be the same as GA102, while AMD's ratio isn't going to be the same, Nvidia's infinity cache addition while AMD already had it in Navi 21 etc...
watzupkenBased on the numbers you provided, I will be surprised that you just lose just 5% performance. I don't even know how you derive the "magical" 5% performance lost.


Very few people will buy a card that draws 900W of power. Even if I could afford it, I won't, unless I have some specialized need for such a card. Buyers will generally be people that need the CUDA/ Tensor cores for their work, or hardcore PC enthusiasts. Not only will the card cost a bomb, but you need some hard core cooling to keep the card in a manageable temp. And even if you have some custom water cooler setup for it, you need very power air conditioning in your room/ enclosed area to avoid the place becoming a sauna. Even with current higher end cards, I am observing room temps creeping up whenever the GPU is under sustained load.
To be fair i said 5% or around that range, and then i clarified in my next post with a 5-8%.
When 300W 3090Ti is only 10% slower vs 480W 3090Ti and the power difference is +60% , i assumed that with a +29% power difference 270W->350W , 350W->450W, 450W->580W (or 470W->600W if the TDP ends up at 600W) the performance deficit is going to be in the 5-8% range.
I didn't thought about it too much but it doesn't sound unreasonable imo.
Regarding your comment about power consumption/900W/buyers of that card etc i agree 100%, i said in the past (for 3090Ti also) that with so much power consumption for me the performance is irrelevant, (especially if the delta vs 3090 is so little).
Posted on Reply
#44
Aquinus
Resident Wat-man
How the hell do they expect to keep a chip cool at 900w? This has thermal throttling written all over it. It's almost like the crap Intel is doing where it just keeps feeding the chip more power until you hit thermal design limits, then the boost algo will just keep adjusting power consumption to bump against that limit. Hell, even my Vega 64 at 330w gets really pretty toasty. I couldn't imagine twice the power consumption, forget almost 3 times that.
Posted on Reply
#45
ARF
AquinusHow the hell do they expect to keep a chip cool at 900w? This has thermal throttling written all over it. It's almost like the crap Intel is doing where it just keeps feeding the chip more power until you hit thermal design limits, then the boost algo will just keep adjusting power consumption to bump against that limit. Hell, even my Vega 64 at 330w gets really pretty toasty. I couldn't imagine twice the power consumption, forget almost 3 times that.
I am afraid the only thing nvidia thinks about is its reputation and how its card will fare in the benchmarks.
No one cares that 60% higher power consumption gives only 10% higher performance back.

I think AMD also made that mistake by heavily overvolting Navi 10 aka Radeon RX 5700 XT which is a mid-range card marketed as not mid-range card.
Posted on Reply
#47
Aquinus
Resident Wat-man
Chomiq
Interesting that he hasn't shown the GPU thermals.
He probably needed phase change or LN2 to cool it. :laugh:
Posted on Reply
#48
Frick
Fishfaced Nincompoop
wolfI suppose it would make for some fun extreme cooling bench runs gunning for top spots, outside of that.. *yawn*
Isn't that exactly the point?
AquinusHe probably needed phase change or LN2 to cool it. :laugh:
Again, isn't that the point?
Posted on Reply
#49
ARF
FrickIsn't that exactly the point?


Again, isn't that the point?
No.
For benching purposes - yes, but making news of it as if everyone is obliged to care about it - no.
Posted on Reply
#50
Aquinus
Resident Wat-man
FrickAgain, isn't that the point?
I thought the point was so nVidia could continue to claim that they have the fastest GPU for gaming. I'm pretty sure that consumers are an afterthought when it comes to a product like this.
Posted on Reply
Add your own comment
Feb 17th, 2025 02:18 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts