Thursday, May 5th 2022
![NVIDIA](https://tpucdn.com/images/news/nvidia-v1739475473466.png)
NVIDIA GeForce RTX 3090 Ti Gets Custom 890 Watt XOC BIOS
Extreme overclocking is an enthusiast discipline where overclockers try to push their hardware to extreme limits. Combining powerful cooling solutions like liquid nitrogen (LN2), which reaches sub-zero temperatures alongside modified hardware, the silicon can output tremendous power. Today, we are witnessing a custom XOC (eXtreme OverClocking) BIOS for the NVIDIA GeForce RTX 3090 Ti graphics card that can push the GA102 SKU to impressive 890 Watts of power, representing almost a two-fold increase to the stock TDP. Enthusiasts pursuing large frequencies with their RTX 3090 Ti are likely users of this XOC BIOS. However, most likely, we will see GALAX HOF or EVGA KINGPIN cards with dual 16-pin power connectors utilize this.
As shown below, MEGAsizeGPU, the creator of this BIOS, managed to push his ASUS GeForce RTX 3090 Ti TUF with XOC BIOS to 615 Watts, so KINGPIN and HOF designs will have to be used to draw all the possible heat. The XOC BIOS was uploaded to our VGA BIOS database, however, caution is advised as this can break your graphics card.
Sources:
MEGAsizeGPU, via VideoCardz, XOC BIOS (TechPowerUp VGA DataBase)
As shown below, MEGAsizeGPU, the creator of this BIOS, managed to push his ASUS GeForce RTX 3090 Ti TUF with XOC BIOS to 615 Watts, so KINGPIN and HOF designs will have to be used to draw all the possible heat. The XOC BIOS was uploaded to our VGA BIOS database, however, caution is advised as this can break your graphics card.
63 Comments on NVIDIA GeForce RTX 3090 Ti Gets Custom 890 Watt XOC BIOS
I'd put money on there being no 900w 4090. If that tweeted rumor is to be even half believed, my bet is it's 2x450w GPU's being tested together to see what they can get out of them.
Imagine if Nvidia offered a "green" bios option in the ada series with focus on aggressive lower power consumption (and benefits like noise etc) that would have been an amazing option
Edit:
Something like the below scenario with only 5% performance loss or around that range would be great:
Highest AD102 580W standard -> 450W green bios
Lowest AD102 450W standard -> 350W green bios
Highest AD103 350W standard -> 270W green bios
Maybe we can power these high wattage cards with an external power cord long enough to run or another outlet on another circuit.:shadedshu:
I don't remember very well that era, but we had a TNT2 Ultra back then, before TNT2 we had TNT and RIVA128, was there a Ti TNT ot Riva model?
It's more economical for them to run highest voltage possible to get most out of the lower quality dies. Yes, it sucks for consumer's power bill and heat that get's dumped into case, but NV bottom line doesn't care about those.
To me, NV at this point should just put two vBIOS'es on cards, one with 0.7 - 0.8V max. vGPU, and the other with 1.0xy V as max. User picks if he wants a heater with max performance, or best FPS/W he can get.
Heck, the 3070 Ti being 5% faster than the non-Ti for 36% more power draw was stupid enough on my book.
Based on the numbers you provided, I will be surprised that you just lose just 5% performance. I don't even know how you derive the "magical" 5% performance lost. Very few people will buy a card that draws 900W of power. Even if I could afford it, I won't, unless I have some specialized need for such a card. Buyers will generally be people that need the CUDA/ Tensor cores for their work, or hardcore PC enthusiasts. Not only will the card cost a bomb, but you need some hard core cooling to keep the card in a manageable temp. And even if you have some custom water cooler setup for it, you need very power air conditioning in your room/ enclosed area to avoid the place becoming a sauna. Even with current higher end cards, I am observing room temps creeping up whenever the GPU is under sustained load.
if the refresh high end RX6000 beats the already refresh 3090ti... you know who actually has the best product this gen.
now thats benchmarks, in game it a bot different, amd is good, but usually nvidia is slightly better in most games (not all) \
and at this point AMD is beating Nvidia for cheaper. A 6900XT can be gotten for under 1000$, try finding a 3090 for 1000$ i won't even mention the 3090ti and the ridiculous 2000$ price tag.
sure, they'll leapfrog on a per release status, but within a gen AMD seems to be winning... and you dont need 890W TDP to do that.
Not for regular users, makes sense only when LN2 overclocking.
[EDIT] Forgot about TNT2 Ultra, woops.
16nm TSMC 100% 2GHz
8nm Samsung 100-102.5% 2.05GHz
7nm TSMC 135-140% 2.7-2.8GHz
6nm TSMC 141.5-147% 2.835-2.94GHz
5nm TSMC 155-160% 3.1-3.2GHz
Of course the Architectures must be optimised for high frequency to hit these theoretical differences.
So the jump in frequency for Nvidia probably isn't going to be the same as AMD's and of course more importantly there are more technical reasons like the pixel-fillrate/bandwidth ratios deltas for the new architectures vs the old ones, the pixel-fillrate/texel-fillrate/FP32 TF ratio which for the AD102 is going to be the same as GA102, while AMD's ratio isn't going to be the same, Nvidia's infinity cache addition while AMD already had it in Navi 21 etc... To be fair i said 5% or around that range, and then i clarified in my next post with a 5-8%.
When 300W 3090Ti is only 10% slower vs 480W 3090Ti and the power difference is +60% , i assumed that with a +29% power difference 270W->350W , 350W->450W, 450W->580W (or 470W->600W if the TDP ends up at 600W) the performance deficit is going to be in the 5-8% range.
I didn't thought about it too much but it doesn't sound unreasonable imo.
Regarding your comment about power consumption/900W/buyers of that card etc i agree 100%, i said in the past (for 3090Ti also) that with so much power consumption for me the performance is irrelevant, (especially if the delta vs 3090 is so little).
No one cares that 60% higher power consumption gives only 10% higher performance back.
I think AMD also made that mistake by heavily overvolting Navi 10 aka Radeon RX 5700 XT which is a mid-range card marketed as not mid-range card.
Interesting that he hasn't shown the GPU thermals.
For benching purposes - yes, but making news of it as if everyone is obliged to care about it - no.