Friday, March 5th 2021
GALAX GeForce RTX 3090 Hall Of Fame (HOF) Edition GPU Benched with Custom 1000 W vBIOS
GALAX, the maker of the popular premium Hall Of Fame (HOF) edition of graphics cards, has recently announced its GeForce RTX 3090 HOF Edition GPU. Designed for extreme overclocking purposes, the card is made with a 12 layer PCB, 26 phase VRM power delivery configuration, and three 8-pin power connectors. Today, we have managed to get the first comprehensive review of the card by a Chinese YouTube channel 二斤自制. However, this wasn't just regular testing being conducted on a card with factory settings. The channel has applied 1000 Watt vBIOS to the GPU and ran it all on the air cooler the GPU comes with.
In the default 420 Watt setting, the card has been running with a GPU clock of 1845 MHz and a temperature of 69 degrees Celsius. However, when the 1000 Watt vBIOS was applied to the card, the GPU core has managed to ramp to 2000 MHz and consume as much as 630 W of power. If you were wondering if the stock cooler was able to handle it all, the answer is yes. The card has reached a toasty 96 C temperature. While GALAX doesn't offer BIOS like this, the ID of the BIOS corresponds to that of a custom XOC 1000 W BIOS for EVGA Kingpin GeForce RTX 3090 GPU, which you can find in our database. When it comes to performance, the gains were very minimal at only 2-3%. That must have been due to the insufficient cooling, and the card could have done much better on water or LN2. The Firestrike Ultra and Firestrike Extreme results are displayed below.
Sources:
二斤自制 (Chinese YouTube Channel), via VideoCardz
In the default 420 Watt setting, the card has been running with a GPU clock of 1845 MHz and a temperature of 69 degrees Celsius. However, when the 1000 Watt vBIOS was applied to the card, the GPU core has managed to ramp to 2000 MHz and consume as much as 630 W of power. If you were wondering if the stock cooler was able to handle it all, the answer is yes. The card has reached a toasty 96 C temperature. While GALAX doesn't offer BIOS like this, the ID of the BIOS corresponds to that of a custom XOC 1000 W BIOS for EVGA Kingpin GeForce RTX 3090 GPU, which you can find in our database. When it comes to performance, the gains were very minimal at only 2-3%. That must have been due to the insufficient cooling, and the card could have done much better on water or LN2. The Firestrike Ultra and Firestrike Extreme results are displayed below.
36 Comments on GALAX GeForce RTX 3090 Hall Of Fame (HOF) Edition GPU Benched with Custom 1000 W vBIOS
Another fun thing about Nvidia top of the line GPU (1080Ti, 2080Ti and 3090) is that they always have XOC BIOS that bypass thermal and power limit, you can flash these BIOS onto your GPU, do some fun benchmark then back to original BIOS for gaming.
And i would really like to know when you say degrade super fast, what kind of window are we talking about 1 year, 2 year, or may be 6 months ?
www.techpowerup.com/forums/threads/6900xt-degrades-below-stock-speeds-within-4-weeks.279308/
This goes for almost anything, CPUs, RAM, etc. In a given generation/process node, there are yields. As an example, I will talk about RX 6000 GPUs as a whole. According to TechPowerUp's GPU database, all RX 6000 GPUs are running TSMC's 7nm node, and 6800+ is using Navi 21, 6700 XT is using Navi 22. But the transistors are the same, just, AMD isn't putting as many in Navi 22. The higher you clock a GPU, the lower yields you will get, that is just how engineering works. This process is called binning, and the GPU processors that can handle the higher speeds won't fail as quickly because while they all degrade at nearly the same rate (in the same generation/process node) there is a spec, and anything that meets spec or exceeds spec will be fine. However, if you spec a GPU w/ a significantly higher clock speed, you increase the silicon quality requirement needed to achieve that, and here is where it can get tricky. It either goes 1 way or the other: either the GPU with lower clock will be super stable and have almost zero failure rate and the GPU with higher clock is stable enough, OR the GPU with lower clock is stable enough, and the GPU with higher clock will have noticeable failure rates. The higher you clock a product, it will be less stable (given the same voltage). Sure, you can go from 2200 MHz to 2300 MHz on the same voltage, it just means you didn't actually need as high a voltage for 2200 MHz. The higher the clock speed, usually more voltage/current, and in the end, you want more cores, not more clock speed. For example, let's say AMD made a Navi GPU with 5120 cores and 1500 MHz core clock speed, that would be many times more reliable than making a Navi GPU with 2560 cores and 3000 MHz core clock speed, because after a certain point, there is diminishing returns with overclocking. There are more factors into play obviously, like cooling, current draw, etc, but all I was saying was, I expect the 6700 XT to have higher failure rates than the rest of the RX 6000 GPUs.
Also, 6800 XT and 6900 XT don't have as high base clocks; 6800XT and 6900XT GPUs are around 1800-1900 MHz base clock, not game clock. Smear campaign? Um... where are you getting this bullshit from? If yields are better on Navi 22, then alrighty, it won't degrade as quickly as I thought it would. I thought it was running on the same Navi 21 GPU. But yeah, generally speaking, higher clock speeds = higher voltage, and that's why I was saying it would degrade quicker, because of the higher voltage.
Either state something as an opinion and others can respect it, or state a fact with linked evidence. Then stop acting like children over it.