Friday, March 5th 2021

GALAX GeForce RTX 3090 Hall Of Fame (HOF) Edition GPU Benched with Custom 1000 W vBIOS

GALAX, the maker of the popular premium Hall Of Fame (HOF) edition of graphics cards, has recently announced its GeForce RTX 3090 HOF Edition GPU. Designed for extreme overclocking purposes, the card is made with a 12 layer PCB, 26 phase VRM power delivery configuration, and three 8-pin power connectors. Today, we have managed to get the first comprehensive review of the card by a Chinese YouTube channel 二斤自制. However, this wasn't just regular testing being conducted on a card with factory settings. The channel has applied 1000 Watt vBIOS to the GPU and ran it all on the air cooler the GPU comes with.

In the default 420 Watt setting, the card has been running with a GPU clock of 1845 MHz and a temperature of 69 degrees Celsius. However, when the 1000 Watt vBIOS was applied to the card, the GPU core has managed to ramp to 2000 MHz and consume as much as 630 W of power. If you were wondering if the stock cooler was able to handle it all, the answer is yes. The card has reached a toasty 96 C temperature. While GALAX doesn't offer BIOS like this, the ID of the BIOS corresponds to that of a custom XOC 1000 W BIOS for EVGA Kingpin GeForce RTX 3090 GPU, which you can find in our database. When it comes to performance, the gains were very minimal at only 2-3%. That must have been due to the insufficient cooling, and the card could have done much better on water or LN2. The Firestrike Ultra and Firestrike Extreme results are displayed below.
Sources: 二斤自制 (Chinese YouTube Channel), via VideoCardz
Add your own comment

36 Comments on GALAX GeForce RTX 3090 Hall Of Fame (HOF) Edition GPU Benched with Custom 1000 W vBIOS

#1
Ja.KooLit
So... in other words, no point if increase is very minimal...
Posted on Reply
#2
nguyen
Yeah I would rather that my 3090 remain in the 300 watts range for best efficiency.
Posted on Reply
#3
thesmokingman
However, when the 1000 Watt vBIOS was applied to the card, the GPU core has managed to ramp to 2000 MHz and consume as much as 630 W of power. If you were wondering if the stock cooler was able to handle it all, the answer is yes. The card has reached a toasty 96 C temperature.
LMAO, 630W power draw, smh.
Posted on Reply
#4
Mussels
Freshwater Moderator
And my Galax 3090 locked to 1800Mhz only uses 230W

These cards are NOT efficient when ramped up
Posted on Reply
#5
henok.gk
Best for taking over 3dmark leaderbords with LN2 of course.
Posted on Reply
#7
HenrySomeone
night.foxSo... in other words, no point if increase is very minimal...
XOC also requires appropriate cooling to go with it (as stated) and it's obviously not meant for everyday use.
Posted on Reply
#8
mrthanhnguyen
Nowadays we probably see these babies in a mining rig instead of a gaming pc.
Posted on Reply
#9
Unregistered
Never liked their HoF cards. Especially in terms of aesthetics. LN2 cooling as a whole is just pointless showmanship to me. I'm more interested in what these cards can reach on their stock air coolers.

Also I got more efficiency out of my Ampere card when undervolting it rather than overvolting it.
#10
Fatalfury
Braggin Rights card..
not much practical Use to overclock nowdays.
Posted on Reply
#11
metalfiber
These results are from running AIDA64 GPU stress test and GPU-Z stress at the same time with no overclock...HOF? more like David Hasselhoff.

EVGA 3090 XC3 ULTRA HYBRID
Posted on Reply
#12
trog100
efficient dosnt come into it when 3dmark scores are the game.. i would have assumed most tpu enthusiasts knew this but it seems not..

trog
Posted on Reply
#13
GhostRyder
That's funny, I would love to play with a bios like that. I wonder how long the card would last like that just gaming.
Posted on Reply
#14
hat
Enthusiast
MusselsAnd my Galax 3090 locked to 1800Mhz only uses 230W

These cards are NOT efficient when ramped up
Is anything ever efficient when ramped up?
Posted on Reply
#15
Tatty_Two
Gone Fishing
Has my math gone bad (TBH it was never that good), how does a card draw 630w from three 8 pin PCI-E cables and a 75w motherboard slot or am I missing something? I make that 525w max.
Posted on Reply
#16
Berfs1
^needs AX1600i just for water cooling this motherf--ker^
Tatty_OneHas my math gone bad (TBH it was never that good), how does a card draw 630w from three 8 pin PCI-E cables and a 75w motherboard slot or am I missing something? I make that 525w max.
If the cables are good enough, they can handle more power. It isn't exactly "safe" to do this on a regular basis, but for overclocking and stuff, it has been a thing to pull more power than the connectors can officially support for about a decade now in extreme overclocking. But no, your math isn't wrong, 525W should be the maximum sustained power consumption for 3x 8 pin + PCIe slot.
Posted on Reply
#17
GhostRyder
Tatty_OneHas my math gone bad (TBH it was never that good), how does a card draw 630w from three 8 pin PCI-E cables and a 75w motherboard slot or am I missing something? I make that 525w max.
Maybe its overvolting from the connectors LOL. I mean I have done that to a motherboard and damaged one before :P
Posted on Reply
#18
Ubersonic
Tatty_OneHas my math gone bad (TBH it was never that good), how does a card draw 630w from three 8 pin PCI-E cables and a 75w motherboard slot or am I missing something? I make that 525w max.
You're thinking of what the connectors are rated for in the PCI-E specification, but the physical connectors are actually rated much higher in their design specification (what their designers/manufacturers say is safe) but the PCI-E spec errs on the side of caution. It's usual for GPU manufacturers to follow the PCI-E spec (IIRC they can't put the PCI-E logo on the box if they don't) but you do occasionally see AIBs release cards like this that can violate the PCI-E spec out of the box (AMD/Nvidia themselves have done it too but it's very rare, I think the last time was the reference RX480).

This is one of the reasons Nvidia "invented" their 12pin connector for the RTX3000 series, because by using a single 12pin micro connector from Molex they could ignore the PCI-E spec without breaking it. And because it wasn't part of the PCI-E spec they could claim it was rated for whatever wattage they felt like (as long as it didn't exceed Molexs own specs ofc). Had the 12pin connector been part of the PCI-E spec it would only have been rated for 100-150w.
Posted on Reply
#19
Tatty_Two
Gone Fishing
Yeah, I suppose thinking about it most single 8 pin cards tend to have a max TDP rating (on the box) of around 180 - 200w probably for a safety margin, my 2060 Super is rated at 200w but as many of W1z's recent reviews show, they commonly exceed the ratings, my card overclocks so well it throttles even at the slider max +14% TDP.
Posted on Reply
#20
erek
The real king is the GALAX Dual 3090 HOF Limited



this example has Dual GA100 chips, not GA102s
Posted on Reply
#21
lesovers
Fire Strike Extreme and Fire Strike Ultra is not the best benchmarks for showing off the 3090 in overclocking. My stock reference 6900XT gives better results just by turning the power up;

25523 (27679 Graphics) - Fire Strike Extreme
13879 (13970 Graphics) - Fire Strike Ultra
Posted on Reply
#22
TheDeeGee
But they're using a riser cable... arn't those 3.0 spec?
Posted on Reply
#23
Jism
thesmokingmanLMAO, 630W power draw, smh.
As frequency scales up, so does power requirement. There's a line in between best possible efficiency or going beyond that in exchange for higer power consumption.

The thing is; the 3090 has so many little cores in there, a small increase can run into hundred of watts. AMD is right. The future of GPU's is'nt one big phat chip anymore, but multiple chips put together.
Posted on Reply
#24
HenrySomeone
Hmmm, we saw multiple chips on one board before (up to 4 in fact) and it didn't end that well... Yes, I know the implementation that is in the works now is different, but there are still quite a few honest doubts to be had about how well this will turn out (and when).
Posted on Reply
#25
Berfs1
MusselsAnd my Galax 3090 locked to 1800Mhz only uses 230W

These cards are NOT efficient when ramped up
just a thought, what if we liquid cooled the graphics cards? Maybe the point of diminishing returns is when under load, the GPU hits mid 40s, and the clocks and voltages needed to hit that would be probably the most optimal point before noticeable diminishing returns. NVIDIA had a really good cooler design, then decided "oh yeah lets just overclock the absofuckingshit out of these cards so that amd doesnt stand a chance", which is exactly why NVIDIA's Ampere cards don't have as good of a performance/watt as they could have.

On an off topic note, AMD's 6700 XT is not going to be a reliable card. That card's base clock is at 2300+ MHz. Yes, base clock. That card will degrade super fast.
Posted on Reply
Add your own comment
Dec 18th, 2024 10:17 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts