• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Galaxy Readies Dual-Fermi Graphics Card

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,676 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Galaxy is finally breaking ground on graphics cards with two GF100 "Fermi" GPUs from NVIDIA, with the company displaying one such design sample at the ongoing Computex event. The dual-Fermi board uses essentially the same design NVIDIA has been using for generations of its dual-GPU cards, involving an internal SLI between two GPUs, which connect to the system bus via an nForce 200 bridge chip, and are Quad SLI capable.

The power conditioning and distribution on this design consists of two sets of 4+1 phase VRM, the card draws power from two 8-pin PCI-Express power connectors. The GPUs carry the marking "GF100-030-A3", which indicates that it has the configuration of GeForce GTX 465, and since we count 8 memory chips per GPU system with no traces indicative of the other two memory chips per GPU sitting on their own memory channels, on the reverse side of the PCB, it is likely that the GPUs have a 256-bit wide memory interface. Galaxy, however, calls the card GTX 470 Dual. Output connectivity includes 3 DVI-D, with a small air-vent. It's likely that the cooler Galaxy designs will dissipate hot air around the graphics card, rather than out through the rear-panel.



View at TechPowerUp Main Site
 
Last edited:
Hmmmm with this setup and the hot air dissipated inside, I could turn my case into a nice little hibernation chamber for all sorts of little furry creatures :)
 
Why dont they just put two of these in a single large copper block? Just like those pendrives in cement.

PS.This card reminds me of a motherboard :o
 
When i saw this, one thing came to mind:
System meltdown ! :laugh:
 
nice very nice
 
Message to Nvidia & AMD:

We don't need those expensive heaters, work on lowering power consumption instead !
 
because nvidia cards scale so well in sli this would only need to be a dual GTX465 to compete well against the 5970
 
CPU phases... one bloody huge card too.

I like how nvidia is giving AIBs more freedom... they have the propensity to develop better cards.

Just noticed the sli finger is there O_o

I think they should update the ATX spec to support such huge GPUs that need that much cooling.

this card will cost at least $1000USD...
 
Ok, this was my first reaction when I saw it:

wtf-cat.jpg


My jaw dropped on the floor.

That just looks like some very organized circuitry, very well done Galaxy!
How come it only has two 8-pin plugs?? Shouldn't it be...3 or 4?
and +1 on the previous comments about heating...:roll:

EDIT: I bet the cooler for this will massive! If it is possible to be air-cooled...
 
Are they mad? They can barely cool one, nevermind two. Fermi would need a die shrink before it is even reasonable to put two on one card.
 
It would of been impressive, if they actually had it up and running in a case, without bursting into flames or melting :laugh: I must say that if this actually works, it's a darn good job from Galaxy. I was expecting a dual-GPU card first when GF104 comes out. If this is two GTX 470s, then they would have to be seriously downclocked and they would probably have to kill some SPUs too, because a GTX 470 consumes almost as much as a 5970. Two GTX 465 seems more realistic to me. It would be quite a fail though if the card ends up being slower or at the same speed as the 5970, but will cost a lot more and use more power.
 
Ugh. This card is going to be terrible. Even if they use two 465s, that is 200W TDP each, totalling to 400W! The thing has 2 8 pin PCI-E plugs which can do MAX 150W and the PCI-E socket provides MAX 75W. That's 375W total, and even if the TDP figures are accurate (which they never are) this thing will pull over 450W at least in FurMark. I can't see any cooler being good enough to keep this thing under 100C. Only water cooling would be viable but even that would be a struggle if it's on the same loop as the CPU.

Seriously by the time they get the TDP good enough to work, it simply won't be powerful enough to beat a 5970.

EDIT: Just noticed in the picture it says below it 'GTX 470 Dual' so that's 215W x 2 = 430W. FAIL!
 
Last edited:
As long as "GF100-030-A3" is written on the gpus this will only be a dual GTX465 card. Even the memory quantity points to this conclusion.
 
Actually power consumption on Fermi is very dependant on temperature people remember that. There is a massive drop when temperatures are lower.
 
Ugh. This card is going to be terrible. Even if they use two 465s, that is 200W TDP each, totalling to 400W! The thing has 2 8 pin PCI-E plugs which can do MAX 150W and the PCI-E socket provides MAX 75W. That's 375W total, and even if the TDP figures are accurate (which they never are) this thing will pull over 450W at least in FurMark. I can't see any cooler being good enough to keep this thing under 100C. Only water cooling would be viable but even that would be a struggle if it's on the same loop as the CPU.

Seriously by the time they get the TDP good enough to work, it simply won't be powerful enough to beat a 5970.

EDIT: Just noticed in the picture it says below it 'GTX 470 Dual' so that's 215W x 2 = 430W. FAIL!

I thought that pcie v2 could do 150w not 75w like pcie v1. I could be wrong though and not for the first time :eek:

Like the others this just seems wrong. To get it working they would need to down clock it too much. Sli scaling is not that far off the 5xxx series many many sites have done lots of testing and while NV is ahead it is not actually by that much in most modern games. The card running dual 465 would not come close to a stock 5970 never mind the OC settings that most run. It would need to running more than 5870 xfire results at a lesser price to make it worth the purchase which is not going to happen. Nice option though having all 3 dvi outputs on one card for surround if they ever release the driers for it.

Actually power consumption on Fermi is very dependant on temperature people remember that. There is a massive drop when temperatures are lower.

Sorry that makes no sense. Yes if the gpu load is lower and the fan is not spinning as much power consumption will be lower. If you have the card on water at 100 gpu use the gpu will use the same power minus the fan (which uses ALOT of power for a fan around 20ish watts or even more cant remember from the water block reviews it is a 1.1+ amp fan) as the card will on air. Yes you may get a slight reduction in power consumption due to thermal dissipation and not as much of the power being converted into heat. (that last sentence might be better explained by someone who actually knows something about thermal dynamics.
 
Last edited:
Sorry that makes no sense. Yes if the gpu load is lower and the fan is not spinning as much power consumption will be lower. If you have the card on water at 100 gpu use the gpu will use the same power minus the fan (which uses ALOT of power for a fan around 20ish watts or even more cant remember from the water block reviews it is a 1.1+ amp fan) as the card will on air. Yes you may get a slight reduction in power consumption due to thermal dissipation and not as much of the power being converted into heat. (that last sentence might be better explained by someone who actually knows something about thermal dynamics.

I think the Doctor was referring to thermal runaway on the chips. The positive feedback on MOSFETs produced by heat.
 
I thought that pcie v2 could do 150w not 75w like pcie v1. I could be wrong though and not for the first time :eek:

Looking into it, you're right that the specifications state 150W. I dind't know that, but the main problem is the motherboard needs to be able to provide that much power. If not then your going to cook it. There are those boards out there with the additional PCI-E power connectors but they are still not that common.
 
I think the Doctor was referring to thermal runaway on the chips. The positive feedback on MOSFETs produced by heat.

yup, higher temperature: higher resistance leading to even more heat buildup (and eventually lots of fun things like large scale electron tunneling, which is buh bai chip function unless it plain ol' melts first!). Unfortunatley, how are galaxy going to keep those chips cool? The very good cooling system (which would pin a hd5870 to around 40-50C under heavy load) on the gtx480 still can't keep it under 90C.
 
well if this "Abomination " has 2 465 cores why do they call it 470 dual ?
 
As long as "GF100-030-A3" is written on the gpus this will only be a dual GTX465 card. Even the memory quantity points to this conclusion.

Very nice point, thanks.
 
I can't wait to see what the price will be for something like this:laugh:. You would have to convert a small fridge like college students use in their dorm rooms into a computer case just to keep your system from frying with these in them.

Nvidia also needs to take a play out of AMD's book and allow TRI SLi with a Dual GPU and Single GPU setup.
 
I am not really into it anymore ... so, one question: The problem with micro-jerking still is an issue in SLI-systems?
 
Also we should remember, more often than not dual GPU cards are downclocked and undervolted so I wouldnt expect to see anything like double the power requirements.
 
Back
Top