Wednesday, June 2nd 2010
Galaxy Readies Dual-Fermi Graphics Card
Galaxy is finally breaking ground on graphics cards with two GF100 "Fermi" GPUs from NVIDIA, with the company displaying one such design sample at the ongoing Computex event. The dual-Fermi board uses essentially the same design NVIDIA has been using for generations of its dual-GPU cards, involving an internal SLI between two GPUs, which connect to the system bus via an nForce 200 bridge chip, and are Quad SLI capable.
The power conditioning and distribution on this design consists of two sets of 4+1 phase VRM, the card draws power from two 8-pin PCI-Express power connectors. The GPUs carry the marking "GF100-030-A3", which indicates that it has the configuration of GeForce GTX 465, and since we count 8 memory chips per GPU system with no traces indicative of the other two memory chips per GPU sitting on their own memory channels, on the reverse side of the PCB, it is likely that the GPUs have a 256-bit wide memory interface. Galaxy, however, calls the card GTX 470 Dual. Output connectivity includes 3 DVI-D, with a small air-vent. It's likely that the cooler Galaxy designs will dissipate hot air around the graphics card, rather than out through the rear-panel.
Source:
HotHardware
The power conditioning and distribution on this design consists of two sets of 4+1 phase VRM, the card draws power from two 8-pin PCI-Express power connectors. The GPUs carry the marking "GF100-030-A3", which indicates that it has the configuration of GeForce GTX 465, and since we count 8 memory chips per GPU system with no traces indicative of the other two memory chips per GPU sitting on their own memory channels, on the reverse side of the PCB, it is likely that the GPUs have a 256-bit wide memory interface. Galaxy, however, calls the card GTX 470 Dual. Output connectivity includes 3 DVI-D, with a small air-vent. It's likely that the cooler Galaxy designs will dissipate hot air around the graphics card, rather than out through the rear-panel.
105 Comments on Galaxy Readies Dual-Fermi Graphics Card
PS.This card reminds me of a motherboard :o
System meltdown ! :laugh:
We don't need those expensive heaters, work on lowering power consumption instead !
I like how nvidia is giving AIBs more freedom... they have the propensity to develop better cards.
Just noticed the sli finger is there O_o
I think they should update the ATX spec to support such huge GPUs that need that much cooling.
this card will cost at least $1000USD...
My jaw dropped on the floor.
That just looks like some very organized circuitry, very well done Galaxy!
How come it only has two 8-pin plugs?? Shouldn't it be...3 or 4?
and +1 on the previous comments about heating...:roll:
EDIT: I bet the cooler for this will massive! If it is possible to be air-cooled...
Seriously by the time they get the TDP good enough to work, it simply won't be powerful enough to beat a 5970.
EDIT: Just noticed in the picture it says below it 'GTX 470 Dual' so that's 215W x 2 = 430W. FAIL!
Like the others this just seems wrong. To get it working they would need to down clock it too much. Sli scaling is not that far off the 5xxx series many many sites have done lots of testing and while NV is ahead it is not actually by that much in most modern games. The card running dual 465 would not come close to a stock 5970 never mind the OC settings that most run. It would need to running more than 5870 xfire results at a lesser price to make it worth the purchase which is not going to happen. Nice option though having all 3 dvi outputs on one card for surround if they ever release the driers for it. Sorry that makes no sense. Yes if the gpu load is lower and the fan is not spinning as much power consumption will be lower. If you have the card on water at 100 gpu use the gpu will use the same power minus the fan (which uses ALOT of power for a fan around 20ish watts or even more cant remember from the water block reviews it is a 1.1+ amp fan) as the card will on air. Yes you may get a slight reduction in power consumption due to thermal dissipation and not as much of the power being converted into heat. (that last sentence might be better explained by someone who actually knows something about thermal dynamics.
Nvidia also needs to take a play out of AMD's book and allow TRI SLi with a Dual GPU and Single GPU setup.
These, based on the memory configuration, are definitely GTX465 cores, unless Galaxy hid 2 memory chips on the back of the card.
And yeah, two GTX480s perform the same as two HD5970s, so really two GTX465s would probably scale well enough to match a single HD5970.
And if these are GTX265s, then we are only looking at power consumption in the 300w range at peak. That shouldn't be too give of an issue considering it is only about 20w beyond what the HD4870x2 ran at, and they were fine. It might not make any sense to you, but W1z's latest review of the GTX480 proves it. Fermi uses less power when it is cooler. The fan speed didn't effect the cards power consumption, in fact in his tests, when he lowered the fan speed via software, power consumption went up because the card was running hotter.
Now, we aren't sure if that is because the VRM area is getting hotter and less effecient, or if the GPU itself is less efficient due to voltage leakage, or a combination of both. However, the fact is, temperature and temperature alone is what is causing Fermi to consume so much power. In fact it is almost a exact linear progression, for every 1°C hotter the card runs, it needs 1.2w more power.