Wednesday, June 2nd 2010

Galaxy Readies Dual-Fermi Graphics Card

Galaxy is finally breaking ground on graphics cards with two GF100 "Fermi" GPUs from NVIDIA, with the company displaying one such design sample at the ongoing Computex event. The dual-Fermi board uses essentially the same design NVIDIA has been using for generations of its dual-GPU cards, involving an internal SLI between two GPUs, which connect to the system bus via an nForce 200 bridge chip, and are Quad SLI capable.

The power conditioning and distribution on this design consists of two sets of 4+1 phase VRM, the card draws power from two 8-pin PCI-Express power connectors. The GPUs carry the marking "GF100-030-A3", which indicates that it has the configuration of GeForce GTX 465, and since we count 8 memory chips per GPU system with no traces indicative of the other two memory chips per GPU sitting on their own memory channels, on the reverse side of the PCB, it is likely that the GPUs have a 256-bit wide memory interface. Galaxy, however, calls the card GTX 470 Dual. Output connectivity includes 3 DVI-D, with a small air-vent. It's likely that the cooler Galaxy designs will dissipate hot air around the graphics card, rather than out through the rear-panel.
Source: HotHardware
Add your own comment

105 Comments on Galaxy Readies Dual-Fermi Graphics Card

#1
Tatty_Two
Gone Fishing
Hmmmm with this setup and the hot air dissipated inside, I could turn my case into a nice little hibernation chamber for all sorts of little furry creatures :)
Posted on Reply
#2
caleb
Why dont they just put two of these in a single large copper block? Just like those pendrives in cement.

PS.This card reminds me of a motherboard :o
Posted on Reply
#3
Kiji
When i saw this, one thing came to mind:
System meltdown ! :laugh:
Posted on Reply
#5
blobster21
Message to Nvidia & AMD:

We don't need those expensive heaters, work on lowering power consumption instead !
Posted on Reply
#6
wolf
Better Than Native
because nvidia cards scale so well in sli this would only need to be a dual GTX465 to compete well against the 5970
Posted on Reply
#7
tkpenalty
CPU phases... one bloody huge card too.

I like how nvidia is giving AIBs more freedom... they have the propensity to develop better cards.

Just noticed the sli finger is there O_o

I think they should update the ATX spec to support such huge GPUs that need that much cooling.

this card will cost at least $1000USD...
Posted on Reply
#8
Regeneration
NGOHQ.COM
I don't know why, but the card looks a bit fake to me.
Posted on Reply
#9
_JP_
Ok, this was my first reaction when I saw it:



My jaw dropped on the floor.

That just looks like some very organized circuitry, very well done Galaxy!
How come it only has two 8-pin plugs?? Shouldn't it be...3 or 4?
and +1 on the previous comments about heating...:roll:

EDIT: I bet the cooler for this will massive! If it is possible to be air-cooled...
Posted on Reply
#10
FordGT90Concept
"I go fast!1!11!1!"
Are they mad? They can barely cool one, nevermind two. Fermi would need a die shrink before it is even reasonable to put two on one card.
Posted on Reply
#11
Yellow&Nerdy?
It would of been impressive, if they actually had it up and running in a case, without bursting into flames or melting :laugh: I must say that if this actually works, it's a darn good job from Galaxy. I was expecting a dual-GPU card first when GF104 comes out. If this is two GTX 470s, then they would have to be seriously downclocked and they would probably have to kill some SPUs too, because a GTX 470 consumes almost as much as a 5970. Two GTX 465 seems more realistic to me. It would be quite a fail though if the card ends up being slower or at the same speed as the 5970, but will cost a lot more and use more power.
Posted on Reply
#12
HillBeast
Ugh. This card is going to be terrible. Even if they use two 465s, that is 200W TDP each, totalling to 400W! The thing has 2 8 pin PCI-E plugs which can do MAX 150W and the PCI-E socket provides MAX 75W. That's 375W total, and even if the TDP figures are accurate (which they never are) this thing will pull over 450W at least in FurMark. I can't see any cooler being good enough to keep this thing under 100C. Only water cooling would be viable but even that would be a struggle if it's on the same loop as the CPU.

Seriously by the time they get the TDP good enough to work, it simply won't be powerful enough to beat a 5970.

EDIT: Just noticed in the picture it says below it 'GTX 470 Dual' so that's 215W x 2 = 430W. FAIL!
Posted on Reply
#13
Parad0x
As long as "GF100-030-A3" is written on the gpus this will only be a dual GTX465 card. Even the memory quantity points to this conclusion.
Posted on Reply
#14
DrPepper
The Doctor is in the house
Actually power consumption on Fermi is very dependant on temperature people remember that. There is a massive drop when temperatures are lower.
Posted on Reply
#15
poo417
HillBeastUgh. This card is going to be terrible. Even if they use two 465s, that is 200W TDP each, totalling to 400W! The thing has 2 8 pin PCI-E plugs which can do MAX 150W and the PCI-E socket provides MAX 75W. That's 375W total, and even if the TDP figures are accurate (which they never are) this thing will pull over 450W at least in FurMark. I can't see any cooler being good enough to keep this thing under 100C. Only water cooling would be viable but even that would be a struggle if it's on the same loop as the CPU.

Seriously by the time they get the TDP good enough to work, it simply won't be powerful enough to beat a 5970.

EDIT: Just noticed in the picture it says below it 'GTX 470 Dual' so that's 215W x 2 = 430W. FAIL!
I thought that pcie v2 could do 150w not 75w like pcie v1. I could be wrong though and not for the first time :eek:

Like the others this just seems wrong. To get it working they would need to down clock it too much. Sli scaling is not that far off the 5xxx series many many sites have done lots of testing and while NV is ahead it is not actually by that much in most modern games. The card running dual 465 would not come close to a stock 5970 never mind the OC settings that most run. It would need to running more than 5870 xfire results at a lesser price to make it worth the purchase which is not going to happen. Nice option though having all 3 dvi outputs on one card for surround if they ever release the driers for it.
DrPepperActually power consumption on Fermi is very dependant on temperature people remember that. There is a massive drop when temperatures are lower.
Sorry that makes no sense. Yes if the gpu load is lower and the fan is not spinning as much power consumption will be lower. If you have the card on water at 100 gpu use the gpu will use the same power minus the fan (which uses ALOT of power for a fan around 20ish watts or even more cant remember from the water block reviews it is a 1.1+ amp fan) as the card will on air. Yes you may get a slight reduction in power consumption due to thermal dissipation and not as much of the power being converted into heat. (that last sentence might be better explained by someone who actually knows something about thermal dynamics.
Posted on Reply
#16
_JP_
poo417Sorry that makes no sense. Yes if the gpu load is lower and the fan is not spinning as much power consumption will be lower. If you have the card on water at 100 gpu use the gpu will use the same power minus the fan (which uses ALOT of power for a fan around 20ish watts or even more cant remember from the water block reviews it is a 1.1+ amp fan) as the card will on air. Yes you may get a slight reduction in power consumption due to thermal dissipation and not as much of the power being converted into heat. (that last sentence might be better explained by someone who actually knows something about thermal dynamics.
I think the Doctor was referring to thermal runaway on the chips. The positive feedback on MOSFETs produced by heat.
Posted on Reply
#17
HillBeast
poo417I thought that pcie v2 could do 150w not 75w like pcie v1. I could be wrong though and not for the first time :eek:
Looking into it, you're right that the specifications state 150W. I dind't know that, but the main problem is the motherboard needs to be able to provide that much power. If not then your going to cook it. There are those boards out there with the additional PCI-E power connectors but they are still not that common.
Posted on Reply
#18
bobzilla2009
_JP_I think the Doctor was referring to thermal runaway on the chips. The positive feedback on MOSFETs produced by heat.
yup, higher temperature: higher resistance leading to even more heat buildup (and eventually lots of fun things like large scale electron tunneling, which is buh bai chip function unless it plain ol' melts first!). Unfortunatley, how are galaxy going to keep those chips cool? The very good cooling system (which would pin a hd5870 to around 40-50C under heavy load) on the gtx480 still can't keep it under 90C.
Posted on Reply
#19
TVman
well if this "Abomination " has 2 465 cores why do they call it 470 dual ?
Posted on Reply
#20
btarunr
Editor & Senior Moderator
Parad0xAs long as "GF100-030-A3" is written on the gpus this will only be a dual GTX465 card. Even the memory quantity points to this conclusion.
Very nice point, thanks.
Posted on Reply
#22
DarthCyclonis
I can't wait to see what the price will be for something like this:laugh:. You would have to convert a small fridge like college students use in their dorm rooms into a computer case just to keep your system from frying with these in them.

Nvidia also needs to take a play out of AMD's book and allow TRI SLi with a Dual GPU and Single GPU setup.
Posted on Reply
#23
Tannhäuser
I am not really into it anymore ... so, one question: The problem with micro-jerking still is an issue in SLI-systems?
Posted on Reply
#24
Tatty_Two
Gone Fishing
Also we should remember, more often than not dual GPU cards are downclocked and undervolted so I wouldnt expect to see anything like double the power requirements.
Posted on Reply
#25
newtekie1
Semi-Retired Folder
wolfbecause nvidia cards scale so well in sli this would only need to be a dual GTX465 to compete well against the 5970
Parad0xAs long as "GF100-030-A3" is written on the gpus this will only be a dual GTX465 card. Even the memory quantity points to this conclusion.
I was going to come in an say exactly these two points.

These, based on the memory configuration, are definitely GTX465 cores, unless Galaxy hid 2 memory chips on the back of the card.

And yeah, two GTX480s perform the same as two HD5970s, so really two GTX465s would probably scale well enough to match a single HD5970.

And if these are GTX265s, then we are only looking at power consumption in the 300w range at peak. That shouldn't be too give of an issue considering it is only about 20w beyond what the HD4870x2 ran at, and they were fine.
poo417Sorry that makes no sense. Yes if the gpu load is lower and the fan is not spinning as much power consumption will be lower. If you have the card on water at 100 gpu use the gpu will use the same power minus the fan (which uses ALOT of power for a fan around 20ish watts or even more cant remember from the water block reviews it is a 1.1+ amp fan) as the card will on air. Yes you may get a slight reduction in power consumption due to thermal dissipation and not as much of the power being converted into heat. (that last sentence might be better explained by someone who actually knows something about thermal dynamics.
It might not make any sense to you, but W1z's latest review of the GTX480 proves it. Fermi uses less power when it is cooler. The fan speed didn't effect the cards power consumption, in fact in his tests, when he lowered the fan speed via software, power consumption went up because the card was running hotter.

Now, we aren't sure if that is because the VRM area is getting hotter and less effecient, or if the GPU itself is less efficient due to voltage leakage, or a combination of both. However, the fact is, temperature and temperature alone is what is causing Fermi to consume so much power. In fact it is almost a exact linear progression, for every 1°C hotter the card runs, it needs 1.2w more power.
Posted on Reply
Add your own comment
Nov 17th, 2024 20:24 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts