# Galaxy Readies Dual-Fermi Graphics Card



## btarunr (Jun 2, 2010)

Galaxy is finally breaking ground on graphics cards with two GF100 "Fermi" GPUs from NVIDIA, with the company displaying one such design sample at the ongoing Computex event. The dual-Fermi board uses essentially the same design NVIDIA has been using for generations of its dual-GPU cards, involving an internal SLI between two GPUs, which connect to the system bus via an nForce 200 bridge chip, and are Quad SLI capable. 

The power conditioning and distribution on this design consists of two sets of 4+1 phase VRM, the card draws power from two 8-pin PCI-Express power connectors. The GPUs carry the marking "GF100-030-A3", which indicates that it has the configuration of GeForce GTX 465, and since we count 8 memory chips per GPU system with no traces indicative of the other two memory chips per GPU sitting on their own memory channels, on the reverse side of the PCB, it is likely that the GPUs have a 256-bit wide memory interface. Galaxy, however, calls the card GTX 470 Dual. Output connectivity includes 3 DVI-D, with a small air-vent. It's likely that the cooler Galaxy designs will dissipate hot air around the graphics card, rather than out through the rear-panel.



 

 

 



*View at TechPowerUp Main Site*


----------



## Tatty_One (Jun 2, 2010)

Hmmmm with this setup and the hot air dissipated inside, I could turn my case into a nice little hibernation chamber for all sorts of little furry creatures


----------



## caleb (Jun 2, 2010)

Why dont they just put two of these in a single large copper block? Just like those pendrives in cement.

PS.This card reminds me of a motherboard


----------



## Kiji (Jun 2, 2010)

When i saw this, one thing came to mind:
System meltdown !


----------



## Hunt3r (Jun 2, 2010)

nice very nice


----------



## blobster21 (Jun 2, 2010)

Message to Nvidia & AMD:

We don't need those expensive heaters, work on lowering power consumption instead !


----------



## wolf (Jun 2, 2010)

because nvidia cards scale so well in sli this would only need to be a dual GTX465 to compete well against the 5970


----------



## tkpenalty (Jun 2, 2010)

CPU phases... one bloody huge card too. 

I like how nvidia is giving AIBs more freedom... they have the propensity to develop better cards.

Just noticed the sli finger is there 

I think they should update the ATX spec to support such huge GPUs that need that much cooling.

this card will cost at least $1000USD...


----------



## Regeneration (Jun 2, 2010)

I don't know why, but the card looks a bit fake to me.


----------



## _JP_ (Jun 2, 2010)

Ok, this was my first reaction when I saw it:







My jaw dropped on the floor.

That just looks like some very organized circuitry, very well done Galaxy!
How come it only has two 8-pin plugs?? Shouldn't it be...3 or 4?
and +1 on the previous comments about heating...

EDIT: I bet the cooler for this will massive! If it is possible to be air-cooled...


----------



## FordGT90Concept (Jun 2, 2010)

Are they mad?  They can barely cool one, nevermind two.  Fermi would need a die shrink before it is even reasonable to put two on one card.


----------



## Yellow&Nerdy? (Jun 2, 2010)

It would of been impressive, if they actually had it up and running in a case, without bursting into flames or melting  I must say that if this actually works, it's a darn good job from Galaxy. I was expecting a dual-GPU card first when GF104 comes out. If this is two GTX 470s, then they would have to be seriously downclocked and they would probably have to kill some SPUs too, because a GTX 470 consumes almost as much as a 5970. Two GTX 465 seems more realistic to me. It would be quite a fail though if the card ends up being slower or at the same speed as the 5970, but will cost a lot more and use more power.


----------



## HillBeast (Jun 2, 2010)

Ugh. This card is going to be terrible. Even if they use two 465s, that is 200W TDP each, totalling to 400W! The thing has 2 8 pin PCI-E plugs which can do MAX 150W and the PCI-E socket provides MAX 75W. That's 375W total, and even if the TDP figures are accurate (which they never are) this thing will pull over 450W at least in FurMark. I can't see any cooler being good enough to keep this thing under 100C. Only water cooling would be viable but even that would be a struggle if it's on the same loop as the CPU.

Seriously by the time they get the TDP good enough to work, it simply won't be powerful enough to beat a 5970.

EDIT: Just noticed in the picture it says below it 'GTX 470 Dual'  so that's 215W x 2 = 430W. FAIL!


----------



## Parad0x (Jun 2, 2010)

As long as "GF100-030-A3" is written on the gpus this will only be a dual GTX465 card. Even the memory quantity points to this conclusion.


----------



## DrPepper (Jun 2, 2010)

Actually power consumption on Fermi is very dependant on temperature people remember that. There is a massive drop when temperatures are lower.


----------



## poo417 (Jun 2, 2010)

HillBeast said:


> Ugh. This card is going to be terrible. Even if they use two 465s, that is 200W TDP each, totalling to 400W! The thing has 2 8 pin PCI-E plugs which can do MAX 150W and the PCI-E socket provides MAX 75W. That's 375W total, and even if the TDP figures are accurate (which they never are) this thing will pull over 450W at least in FurMark. I can't see any cooler being good enough to keep this thing under 100C. Only water cooling would be viable but even that would be a struggle if it's on the same loop as the CPU.
> 
> Seriously by the time they get the TDP good enough to work, it simply won't be powerful enough to beat a 5970.
> 
> EDIT: Just noticed in the picture it says below it 'GTX 470 Dual'  so that's 215W x 2 = 430W. FAIL!



I thought that pcie v2 could do 150w not  75w like pcie v1.  I could be wrong though and not for the first time 

Like the others this just seems wrong.  To get it working they would need to down clock it too much.  Sli scaling is not that far off the 5xxx series many many sites have done lots of testing and while NV is ahead it is not actually by that much in most modern games.  The card running dual 465 would not come close to a stock 5970 never mind the OC settings that most run.  It would need to running more than 5870 xfire results at a lesser price to make it worth the purchase which is not going to happen.  Nice option though having all 3 dvi outputs on one card for surround if they ever release the driers for it.



DrPepper said:


> Actually power consumption on Fermi is very dependant on temperature people remember that. There is a massive drop when temperatures are lower.



Sorry that makes no sense.  Yes if the gpu load is lower and the fan is not spinning as much power consumption will be lower.  If you have the card on water at 100 gpu use the gpu will use the same power minus the fan (which uses ALOT of power for a fan around 20ish watts  or even more cant remember from the water block reviews it is a 1.1+ amp fan) as the card will on air.  Yes you may get a slight reduction in power consumption due to thermal dissipation and not as much of the power being converted into heat. (that last sentence might be better explained by someone who actually knows something about thermal dynamics.


----------



## _JP_ (Jun 2, 2010)

poo417 said:


> Sorry that makes no sense.  Yes if the gpu load is lower and the fan is not spinning as much power consumption will be lower.  If you have the card on water at 100 gpu use the gpu will use the same power minus the fan (which uses ALOT of power for a fan around 20ish watts  or even more cant remember from the water block reviews it is a 1.1+ amp fan) as the card will on air.  Yes you may get a slight reduction in power consumption due to thermal dissipation and not as much of the power being converted into heat. (that last sentence might be better explained by someone who actually knows something about thermal dynamics.



I think the Doctor was referring to thermal runaway on the chips. The positive feedback on MOSFETs produced by heat.


----------



## HillBeast (Jun 2, 2010)

poo417 said:


> I thought that pcie v2 could do 150w not  75w like pcie v1.  I could be wrong though and not for the first time



Looking into it, you're right that the specifications state 150W. I dind't know that, but the main problem is the motherboard needs to be able to provide that much power. If not then your going to cook it. There are those boards out there with the additional PCI-E power connectors but they are still not that common.


----------



## bobzilla2009 (Jun 2, 2010)

_JP_ said:


> I think the Doctor was referring to thermal runaway on the chips. The positive feedback on MOSFETs produced by heat.



yup, higher temperature: higher resistance leading to even more heat buildup (and eventually lots of fun things like large scale electron tunneling, which is buh bai chip function unless it plain ol' melts first!). Unfortunatley, how are galaxy going to keep those chips cool? The very good cooling system (which would pin a hd5870 to around 40-50C under heavy load) on the gtx480 still can't keep it under 90C.


----------



## TVman (Jun 2, 2010)

well if this "Abomination " has 2 465 cores why do they call it 470 dual ?


----------



## btarunr (Jun 2, 2010)

Parad0x said:


> As long as "GF100-030-A3" is written on the gpus this will only be a dual GTX465 card. Even the memory quantity points to this conclusion.



Very nice point, thanks.


----------



## Roph (Jun 2, 2010)




----------



## DarthCyclonis (Jun 2, 2010)

I can't wait to see what the price will be for something like this.   You would have to convert a small fridge like college students use in their dorm rooms into a computer case just to keep your system from frying with these in them. 

Nvidia also needs to take a play out of AMD's book and allow TRI SLi with a Dual GPU and Single GPU setup.


----------



## Tannhäuser (Jun 2, 2010)

I am not really into it anymore ... so, one question: The problem with micro-jerking still is an issue in SLI-systems?


----------



## Tatty_One (Jun 2, 2010)

Also we should remember, more often than not dual GPU cards are downclocked and undervolted so I wouldnt expect to see anything like double the power requirements.


----------



## newtekie1 (Jun 2, 2010)

wolf said:


> because nvidia cards scale so well in sli this would only need to be a dual GTX465 to compete well against the 5970





Parad0x said:


> As long as "GF100-030-A3" is written on the gpus this will only be a dual GTX465 card. Even the memory quantity points to this conclusion.



I was going to come in an say exactly these two points.

These, based on the memory configuration, are definitely GTX465 cores, unless Galaxy hid 2 memory chips on the back of the card.

And yeah, two GTX480s perform the same as two HD5*9*70s, so really two GTX465s would probably scale well enough to match a single HD5970.

And if these are GTX265s, then we are only looking at power consumption in the 300w range at peak.  That shouldn't be too give of an issue considering it is only about 20w beyond what the HD4870x2 ran at, and they were fine.



poo417 said:


> Sorry that makes no sense.  Yes if the gpu load is lower and the fan is not spinning as much power consumption will be lower.  If you have the card on water at 100 gpu use the gpu will use the same power minus the fan (which uses ALOT of power for a fan around 20ish watts  or even more cant remember from the water block reviews it is a 1.1+ amp fan) as the card will on air.  Yes you may get a slight reduction in power consumption due to thermal dissipation and not as much of the power being converted into heat. (that last sentence might be better explained by someone who actually knows something about thermal dynamics.



It might not make any sense to you, but W1z's latest review of the GTX480 proves it.  Fermi uses less power when it is cooler.  The fan speed didn't effect the cards power consumption, in fact in his tests, when he lowered the fan speed via software, power consumption went up because the card was running hotter.

Now, we aren't sure if that is because the VRM area is getting hotter and less effecient, or if the GPU itself is less efficient due to voltage leakage, or a combination of both.  However, the fact is, temperature and temperature alone is what is causing Fermi to consume so much power.  In fact it is almost a exact linear progression, for every 1°C hotter the card runs, it needs 1.2w more power.


----------



## kid41212003 (Jun 2, 2010)

Maybe somewhere between 465 and 470 because even with SLI 465 it's not possible to beat the HD5970.

I'm expecting something like 416x2 cores, 1GBx2 memory, and GPU clock @ 650.


----------



## DrPepper (Jun 2, 2010)

poo417 said:


> Sorry that makes no sense.  Yes if the gpu load is lower and the fan is not spinning as much power consumption will be lower.  If you have the card on water at 100 gpu use the gpu will use the same power minus the fan (which uses ALOT of power for a fan around 20ish watts  or even more cant remember from the water block reviews it is a 1.1+ amp fan) as the card will on air.  Yes you may get a slight reduction in power consumption due to thermal dissipation and not as much of the power being converted into heat. (that last sentence might be better explained by someone who actually knows something about thermal dynamics.



It does make sense, we just don't understand why. I have proof that backs me up.

http://www.techpowerup.com/reviews/Zotac/GeForce_GTX_480_Amp_Edition/27.html


----------



## douglatins (Jun 2, 2010)

If this is a 470 card it will break PCI-E specs. A 400W+ card?



wolf said:


> because nvidia cards scale so well in sli this would only need to be a *dual GTX465* to compete well against the 5970



No not really, SLI and CF scales pretty much the same, differences come when one is not working properly



newtekie1 said:


> I was going to come in an say exactly these two points.
> 
> These, based on the memory configuration, are definitely GTX465 cores, unless Galaxy hid 2 memory chips on the back of the card.
> 
> ...



Try getting 40K in vantage with 2 480's like 2 5970 can.

Also i pretty much think that the zotac 480 consumes less because of the different fans, and some of the heat thing


----------



## newtekie1 (Jun 2, 2010)

douglatins said:


> Try getting 40K in vantage with 2 480's like 2 5970 can.
> 
> Also i pretty much think that the zotac 480 consumes less because of the different fans, and some of the heat thing



I really don't care about Vantage scores, or any Synthectic benchmark.

All you need to look at is:









HD5970 Crossfire=12% Better than Single HD5970
GTX480 SLI=13% Better than Single HD5970
Overall, which is the important thing, not just on a single synthetic benchmark.









If you just want to focus on a high resolution(and lets face it, no one buying this configuration is running it on 1024x768):
HD5970 Crossfire=19% Better than a single HD5970
GTX480 SLI=18% Better than a single HD5970


----------



## DaJMasta (Jun 2, 2010)

All your watts are belong to us.


----------



## Fourstaff (Jun 2, 2010)

I am surprised no one have congratulated Galaxy for undertaking this unprecedented challenge. Good luck Galaxy and I hope you can come up with something not named barbecue pit!


----------



## trt740 (Jun 2, 2010)

Tatty_One said:


> Hmmmm with this setup and the hot air dissipated inside, I could turn my case into a nice little hibernation chamber for all sorts of little furry creatures



don't be so sure the 465 gtx does not run near as hot as a 480 gtx


----------



## _JP_ (Jun 2, 2010)

Fourstaff said:


> I am surprised no one have congratulated Galaxy for undertaking this unprecedented challenge. Good luck Galaxy and I hope you can come up with something not named barbecue pit!



I think by now the only thing to cheer is for them to continue the development. 
I'll give my congratulations when it's done and working. 

And it won't be a "barbecue pit" if it gets a water cooling kit.


----------



## poo417 (Jun 2, 2010)

DrPepper said:


> It does make sense, we just don't understand why. I have proof that backs me up.
> 
> http://www.techpowerup.com/reviews/Zotac/GeForce_GTX_480_Amp_Edition/27.html



Yeah I know I read Wizz's review of that.  That card is using a different bios at the moment and we still don't know what they have done to reduce power use and temps (magic voo doo dust I think ) Sadly we cant get current, voltage and power readings from all relevant parts of the card to see where it has been changed.  As I said it does not make sense that there would be massive drop and Wizz's graph shows that.  I don't call 20W a lot when we are talking about a 300w card.

I know that any electronic part becomes less efficient the hotter it gets (Intel p4's anyone )  I knew from Wizz's review that he had found that with that card.  There are other reviews that have found that even running a liquid cooling system and block on a 480 it still uses less amps and watts.

http://www.guru3d.com/article/geforce-gtx-480-liquid-cooling-danger-den-review/9 

hilberts water block review shows this too.  The fan design on the 400's may well be pulling a lot of power even when spinning slow it depends on how it is used I have no idea (about many things it seems) 

I think it is fairly simple that 2 x 480's are a good bit faster than a 5970 even when overclocked to 5870 speeds.  However the 495 could struggle at the resolutions that people that buy would be looking for 2560 x 1600 or eyefinity/ surround as the 465 and 470 seem to struggle a bit that type of resolution at the moment.  That all may change.  I just want to see what the 480's are like at 6000 x 1080 so I can make my bloody mind up to buy them. 

Please feel free to call me an idiot it has been over 20 years since I have done anything to do with P=IV or as some smart people say


----------



## crow1001 (Jun 2, 2010)

Meh who the f*** is galaxy, some poor mans brand trying to make some waves in the market with its make believe dual fermi solution, I call bull, card will never see light of day, galaxy trying to get some media attention.


----------



## Bjorn_Of_Iceland (Jun 2, 2010)

FordGT90Concept said:


> Are they mad?  They can barely cool one, nevermind two.  Fermi would need a die shrink before it is even reasonable to put two on one card.


Tell that to the people making 470 a single slot o_0


----------



## EastCoasthandle (Jun 2, 2010)

crow1001 said:


> Meh who the f*** is galaxy, some poor mans brand trying to make some waves in the market with its make believe dual fermi solution, I call bull, card will never see light of day, galaxy trying to get some media attention.



That's a pretty bold claim there.  I guess we will see soon enough.


----------



## a_ump (Jun 2, 2010)

that is a bold claim. every market segment has to be filled by someone and galaxy fills the roll of cheaper products compared to the norm. That doesn't make their products POS tho, grab a galaxy graphics card and then the same card under a diff brand(evga, xfx, etc) and performance will be the same. the difference is in components used to construct the card, which honestly only matters when it comes to overclocking.


----------



## lyndonguitar (Jun 2, 2010)

wow, with this card, your rig could fly!


----------



## tkpenalty (Jun 2, 2010)

I love how galaxy always puts out cards that make everyone go WTF 

EDIT: 


crow1001 said:


> Meh who the f*** is galaxy, some poor mans brand trying to make some waves in the market with its make believe dual fermi solution, I call bull, card will never see light of day, galaxy trying to get some media attention.



They've been in the market much longer than you have stuck your head into basic hardware IT. They usually put out much improved non-reference versions of cards addressing issues such as VRM Cooling, OC editions etc.

EDIT: Galaxy products are cheaper because they tend to simplify the circuitry, therefore saving power, saving money, at no cost to perf, usually with some innovative thing like this: http://www.galaxytech.com/en/productview.aspx?id=278


----------



## newtekie1 (Jun 2, 2010)

poo417 said:


> Yeah I know I read Wizz's review of that.  That card is using a different bios at the moment and we still don't know what they have done to reduce power use and temps (magic voo doo dust I think )



You say you've read the review, then one sentence later say "we still dont' know what they've done to reduce power use and *temps*".

Are you blind?  Did you miss miss this gigantic 3 slot beast sitting on top of the card?  Yeah, they must have used "magic voo doo dust" to lower the temps...

Oh, and we know exactly what they have done to reduce the power use, lowered the temperatures.  How do you know?  Because when the fans are artifically slowed down, the card gets hotter, and it consumes more power.  No magic voo doo dust, just hotter = more power.


----------



## mdsx1950 (Jun 2, 2010)

Can't wait to see some reviews on this card.


----------



## Deleted member 24505 (Jun 2, 2010)

The electric companies should put some bonus coupons in the box with these.


----------



## newfellow (Jun 2, 2010)

And we got a new grill in town now where's the steaks!

@Roph

hehe, good1


----------



## Hayder_Master (Jun 2, 2010)

i smell GTX 495


----------



## mdsx1950 (Jun 2, 2010)

I honestly doubt this card will beat the HD 5970. :shadedshu


----------



## theonedub (Jun 2, 2010)

I think more manufacturers should do what this card does and break the width specs of normal cards instead of making them longer. It would make them easier on cases and gives more room for larger heatsinks. (It is wider isn't it?  )


----------



## Tatty_One (Jun 2, 2010)

trt740 said:


> don't be so sure the 465 gtx does not run near as hot as a 480 gtx



Clearly, but 2 of them glued together with the hot air flowing inside the case is going to make temps difficult for those with the more "enclosed" type cases.  Am I one of the only ones who think though that from a performance perspective this might actually be a good thing?  In reality it "should" be cheaper than the 5970, possibly slower yes but who knows, this might be just the first dual GPU offering that we will see from them this year....... to me choice is good, we shouldn't knock it.


----------



## phanbuey (Jun 2, 2010)

465's are weak tho... they are well behind a 5850.  5850cf is basically a 5970... This might match it in some tests, but this is no 5970 killer.


----------



## GenTarkin (Jun 2, 2010)

*Yeah this isnt gonna take off....their dual GTS250 seems to have died...*

I dont see their GTS250 dual gpu card they announced a few months back yet.....so why would this card be any different?!?! lol


----------



## indybird (Jun 2, 2010)

**Generic comment about how much power this will use and how hot it will run**


----------



## _JP_ (Jun 2, 2010)

indybird said:


> **Generic comment about how much power this will use and how hot it will run**



**Generic comment on what's been said here about the power consumption regarding the heat**

**Generic EDIT to add the fact that it will probably have a massive cooler and thus being able to cool properly, regardless of how many slot it should take, in order to be retailed**


----------



## erixx (Jun 2, 2010)

Like I said in the Galaxy 'space' videocard: at least these guys show some innovation!!!!!


----------



## crow1001 (Jun 2, 2010)

Galaxy produced the below GFX card...nuff said...


----------



## theorw (Jun 2, 2010)

Any price yet?Who bets its over 700$?


----------



## Tatty_One (Jun 2, 2010)

theorw said:


> Any price yet?Who bets its over 700$?



What for two 465's?  Naaaa, a good benchmark tends to be, twice the retail price of a single 465 minus about 10 - 15%.


----------



## _JP_ (Jun 2, 2010)

theorw said:


> Any price yet?Who bets its over 700$?



My guess, is around 450€ to 500€, because it's not supposed to top a HD 5970 (I guess), so it shouldn't cost as much.
On the worse case, maybe 600€, but that would be pushing it (even know it is nVidia).


----------



## JayliN (Jun 2, 2010)

newtekie1 said:


> And yeah, two GTX480s perform the same as two HD5*9*70s, so really two GTX465s would probably scale well enough to match a single HD5970.





newtekie1 said:


> If you just want to focus on a high resolution(and lets face it, no one buying this configuration is running it on 1024x768):
> HD5970 Crossfire=19% Better than a single HD5970
> GTX480 SLI=18% Better than a single HD5970



I'm lost as to what point you're trying to make. Where have you proven that two gtx465s will match a single 5970?

All your graphs have shown is that 2x GTX480 in SLI is faster than a single 5970 which is a no brainer since the GTX480 is 11% faster than a 5870 and a 5970 is actually 2x downclocked 5870s in crossfire.

Its no secret that GPUs don't scale well beyond 2. It doesn't matter whether its crossfire or sli. To imply that crossfire is inferior to sli by comparing benchmarks taken from how 2 ati gpus scales to 4 and how 1 nvidia gpu scales to 2 is nonsense.

If you think 2 465s (which is ~37% slower than a 480 and ~32% slower than a 5870 at 2560x1600) is going to match a 5970, you're going to be surprised.

For your consideration: http://www.guru3d.com/article/geforce-gtx-465-sli-review/12


----------



## naram-sin (Jun 2, 2010)

Sorry for this, but F equals FAIL... or FERMI... whichever comes to mind (or situation) first... and it's an epic fail... Buahahahahahahaah!!! 

Although, I hope they have a nice comeback, because we don't need another Intel/MS here... prices should go DOWN!


----------



## OneCool (Jun 2, 2010)

LAME!!!!!!!!!!!!!

If you cant bring in the big guns on a dual GPU card it = FAIL in my book.


They should have used crippled 480s :shadedshu


----------



## the54thvoid (Jun 2, 2010)

How on earth does the card being hotter mean it uses more power?  In physics heat is a by product of current, essentially it is waste energy (unless heat is what you are after).  Therfore does it not follow that because the card is drawing so much power, it is getting hotter?
The more cycles a processor performs the hotter it becomes because of the extra 'power' required to do the extra cycles.  This is why liquid gas (N2) cooling is used on OC records.  The processor does not consume more power because it is hot - it is hot because it consumes more power.

It is a bloody rule of electrical power - that which requires more power becomes hotter.  Heat does not generate power.

"_However, the fact is, temperature and temperature alone is what is causing Fermi to consume so much power_"

This is not correct.  What is really happening is the fact that so much heat is being lost by an innefficient design means that more power has to be pumped in to perform the given task*  If the system is more efficient, as W1zz's review sample must be, it loses less heat and therefore does not require more power.  So to be more accurate:

"Temperature (heat) loss is what causes Fermi to consume so much power"

Well, that and the fact it requires a lot more power in the first place.

*i.e. if i need 100 units (of whatever) to perform an operation and i have a 100% efficient system, i only need to draw 100 units.  However if my system is innefficient, say 75%, then i lose 25 units as heat (or light or sound).  This leaves me with 75 units which is not enough for the operation, so i have to draw 25 more.  My total draw for a 100 unit task is now 125 units because of my 25 unit heat loss.  Well done if you follow that!


----------



## Fourstaff (Jun 2, 2010)

the54thvoid said:


> How on earth does the card being hotter mean it uses more power?  In physics heat is a by product of current, essentially it is waste energy (unless heat is what you are after).  Therfore does it not follow that because the card is drawing so much power, it is getting hotter?
> The more cycles a processor performs the hotter it becomes because of the extra 'power' required to do the extra cycles.  This is why liquid gas (N2) cooling is used on OC records.  The processor does not consume more power because it is hot - it is hot because it consumes more power.
> 
> It is a bloody rule of electrical power - that which requires more power becomes hotter.  Heat does not generate power.
> ...



I hope your physics teacher told you that hot objects have higher resistance, therefore to supply the same amount of current, more power is needed. Hence, thermal runway we see in Fermi.


----------



## Kantastic (Jun 2, 2010)

the54thvoid said:


> *i.e. if i need 100 units (of whatever) to perform an operation and i have a 100% efficient system, i only need to draw 100 units.  However if my system is innefficient, say 75%, then i lose 25 units as heat (or light or sound).  This leaves me with 75 units which is not enough for the operation, so i have to draw 25 more.  My total draw for a 100 unit task is now 125 units because of my 25 unit heat loss.  Well done if you follow that!



If you're losing 25% of your units then you will need 133.33 units in order to maintain a 100 stable.


----------



## HillBeast (Jun 3, 2010)

JayliN said:


> I'm lost as to what point you're trying to make. Where have you proven that two gtx465s will match a single 5970?



Why are people still going on about 465s for? It clear says below the card "DUAL GTX 470". It may say on the cores they a 465s but bear in mind that this is a mock up. Do you honestly think that card will resemble the final product? Remember the original Fermis, they had wood screws on them. Someone pull their GTX480 apart and tell me where the wood screws are. Do you seriously think that Galaxy are going to waste a good GTX 470 core on a mock up card? Do you know how expensive those things are?

It's a Dual 470 and it's as simple as that. If Galaxy says it is, then it is.


----------



## newtekie1 (Jun 3, 2010)

JayliN said:


> I'm lost as to what point you're trying to make. Where have you proven that two gtx465s will match a single 5970?
> 
> All your graphs have shown is that 2x GTX480 in SLI is faster than a single 5970 which is a no brainer since the GTX480 is 11% faster than a 5870 and a 5970 is actually 2x downclocked 5870s in crossfire.
> 
> ...



Read the quoted text, it really isn't that hard to follow.



Fourstaff said:


> I hope your physics teacher told you that hot objects have higher resistance, therefore to supply the same amount of current, more power is needed. Hence, thermal runway we see in Fermi.



Correct, it is for this same reason that power supplies are less efficient at higher temps.


----------



## hat (Jun 3, 2010)




----------



## a_ump (Jun 3, 2010)

i also remember reading that Fermi G100 has bad leakage. Heat increases leakage, so as said before, in order to maintain stability and compensate for the leakage in Fermi, more power or current must be fed.


----------



## my_name_is_earl (Jun 3, 2010)

This is when your power generation got a good workout.


----------



## HammerON (Jun 3, 2010)

Wow this thread is "another crap on fermi" thread and that is just to bad:shadedshu

As far as Galaxy releasing this Dual GTX 465 card; we will just have to wait and see if they do or don't. And then we can judge it accordingly...


----------



## HillBeast (Jun 3, 2010)

GenTarkin said:


> I dont see their GTS250 dual gpu card they announced a few months back yet.....so why would this card be any different?!?! lol



Well it's not like that card was going to be hard to make either considering it was a single PCB 9800GX2, and they gave up on it.


----------



## Yellow&Nerdy? (Jun 3, 2010)

newtekie1 said:


> I really don't care about Vantage scores, or any Synthectic benchmark.
> 
> All you need to look at is:
> 
> ...



I not defending ATI or anything, but I think because the review of the 5970 Crossfire was done when they were released, which is over 8 months ago. And one of the things that is improved with new drivers is multiple video card scaling. But you do have a point there, SLI usually scales better than Crossfire.


----------



## pr0n Inspector (Jun 3, 2010)

HillBeast said:


> Ugh. This card is going to be terrible. Even if they use two 465s, that is 200W TDP each, totalling to 400W! The thing has 2 8 pin PCI-E plugs which can do MAX 150W and the PCI-E socket provides MAX 75W. That's 375W total, and even if the TDP figures are accurate (which they never are) this thing will pull over 450W at least in FurMark. I can't see any cooler being good enough to keep this thing under 100C. Only water cooling would be viable but even that would be a struggle if it's on the same loop as the CPU.
> 
> Seriously by the time they get the TDP good enough to work, it simply won't be powerful enough to beat a 5970.
> 
> EDIT: Just noticed in the picture it says below it 'GTX 470 Dual'  so that's 215W x 2 = 430W. FAIL!



Those numbers are the minimum requirements. Decent PSUs(read: 18AWG wires, powerful 12V rail(s)) can provide well over 150W on a pci-e 8 pin* connector.


*in fact, even 6-pin can do it since most 8-pin connectors are 6+2.


----------



## Tatty_One (Jun 3, 2010)

phanbuey said:


> 465's are weak tho... they are well behind a 5850.  5850cf is basically a 5970... This might match it in some tests, but this is no 5970 killer.



You say weak well yes in comparision to a 5850, 5870, 470 or 480 but they fit a niche and are positioned as a "mid range" (or perhaps lower mid) product, much like the HD4850 was (eventually once the HD4890 was released), now I crossfired 2 of those 1GB HD4850's at the time and their performance to be honest was pretty awesome but not quite as good as a HD4870x2, however that didnt stop Ati releasing a HD4850x2 although it was only taken up by 2 board partners, my point being..... whats the difference here?


----------



## poo417 (Jun 3, 2010)

newtekie1 said:


> You say you've read the review, then one sentence later say "we still dont' know what they've done to reduce power use and *temps*".
> 
> Are you blind?  Did you miss miss this gigantic 3 slot beast sitting on top of the card?  Yeah, they must have used "magic voo doo dust" to lower the temps...
> 
> Oh, and we know exactly what they have done to reduce the power use, lowered the temperatures.  How do you know?  Because when the fans are artifically slowed down, the card gets hotter, and it consumes more power.  No magic voo doo dust, just hotter = more power.



I don't think what I was trying to say was very clear. On that card yes.  Other cards that  have the newer bios still all on stock heatsink are cooler and use less power.  I am not talking just about the zotac card.  Will sticking a finger on the fan stop it pulling current even more because it is told to run at a certain speed and it cant so pulls more current?  

Perhaps you can enlighten me as to what you know they have done to the stock cards bioses to lower temps and power without added cooling or faster fans which is what I was trying to say.  Using the the graph from the zotac review applied to a water cooled card at 48C under load would mean it should use a lot less power than it is in the review.

Anyway back this post.  It will interesting to see what a bigger company comes up with.  With the 4 gig 5970 seeming to all have 3 slot coolers it does leave a lot of space for a big efficient cooler.  Will be interesting to see what they all come up with.  We need something to knock the 5970 off its perch.


----------



## Fourstaff (Jun 3, 2010)

I cant see how that can be a good "mid range" graphics card, power consumption is going to be quite high. But I do see this card as a filler between the GTX480 and HD5970


----------



## Tatty_One (Jun 3, 2010)

Fourstaff said:


> I cant see how that can be a good "mid range" graphics card, power consumption is going to be quite high. But I do see this card as a filler between the GTX480 and HD5970



Agreed, however, the "average" consumer/gamer does not get too wound up about wattage and amperage, they simply try to find out if the card will play their favorite game and buy it, the enthusiast however, as we know is a different beast, the bottom line is, if the cards performance is above that of a HD5870 and GTX470/480 but below that of an HD5970, providing it does not cost as much as the 5970 then it will sell..... quite simply, there are millions of people out there who will not buy ATi because they have always had NVidia, of course the same applies the other way round, I am not suggesting for one minute that this will be a good card, just saying that it probably will have a market, albeit the dual GPU market is pretty narrow in any case.  Personally, it does not interest me in the slightest, the only dual GPU card I have owned is the HD4870x2 and I didnt like it, it was hot,and noisy, the upside was that I dint like the neighbours who lived 4 houses away and when I gamed with the 4870x2 it always woke their children up


----------



## HalfAHertz (Jun 3, 2010)

Best of luck to Galaxy on their newest endeavor.


----------



## pr0n Inspector (Jun 3, 2010)

nah noise won't be problem. this card begs for water.


----------



## JayliN (Jun 3, 2010)

newtekie1 said:


> Read the quoted text, it really isn't that hard to follow.



465s in sli matching a 5970? 

yea, its clear you made a bs statement with meaningless numbers to back it up. try again.


----------



## JayliN (Jun 3, 2010)

HillBeast said:


> Why are people still going on about 465s for? It clear says below the card "DUAL GTX 470". It may say on the cores they a 465s but bear in mind that this is a mock up. Do you honestly think that card will resemble the final product? Remember the original Fermis, they had wood screws on them. Someone pull their GTX480 apart and tell me where the wood screws are. Do you seriously think that Galaxy are going to waste a good GTX 470 core on a mock up card? Do you know how expensive those things are?
> 
> It's a Dual 470 and it's as simple as that. If Galaxy says it is, then it is.



Beyond what is marked on the gpu itself, the number of memory chips points to a 256bit bus as pointed out numerous times in this thread. GTX470s run off a 320bit interface.


----------



## Bjorn_Of_Iceland (Jun 3, 2010)

JayliN said:


> Beyond what is marked on the gpu itself, the number of memory chips points to a 256bit bus as pointed out numerous times in this thread. GTX470s run off a 320bit interface.



They havent showed the back of the card's pcb, so memory quantity can be ruled out for now..


----------



## the54thvoid (Jun 3, 2010)

After my physics lesson heres a history lesson- thanks for correcting me guys (but it's still an issue that power generates heat albeit heat increases resistance which increases power draw - ooh it's almost like the Matrix in its infinite circular complexity). 

Nvidia Naming

GTX 295 = 2 x GTX 275
Mars GTX 295 = 2 x GTX 285

So it doesn't matter if it's called a Dual GTX 470, it does not mean it is 2 x GTX 470.

Similarly, a HD5970 = 2 x 'downclocked to 5850' 5870's.
and the new monsters at 4GB are HD5970 = 2 x 5870's (at stock)

GFX nomenclature is as reliable as a chocolate solar panel.

However, on the power front, the review at tweaktown for  GTX 465 sli puts it at more draw than the HD 5970 and almost the same (40 watts out at load) as a 4GB HD5970 (higher at idle than one anyway).  Point being 465 sli is a good indicator of the maximum potential of this card if it is 2 x GTX 465.
But i must stress Tweaktown's gaming choices are exceptionally pro Nvidia (res evil 5, Far Cry 2, Batman, Darkest days etc).  I'd like to see Crysis and BFBC2 in there.


----------



## [I.R.A]_FBi (Jun 3, 2010)

mmm, silicon kebabs


----------



## Flavius (Jun 3, 2010)

the54thvoid said:


> ...
> GTX 295 = 2 x GTX 275
> ...
> Similarly, a HD5970 = 2 x 'downclocked to 5850' 5870's.
> ...


or GTX295 = 2x "downclocked to GTX260 216sp" GTX275's.


----------



## the54thvoid (Jun 3, 2010)

I think NV should realise how much they've missed the boat in terms of newer cooler & efficient tech and launch a dual GF 100 480 core (x2) monster and call it 'Inferno'.  I'm serious.  Take a hit for what they've got wrong and capitalise on it.  They'd regain the 'single card' crown and have an absolute beast.  Sell it with the same cooler as the 4GB Sapphire HD5970 or water cool it from scratch.  Surely EVGA could do this?

I wouldn't buy it but it'd be a blast to see!  They'd have to sell it with a sticker on the box saying "F*ck the PCI power specs- this is INFERNO!!!"


----------



## newtekie1 (Jun 3, 2010)

JayliN said:


> 465s in sli matching a 5970?
> 
> yea, its clear you made a bs statement with meaningless numbers to back it up. try again.



No, two GTX480s matching two HD5970s, see the quoted text and learn to read.

I'll make it easy for you:

My original comment:


newtekie1 said:


> And yeah, two GTX480s perform the same as two HD5*9*70s, so really two GTX465s would probably scale well enough to match a single HD5970.



His responce:


douglatins said:


> Try getting 40K in vantage with 2 480's like 2 5970 can.



It was at that point that the conversation shifted from two GTX465s outperforming a single HD5970 to two GTX480s outperforming two HD5970s, which led to my post explaining that two GTX480s outperform two HD5970s.  See, was that hard to follow?

Again, my original statement about the GTX465s was hardly BS, as it was obviously my opinion on the subject based on other experienece with SLi/Crossfire scaling(you can tell this by my use of the words "so" and "probably"), because there really hasn't been any good reviews that directly address the issue.


----------



## mdsx1950 (Jun 3, 2010)

newtekie1 said:


> No, two GTX480s matching two HD5970s, see the quoted text and learn to read.




I gotta disagree there. Maybe 3x GTX 480s match 2x HD 5970s. 

Sorry bro. But you just spoke BS up there. :shadedshu


----------



## newtekie1 (Jun 3, 2010)

mdsx1950 said:


> I gotta disagree there. Maybe 3x GTX 480s match 2x HD 5970s.
> 
> Sorry bro. But you just spoke BS up there. :shadedshu



*Sigh*

Look at the charts.  

Overall
2x HD5970 = 12% Better than single HD5970
2x GXT480 = 13% Better than single HD5970

@2560x1600
2x HD5970 = 19% Better than single HD5970
2x GTX480 = 18% Better than single HD5970

There is nothing more conclusive than that.  It seems SLi between two GTX480s scales a whole hell of a lot better than two HD5970s.

I love it when someone posts charts from very respected reviews that prove their point, then another person comes in and say "Nope, I disagree, what you said is BS" and doesn't even bother to show at least a small amount of evidence to back them up...


----------



## DaMulta (Jun 3, 2010)

can u run two of these for 4gpus?


----------



## newtekie1 (Jun 3, 2010)

DaMulta said:


> can u run two of these for 4gpus?



It seems to have an SLI finger on it, so I would think yes, or at least I hope yes becuse that would be sweet.


----------



## OnBoard (Jun 3, 2010)

OK, now NVIDIA went crazy. Forget the GTX 465/470 times two.

Nvidia prepares dual chip card
http://www.fudzilla.com/content/view/19052/1/

Two 460?! really? I had a hard time believing GTX 460 would even beat 5770 yesterday, but now it should beat 5850 (to beat 5970).

So what could have they done with GF104? Full 60TMUs like in GTX 480 or close to that? Then you'd get same shader performance but bigger fillrate.


----------



## the54thvoid (Jun 3, 2010)

newtekie1 said:


> *Sigh*
> 
> Look at the charts.
> 
> ...



There is one glaring omission here and i'm not disagreeing with the superb scaling of the GTX4xx range in sli - BUT, 2 x HD5970 isn't crossfire, it's quad, as in quad sli.  We gotta bear in mind the ratings. GTX480>HD5870>GTX470>HD5850.  A HD5970 is 2 x 5850 which beat out the GTX480 but not by double so when the GTX480 doubles up, it beats the HD5970, even in 2x HD5970 because the returns diminish from 2 -3 - 4 card set ups.

No arguments though, GTX 4xx series scale very well.


----------



## erocker (Jun 3, 2010)

Well, I'm very much looking forward to this card, the price and power requirements. If they all fit within my range I'm thinking I'd like to try one out.


----------



## HillBeast (Jun 4, 2010)

pr0n Inspector said:


> Those numbers are the minimum requirements. Decent PSUs(read: 18AWG wires, powerful 12V rail(s)) can provide well over 150W on a pci-e 8 pin* connector.
> 
> 
> *in fact, even 6-pin can do it since most 8-pin connectors are 6+2.



Yeah but like you said, the decent ones do it. There are still the other ones out there that won't do more than 150W and it's not going to be safe. They still shouldn't be risking going over the PCI-E specs.


----------



## mdsx1950 (Jun 4, 2010)

newtekie1 said:


> *Sigh*
> 
> Look at the charts.
> 
> ...



Newtekie from that time, there have been alot of drivers for the 5970 boosting the performance. The drivers at the time when the 5970 was released sucked. There were tons of issues. nVidias scale better than ATi, i agree there. But the 5970 performs much better on the 10.4a drivers for me. Maybe its just my personal experience. I can assure that the 5970s on crossfire perform better than in the review now.  :shadedshu


----------



## newconroer (Jun 4, 2010)

HillBeast said:


> Only water cooling would be viable but even that would be a struggle if it's on the same loop as the CPU.



Which is why enthusiast hardware users don't use chintzy Thermaltake on a loop with four components in it.

You build a separate system for the GPU(s).

Now just to see how this stacks against 480SLI...


----------



## filip007 (Jun 5, 2010)

nVidia must prepare for the next level, like ATi with Southern Island on 28nm probably that's the ultimate win overall.


----------



## Fourstaff (Jun 5, 2010)

filip007 said:


> nVidia must prepare for the next level, like ATi with Southern Island on 28nm probably that's the ultimate win overall.



They just launched Fermi recently, I think they should finish their top to bottom lineup and a few revisions first before moving to "Fermi 2". ATi has finished their top to bottom lineup, found that they didn't need a revision and so they are speeding up Southern Islands. That's what I think anyway.


----------



## newtekie1 (Jun 5, 2010)

mdsx1950 said:


> Newtekie from that time, there have been alot of drivers for the 5970 boosting the performance. The drivers at the time when the 5970 was released sucked. There were tons of issues. nVidias scale better than ATi, i agree there. But the 5970 performs much better on the 10.4a drivers for me. Maybe its just my personal experience. I can assure that the 5970s on crossfire perform better than in the review now.  :shadedshu



As I said, you are welcome to show some proof that, but I go on what I've seen, and that is what I have seen.  I haven't read every review, and things may have changed since then, but I'm going on what I've seen.

However, to not even show a little evidence about what you say really makes it seem like you are the one speaking BS, not me.  And I would suggest that the next time you say what someone says is BS, especially when that person how shown proof from respectable reviews, you at least make a little effort to show why, otherwise it is just trolling.  You have had two chances to show some proof to back your statements up, yet you haven't, so that is even worse trolling.


----------



## mdsx1950 (Jun 6, 2010)

newtekie1 said:


> As I said, you are welcome to show some proof that, but I go on what I've seen, and that is what I have seen.  I haven't read every review, and things may have changed since then, but I'm going on what I've seen.
> 
> However, to not even show a little evidence about what you say really makes it seem like you are the one speaking BS, not me.  And I would suggest that the next time you say what someone says is BS, especially when that person how shown proof from respectable reviews, you at least make a little effort to show why, otherwise it is just trolling.  You have had two chances to show some proof to back your statements up, yet you haven't, so that is even worse trolling.





GTX 480 3-Way SLi VS 5970 CFX[/QUOTE]



Now please don't come up with some lame excuse telling the figures or fake or that the site is BS. I checked 90% of the figures with the TPU review and is very similar.

Only when it comes to Tessellation does the GTX 480 overpower the HD 5970. 

Guess your right as usual.


----------



## newtekie1 (Jun 6, 2010)

mdsx1950 said:


> GTX 480 3-Way SLi VS 5970 CFX
> 
> 
> 
> ...



Very nice, you actually managed to find some proof, I'm glad.  Now if only you have posted something to that effect in the beginning, you wouldn't have looked like such a fool...


----------



## mdsx1950 (Jun 6, 2010)

newtekie1 said:


> Very nice, you actually managed to find some proof, I'm glad.  Now if only you have posted something to that effect in the beginning, you wouldn't have looked like such a fool...



You should have listened to what i said from the beginning without being ignorant. I have experienced the GTX 480 running on SLi on a very similar computer like mine and currently owning 2 HD 5970s. I didn't need proof to voice my opinion.  

I looked like a fool? 
Keep dreaming. 

Anyways nice debating with ya!


----------



## carrera997 (Jun 8, 2010)

douglatins said:


> Try getting 40K in vantage with 2 480's like 2 5970 can.


 
Agree. I have tested two 5970s and 2 480s on the same setup and I can tell you that two 480s are no match for 2 5970s.  Tri SLI is a different story though. definitley better than two 5970s


----------



## rick22 (Jun 8, 2010)

I think it's over kill...


----------



## mdsx1950 (Jun 8, 2010)

carrera997 said:


> Agree. I have tested two 5970s and 2 480s on the same setup and I can tell you that two 480s are no match for 2 5970s.  *Tri SLI is a different story though. definitley better than two 5970s*



Its not definitely better than two 5970s. Two 5970s do manage to beat Three GTX 480s in some games. Check here.


----------

