Friday, April 9th 2010

My production Fermi arrived!

This morning the first of the long awaited retail GeForce GTX 480 cards arrived. The card came from Zotac and photos are below. It is from the same shipment that will be sold on store shelves in the next days. At first glance the card looks completely identical to the NVIDIA press sample, with the exception of some black foam on the back of the card. Another diffence is the BIOS which I put up for download here.

I am told that the reason for the black foam is to ensure some spacing between the cards when running in SLI mode, so that they can breathe. It also acts as safeguard against short circuits which could happen when the metal cooler surface of one card touches the back of the other card.
Add your own comment

134 Comments on My production Fermi arrived!

#76
SteelSix
ZubasaExactly, if they want us to pay the big bucks for that card, at lease make it looks high quality.
I just hate it when the Fermi looks so half-assed in terms of cooling. :shadedshu
Everything on that card have a "last minute solution" feeling to it....
Wow I have to disagree. 480's cooler looks to be very well crafted. Look at wiz's pics, that baseplate is intricate and very well designed. The heatsink is top notch. 470's heatsink is very good quality too. If anything, it's what they're cooling that makes them look bad. Imagine that cooler on a 5870..
Posted on Reply
#77
Zubasa
SteelSixWow I have to disagree. 480's cooler looks to be very well crafted. Look at wiz's pics, that baseplate is intricate and very well designed. The heatsink is top notch. 470's heatsink is very good quality too. If anything, it's what they're cooling that makes them look bad. Imagine that cooler on a 5870..
Looks can be deceptive. By making the cooler "look good" nVidia hinders its performance.
They could easily make that card run cooler if they remove that 5th heatpipe and make that part of the heatsink as wide as the rest.


They could also use a larger diameter blower which will reduce the noise.


At last sorry guys for de-railing this thread. :shadedshu
Posted on Reply
#78
gigabit942007
i think they put that piece of foam just to piss of people *WTF!! a piece of sticky foam?*
Posted on Reply
#79
EarlZ
@Zubasa

Nvidia needs guys like you to design their heatsink ;)
Posted on Reply
#80
R_1
It is well designed, just needs some steroids, because can't keep up with the GF100 TDP.
Posted on Reply
#81
Kaleid
Seems to me that the all too tightly spaced fins require a high RPM fan...
I just hate the idea of using the stock cooler because not only will it be noisy but it will also trap in tons of dust into it in the end.

Stock coolers are generally epic failures.. :shadedshu
Posted on Reply
#82
SteelSix
EarlZ@Zubasa

Nvidia needs guys like you to design their heatsink ;)
Nice work Zubasa, I see what you mean about 480's heatsink. They could have greatly increased 480's fin surface area. It looks like they were thinking more about SLI interface clearance than maximum fin surface area. Damn nice spot.

I hadn't considered the fan diameter either. They have room for a larger diameter fan. Yep, nice work sir..
Posted on Reply
#83
vagxtr
ZubasaLooks can be deceptive. By making the cooler "look good" nVidia hinders its performance.
They could easily make that card run cooler if they remove that 5th heatpipe and make that part of the heatsink as wide as the rest.


They could also use a larger diameter blower which will reduce the noise.
While i agree that they could use a Much Better Cooler, concerned not only to that 10x60mm alu fins that they sliced out, for 250W+ power hungry monster (even if it was only 210-220W). I think you're on wrong path if you think that HP reduction would improve situation at all. TR HR-03 Plus popular in times when first power hungry monster 8800Ultra arrived, had 6 heatpipes specially for these babies and was much larger. And you still needed high CFM fan blower to cool it down.

As 80mm fan goes. It's pretty standard as no G80/GT200 chip hasnt have better stock alike blower. These kind of things arent produced in as many aftermarket sizes as other fans and 100mm is first in line and it would probably mess up with pcb layout. And they're simply not interested to design special fans just for overclockers dream cards like GTX280/GTX480 that are produced in extremely low volumes.

Maybe MSI gives us one of theirs OC SuperPipe editions like their GTX285 :p
Posted on Reply
#84
HalfAHertz
I agree. The problem with the GTX480 isn't the cooler, it's the chip. We can't but help it to always compare it to a dual chip solution, not only because it delivers similar performance but because it runs just as hot, there's no denying the fact - it produces just as much hea if not more. But the problem is that unlike a dual chip solution, where you have the luxury to dissipate the heat from each chip separately, it focuses all that heat in one point and it;s only natural that the less hinders heat dissipation.
Posted on Reply
#85
Zubasa
vagxtrWhile i agree that they could use a Much Better Cooler, concerned not only to that 10x60mm alu fins that they sliced out, for 250W+ power hungry monster (even if it was only 210-220W). I think you're on wrong path if you think that HP reduction would improve situation at all. TR HR-03 Plus popular in times when first power hungry monster 8800Ultra arrived, had 6 heatpipes specially for these babies and was much larger. And you still needed high CFM fan blower to cool it down.

As 80mm fan goes. It's pretty standard as no G80/GT200 chip hasnt have better stock alike blower. These kind of things arent produced in as many aftermarket sizes as other fans and 100mm is first in line and it would probably mess up with pcb layout. And they're simply not interested to design special fans just for overclockers dream cards like GTX280/GTX480 that are produced in extremely low volumes.
Oh I was just pointing out some of the most apparent problems.
The fins can easily be extended towards the fan for at lease 1/2 an inch.
As for the heatpipes, their job is only to help heat transfer, if the fins can't cool down fast enough more heatpipes are just useless.
Thermalright is known for designing top notch coolers, and it is not the number of HPs that makes them so good.
HalfAHertzI agree. The problem with the GTX480 isn't the cooler, it's the chip. We can't but help it to always compare it to a dual chip solution, not only because it delivers similar performance but because it runs just as hot, there's no denying the fact - it produces just as much hea if not more. But the problem is that unlike a dual chip solution, where you have the luxury to dissipate the heat from each chip separately, it focuses all that heat in one point and it;s only natural that the less hinders heat dissipation.
The cooler certainly have something to do with it.
You make a hot chip, you better get us to STFU with a good cooler.
If nVidia actually put more effort in the cooling and manage to cool this thing down, I will hardly have much to bitch about.
Posted on Reply
#86
OUTSIDE
NEW TEMPS!

With ONE monitor:

IDLE with 100% FAN: 35ºC!
IDLE with AUTO-fan: 43ºC!

Core -> 51MHz / Shader -> 101MHz / Mem -> 135MHz

With TWO monitor:

IDLE with 100% FAN: 42ºC!
IDLE with AUTO-fan: 72ºC!

Core -> 405MHz / Shader -> 810MHz / Mem -> 1848MHz

In furmark... NOT PASSES THAN 88ºC!

All that it's mounted "in the air": not in a case.

ByE!
Posted on Reply
#87
KainXS
what??? you have the core at what
Posted on Reply
#88
trickson
OH, I have such a headache
W1zzardThis morning the first of the long awaited retail GeForce GTX 480 cards arrived. The card came from Zotac and photos are below. It is from the same shipment that will be sold on store shelves in the next days. At first glance the card looks completely identical to the NVIDIA press sample, with the exception of some black foam on the back of the card. Another diffence is the BIOS which I put up for download here.

I am told that the reason for the black foam is to ensure some spacing between the cards when running in SLI mode, so that they can breathe. It also acts as safeguard against short circuits which could happen when the metal cooler surface of one card touches the back of the other card.

www.techpowerup.com/img/10-04-09/zotac1_thm.jpg www.techpowerup.com/img/10-04-09/zotac3_thm.jpg www.techpowerup.com/img/10-04-09/zotac5_thm.jpg
My question is how well is this one holding true to your previous review ? Performance wise and heat wise ?
Is it now working in SLI mode ? and if so any chance to see some pic's of the final install ?
Posted on Reply
#89
mdsx1950
OUTSIDENEW TEMPS!

With ONE monitor:

IDLE with 100% FAN: 35ºC!
IDLE with AUTO-fan: 43ºC!

Core -> 51MHz / Shader -> 101MHz / Mem -> 135MHz
Wow. The core, shader, memory speeds suck but the temps are so high?? I gotta say... What the fuck is that????? :twitch:

P.S : Is that real?
Posted on Reply
#90
Unregistered
OUTSIDENEW TEMPS!

With ONE monitor:

IDLE with 100% FAN: 35ºC!
IDLE with AUTO-fan: 43ºC!

Core -> 51MHz / Shader -> 101MHz / Mem -> 135MHz

With TWO monitor:

IDLE with 100% FAN: 42ºC!
IDLE with AUTO-fan: 72ºC!

Core -> 405MHz / Shader -> 810MHz / Mem -> 1848MHz

In furmark... NOT PASSES THAN 88ºC!

All that it's mounted "in the air": not in a case.

ByE!
You can sure see it dont like running two monitors.
Posted on Edit | Reply
#91
trickson
OH, I have such a headache
Yeah all this is much ado about nothing !
I mean I want pictures posted up lets see what they look like in SLI show some screen shots of benchmarks and temps ! Come on we know you did a review but lets see some action now with non reference cards !
Posted on Reply
#92
twistedneck
TheMailMan78I would have used rubber or something. That would be much better IMO. Foam just seems like "Man we built a badass video card that can run SLI and........DAMN IT if they touch they will fry! Quick cut off a piece of packing foam and give me some Elmer's glue. We got work to do!".

This would also explain why Fermi was 6 months late.
Typical production start up - you dont know how the launch will work out until the actual cards start coming in.. engineers have years of lessons learned, but its impossible to stuff them all inside your brain. even the so called proven design methods go out the window when you are late - when that $ / month delay figure gets pinned on you.

I have been there, am there now with a product launch. not electronics but another part. still, its the same thing - humans can only remember 5 things at once. :twitch:
Posted on Reply
#93
Black Panther
OUTSIDENEW TEMPS!

With ONE monitor:

IDLE with 100% FAN: 35ºC!
IDLE with AUTO-fan: 43ºC!

Core -> 51MHz / Shader -> 101MHz / Mem -> 135MHz

With TWO monitor:

IDLE with 100% FAN: 42ºC!
IDLE with AUTO-fan: 72ºC!

Core -> 405MHz / Shader -> 810MHz / Mem -> 1848MHz

In furmark... NOT PASSES THAN 88ºC!

All that it's mounted "in the air": not in a case.

ByE!
The stock speeds for the GTX 480 are core 750Mhz 700Mhz, Mem 3800Mhz.
I'm assuming that you either made a lot of typos here, or you purposely underclocked the card to get low temperatures :confused:
Posted on Reply
#95
DOM
in idle NVIDIA runs the card at clocks of 50 MHz core, 68 MHz memory and 100 MHz shaders!


from wiz review didnt any of you look at it ? noobs :p
Posted on Reply
#96
sneekypeet
Retired Super Moderator
n00bs my foot, who cares about idle temps:p From what he is showing my 275 runs hotter;)
Posted on Reply
#98
erocker
*
OUTSIDENEW TEMPS!

With ONE monitor:

IDLE with 100% FAN: 35ºC!
IDLE with AUTO-fan: 43ºC!

Core -> 51MHz / Shader -> 101MHz / Mem -> 135MHz

With TWO monitor:

IDLE with 100% FAN: 42ºC!
IDLE with AUTO-fan: 72ºC!

Core -> 405MHz / Shader -> 810MHz / Mem -> 1848MHz

In furmark... NOT PASSES THAN 88ºC!

All that it's mounted "in the air": not in a case.

ByE!
Sounds awesome, could we have a screenshot or two? :) Your idle temps with your fan on AUTO is lower than every review I've seen on the internet!
Posted on Reply
#99
WSP
maybe that has got to do with ambient temp?
Posted on Reply
#100
Assimilator
Black PantherI honestly can't understand why the moment a new driver is loaded and you have 2 cards, the Nvidia CP sets default to single card. :wtf:

It happened to me a couple of times on the laptop, I'd update the drivers then go straight to benchmark or game, and start thinking that my cards are on the good road to become obsolete.
Then I'd have a fffuuu moment:ohwell: when I remember that the driver had disabled my SLI and I was running on one card :shadedshu
Ah yeah, reminds me of the days of quad SLI'd 7950 GX2s... every time I installed a new driver performance would go down and only then would I remember to manually go into the nVidia CP and enable Quad SLI.

Even better, with a 9800 GX2 it defaults to PhsyX enabled and SLI disabled... WHY???
Posted on Reply
Add your own comment
May 3rd, 2024 19:49 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts