# ZOTAC GeForce GTX 480 AMP! Edition



## W1zzard (May 25, 2010)

ZOTAC's GeForce GTX 480 AMP! Edition comes with a mighty Zalman VF3000 cooler strapped to it which reduces the card's temperatures by 20°C. Thanks to the new cooling solution, the card has no problems running at higher clock speeds out of the box. But is that enough to declare this triple slot card a winner?

*Show full review*


----------



## v12dock (May 26, 2010)

Excellent Review! Overall looks like a very nice improvement, but im sure the 257 drivers helped .


----------



## newtekie1 (May 26, 2010)

Very interesting...

It has huge power consumption drops, despite actually having a higher voltage.  It is possible that the better cooling on the VRM area has made it more effeicient?


----------



## ozama (May 26, 2010)

Hi,Is it possible to ajust fan speed at lower level with rivatuner for the noice in idle mode.

Thanks


----------



## newtekie1 (May 26, 2010)

I would assume it is possible since it uses the reference PCB.


----------



## LAN_deRf_HA (May 26, 2010)

I'd be interested in nvidia or zotac's take on the power draw difference.


----------



## Cold Storm (May 26, 2010)

Well, the normal VF3000(N/A) has a fan controller with it.. But, don't see it with the Zotac


----------



## newtekie1 (May 26, 2010)

The standard VF3000 connected to a standard fan header though, the custom one Zotac is using connects directly to the card's fan header.  Which I would guess means it has fan control.


----------



## Cold Storm (May 26, 2010)

newtekie1 said:


> The standard VF3000 connected to a standard fan header though, the custom one Zotac is using connects directly to the card's fan header.  Which I would guess means it has fan control.



Very true. it would assume that it would go straight to fan control.


----------



## ozama (May 26, 2010)

I hope its possible,thanks for your reply guys.


----------



## Hayder_Master (May 27, 2010)

thanx W1zzard for first review in the hole web
im think about getting this card


----------



## alexsubri (May 27, 2010)

:shadedshu Still hoping I'd see more competition against the 5970 shadedshu


----------



## Jiraiya (May 27, 2010)

thx man , wt this games ?? this old games

*Enemy Territory: Quake Wars

Quake 4

Prey

Far Cry

Call of Duty 4*

this new games



> Metro 2033
> 
> Just Cause 2
> 
> ...


----------



## Hayder_Master (May 27, 2010)

alexsubri said:


> :shadedshu Still hoping I'd see more competition against the 5970 shadedshu



can't compare it with dual GPU, GTX480 is the fasts GPU now, but two on sli and compare


----------



## wahdangun (May 27, 2010)

newtekie1 said:


> Very interesting...
> 
> It has huge power consumption drops, despite actually having a higher voltage.  It is possible that the better cooling on the VRM area has made it more effeicient?



maybe because they use more efficient cooler and not using those stupid high ampere fan, and because they are lower temperature it's make the fan spin less, thus make power consumtion to drop.


----------



## Bjorn_Of_Iceland (May 27, 2010)

alexsubri said:


> :shadedshu Still hoping I'd see more competition against the 5970 shadedshu


Surely you were not expecting a mere change of the cooler would make it faster than the 5970??


----------



## afw (May 27, 2010)

hayder.master said:


> can't compare it with dual GPU, GTX480 is the fasts GPU now, but two on sli and compare



People buy VGA cards not GPUs  ... 

They wont care whether the 5970 has 2 cores or 20 cores as long as its price/performace/temps are good ... 

But i have to admit that the 5970 price is a bit high  ....


----------



## Bjorn_Of_Iceland (May 27, 2010)

afw said:


> But i have to admit that the 5970 price is a bit high  ....


Its priced competitively.. its really not that high imo.


----------



## Fourstaff (May 27, 2010)

Do we have a winrar? I think this card is going to pile a lot of pressure on ATi's offerings.


----------



## newtekie1 (May 27, 2010)

Bjorn_Of_Iceland said:


> Its priced competitively.. its really not that high imo.



HA

When there is nothing to compete against, that isn't a valid statement.

And I would hardly consider it priced competitive when the closest option performance wise is $100 cheaper, and of course by closest I mean virtually identical performance.


----------



## mlee49 (May 27, 2010)

Jiraiya said:


> thx man , wt this games ?? this old games
> 
> *Enemy Territory: Quake Wars
> 
> ...



+1 please stop working so hard!  You can remove these games from the review process, they are really outdated.

Great review as always Wiz! Thanks!


----------



## puma99dk| (May 27, 2010)

i liked the bios tweak Zotac have than, almost 20watts less power usage than Nvidia's reference card that's awesome ness, maybe with some more tweaks it can get better ^^


----------



## claylomax (May 27, 2010)

I'm not surprised about the power consumption; it's been compared to the power consumption of the "launch card" not the mass production card. My GTX 480 consumes just 50w more at load that my old GTX 285. By the way most GTX 480 owners seem to reach 830-840 on the core with stock voltage; otherwise another excellent review.


----------



## mascotzel (May 27, 2010)

The fans have 4 wires and they should be capable of being dynamically controlled .
Maybe there is something wrong with the sample TPU received. I've just read another ZOTAC GTX480 test and it shows big variations in idle/load dbA, so the fans must be controlled.


----------



## WW_Dagger (May 28, 2010)

*Will it fit in SLI?*

So if I wanted to put two of these big daddies on my Asus Rampage III Extreme motherboard... would I have to use slot 1 and 4? Wonder if slot 4 would even have enough room below it...


----------



## W1zzard (May 28, 2010)

mascotzel said:


> The fans have 4 wires and they should be capable of being dynamically controlled .
> Maybe there is something wrong with the sample TPU received. I've just read another ZOTAC GTX480 test and it shows big variations in idle/load dbA, so the fans must be controlled.



the fans are temperature controlled. it's just that there is so little difference between the idle fan speed and load fan speed that the noise difference is smaller than 1 dba


----------



## caleb (May 28, 2010)

1. 





> Significantly lower power consumption thanks to lower temperatures


 Since when is lower heat = lower consumtion ?
2. *PLEASE* benchmark on new games!!!


----------



## newtekie1 (May 28, 2010)

W1zzard said:


> the fans are temperature controlled. it's just that there is so little difference between the idle fan speed and load fan speed that the noise difference is smaller than 1 dba



I wonder if some more aggressive fan speed control set by the user would change that.  I think the fan speed curve Zotac is using just doesn't lower the fan speed enough when idle.



caleb said:


> 1.  Since when is lower heat = lower consumtion ?
> 2. *PLEASE* benchmark on new games!!!



Cooler VRM area means they work more efficiently.


----------



## caleb (May 28, 2010)

> Cooler VRM area means they work more efficiently.


Most likely by a margin which might be considered a measurement error.


----------



## newtekie1 (May 28, 2010)

caleb said:


> Most likely by a margin which might be considered a measurement error.



20-30w definitely isn't a margin that would be considered a measurment error, and go back an read the review.  W1z being the great reviewer that he is did additional testing to see what was going on, and sure enough the power draw goes up with temperature in a linear fashion.


----------



## swaaye (May 28, 2010)

Varying power consumption in these cards is perhaps caused by wider than usual variances in the leakage current from these GPUs. Unsurprising considering their problems with the process. So it's the luck of the draw on whether you get one that's fairly efficient or one that's a particularly wasteful minifurnace.


----------



## newtekie1 (May 28, 2010)

swaaye said:


> Varying power consumption in these cards is perhaps caused by wider than usual variances in the leakage current from these GPUs. Unsurprising considering their problems with the process. So it's the luck of the draw on whether you get one that's fairly efficient or one that's a particularly wasteful minifurnace.



On again, read the review, power consumption increased with temperature.  So it has nothing to do with the luck of the draw.


----------



## swaaye (May 29, 2010)

So that means that the typical GTX 480 runs its VRMs so hot that they become rather impressively inefficient. That is pretty sloppy of NV. It could just be the tested samples here though.

But leakage current does vary between these GPUs. You can bet on that. That is the case with every GPU out there.

Regardless, these chips are just not that great. Need a refresh for sure. I can't wait to see how that turns out and if this ends up being one of the more inefficent GPUs. I'd bring up the R600 -> RV670 transition but they enjoyed a major new manufacturing process there and added Powerplay. I can't see the refresh to this beast being on anything other than 40nm again.


----------



## W1zzard (May 29, 2010)

swaaye said:


> So that means that the typical GTX 480 runs its VRMs so hot that they become rather impressively inefficient. That is pretty sloppy of NV. It could just be the tested samples here though.



i measured gpu temperature, not vrm temperature


----------



## newtekie1 (May 29, 2010)

W1zzard said:


> i measured gpu temperature, not vrm temperature



Right, but it is logical to assume that if the GPU is heating up, the VRMs are also.  And since the VRMs are where the power is converted for the GPU, their efficiency it was it dropping and causing the higher power draw.


----------



## W1zzard (May 29, 2010)

newtekie1 said:


> Right, but it is logical to assume that if the GPU is heating up, the VRMs are also.  And since the VRMs are where the power is converted for the GPU, their efficiency it was it dropping and causing the higher power draw.



i doubt the difference in vrm temps will be that big, but unfortunately no sensor in vrms so no way to find out.
the power draw changes near instantly with gpu temperature, vrms would need some time to heat up from the reduced air flow of the fans going slower


----------



## wahdangun (May 29, 2010)

could it be the fan ?

because reference GTX 480 use very high ampere, and RPM

and this zotac use very efficient cooler and because that it make the GPU more cooler with less RPM


----------



## HillBeast (May 31, 2010)

caleb said:


> Most likely by a margin which might be considered a measurement error.



It's a known fact that when electronics get hot, they become inefficient. That's why I don't let my CPU over 60C and my GPU over 70C because they just start getting really inefficient, plus it's not good for them. When you are dealling with something with 3 billion transistors, it's definately going to make a difference when they start expanding from heat and such.

Anyways, nice review W1zzard. Good to see the inclusion of power consumption graph showing how it uses more power when it gets hotter. I knew this happened but didn't realise it made that much of a difference. Probably why this has a lower power consumption: lower temps.


----------



## swaaye (May 31, 2010)

HillBeast said:


> It's a known fact that when electronics get hot, they become inefficient. That's why I don't let my CPU over 60C and my GPU over 70C because they just start getting really inefficient, plus it's not good for them. When you are dealling with something with 3 billion transistors, it's definately going to make a difference when they start expanding from heat and such.


Chips are specified for much more heat than 60C and they have to maintain their design power across their designated temp range. Lifetime is of no concern to me because, comparatively, I'm not running 486s or Pentium IIIs anymore and what I have today will be just as pathetic in 10 years as those are today.

I'm still thinking that the GTX 4xx cards have significantly varying leakage current. These GPUs are obviously the most complex ever and are very difficult to manufacture so they are probably not having the best of luck with chip quality overall. Just doing a search regarding GF100 and 'leakage current' brings tons of results, unsurprisingly. They need 28nm to work amazingly well but I'm not sure what to expect there and it's a long time off I believe. It's amazing how far and hard they push processes these days compared to 10 years ago. That's competition for ya.


----------



## HillBeast (Jun 1, 2010)

swaaye said:


> Chips are specified for much more heat than 60C and they have to maintain their design power across their designated temp range. Lifetime is of no concern to me because, comparatively, I'm not running 486s or Pentium IIIs anymore and what I have today will be just as pathetic in 10 years as those are today.



While they a specified for higher, it's still not good for them. Doesn't matter how well it's built, it's simply physics: when something heats up, it expands. When you have that many tiny little things all bunched together, expansion is the last thing you want, no matter how little.


----------



## Adhdownload (Jun 1, 2010)

*I'm happy as is.*

Seeing the stats im happy with 4870x2 xfire.Ok it pulls about 1000watt full power, but the visuals are ok,even with only a I7 920.


----------



## Blacklash (Jun 9, 2010)

Yeah people do buy cards not GPUs, and many of those same folks whine when they realize their 2Gb card has a 1Gb frame buffer and performs worse than a single GPU in games if a proper driver profile isn't present. See it constantly on gaming forums.


----------

