# NVIDIA GeForce GTX 480 Fermi



## W1zzard (Mar 19, 2010)

Today marks the release of NVIDIA's new GeForce Fermi architecture. After excruciating months of delays, NVIDIA has finally given the green light for their new products. The GeForce GTX 480 offers all the latest features like DirectX 11, Tesselation, gaming on multiple monitors and GPU computation. Did NVIDIA's new card manage to claim the throne?

*Show full review*


----------



## qubit (Mar 26, 2010)

Nice, It's got the performance crown, I want one! 

Thanks for another great review, W1zzard. 

EDIT: Ok, taking a slightly longer look at it, I think I'd rather wait for the improved "GTX485" variant which will come with a die shrink and all 512 shaders enabled (hopefully). The GTX480 runs far too hot and noisy for my liking. My 285 doesn't make an irritating noise, even when worked hard and that's essential for me. I'd rather lose performance than put up with it.

Heck, I might even get one of the smaller variants coming out soon, if I want nvidia DX11 capability to play with.


----------



## ERazer (Mar 26, 2010)

great review wizz 

card kinna let down


----------



## theonedub (Mar 26, 2010)

96C loaded, not F@H friendly


----------



## ShiBDiB (Mar 26, 2010)

only 5-8fps in msot tests then my 295.. win on my part


----------



## Binge (Mar 26, 2010)

BWAHAHAHAHAHAHAHA 320 Watts!


----------



## Soylent Joe (Mar 26, 2010)

A big, hot, expensive, badass, sexy, power-sucking pile of awesome. Can't wait to see how the 470 does.


----------



## SteelSix (Mar 26, 2010)

ATI: Catalyst 9.12 ??


----------



## dir_d (Mar 26, 2010)

yay...i wanna see 10.3 benches with it now


----------



## Marineborn (Mar 26, 2010)

not bad, but once that aa hits them 480's it starts to choke and at high resoltuions. notice the 5850 at 2560x1600 with anti almost beats the 480. but still not bad, it should have been better for the 6 months delay


----------



## jasper1605 (Mar 26, 2010)

SteelSix said:


> ATI: Catalyst 9.12 ??



yeah, interesting.  Didn't 10.2 or 10.3 announce FPS increases for a multitude of games?


----------



## SystemViper (Mar 26, 2010)

sweeeet, again another toy on the "git me one" list, man that list is growing...

nice review! as always


----------



## DaC (Mar 26, 2010)

As expected I would say.... but no thanks, the electric bill would get into account because of the very big difference on  power draw...


----------



## qubit (Mar 26, 2010)

Looking at the benchies, it's amazing how well the 4870 X2 still does - it actually beats the GTX480 in some cases and is a great achievement for AMD for making such a relatively future proof (in frame rate at least) card.

Although I want the GTX480, the performance difference isn't enough for me to buy it anytime soon. A big reason is that my trusty GTX285 still gets really good frame rates in the games and resolutions I play at. If and when DX11 becomes more mainstream and games I'm interested in start using it eg HL2EP3, then it will make a more compelling purchase.

I reckon the Fermi GPU will really come into its own on the next die shrink. It'll run cooler and have the extra shaders enabled.


----------



## xaira (Mar 26, 2010)

only shows any significant performance advantage in a few situations where both cards average over 100+FPS, disappointing pile of fail, at least the fanbois will still buy it 300watts vs 188watts Whup tee doo well done nFailia :shadedshu


----------



## Fourstaff (Mar 26, 2010)

Performs in a very predictable way, but still a reasonable product. Useful especially during winter, it keeps you warm. However in summer...


----------



## SteelSix (Mar 26, 2010)

jasper1605 said:


> yeah, interesting.  Didn't 10.2 or 10.3 announce FPS increases for a multitude of games?



yep.. 10.3 and 10.3a


----------



## Divide Overflow (Mar 26, 2010)

jasper1605 said:


> yeah, interesting.  Didn't 10.2 or 10.3 announce FPS increases for a multitude of games?


Yeah, it sure did.  Things are somewhat closer with current drivers.


----------



## crow1001 (Mar 26, 2010)

WTF, how come you're running the 9.12's from December 2009, misprint?


----------



## pantherx12 (Mar 26, 2010)

Not impressed at all.

Power draw is ludicrous .

Considering NV had an extra 6 months to fiddle with the card I was expecting better performance as well.


----------



## Reefer86 (Mar 26, 2010)

well thats what i thought WTF! 10.3's are surely better to use.

Its kinda made that review worthless to me, sorry to say.

Yeah ok we can see the performance of the card BUT not using the latest drivers means we cant see if this has taken the crown


----------



## human_error (Mar 26, 2010)

Well i'm not suprised, but am quite dissapointed no testing was done with the 10.3a/b/stock drivers for the 5k series - i feel possibly a mini-review on how the 5ks do against fermi on those drivers would be very interesting.

I am very impressed by 50mhz core for desktop but less than impressed at the power needed to run that.

Glad my 5970s still teh fastest thing around too.


----------



## newtekie1 (Mar 26, 2010)

Price to performance is about where I expected. What is surprising is how much better this card did in all the DX11 tests.  Seems like nVidia's claims of 100% better performance in tesseleation wasn't too far off.  Too bad DX11 won't matter until at least the next generation of cards.  I think I'll pass, unless I can get a free step-up from eVGA...


----------



## johnnyfiive (Mar 26, 2010)

Knew it.

So whos still buying a 480? At $499+ (More like $549), the 5870 seems a lot more appealing now doesn't it?


----------



## SteelSix (Mar 26, 2010)

johnnyfiive said:


> Knew it.
> 
> So whos still buying a 480? 5870 seems a lot more appealing now doesn't it?



It sure does. ATI should have launched all options today, mainly the 2GB cards. I'm ready to buy now damn it!!


----------



## alexsubri (Mar 26, 2010)

Hmm.. I knew I wouldn't be dissapointed by ATI when I purchased my 5850 and XFX 5850. I was actually hoping nVidia would beat the 5970s


----------



## KainXS (Mar 26, 2010)

the power consumption alone makes this card a failure in my eyes, its crazy, if it were like 280 i could understand but 300+ . . . . .


----------



## SteelSix (Mar 26, 2010)

alexsubri said:


> Hmm.. I knew I wouldn't be dissapointed by ATI when I purchased my 5850 and XFX 5850. I was actually hoping nVidia would beat the 5970s



I had 5850's in my basket at launch and held off. I wish I'd grabbed them. Launch price too..


----------



## jasper1605 (Mar 26, 2010)

KainXS said:


> the power consumption alone makes this card a failure in my eyes, its crazy



3 of those in SLI w/ a heavily OCd processor.  You'd need a 1.5kW psu.  Not to mention an air conditioner in place of the side panel to keep the heat from remelting the steel


----------



## SteelSix (Mar 26, 2010)

KainXS said:


> the power consumption alone makes this card a failure in my eyes, its crazy, if it were like 280 i could understand but 300+ . . . . .



I'm not one to bitch about power consumption but this is crazy, the temps are insane when compared to 5870's. Infreakinsane..


----------



## KainXS (Mar 26, 2010)

yeah the temps are bad . . . . SLI =?

but the fan on the card is very powerful I don't think it will overheat before it sounded like a jet in your pc.


----------



## Azza_1 (Mar 26, 2010)

I'm also confused by the ATI drivers used...


----------



## Loosenut (Mar 26, 2010)

SteelSix said:


> I'm not one to bitch about power consumption but this is crazy, the temps are insane when compared to 5870's. Infreakinsane..



I said it a few weeks ago and I'll say it again. You wanna run these cards in tri-SLI? Get something like Fits' Mega rig. Otherwise, with summer here soon, EPIC FAIL.


----------



## pentastar111 (Mar 26, 2010)

Hmmm...I see no reason to replace my 285's yet. And if I do it'll be so that I can do eyefinity. Which means no nVidia anyway. Also, there is waaaay tooooo much heat and power draw.


----------



## fochkoph (Mar 26, 2010)

Any chance Bad Company 2 will become a standard in the benchmarks?


----------



## epicfail (Mar 26, 2010)

the o so great new card by nvidia gets beaten by the old 4870x2 on some tests ;-) not lots of tests but still.

ill stay with my old card ty


----------



## LifeOnMars (Mar 26, 2010)

Not impressed, I'll wait for future revisions.


----------



## MrMilli (Mar 26, 2010)

SteelSix said:


> ATI: Catalyst 9.12 ??



A review that uses 10.3a for all ATI cards:
http://www.pcgameshardware.com/aid,...eviewed-Fermi-performance-benchmarks/Reviews/


----------



## Reefer86 (Mar 26, 2010)

im sorry but using the 9.12 drivers really has made the 5870 look like a chump.

For example in heaven 2.0 @ 1920x1200

the review gave it 27.1

im using 10.3 drivers and i get 39.4 which beats the 480.

Thats a huge difference, i cant believe that was overlooked.
We want to know how good the 480 is compared to a 5870 now (with latest drivers) not 3 months ago.

I will have to look around for a more complete reveiw.


----------



## KainXS (Mar 26, 2010)

I waited with my GTX280 long enough im gettin 2 5970's and goin back to amd land.


----------



## F2K (Mar 26, 2010)

Sorry but I had to skip the review because of the old ATI drivers. This would represent a wrong image in my head.


----------



## HalfAHertz (Mar 26, 2010)

Nvidia definitely delivered on performance, but everything else about the card is quite sketchy. BTW I'm still selling my easy-bake solutions!


----------



## DaedalusHelios (Mar 26, 2010)

I am glad ATi fanboys will now cease posting "semiaccurate" website links that are BS now that the cards are on the table. Maybe I spoke to soon on the growing up part. 

I am sticking with my 5870's thanks to the better pricing. Power consumption is not my worry. Initial cost to purchase the cards is my main concern. Nvidia will have to drop prices to move serious volume in this competitive market. Nvidia needs to drop them by $100 at least.


----------



## crow1001 (Mar 26, 2010)

The cards are not available, semi has been right all along.


----------



## Steevo (Mar 26, 2010)

*Epic FAIL!!!*

And utter RAPE!!!!


$379 with Dirt2


----------



## eidairaman1 (Mar 26, 2010)

basically AMD has executed a Choke Hold Maneuver on Nvidia in the pricing market. AMD has a lead in DX11 sales and will continue to succeed and then will lower pricing of the first gen 5800 series with the Launch of the Second Gen 5800 Series. At Least is what I expect.


----------



## Maban (Mar 26, 2010)

How much power does the card consume when OCed?


----------



## ERazer (Mar 26, 2010)

x6 month of wait and u get oven toaster

@ HalfAHertz - sign me up for one


----------



## DaedalusHelios (Mar 26, 2010)

Maban said:


> How much power does the card consume when OCed?



Without increasing volts it should be the default 320W under load.


----------



## Maban (Mar 26, 2010)

Surely it would need more amps though wouldn't it?


----------



## xrealm20 (Mar 26, 2010)

not quite helios -- an overclocked cpu will consume more power at a higher clockspeed even at stock voltage.


----------



## crazyeyesreaper (Mar 26, 2010)

meh old drivers = not fully accurate altho im glad for the review Wizz  with your review and the multitude of others we can get a nice idea of where the cards truly are.

but all i can think is 2 5850s at launch was $500 for me and i get better then 5970 performance and with wizz review i hit 300 watts max from both gpus under heavy load so 2 5850s at max non volt mod clock and i easily destroy the 480 for the same price.  you need to redo the 5k cards with 10.3 drivers wizz so we can get a better idea  ive seen a HUGE jump in half my games between 9.12 and 10.3 my average increase is 5% in crysis to 25-30% in Dirt2  and about 15-20% in Heaven bench


----------



## rpsgc (Mar 26, 2010)

9.12 drivers? FAIL.



And 320W at load? HAHAHAHAHAHAHAHA



HAHAHAHAHAHAHAHAHA



HAHAHAHAHAHAHA


HAHAHA


HA!


----------



## aCid888* (Mar 26, 2010)

I'm just happy my Crossfire 4870's are still holding up well.


Needless to say, I wont be one of the many throwing my cash at this card.


----------



## Black Panther (Mar 26, 2010)

Some rumours appeared to be right after all... 

This was disappointing, this card was supposed to compete with the 5870 (I was actually thinking it might be better than the 5970...) 

It ended up to be:

-much hotter 
-only slightly better 
-more expensive 
-consumes much more power

than the 5870.

I'm sure Nvidia could have done better?


----------



## zyklon (Mar 27, 2010)

Black Panther said:


> Some rumours appeared to be right after all...
> 
> This was disappointing, this card was supposed to compete with the 5870 (I was actually thinking it might be better than the 5970...)
> 
> ...



+1
...very disappointed


----------



## DaedalusHelios (Mar 27, 2010)

xrealm20 said:


> not quite helios -- an overclocked cpu will consume more power at a higher clockspeed even at stock voltage.



Yeah I have never measured the difference as I always use a PSU with a much larger wattage than needed.


----------



## LittleLizard (Mar 27, 2010)

320W AND good performance in metro 2033.

In my opinion nvidia is using a card with a so high power consumption that works on a post apocalyptic russia to promote Nuclear energy. 

Anyway, i like its appeareance and performance but with such a high power consumption if i had the money i just would go for a 5870.

@w1zz: great review


----------



## Mistral (Mar 27, 2010)

Many thanks for the great review, W1zzard. 

It's faster than the 5870, that's what I was hoping for. The power consumption and thermals are wacko crazy nuts though.

So, then can we expect a 470 review with the 10.03 used for ATI?


----------



## cool_recep (Mar 27, 2010)

Finally nvidia revealed its recommended power supply for the GTX 480:









> A device at Sandia Labs releasing its capacitors' charge. It is about the size of a basketball court. It has achieved temperatures of 3.7*10^9 K (highest man-made temperature ever). During the 100 ns discharge, the power output is 2.9*10^14 W, equal to 80 times the power output of all the power plants on earth.



Oh yeah and the Fail Truck...


----------



## HillBeast (Mar 27, 2010)

Wow. I really need to go on eBay and sell my HD5870 so I can the more power hungry, hotter, uglier, and in some cases slower GTX480 because that's clearly the best card in the world...

I think you were being a bit kind to it giving it an 8.2. It may be the most powerful single GPU solution but when it manages to be hotter and more power inefficient than ATIs most powerful dual core, you need to factor these in a bit better with the conclusion.

Personally I wouldn't give it more than a 6.


----------



## Steevo (Mar 27, 2010)

Nvidia and Global Warming. 

Teaming up for your future!


----------



## HillBeast (Mar 27, 2010)

crazyeyesreaper said:


> you need to redo the 5k cards with 10.3 drivers wizz so we can get a better idea  ive seen a HUGE jump in half my games between 9.12 and 10.3 my average increase is 5% in crysis to 25-30% in Dirt2  and about 15-20% in Heaven bench



I agree. The newer drivers have made everything so much faster on my system, and aparently it also reduced power consumption but I've never seen this myself.


----------



## aCid888* (Mar 27, 2010)

HillBeast said:


> Personally I wouldn't give it more than a 6.



When was the last time anything in a TPU review got less than 7.5???


----------



## Semi-Lobster (Mar 27, 2010)

...I still don't get why 10.3 wasn't used?


----------



## HillBeast (Mar 27, 2010)

aCid888* said:


> When was the last time anything in a TPU review got less than 7.5???



lol. True. I think we have set our standards a little low lately. I remember back when a computer with a 300W power supply was over the top but now we have graphics cards which alone need more than that.


----------



## douglatins (Mar 27, 2010)

http://www.pcgameshardware.com/aid,...Fermi-performance-benchmarks/Reviews/?page=17

Funny quote from them

_The charts below show a list of games that we have been testing with the GTX 480/470 and the multi GPU cards GTX 295 and Radeon HD 5970. Please bear in mind that besides the increased rate of incomparability problems, the increased power consumption and the irregular frame distribution (including an input lag) the gaming experience delivered by a multi GPU card can be inferior to the experience on a single GPU card. The results below don't represent the felt performance. _






They failed to say that the 480 consumes a lot more than a 5970


----------



## PopcornMachine (Mar 27, 2010)

Looks like a very powerful card. A giant pig of a very powerful card.

And I wonder how long they will last running at +90C?  That's the most surprising thing to me.  Can't be good.


----------



## a_ump (Mar 27, 2010)

hmmm the GTX 480 is pretty much where it was guessed to fall, 10-15% faster than the HD 5870. So will ATI counter with a HD 5890 or just lower prices


----------



## douglatins (Mar 27, 2010)

a_ump said:


> hmmm the GTX 480 is pretty much where it was guessed to fall, 10-15% faster than the HD 5870. So will ATI counter with a HD 5890 or just lower prices



They dont need to lower prices, and they still have the all around perf king, so why bother


----------



## HillBeast (Mar 27, 2010)

PopcornMachine said:


> Looks like a very powerful card. A giant pig of a very powerful card.
> 
> And I wonder how long they will last running at +90C?  That's the most surprising thing to me.  Can't be good.



Xbox 360s last about 3 months at lower than those temperatures, so lets say... a day?

My Radeon HD5870 before I custom cooled it never got above 75C and after putting a Scythe Setsugen on it it never exceeds 55-60C unless I Furmark it.

Worst... Graphics Card... Ever.


----------



## Deleted member 24505 (Mar 27, 2010)

Soo i wonder how many companies will produce overpriced full cover water blocks for the 470/480 now


----------



## freaksavior (Mar 27, 2010)

Considering the gts is $499 and the 5870 is 350-400 the 5870 is better here. maybe drivers will mature and give better performance though. 

either way, im lightly disapointed.


----------



## pantherx12 (Mar 27, 2010)

So when is the review going to be redone?


----------



## HalfAHertz (Mar 27, 2010)

+1 These are still beta drivers, so maybe after Nvidia releases some more sable drivers, we could have a re-review?


----------



## a_ump (Mar 27, 2010)

um, who cares bout nvidia's "beta" drivers. 9.12 for ATI? hello wtf lol


----------



## Dahaka (Mar 27, 2010)

Why 9.12 and not a 10.2 or 10.3a?, is ridiculous see that part on this review.

Sorry guys but you dissapoint a lot with that detail. I don't know.... this is frustrating, all day wating for your review ( one of the best reviews on web ) and do you use that oldies and creeping drivers for what? to leave Nvidia ahead?...

Waiting for a good review, a real one.


----------



## sneekypeet (Mar 27, 2010)

wow all the time and effort that went into his review and all you guys can do is complain?

Maybe he wanted to compare out of the box experiences, and since they were so far apart he went to release drivers, or it is possibly a typo. Either way it was his prerogative to choose, and regardless, it is still an informative look at the 480.


----------



## crow1001 (Mar 27, 2010)

Err, the review is to give people an idea of current performance between the GPU's, using three month out of date drivers make this review invalid IMO.


----------



## sneekypeet (Mar 27, 2010)

while I get your take, it can be resolved with basic math. If ATI got a 10-15% increase, is it really that hard to "figure" the 10.3 numbers?


----------



## Dahaka (Mar 27, 2010)

sneekypeet said:


> while I get your take, it can be resolved with basic math. If ATI got a 10-15% increase, is it really that hard to "figure" the 10.3 numbers?



One thing is " figure " and another totally different is see the real diference, not " figure "....


----------



## senninex (Mar 27, 2010)

wow... its double yr electric bill + increase heat in yr room!

5890 one the way!!!!!!!!!!!!!!


----------



## Jamborhgini313 (Mar 27, 2010)

major dissappointment...was considering picking one up but wow not much difference from GTX295


----------



## sneekypeet (Mar 27, 2010)

I'm not here to defend W1zzard or his results.

I just think you all are being really harsh on his work, this type of thing isnt done in an hour, but with a few days of his time.

All the info is there. 
If it is a simply typo, you all are way out of line.
If he did use those drivers, the information is still worth its weight (as basic math isn't out of my realm of figuring logic). Saying its worthless in any way is over the top in any case, and thats where I leave this thread.

That doesnt even account for the time and dedication it takes to give you a review on time for release, and keep a site up giving you a place and the freedom to slam him and his views


----------



## pantherx12 (Mar 27, 2010)

sneekypeet said:


> while I get your take, it can be resolved with basic math. If ATI got a 10-15% increase, is it really that hard to "figure" the 10.3 numbers?



Its misleading simply put.



Found great conclusion about the 480 on bit-tech XD
"Yes, the GTX 480 offers great performance in our test games, especially in Dirt 2 and Bad Company 2, but compared to the competition, it doesn't make a strong enough case for itself, especially when you consider that there are just so many caveats involved with buying this card. The higher price, the 100W of extra power consumption, scorchingly hot temperatures and a much noisier stock cooler are all extremely detrimental to its desirability. The HD 5870 remains a far better choice if you're a gamer; while we've yet to see how the GTX 480 performs with CUDA apps and Folding, at this stage Fermi looks like a flop."


Also sneeky, he published a review publically, we're all welcome to criticise it.


----------



## jamsbong (Mar 27, 2010)

I can't believe many of those rumors are actually true!
The lists are below:
1. it is hotter and consumes more power
2. it is 10-15% faster
3. it has 480core not 512.

The one thing which is wrong is that power consumption. It is rated at 250watts not 298watts.

Overall, I'm disappointed. six months late and it is just a paper launch?

ATI will have a good answer for this card soon. a higher clocked 5870 can easily close the gap of the GTX480. 
The only thing ATI has to realize is the tessellation performance of FERMI. ATI has to produce something with much faster on tessellation when the time comes.


----------



## boulard83 (Mar 27, 2010)

Hard OCP ran a SLI of GTX480 and the nearly touch 594watts from the GPU ... ouch !

As other said .. power ungry/heat generator but very nice card !


----------



## senninex (Mar 27, 2010)

boulard83 said:


> Hard OCP ran a SLI of GTX480 and the nearly touch 594watts from the GPU ... ouch !
> 
> As other said .. power ungry/heat generator but very nice card !



Yes.. GTX480 is VGA no.1 that generate global warming...


----------



## v12dock (Mar 27, 2010)

Meh, time for a 5890 w/ 2GB of memory


----------



## a_ump (Mar 27, 2010)

yea i think we all know that if ATI wanted to they could release an HD 5890 and take back the crown or match it.


----------



## crow1001 (Mar 27, 2010)

Don't get me wrong, it's still an honest good review, but all other sites use the 10.3 drivers that have major performance boost for the 5850/70, hope it is an error.


----------



## senninex (Mar 27, 2010)

Sure.. 5890 >1Ghz.. 

Wanna OC the fermi?.............. u may cause fire in yr apartment


----------



## Black Panther (Mar 27, 2010)

Enrico _Fermi_ (29 September 1901 – 28 November 1954) was an Italian physicist, particularly remembered for his work on the development of the first nuclear reactor...


----------



## bpgt64 (Mar 27, 2010)

850W power draw in SLI...HAHAHAHAHHA


----------



## senninex (Mar 27, 2010)

I believe ATI guy currently with us now....

So please tell us the good news pls... when you guy release the HD5890?... i wait 4 it..


----------



## Animalpak (Mar 27, 2010)

I think, even lowering the frequences impossible to see a contender to the dual GPU 5970.

Anyway the performance crown is still go to nvidia.


----------



## pantherx12 (Mar 27, 2010)

Animalpak said:


> I think, even lowering the frequences impossible to see a contender to the dual GPU 5970.
> 
> Anyway the performance crown is still go to nvidia.





You should read some other reviews 


Its very close between 5870/480 according to other places, it depends on game really.


----------



## Paintface (Mar 27, 2010)

10.3 or not, ive never seen nvidia fail like that.

Using 250 watts instead of the 150w of the 5870

Second loudest card of all.

Goes up to 96C compared to the 65C of 5870 of which most come with custom cooling now.

If this card is selling for 500 dollars, the custom cooled 5870s are up to $70 cheaper...


Its sometimes faster and sometimes slower than the 5870, everything else isnt exactly in favor for the gtx480


----------



## v12dock (Mar 27, 2010)

I wonder how hot the card gets under intense sustained gaming... How long the chip can last down the road.


----------



## Animalpak (Mar 27, 2010)

aCid888* said:


> I'm just happy my Crossfire 4870's are still holding up well.
> 
> 
> Needless to say, I wont be one of the many throwing my cash at this card.





and how would you see the DX11?

You know you have to change GPU's.


----------



## dr emulator (madmax) (Mar 27, 2010)

sneekypeet said:


> I'm not here to defend W1zzard or his results.
> 
> I just think you all are being really harsh on his work, this type of thing isnt done in an hour, but with a few days of his time.
> 
> ...



btarunr edits the bosses work so  
maybe the ofending views need censoring and the thread closing


----------



## TheMailMan78 (Mar 27, 2010)

This sucks for all of us. I was really hoping Nvidia would come to win this time. Instead we got a 2900XT.

Also W1zz just upgraded his review stats/drivers a few months ago. Do you guys have ANY idea how long it takes to run all those tests correctly? If he re-reviewed everytime a new driver came out we would never see a new review! I swear you guys are like clockwork. You bitch about the same things EVERY TIME we get a new GPU review.


----------



## senninex (Mar 27, 2010)

sapphire HD5870 toxic vapor X = GTX480 in performance... 

but.. others spec (temp, noise, power consumtion, price, Availbility etc)... Shapphire (ATI) win


----------



## TheMailMan78 (Mar 27, 2010)

If you guys want 10.3a results just take the numbers W1zz posted and add 15%. BAM you have your FPS.


----------



## pantherx12 (Mar 27, 2010)

TheMailMan78 said:


> This sucks for all of us. I was really hoping Nvidia would come to win this time. Instead we got a 2900XT.
> 
> Also W1zz just upgraded his review stats/drivers a few months ago. Do you guys have ANY idea how long it takes to run all those tests correctly? If he re-reviewed everytime a new driver came out we would never see a new review! I swear you guys are like clockwork. You bitch about the samethings EVERYTIME we get a new GPU review.





This is an "important" review, it should of been done right.


----------



## TheMailMan78 (Mar 27, 2010)

pantherx12 said:


> This is an "important" review, it should of been done right.



It was. Read the post above yours.


----------



## pantherx12 (Mar 27, 2010)

TheMailMan78 said:


> It was. Read the post above yours.



What about people that don't know about that?

More then just us who read the reviews.

Its incredibly misleading if Wiz has not made a typo and did infact use 9.12.

Especially since they offer so much improvement.


----------



## TheMailMan78 (Mar 27, 2010)

pantherx12 said:


> What about people that don't know about that?
> 
> More then just us who read the reviews.
> 
> ...



He used 9.12. And in all fairness its only a 15% with 10.3a difference vs Nvidias immature drivers. Seems about fair.


----------



## Animalpak (Mar 27, 2010)

We know, the new baby is strong...

I can not justify all the consumption and heat produced by such a graphics card. it seems that ATI has succeeded in the goal.

I am interested in the dual-GPU version, in my opinion, the project has not even been thought of.

59 degress idle, and 90 -100 degrees full load... simply unacceptable

I think the delay is caused by despair of the engineers find a proper heat dissipation. And have failed miserably.


----------



## afw (Mar 27, 2010)

Here some reviews where 10.3a preview drivers have been used 
http://news.firingsquad.com/hardware/nvidia_geforce_gtx_480_470_performance/
http://www.pcgameshardware.com/aid,...eviewed-Fermi-performance-benchmarks/Reviews/
http://www.legitreviews.com/article/1258/1/
http://www.hardwarecanucks.com/foru...iews/30297-nvidia-geforce-gtx-480-review.html
http://www.guru3d.com/article/geforce-gtx-470-480-review/1
http://www.techspot.com/review/263-nvidia-geforce-gtx-480/
http://www.tomshardware.com/reviews/geforce-gtx-480,2585.html
http://www.anandtech.com/video/showdoc.aspx?i=3783&p=1

The GTX performs somewhere between 10%-15% better than a 5870 ....

The only thing that nVidia can boast about is that they have the fastest GPU on earth now ... 

could have been better ... but I wont say its an utter failure ... 

EDIT: Average load temp readings from all the above sites is around 95C .... so please make sure that you'll dont play games during summer


----------



## pantherx12 (Mar 27, 2010)

TheMailMan78 said:


> He used 9.12. And in all fairness its only a 15% difference vs Nvidias immature drivers. Seems about fair.




Not at all, that 15% would of made the 5870 pretty much tickle the 480s balls it would of been so close.

Check other reviews to see 


That makes a massive difference in results.


Remember nv have had 6 months extra to sort things out, immature drivers are no excuse.


----------



## senninex (Mar 27, 2010)

Is ATI VGA support 3D vision?

I want to play next gen 3D games... on my 42" OLED display TV...


----------



## DaC (Mar 27, 2010)

senninex said:


> Is ATI VGA support 3D vision?
> 
> I want to play next gen 3D games... on my 42" OLED display TV...



You could use plasma tv instead to get a even greater power consumption....


----------



## pentastar111 (Mar 27, 2010)

There is NOTHING wrong with the review...This card is just sometimes "marginally" edging ATI, and with out of date drivers v s. immature ones. .Not impressive...It definitely not as impressive as the jump from the 7 series to the 8 series that's for sure.


----------



## senninex (Mar 27, 2010)

DaC said:


> You could use plasma tv instead to get a even greater power consumption....



Man...

Plasma is out of date...... 

OLED is so much better.............


----------



## Dahaka (Mar 27, 2010)

sneekypeet said:


> I'm not here to defend W1zzard or his results.
> 
> I just think you all are being really harsh on his work, this type of thing isnt done in an hour, but with a few days of his time.
> 
> ...



No one is discussing that, i do reviews along time ago too, and i know how much do you put on that time, effort, hours without slipping and stuff like that.

There are many much betters reviews on the web, like anandtech for an example.


----------



## DeathByTray (Mar 27, 2010)

Nice review Wizz.
Too bad about the drivers but it's ugly enough for the GTX 480.


----------



## pantherx12 (Mar 27, 2010)

senninex said:


> Is ATI VGA support 3D vision?
> 
> .




Yes ATI cards support 3D, just ati does not make the software to support 3d, just a case of downloading it from somewhere else : ]


----------



## mlee49 (Mar 27, 2010)

Thank you Wiz for silencing the myths.  It's too bad Nvidia's putting out such a hot card.  I really wanted  a 470 but think I'll wait for a dual GPU card before I consider buying an upgrade.


----------



## Steevo (Mar 27, 2010)

Nvidia has had how many months of delays to optimize driers? Stop pole smoking your favorite green candy stick and face it, ATI wins, and wins big.


----------



## Sasqui (Mar 27, 2010)

I don't know if it's just me, but an 8.2 score seemed high after reading the article and benches.  I love the review comments.

A waste of good silicon.  When they get the fabs right, clocks up, temps down, perhaps it will impress.


----------



## SUPERREDDEVIL (Mar 27, 2010)

Tnx for the reviews, BUT.....


----------



## senninex (Mar 27, 2010)

Sasqui said:


> I don't know if it's just me, but an 8.2 score seemed high after reading the article and benches.  I love the review comments.
> 
> A waste of good silicon.  When they get the fabs right, clocks up, temps down, perhaps it will impress.



You can read this news... why Nvidia delaying their product.. Someting A1 silicon.. idont know..

read this...
http://www.semiaccurate.com/2010/02/17/nvidias-fermigtx480-broken-and-unfixable/


----------



## DonInKansas (Mar 27, 2010)

mlee49 said:


> Thank you Wiz for silencing the myths.  It's too bad Nvidia's putting out such a hot card.  I really wanted  a 470 but think I'll wait for a dual GPU card before I consider buying an upgrade.



A dual GPU card with this GPU would probably burn down the entirety of Lawrence.


----------



## PopcornMachine (Mar 27, 2010)

Not only do these cards take way too much power themselves, but will cause even more of a drain when people have to turn up the AC so they can stay in the same room. :LOL:


----------



## HillBeast (Mar 27, 2010)

TheMailMan78 said:


> He used 9.12. And in all fairness its only a 15% with 10.3a difference vs Nvidias immature drivers. Seems about fair.



The thing is, NVIDIAs drivers have always been on the ball. They always come out pretty decent and drivers do improve performance but not as much as ATIs drivers do.

The other thing is, these chips have been in production since like January and they have had all that time to improve the drivers, so you can't really say they drivers are immature. Especially when the card was 6 months late.


----------



## iamverysmart (Mar 27, 2010)

ATI: Catalyst 9.12


----------



## human_error (Mar 27, 2010)

I feel i may have been a little harsh in my initial read of the review - so much so i forgot to say thanks w1zz for doing this. Give it a few days after availability and we'll have more current driver numbers than you can shake a stick at for comparisons. I think we've all forgotten w1zz doesn't get paid for these reviews and as others have said they take a long time to bench.

That doesn't mean i don't think this card fails hard - it does - i just feel we should be greatful to have a set of benchmarks out the minute the NDA was lifted and a SLI scaling review too.


----------



## csendesmark (Mar 27, 2010)

Fermi? you mean *FAIL*-me? 
320W? OMG


----------



## SUPERREDDEVIL (Mar 27, 2010)

when NVIDIA GTX495 comes out.. ATI already have the HD6970 for the win...  Admit it everyone! ATI has the crown right now!


----------



## Melvis (Mar 27, 2010)

Seems to out perform the 5870 in just about everthing, well done Nvidia for that, but if you have a 4870X2 or a GTX295 then this wouldnt be an upgrade (unless you want DX11) otherwsie any other card...then yes that would be an upgrade.

Shame its one hot power hungry card for a single GPU wow


----------



## mlee49 (Mar 27, 2010)

DonInKansas said:


> A dual GPU card with this GPU would probably burn down the entirety of Lawrence.



History could repeat itself(Lawrence KS was burned down in the late 1800's by a bunch of Missouri A-holes- hence Bleeding Kansas).


Dx 11 needs to be developed before we can really utilize these new cards, yet another point on why not to buy.


----------



## sneekypeet (Mar 27, 2010)

as a cooler guy, this card has make or break writtten all over it for aftermarket cooler manufacturers. If you can cool a Fermi with it, you got yourself one helluva product


----------



## Cold Storm (Mar 27, 2010)

There is more to look at then the factor of things being "Up to Date"..  in order for a video card to work right, you get drivers that "mature" it. Drivers work to "fix", "help" the video card go along. 

Yes, Nvidia has had 6m to work on things..  But, how many months have ATI had a card out, to make a driver "mature" due to the factor of having, Millions, and I mean MILLIONS, of people to work out, and find, all the problems that the drivers have.

Now, if you really want to go and play hard ball.. driver wise on 9.12.. here's some ss to help you along.

9.12 drivers





*169k reports*

10.1 drivers




*389k reports*

10.2 drivers




*360k reports*



As you can see from the SS, the "safer" drivers to have is the 9.12 drivers. Yes, they are not the most "mature" drivers, but they've made people feel Safe in the factor of using.

Also, you gotta look at the factor of this.. 10.3 became "official" as of March 24th.. A little over 2 days go.. W1z uses 25 benches to do his reviews. 5 tests for each.. 25x5=125. So, 125 benches he would have to redo in 2 days.. Did, some do it.. Yes.. Why? How? Because they probably don't have anything else to do but get that going. 

W1zzard has more going on then just a "RELEASE" of a card. GPU-z gotta get a work out now. You got services that have to get going in order to get this new "series" going.. He's not here just to do a review. Not like some, probably most, sites have it going.. 




Just my 2 cents...


----------



## HillBeast (Mar 27, 2010)

human_error said:


> I feel i may have been a little harsh in my initial read of the review - so much so i forgot to say thanks w1zz for doing this



Same here. W1zzard worked hard to get the review out so thanks to him for it. I don't hold it against him for using 9.12 drivers because it shows how useless GF100 is compared to a Radeon 5870 which isn't at it's potential.


----------



## HillBeast (Mar 27, 2010)

Cold Storm said:


> 9.12 drivers
> *169k reports*
> 
> 10.1 drivers
> ...



Not really a fair test seeing it's Google. 160k of that will most likely be porn or unrelated.


----------



## senninex (Mar 27, 2010)

GTX480?
I'm surprise that not surprise.


----------



## pentastar111 (Mar 27, 2010)

sneekypeet said:


> as a cooler guy, this card has make or break writtten all over it for aftermarket cooler manufacturers. If you can cool a Fermi with it, you got yourself one helluva product


That is very true! lol


----------



## Cold Storm (Mar 27, 2010)

HillBeast said:


> Not really a fair test seeing it's Google. 160k of that will most likely be porn or unrelated.



hey, all good.. want to play even more. go deeper. go site to site, and see what the "SEARCH" button might find you.. 

Or, look at what people here have said.. 

Nvidia had drivers out not to long ago.. I think it was the first 191 drivers.. People will tell others to use them. Why? because their a "stronger" driver suit, then any of the newer ones. Yeah, after a few "newer" drivers, you got one that was better, but you still had people say that they tried, and they did, the newer ones, just to go back to the certain one.


----------



## HillBeast (Mar 27, 2010)

Cold Storm said:


> Nvidia had drivers out not to long ago.. I think it was the first 191 drivers.. People will tell others to use them. Why? because their a "stronger" driver suit, then any of the newer ones. Yeah, after a few "newer" drivers, you got one that was better, but you still had people say that they tried, and they did, the newer ones, just to go back to the certain one.



I use the latest ATI 10.3 drivers on my computer and have no bugs I can report, but what I can report is a huge boost in performance over older drivers which is what matters in a review. If the game crashed in the benchmark then they should mention it. They used to do that with the original SLI cards they would say 'We gave the 7950GX2 0 FPS because it crashed in benchmarks'.

By using old drivers it is making the GTX480 look faster. Which it isn't. Not really.


----------



## Cold Storm (Mar 27, 2010)

HillBeast said:


> I use the latest ATI 10.3 drivers on my computer and have no bugs I can report, but what I can report is a huge boost in performance over older drivers which is what matters in a review. If the game crashed in the benchmark then they should mention it. They used to do that with the original SLI cards they would say 'We gave the 7950GX2 0 FPS because it crashed in benchmarks'.
> 
> By using old drivers it is making the GTX480 look faster. Which it isn't. Not really.



Nice that your using the latest drivers. Cookie should go your way for it.. But, you took one part of my post and now going away from the whole thing..

Drivers came out 2 days ago. 25 benchmarks are used. 125 redone bench marks have to be done. Can it be done in 2 days? yeah.. Maybe..

His reviews are done a little different then others.. Yeah, most do "up to date" games. Just Cause 2, all the latest games.. Not him... He, probably, does the games in which most people will still have in their computer. Have the factor of "replaying" then when a new driver came out.  
^^^ this is what I "think he does" and do not respond to what he "really" does^^^


----------



## yuriylsh (Mar 27, 2010)

*ATI 9.12 driver?*

unfortunately had to skip the review after seeing that 9.12 direvers were used for ATI cards


----------



## HillBeast (Mar 27, 2010)

Cold Storm said:


> Nice that your using the latest drivers. Cookie should go your way for it.. But, you took one part of my post and now going away from the whole thing..



You go off at me for going away when you were talking about how they are using old driver because they are more reliable and I rebute it by saying the newer drivers are just as good. Just because NVIDIA can't make drivers for s**t anymore doesn't mean ATI are following suit. Your Google search is a completely inaccurate way for finding how many bugs there are with the new drivers.

What I was trying to say is that if bugs arrised in the benchmark with using new drivers he should have mentioned them after the graphs like they used to do in reviews.

If you think the review is fair then get a GTX 480. I am just trying to say the review isn't accurate enough to make a fair point on it. TechSpot used 10.2 drivers in their review and the 480 barely managed to get past and in most cases was beaten by the 5870 let alone the 5970 which is what this card is more going to compete against seeing as how they have roughly equal TDPs and prices...


----------



## Roph (Mar 27, 2010)

I took my browser elsewhere also, when I noticed that ancient 9.12 drivers were used, and thus all results would be skewed in Nvidia's favour. I recommend firingsqaud's review, it's quite thorough and also uses drivers actually released this year.

I think I'll call this card the Geforce FX 480. It sounds more fitting . They might as well have just renamed an extra one hundred into the model number, so we could have and easily recognise another "5 series" geforce failure.

Overall I'm very disappointed.


----------



## HillBeast (Mar 27, 2010)

Quoting Guru3Ds review (http://www.guru3d.com/article/geforce-gtx-470-480-review/35):

_ATI reaps mucho benefits from the release of their Catalyst 10.3 driver (used in this review). The new driver brings significant performance boosts throughout the Radeon HD 5800 and 5900 series. Performance was enhanced in a lot of game titles. Would ATI not have released Catalyst 10.3 on time, then this review would have looked different. You'll probably notice a review or two out on the web using older drivers._

I rest my case.


----------



## Cold Storm (Mar 27, 2010)

HillBeast said:


> You go off at me for going away when you were talking about how they are using old driver because they are more reliable and I rebute it by saying the newer drivers are just as good. Just because NVIDIA can't make drivers for s**t anymore doesn't mean ATI are following suit. Your Google search is a completely inaccurate way for finding how many bugs there are with the new drivers.
> 
> What I was trying to say is that if bugs arrised in the benchmark with using new drivers he should have mentioned them after the graphs like they used to do in reviews.
> 
> If you think the review is fair then get a GTX 480. I am just trying to say the review isn't accurate enough to make a fair point on it. TechSpot used 10.2 drivers in their review and the 480 barely managed to get past and in most cases was beaten by the 5870 let alone the 5970 which is what this card is more going to compete against seeing as how they have roughly equal TDPs and prices...




If you want me to go off at you. I can. Trust me, that wouldn't be a hard thing to do. But, yet I haven't.

I get the factor of other sites using the newer drivers. But, you have to look at how reviewers do their review. Every review is different. A lot of times, a review will even have to use a "certain" thing in order to do their review. Or, the factor that they are "sponsored" by someone, so they have to watch their reviews with a Microscope.

I never said the review was "fair". I was just trying to back, and give site, into why a person would, may, use those "certain" drivers. 

Yeah, Google wouldn't be the "idea" way to search the problem. but it's a Start. Yes, a lot of "strange" things may come across with the "numbering" system. But, that was just a factor on what "might be".. 

I remember a review I read a few days ago.. Great review on a product.. But, when someone asked why he didn't use a certain product in his review. Reviewers answer.. Give me one and I'll show you the numbers.. 

Now, I'm glad I got to get in your head HB. Yeah, these nvidia cards are showing to be some crazyness. Making us wonder what the hell has been going on.. 

To me, a person that has went back and forth to both camps, I'm glad I have tried out the 5870's.. But.. I do wish I still had my GTX 295.


----------



## Atom_Anti (Mar 27, 2010)

So the Ati cards why did not use the newest 10.3 driver?? Unfortunately this test do not give real picture.


----------



## HillBeast (Mar 27, 2010)

Cold Storm said:


> I get the factor of other sites using the newer drivers. But, you have to look at how reviewers do their review. Every review is different. A lot of times, a review will even have to use a "certain" thing in order to do their review. Or, the factor that they are "sponsored" by someone, so they have to watch their reviews with a Microscope.



I know argueing in forums is immature, but I just want you to see my point. Guru3D used the newer 10.3 drivers and said they did because they are so much better, and because of that the GTX480 was slaughtered by the 5870. Now I'm not trying to say the 480 is fail because ATIs drivers are better (well I am but that's not my point). What I'm trying to say is that the 480 isn't what it was hyped up to be. Even if you consider NVIDIAs drivers being immature and ATIs drivers being superior so they made it even by using release drivers you get with your card, which makes no sense because reviews are for the end user to look at and see at this point in time this card is best, but you simply can get that conclusion from a review that uses horribly outdated drivers which are 6 months old.

There is no excuse for it in my opinion, sponsers or no. If they wanted NVIDIA to be happy then I suppose using old drivers for the ATI is understandable but if they wanted ATI to be happy then they would use the new stuff.


----------



## HillBeast (Mar 27, 2010)

Also I should mention (and I'm not accusing TPU of anything) but I have always found TPU to be more AMD biased so saying sponsors doesn't make sense because then in that case TPU have just lost AMD as a sponsor.


----------



## Cold Storm (Mar 27, 2010)

Hey, just like everyone has. A person has their opinion to things. Yeah, might of went "off" wrong in wording. It's internet. I was just throwing out factors on which a person "may" use that sort of thing. Calm the review bashing down.


----------



## kingkongtol (Mar 27, 2010)

i don't think i will buy it, maybe after 3-~ revisions . . . 
geforce fx5800 maybe failed, but fx5900 is not bad, still it can compete with r9800
with reference design, is this card get gray screen too when playing games?


----------



## Atom_Anti (Mar 27, 2010)

MrMilli said:


> A review that uses 10.3a for all ATI cards:
> http://www.pcgameshardware.com/aid,...eviewed-Fermi-performance-benchmarks/Reviews/



Thanks man, this is the test what I am looking for!


----------



## HillBeast (Mar 27, 2010)

Cold Storm said:


> Calm the review bashing down.



Advice to everyone: read several reviews before deciding to buy. I'll leave it at that.


----------



## HillBeast (Mar 27, 2010)

kingkongtol said:


> i don't think i will buy it, maybe after 3-~ revisions . . .
> geforce fx5800 maybe failed, but fx5900 is not bad, still it can compete with r9800
> with reference design, is this card get gray screen too when playing games?



I can actually see NVIDIA repeating what they did with the GTX2x0 cards in which they came out hot and powerful and then they released the 55nm versions. I can see a GTX485 coming out in due time and they will probably do well. That is the main thng I have against the GTX480: it's so damned hot!


----------



## OnBoard (Mar 27, 2010)

Binge said:


> BWAHAHAHAHAHAHAHA 320 Watts!



Thank you NVIDIA, my GTX 280 is no longer the hottest single chip GPU around 

The load sound/wattage would go down if they'd drop the 2D voltage. Only 0.03V drop from load is tiny, the card would happily work with 0.89V or 0.1V drop in desktop (and most likely even less).

GTX 470 will be interesting, with it's core/shaders being so slow in comparison wattage will also go down and OCing range will go up. Overclocked GTX 470 will have the GTX 480 performance but lower price than HD 5870 making it a good buy.

But OCd GTX 470 will most likely tickle the 300W barrier as well.. One less SM and couple memory chips less won't have that much of effect.

If stock GTX 470 is same 250W (or less) on peak, then it's a contender for next graphics card in the future. If not then GTS 450, but it might not be any faster than my current card  Well there is always ATI, but 5850 only brings 10% performance boost and 5870 is too expensive, so I'm good for next few to 6 months


----------



## HillBeast (Mar 27, 2010)

OnBoard said:


> The load sound/wattage would go down if they'd drop the 2D voltage. Only 0.03V drop from load is tiny, the card would happily work with 0.89V or 0.1V drop in desktop (and most likely even less).



My 5870 is 0.95V idle and 1.15V under load which is .2V drop. My CPU does that same drop so why can't the GTX480 do that?

I can see that is a BIOS problem though rather than driver issue. I suppose some clever bugger could use NiBiTor to edit the BIOS to bring that down a bit.


----------



## MilkyWay (Mar 27, 2010)

Man that was a total let down sure its slightly ahead in the single card area but its nothing major for the price and that power consumption is ridiculous for a single card!

They promised some sort of monster so it should have beaten the 5970 or equalled it from all that hype.

The 5870 is pretty expensive but right now dang the 5850 looks pretty good for the price.


----------



## lukesky (Mar 27, 2010)

HillBeast said:


> I know argueing in forums is immature, but I just want you to see my point. Guru3D used the newer 10.3 drivers and said they did because they are so much better, and because of that the GTX480 was slaughtered by the 5870.
> 
> Uhh. How is the GTX 480 slaughtered when it is consistantly 10-20% faster than a 5870 benchmarked with 10.3a performance drivers. Guru3d came to the same conclusion that GTX 480 regains the performance crown, but with excess of heat.


----------



## EchoMan (Mar 27, 2010)

SteelSix said:


> ATI: Catalyst 9.12 ??



Ill now go to another site...pce.


----------



## pantherx12 (Mar 27, 2010)

lukesky said:


> Uhh. How is the GTX 480 slaughtered when it is consistantly 10-20% faster than a 5870 benchmarked with 10.3a performance drivers. Guru3d came to the same conclusion that GTX 480 regains the performance crown, but with excess of heat.




Price performance ratio.
Power consumption
Heat output


----------



## MilkyWay (Mar 27, 2010)

From what i saw it did regain the performance crown but not significantly enough, from what i can see you get 10-20frames per second over the 5870 and that all depends on the game and resolution for example on Unreal Engine 3 its pretty much going good then other engines its only 5 fps faster.

Who wants to pay the extra for that? Add in the ridiculous heat and power consumption just like Pantherx12 said.


----------



## TheMailMan78 (Mar 27, 2010)

Again you guys bitching about the review drivers have no damn clue what you are talking about. I suggest you pull up a chair and let the big boys discuss this new GPU.


----------



## pantherx12 (Mar 27, 2010)

TheMailMan78 said:


> Again you guys bitching about the review drivers have no damn clue what you are talking about. I suggest you pull up a chair and let the big boys discuss this new GPU.



No clue? 


Cmon man we have plenty of "clue" that's why we're bitching.


This reminds me of when kellogs say their diet cereals help people loose weight, when in reality its just because their studies were done with 30g of diet cereal vs 50g of regular cereal XD

Its misleading is the important point mailman, so our complaints are justified.


----------



## MilkyWay (Mar 27, 2010)

Uh well i thought it wasnt worth its price based on this review.

I dont know why people are talking about other reviews since this is the thread for *this *review. Your supposed to discuss the product based on *this *review.

I still think the 5850 is a really good card for the price, if it went sub £200 it would be great and i could afford one lol.


----------



## v12dock (Mar 27, 2010)

W1zzard quick question: What was the temperature of the room you were testing in?


----------



## pantherx12 (Mar 27, 2010)

v12dock said:


> W1zzard quick question: What was the temperature of the room you were testing in?



22c, he said in the review.


----------



## BraveSoul (Mar 27, 2010)

at last. very good review, and Metro 2033 is included, digging this
_____________________________





Antec1200 filter project


----------



## v12dock (Mar 27, 2010)

Ahh need to learn how to read.. lol,


----------



## pantherx12 (Mar 27, 2010)

v12dock said:


> Ahh need to learn how to read.. lol,




I think it was only mentioned in the conclusion actually, which is 3x longer then usual so its fair enough you missed it!


----------



## newtekie1 (Mar 27, 2010)

johnnyfiive said:


> Knew it.
> 
> So whos still buying a 480? At $499+ (More like $549), the 5870 seems a lot more appealing now doesn't it?



Well with the GTX470 performing the same as the HD5870, particularly better in DX11, with a projected $50 less in price...no the HD5870 doesn't seem a lot more appealing...


----------



## MilkyWay (Mar 27, 2010)

newtekie1 said:


> Well with the GTX470 performing the same as the HD5870, particularly better in DX11, with a projected $50 less in price...no the HD5870 doesn't seem a lot more appealing...



uh so the gtx 470 is better than the 5870 and its cheaper? REALLY?


----------



## exodusprime1337 (Mar 27, 2010)

from anandtech.. lol shows a 5870 beating a 480 sli setup lol 
http://images.anandtech.com/graphs/nvidiageforcegtx480launch_032610115215/22177.png


----------



## EchoMan (Mar 27, 2010)

TheMailMan78 said:


> Again you guys bitching about the review drivers have no damn clue what you are talking about. I suggest you pull up a chair and let the big boys discuss this new GPU.



So I should read a review that pits 3 month vs current software up against each other? Uninstall plz. Troll out.


----------



## newtekie1 (Mar 27, 2010)

MilkyWay said:


> uh so the gtx 470 is better than the 5870 and its cheaper? REALLY?



Yes, read the review MrMilli posted.  In the end, the GTX470 was equal with the HD5870, sometimes slightly behind sometimes slightly ahead, really ahead in DX11.  The HD5870 goes for $400+, the GTX470 is supposed to retail for $349(I would expect $400 pricing in the first few weeks though).

uh so the gtx470 is better than the HD5870 and its cheaper. REALLY.

The HD5870 wins with power consumption for sure though, but really I don't care.  If I cared about power consumption, I wouldn't have an HD4890 or GTX285.



MrMilli said:


> A review that uses 10.3a for all ATI cards:
> http://www.pcgameshardware.com/aid,...eviewed-Fermi-performance-benchmarks/Reviews/


----------



## MilkyWay (Mar 27, 2010)

newtekie1 said:


> Yes, read the review MrMilli posted.  In the end, the GTX470 was equal with the HD5870, sometimes slightly behind sometimes slightly ahead, really ahead in DX11.  The HD5870 goes for $400+, the GTX470 is supposed to retail for $349(I would expect $400 pricing in the first few weeks though).
> 
> uh so the gtx470 is better than the HD5870 and its cheaper. REALLY.








Cool looks like a winner then. If thats true then the GTX 480 looks like its terrible in price to performance compared to its compatriot the GTX 470.


----------



## entropy13 (Mar 27, 2010)

But in other reviews, the GTX470 only equals the 5850.....


----------



## pantherx12 (Mar 27, 2010)

newtekie1 said:


> Yes, read the review MrMilli posted.  In the end, the GTX470 was equal with the HD5870, sometimes slightly behind sometimes slightly ahead, really ahead in DX11.  The HD5870 goes for $400+, the GTX470 is supposed to retail for $349(I would expect $400 pricing in the first few weeks though).
> 
> uh so the gtx470 is better than the HD5870 and its cheaper. REALLY.




Just a small correction, not in directx 11.


Have you misread the data somewhere?

Sure it wins in metro 2033 ( which is an nvidia optimised game) but it is not "really ahead" on dirt 2 is it?

And those are the only 2 direct x 11 benchmarks that both ati cards and nv cards done


----------



## EchoMan (Mar 27, 2010)

470 fails more than anything on all fronts


----------



## MilkyWay (Mar 27, 2010)

Man i think reviews are really erratic in their findings, they never find the same FPS for the same card on the same game.

The fact there is only 2 direct x 11 compatible games really makes it unclear as to what it will be like in the future.


----------



## pantherx12 (Mar 27, 2010)

MilkyWay said:


> Man i think reviews are really erratic in their findings, they never find the same FPS for the same card on the same game.
> 
> The fact there is only 2 direct x 11 compatible games really makes it unclear as to what it will be like in the future.



Only 2?

There's 5 so far : ]


----------



## OnBoard (Mar 27, 2010)

qubit said:


> Nice, It's got the performance crown, I want one!
> 
> EDIT: Ok, taking a slightly longer look at it, I think I'd rather wait for the improved "GTX485" variant which will come with a die shrink and all 512 shaders enabled (hopefully). The GTX480 runs far too hot and noisy for my liking.



As far as I know also the next generation cards will be 40nm, so there won't be any die shrink. They can do a 4870->4890 thing only with this gen. GTX 485 could come if they figure out how to keep power draw down (A4 silicon?). Don't think there is much point in releasing the full 512 shaders cards at the moment, even though they already have those. It would make the card even more power hungry and noisier.

Save some 512 shader chips for ASUS GTX 495 Mars? Let them worry how to cool that monster 



HillBeast said:


> My 5870 is 0.95V idle and 1.15V under load which is .2V drop. My CPU does that same drop so why can't the GTX480 do that?
> 
> I can see that is a BIOS problem though rather than driver issue. I suppose some clever bugger could use NiBiTor to edit the BIOS to bring that down a bit.



That they will do once there is support for the new cards and the controller chip. But doubt even dropping the load voltage will help this one to go under 300W. Rather it seems they have already dropped it, to keep it cooler, if ATI has it 1.15V with 40nm.


----------



## TheMailMan78 (Mar 27, 2010)

9.12 drivers vs immature drivers is pretty damn equal. In the BEST case 10.3 adds UP TO 15% better performance. As you can see its not a big deal. For fuck sake do you want me to do the math for you?



newtekie1 said:


> Yes, read the review MrMilli posted.  In the end, the GTX470 was equal with the HD5870, sometimes slightly behind sometimes slightly ahead, really ahead in DX11.  The HD5870 goes for $400+, the GTX470 is supposed to retail for $349(I would expect $400 pricing in the first few weeks though).
> 
> uh so the gtx470 is better than the HD5870 and its cheaper. REALLY.
> 
> The HD5870 wins with power consumption for sure though, but really I don't care.  If I cared about power consumption, I wouldn't have an HD4890 or GTX285.



I really hope you don't trust those over W1zz.


----------



## LAN_deRf_HA (Mar 27, 2010)

entropy13 said:


> But in other reviews, the GTX470 only equals the 5850.....



That's what I was seeing too, which means it's pointless to get a 470 GTX because of the price. Hell in games like crysis an overclocked 5850 can beat a stock 480 GTX with the latest drivers. If you were lucky enough to grab one at the $260 launch price you got the best card deal in recent history, possibly ever.


----------



## btarunr (Mar 27, 2010)

HalfAHertz said:


> +1 These are still beta drivers, so maybe after Nvidia releases some more sable drivers, we could have a re-review?





a_ump said:


> um, who cares bout nvidia's "beta" drivers. 9.12 for ATI? hello wtf lol



Why didn't people like you (with the same opinion) complain when we first reviewed HD 5870, HD 5970, etc? You think ATI gave reviewers stable, released, WHQL-signed drivers?

And this won't be our last GTX 480 review, so don't worry.


----------



## araditus (Mar 27, 2010)

TheMailMan78 said:


> 9.12 drivers vs immature drivers is pretty damn equal. In the BEST case 10.3 adds UP TO 15% better performance. As you can see its not a big deal. For fuck sake do you want me to do the math for you?



The current Nvidia drivers for the 480 are very mature by the way, lets look at a timeline
5month ago, (a few weeks after the great successful 5870 release, although short in units) Nvidia release a public statement stating that they were done with the Fermi project and had "foundry issues" for producing large volume, this was a corporate lie/cover up. Im telling you right now (and there will be no source to back this up bc its speculation although I am experienced in business) that the issue was the foundry made their card and they got it, tested it and it SUCKED, they then spent the next 5 months working on the software side and thermodynamics of things (this card looks a hell of a lot different that initial photos of the card. ) They couldnt re-engineer a new architecture or adjust their own becuase nvidia is cheap, and signed a contract with their chip maker (who made dies already) that the dies would be used for x amount of cards. 

So, in my opinion this card is the final mature, "the best we could manage" from the green camp, so really these drivers vs. the same amount of time ATI had to "mature" theirs is extremely equal.

Fermi has been done for a long time, the release only came when their accountants said, do it now or lose more money. Honestly, you get DX11, and Tesselation, gogo! As one other thing (I do not know all the differences at all) has the quaddro version of fermi been out for a while anyways? similar basic architecture right?


----------



## TheMailMan78 (Mar 27, 2010)

araditus said:


> The current Nvidia drivers for the 480 are very mature by the way, lets look at a timeline
> 5month ago, (a few weeks after the great successful 5870 release, although short in units) Nvidia release a public statement stating that they were done with the Fermi project and had "foundry issues" for producing large volume, this was a corporate lie/cover up. Im telling you right now (and there will be no source to back this up bc its speculation although I am experienced in business) that the issue was the foundry made their card and they got it, tested it and it SUCKED, they then spent the next 5 months working on the software side and thermodynamics of things (this card looks a hell of a lot different that initial photos of the card. ) They couldnt re-engineer a new architecture or adjust their own becuase nvidia is cheap, and signed a contract with their chip maker (who made dies already) that the dies would be used for x amount of cards.
> 
> So, in my opinion this card is the final mature, "the best we could manage" from the green camp, so really these drivers vs. the same amount of time ATI had to "mature" theirs is extremely equal.



I'm not talking about the card. I am talking about the drivers. ATI has had MONTHS of MILLIONS of people testing the 5xxx series. Nvidia has had ZERO wild testing.


----------



## Bundy (Mar 27, 2010)

Many thanks for the review Wiz

As a suggestion, I'd prefer to see less comparisons in games that pump over 100fps, only because the usefulness of such info is limited. I'd also like to see minimum fps comparisons because thats what really matters to gamers.

I also agree with the driver choice. It makes your review better because we see more closely what the hardware differences are, rather than distorting the figures with different maturity drivers. 

lol my main issue it that the review has not made it any earier to pick what card I want next. Pros and cons, pros and cons. I wanted a clear winner one way or the other, ah well.

Anyway, well done mate.


----------



## eidairaman1 (Mar 27, 2010)

Explain why several users are having issues with 197 drivers then?



araditus said:


> The current Nvidia drivers for the 480 are very mature by the way, lets look at a timeline
> 5month ago, (a few weeks after the great successful 5870 release, although short in units) Nvidia release a public statement stating that they were done with the Fermi project and had "foundry issues" for producing large volume, this was a corporate lie/cover up. Im telling you right now (and there will be no source to back this up bc its speculation although I am experienced in business) that the issue was the foundry made their card and they got it, tested it and it SUCKED, they then spent the next 5 months working on the software side and thermodynamics of things (this card looks a hell of a lot different that initial photos of the card. ) They couldnt re-engineer a new architecture or adjust their own becuase nvidia is cheap, and signed a contract with their chip maker (who made dies already) that the dies would be used for x amount of cards.
> 
> So, in my opinion this card is the final mature, "the best we could manage" from the green camp, so really these drivers vs. the same amount of time ATI had to "mature" theirs is extremely equal.


----------



## araditus (Mar 27, 2010)

TheMailMan78 said:


> I'm not talking about the card. I am talking about the drivers. ATI has had MONTHS of MILLIONS of people testing the 5xxx series. Nvidia has had ZERO wild testing.



They had 5 months of in house testing, its their only focus (other than next gen) trust me alot of man hours went into those drivers.


----------



## btarunr (Mar 27, 2010)

pantherx12 said:


> This is an "important" review, it should of been done right.



It was done right. All test results were double checked. You sit and do 96 tests on dozens of ATI cards each time a new driver comes out. Then you'll know.


----------



## Steevo (Mar 27, 2010)

newtekie1 said:


> Yes, read the review MrMilli posted.  In the end, the GTX470 was equal with the HD5870, sometimes slightly behind sometimes slightly ahead, really ahead in DX11.  The HD5870 goes for $400+, the GTX470 is supposed to retail for $349(I would expect $400 pricing in the first few weeks though).
> 
> uh so the gtx470 is better than the HD5870 and its cheaper. REALLY.
> 
> The HD5870 wins with power consumption for sure though, but really I don't care.  If I cared about power consumption, I wouldn't have an HD4890 or GTX285.



They do perform better, at 1280 X 1024 but at 1920X1200 and higher the edge drops to zero, so the difference is playing a game at 1280X1024 200FPS or 220FPS, whoopeee.


At 1920X1200 I beat or match the 480 due to the overclocking headroom of this card, and THAT is something a enthusiast cares about. My little overclock already rapes the shit out of the 480 at heaven, something Nvidia brags about in PR spin videos.


----------



## TheMailMan78 (Mar 27, 2010)

araditus said:


> They had 5 months of in house testing, its their only focus (other than next gen) trust me alot of man hours went into those drivers.



 Dude no house in the world can test as well as drivers in the wild. Sorry but your argument is fail.


----------



## Steevo (Mar 27, 2010)

TheMailMan78 said:


> Dude no house in the world can test as well as drivers in the wild. Sorry but your argument is fail.



If that is the case than he idea that Nvidia released better drivers is shit, and I never want to hear it again.


----------



## araditus (Mar 27, 2010)

TheMailMan78 said:


> Dude no house in the world can test as well as drivers in the wild. Sorry but your argument is fail.



I'm sure you are a driver developer by profession?  (thought not) but 5 nerds in a room with alot of mountain dew and 1 task for 5 months, they go insane or do the best they can, and the millions you talk about about 1% will actually give feedback (there arent 1m 5xxx in existence btw)


----------



## TheMailMan78 (Mar 27, 2010)

araditus said:


> I'm sure you are a driver developer by profession?  (thought not) but 5 nerds in a room with alot of mountain dew and 1 task for 5 months, they go insane or do the best they can



5 nerds vs MILLIONS of nerds? Hmmmmmm


----------



## 20mmrain (Mar 27, 2010)

*Let's see here.....It's six months late. It doesn't have that impressive performance increases. The card while it dose out perform the (6 month older) 5870 sometimes......in some games and resolutions it gets beat by a 5870.
It costs more than a 5870 buy quite a decent amount in this economy. It is very power hungry which really makes it hard for a SLI system on a power supply any less than 800 watts.

Plus the thing runs at 94c under load.........This card is a huge Disappointment!!!! Nvidia should be ashamed on bringing this out!!!

Well there is still one good news about it...... It still has this feature!!!*







*This is Nvidia's 2900XT! 

And Why buy it ???? By the time they bring out the full line and all the specialty versions of this cards start releasing.... the ATI refresh or 6800 series will be released! There is no reason to by this card. 
Except for maybe just benching and then selling.

Nvidia------->*

*There goes my hopes in Any ATI price drops any time soon! Damn it I was really hoping for this card to be better than it is.*


----------



## LAN_deRf_HA (Mar 27, 2010)

Bundy said:


> Many thanks for the review Wiz
> 
> As a suggestion, I'd prefer to see less comparisons in games that pump over 100fps, only because the usefulness of such info is limited.



Since most games are console ports and consoles are standing still while graphics cards continue to advance there are very few games that tax modern graphics cards. That's why you see so many reviews now at 2560x1600, only way to get really taxing. Except this really bugs me since so few will use such a display. I wouldn't even use one if a I had a $50,000 budget for my pc setup.



eidairaman1 said:


> Explain why several users are having issues with 197 drivers then?



People always have problems with drivers, both sides, all the time. Every time there's a new driver release thread someone talks about random problems.


----------



## araditus (Mar 27, 2010)

TheMailMan78 said:


> 5 nerds vs MILLIONS of nerds? Hmmmmmm



dont post so fast read the edits


----------



## TheMailMan78 (Mar 27, 2010)

araditus said:


> I'm sure you are a driver developer by profession?  (thought not) but 5 nerds in a room with alot of mountain dew and 1 task for 5 months, they go insane or do the best they can, and the millions you talk about about 1% will actually give feedback (there arent 1m 5xxx in existence btw)



And 1% of 10,000 people on different hardware is STILL more than 5 guys in a room. Dude wild testing will ALWAYS trump a lab.


----------



## araditus (Mar 27, 2010)

TheMailMan78 said:


> And 1% of 10,000 people on different hardware is STILL more than 5 guys in a room. Dude wild testing will ALWAYS trump a lab.



Hate to sound immature but prove it, how is a bunch of jackoffs dropping these reactors into their moms dell (WHICH is the MAJORITY of the consumer market) is going to help?


----------



## Deleted member 24505 (Mar 27, 2010)

So jackoffs spend 500 bucks to put one in their moms shitty dell,wake up.


----------



## araditus (Mar 27, 2010)

tigger said:


> So jackoffs spend 500 bucks to put one in their moms shitty dell,wake up.



Sorry, was referring to the general GPU market not just 480s


----------



## TheMailMan78 (Mar 27, 2010)

araditus said:


> Hate to sound immature but prove it, how is a bunch of jackoffs dropping these reactors into their moms dell (WHICH is the MAJORITY of the consumer market) is going to help?



Have you ever gotten an error on a GPU that many other people were getting? Lets say gray bars across the screen? Yeah well a lot of people on TPU did and you know what? ATI had no clue until the community told them about it. Why? BECAUSE THEY WERE TESTED IN THE WILD! Why the hell do you think they beta test so damn much? You have to understand something. Drivers are not mature for a card almost until the next generation and are only fully stable about three months in.


----------



## Bundy (Mar 27, 2010)

araditus said:


> I'm sure you are a driver developer by profession?  (thought not) but 5 nerds in a room with alot of mountain dew and 1 task for 5 months, they go insane or do the best they can, and the millions you talk about about 1% will actually give feedback (there arent 1m 5xxx in existence btw)



I dont know anything about writing drivers but I hardly know of any card released in the past few years that didn't have significant improvements in performance from driver updates in the first months of release. You are probably right about how the drivers are made but in practice we see different.


----------



## araditus (Mar 27, 2010)

Never had a problem with any driver ever (should probably mention i havent downloaded them all) believe me if you want or not, been in this game since 2002 played every mainstream title on all sorts of hardware


----------



## TheMailMan78 (Mar 27, 2010)

araditus said:


> Never had a problem with any driver ever (should probably mention i havent downloaded them all) believe me if you want or not, been in this game since 2002 played every mainstream title on all sorts of hardware



Then you and a unicorn should have a baby.


----------



## LAN_deRf_HA (Mar 27, 2010)

This thread is fun, except for w1zzard... unless he's used to people taking a dump and smearing it all over his work.


----------



## araditus (Mar 27, 2010)

TheMailMan78 said:


> Then you and a unicorn should have a baby.



That line doesnt help the topic. seems you might be feeling personally attached to this. Ill be in FL next month maybe you can babysit for me while i go out on my boat?


----------



## TheMailMan78 (Mar 27, 2010)

araditus said:


> That line doesnt help the topic. seems you might be feeling personally attached to this.



Oh I'm not. I'm just stating that its beyond RARE that you have NEVER had a problem. But thats besides the point. As long as you understand the drivers are far from mature.


----------



## araditus (Mar 27, 2010)

TheMailMan78 said:


> Oh I'm not. I'm just stating that its beyond RARE that you have NEVER had a problem. But thats besides the point. As long as you understand the drivers are far from mature.



quote me on this, Ill stick to my guns about how i feel about the drivers, in 3 months, lets compare reviews and see what the drivers do, only then will i humbly accept defeat.


----------



## Bundy (Mar 27, 2010)

Shhhh, I can hear a mods footsteps coming.


----------



## TheMailMan78 (Mar 27, 2010)

araditus said:


> quote me on this, Ill stick to my guns about how i feel about the drivers, in 3 months, lets compare reviews and see what the drivers do, only then will i humbly accept defeat.



Its simple man. Look at old reviews. You'll see I'm right.


----------



## hertz9753 (Mar 27, 2010)

I have been camping on this thread for a while now.  The flaming is not cool guys.


----------



## TheMailMan78 (Mar 27, 2010)

hertz9753 said:


> I have been camping on this thread for a while now.  The flaming is not cool guys.



Who flaming?


----------



## araditus (Mar 27, 2010)

TheMailMan78 said:


> Its simple man. Look at old reviews. You'll see I'm right.



Why on earth would you compare an old generations history to new architecture? dont anwser that, i wont be here to read it, heading out to the airport.


----------



## DOM (Mar 27, 2010)

have you guys seen EVGA SuperClocked on newegg ?

GTX 470 core 607/625 for $30 more from $350

GTX 480 Core 700/725 for $30 more from $500

wow some super cards XD


----------



## hertz9753 (Mar 27, 2010)

TheMailMan78 said:


> Who flaming?



I guess that I should have caled it defending, for you and sneakypete and please stop for pantherx12.


----------



## thebluebumblebee (Mar 27, 2010)

ERazer said:


> x6 month of wait and u get oven toaster



Ah, that's why they have exposed metal on the heat sink.  Grill marks!


----------



## theorw (Mar 27, 2010)

Its quite  a let down to me...Not worth the 500+EUR it will cost here considering $ is just turned to EUR here....
Considering nvidia used to pawn ati in every new launch this is mediocre...
Much better off with a 5850...around 270EUR.2x 5850s wo;; be around the same price and beat 480.
Just IMO anyways.


----------



## wahdangun (Mar 27, 2010)

wew, crazy ass card, what the hell nvidia doing ?

i never seen 320 watt power consumption for single GPU, so the rumour are right after all,



EDIT : thanx wizz i really appreciate your benchmark, and i know the difficulties for you to bench all this card, and i know you didn't have much time to bench it with latest driver, and i hope you will benchmark it again wit ati newer driver


----------



## hertz9753 (Mar 27, 2010)

thebluebumblebee said:


> Ah, that's why they have exposed metal on the heat sink.  Grill marks!



I agree, I can't get past the heat that this card puts out.


----------



## HillBeast (Mar 27, 2010)

Why does Fermi feel like Intels Netburst. They had something awesome planned but didn't have the time to refine it and because of that they rushed it out and it was nowhere near what they intended it to be. They made a chip way too big, inefficient, and way too hot.

Sound like Netburst?


----------



## Kitkat (Mar 27, 2010)

SteelSix said:


> It sure does. ATI should have launched all options today, mainly the 2GB cards. I'm ready to buy now damn it!!



there pretty close, all april.


----------



## W1zzard (Mar 27, 2010)

i think i'm done .. every time there is a review you people have something to complain about .. suddenly lost all motivation to do any further vga reviews ... anyone interesting in buying a tech site ?


----------



## Tartaros (Mar 27, 2010)

theorw said:


> Its quite  a let down to me...Not worth the 500+EUR it will cost here considering $ is just turned to EUR here....
> Considering nvidia used to pawn ati in every new launch this is mediocre...



Not always, remember fx series. And things changes, not every time nvidia launches a new gpu ati has to lag behind.

I expected a worse card, a single gpu card that can match more or less a 295 and it hasn't the natural problems of a dual gpu like in dirt2 is quite fine. But I guess the real thing will come later with revisions, gtx 485s, 495s and so... 320w is just crazy.


----------



## LAN_deRf_HA (Mar 27, 2010)

W1zzard said:


> i think i'm done .. every time there is a review you people have something to complain about .. suddenly lost all motivation to do any further vga reviews ... anyone interesting in buying a tech site ?



How much?


----------



## W1zzard (Mar 27, 2010)

LAN_deRf_HA said:


> How much?



email me serious offers to w1zzard@techpowerup.com


----------



## TAViX (Mar 27, 2010)

qubit said:


> Nice, It's got the performance crown, I want one!



What crown?? The 5970 is still the leader! Doesn't matter if it's dual GPU. *It's 1 card*!

But I don't want to sound like a fan-boy, but heck, the GTX480 drains more power than a 5970!!! And it's hotter and louder....


----------



## btarunr (Mar 27, 2010)

Oh now look you pissed him off. Good job :shadedshu


----------



## jimmyz (Mar 27, 2010)

W1zzard said:


> email me serious offers to w1zzard@techpowerup.com



It wouldn't matter who bought it , without W1zzard it wouldn't be TPU. you would be selling a site that would be doomed. This community is here in large part due to you!


----------



## kylew (Mar 27, 2010)

Cold Storm said:


> There is more to look at then the factor of things being "Up to Date"..  in order for a video card to work right, you get drivers that "mature" it. Drivers work to "fix", "help" the video card go along.
> 
> Yes, Nvidia has had 6m to work on things..  But, how many months have ATI had a card out, to make a driver "mature" due to the factor of having, Millions, and I mean MILLIONS, of people to work out, and find, all the problems that the drivers have.
> 
> ...



You've failed a bit there, you are aware that google searches for the individual words used, right?

"ATi" "10.2" "driver" "fail", the way you've searched bares no relevance on the stability of the drivers, just that certain ordering of words and numbers makes the result numbers go up or down.


----------



## Fourstaff (Mar 27, 2010)

Great. I went to sleep thinking TPU is the best site after yet another excellent review, and woke up to this mess.


----------



## TAViX (Mar 27, 2010)

Fourstaff said:


> Great. I went to sleep thinking TPU is the best site after yet another excellent review, and woke up to this mess.



Good. I'm still sleeping...


----------



## Cap'n Killmore (Mar 27, 2010)

First time poster, long time reader...

seriously felt like i needed to make this post.... this is by far my favorite HW site on the interwebs.

hearing the way people allways complain, I can understand w1zzard's feeling about this.

W1zz does a great job. every single time. think about it for once, if you had spent the better part of a week writing a review, just to see ppl bashing it. not fair guys !

maybe W1zz has other things to do than pleasing an unpleaseable crowd of whiners.

perhaps he has an education to look after...

seriously, how much do you pay for these reviews ?

W1zz plz don't give up... I have found your reviews is 95% of the time the most accurate on the web.

love from here mate


----------



## kylew (Mar 27, 2010)

W1zzard said:


> i think i'm done .. every time there is a review you people have something to complain about .. suddenly lost all motivation to do any further vga reviews ... anyone interesting in buying a tech site ?




You should know better than most that a person in your position will be most subject to criticism.

To give up and have a moan that people are moaning is just lame.

If you can't accept and hear people's views then you shouldn't publish reviews anyway.

The issue that people seem to have is that most other websites managed to use 10.3a/b drivers for the ATi cards, that automatically makes a distinction between your review and the other reviews.

It's fair enough that people mention that.

Seems like something big to over look in my opinion.

Do you honestly care that people are moaning? Do you think they have no ground? Are you unwilling to accept their views?

If you do care, then you should care for the right reasons, what's a tech review website without its readers? Just words on a web page.

Be bothered about WHAT they're saying, not simply because you think they're moaning.

I personally think you should have used the most recent ATi drivers considering it's a big launch. That's my opinion, you don't seem to think it's a problem, fair enough, your opinion too.

But don't whinge that your reader base criticises your review, you should be used to criticism right now, especially from the most enthusiastic computer hardware enthusiasts.


----------



## W1zzard (Mar 27, 2010)

kylew said:


> If you can't accept and hear people's views then you shouldn't publish reviews anyway.
> 
> If you do care, then you should care for the right reasons, what's a tech review website without its readers? Just words on a web page.



yes, that's why i'm quitting, seems i'm not the right person to do this


----------



## kylew (Mar 27, 2010)

W1zzard said:


> yes, that's why i'm quitting, seems i'm not the right person to do this





Additionally, if you think "us people" have always got to have a moan about your reviews, does that not even slightly trigger a "maybe there's an aspect of the reviews that I could do better which is why people are moaning"?


----------



## mR Yellow (Mar 27, 2010)

Wizz, don't quit. Just give the tedious benchmarking to someone else.


----------



## W1zzard (Mar 27, 2010)

kylew said:


> Additionally, if you think "us people" have always got to have a moan about your reviews, does that not even slightly trigger a "maybe there's an aspect of the reviews that I could do better which is why people are moaning"?



what do you think i have been doing for the past 7 years? it's just now that i realized that people here can never be pleased ... i feel like i am wasting my precious time and will go do something else


----------



## Fourstaff (Mar 27, 2010)

This kinds of remind me of the PP and MM problem. Some people in TPU just wouldn't appreciate what others have done for us, and instead bash them for their errors. W1z isn't our review slave, and we shouldn't treat him as such. Anything you don't agree with him, you can kindly ask him about your opinions, its not like he is a stubborn mule.


----------



## sneekypeet (Mar 27, 2010)

you just don't get it. How about compassion or constructive criticism?

Please don't kick the guy when he is down

You had the review and the place to complain.....all brought to you by the guy who has been told pretty much that he knows nothing about what he does.

As a reviewer I take all criticism to heart. I put hard time and effort into everything I turn in, in essence its a little piece of me. 

If you don't like the review or the way it was done, I completely get that, just don't disrespect the man or rub your muddy shoes throughout his house


----------



## kylew (Mar 27, 2010)

W1zzard said:


> what do you think i have been doing for the past 7 years? it's just now that i realized that people here can never be pleased ... i feel like i am wasting my precious time and will go do something else




Well I only said that because TPU reviews appear to use the exact same method for years now, I just thought people complaining + same method = you should mix things up a bit.

There will always be moaners, a lot of them will moan for the same of moaning anyway.


----------



## W1zzard (Mar 27, 2010)

kylew said:


> TPU reviews appear to use the exact same method for years now



power, noise, average performance, performance per watt, performance per dollar, more resolutions, 2560x1600, 64-bit OS

all that has changed and probably even more, if you dont know what you are talking about shut up and go somewhere else


----------



## kylew (Mar 27, 2010)

sneekypeet said:


> you just don't get it. How about compassion or constructive criticism?
> 
> Please don't kick the guy when he is down
> 
> ...



I don't think any one is seriously saying he doesn't know what he's talking about.

The best make mistakes at times, from what I've read (not the whole thread) it's mostly people saying "10.2s, bad move".

If people are genuinely saying he doesn't know what he's doing/talking about, then do you seriously believe that they should have their "opinion" valued in any such way when it's blatant provocative talk?


----------



## TAViX (Mar 27, 2010)

W1zzard said:


> yes, that's why i'm quitting, seems i'm not the right person to do this





W1zzard said:


> what do you think i have been doing for the past 7 years? it's just now that i realized that people here can never be pleased ... i feel like i am wasting my precious time and will go do something else



Quiting definitely it's not the answer. On the contrary.

Think about this. As higher the number of people visiting the forum, the higher percentage of some of them being trolls, flamers, or uneducated kids. It's just the way it is. Also, 70% of your readers are kids, they never worked 1 day in their life, and they think they know everything. Just like in politics. You cannot judge an entire nation by the action of only a few people...(maybe except gypsies)


----------



## btarunr (Mar 27, 2010)

Additionally, look up "reasonable". Catalyst 10.3 was released when, yesterday? We have 16 ATI cards in today's review (relative performance chart, so they all need to be up-to-date). 26 benchmarks, five resolutions, so 100 tests per card. That's 1600 tests. Try sitting next to hot bunch of wires to see sixteen hundred tests go through without a hitch, and then typing all those figures down, and then coming up with accurate graphs. 

Additionally, if other tech-sites want to show you Catalyst 10.3 scores, see if they're benching these many ATI cards, and see if there are these many tests they're put through. 

Also see if HD 5870 turns into superman with 10.3, that it can suddenly pwn GTX 480 or come up with even significantly better scores. 

Once again, if you expect the moon from a tech-site despite the effort put into this review, choose with your web-browser.


----------



## kylew (Mar 27, 2010)

W1zzard said:


> power, noise, average performance, performance per watt, performance per dollar, more resolutions, 2560x1600, 64-bit OS
> 
> all that has changed and probably even more,* if you dont know what you are talking about shut up and go somewhere else*



Wow, I hope you're not being serious there and aiming that me. No need for being arsey like that.

Also note that I said "appear". I didn't say they were exactly the same, however they do come across as very similar. Maybe it's the layout, I don't know, but I know I wouldn't complain about a fresh new look.

However, this is my very first time "complaining" and I'm not so much as really "complaining" as such, rather pointing out my observations.

Telling people to shut up and go away is the wrong (and seemingly immature) attitude to have, especially considering I (feel) haven't said anything out of line to you.


----------



## kylew (Mar 27, 2010)

TAViX said:


> Quiting definitely it's not the answer. On the contrary.
> 
> Think about this. As higher the number of people visiting the forum, the higher percentage of some of them being trolls, flamers, or uneducated kids. It's just the way it is. Also, 70% of your readers are kids, they never worked 1 day in their life, and they think they know everything. Just like in politics. You cannot judge an entire nation by the action of only a few people...(maybe except gypsies)



This too, I mean, you should take criticism on board, but don't take it to heart so much just because it's negative.

I'm sure most of what happens at TPU is a team effort anyway.

As I said before, there are always going to be people that just want to troll, that's the way things are.


----------



## Fourstaff (Mar 27, 2010)

btarunr said:


> Additionally, look up "reasonable". Catalyst 10.3 was released when, yesterday? We have 16 ATI cards in today's review (relative performance chart, so they all need to be up-to-date). 26 benchmarks, five resolutions, so 100 tests per card. That's 1600 tests. Try sitting next to hot bunch of wires to see sixteen hundred tests go through without a hitch, and then typing all those figures down, and then coming up with accurate graphs.
> 
> Additionally, if other tech-sites want to show you Catalyst 10.3 scores, see if they're benching these many ATI cards, and see if there are these many tests they're put through.
> 
> ...



QFT. The amount of work just to bench a card is a lot. And the benchers have a life other than benching.


----------



## Tartaros (Mar 27, 2010)

> I don't think any one is seriously saying he doesn't know what he's talking about.
> 
> The best make mistakes at times, from what I've read (not the whole thread) it's mostly people saying "10.2s, bad move".
> 
> If people are genuinely saying he doesn't know what he's doing/talking about, then do you seriously believe that they should have their "opinion" valued in any such way when it's blatant provocative talk?



Maybe you forget this isn't the first time this happens. When you get a kick in the nuts everytime you do something is natural to get fed up.

If you really want to know what I'm saying, make yourself reviewer or fansuber. You'll end being a misanthropist.


----------



## kylew (Mar 27, 2010)

btarunr said:


> Additionally, look up "reasonable". Catalyst 10.3 was released when, yesterday? We have 16 ATI cards in today's review (relative performance chart, so they all need to be up-to-date). 26 benchmarks, five resolutions, so 100 tests per card. That's 1600 tests. Try sitting next to hot bunch of wires to see sixteen hundred tests go through without a hitch, and then typing all those figures down, and then coming up with accurate graphs.
> 
> Additionally, if other tech-sites want to show you Catalyst 10.3 scores, see if they're benching these many ATI cards, and see if there are these many tests they're put through.
> 
> ...



Fair enough that they came out yesterday, I don't think many people were suggesting that you re-reviewed all ATi cards using the 10.3 drivers though, only the relevant cards such as the 5850 and 5870.

Other sites seemed to have the time to do so (quite a few by the looks of things).

I don't think many people are expecting "the moon", no way near that at all, I know I'm not at least.

But I'm sure most people would see it as simple as "well other sites managed to use 10.3s, why couldn't TPU?".


----------



## Wile E (Mar 27, 2010)

W1zzard said:


> i think i'm done .. every time there is a review you people have something to complain about .. suddenly lost all motivation to do any further vga reviews ... anyone interesting in buying a tech site ?



Don't let douchy comments get to you w1z. if you can deal with people like me being on the site for years, you are strong enough to handle anything. lol.

At any rate, in your opinion, if you take power consumption out of the equation, 480 or 5870 @ 1920x1200 resolution?


----------



## TAViX (Mar 27, 2010)

...sometimes trolling around and making long and useless postings on the forum it's good for traffic, hehe...


----------



## Fourstaff (Mar 27, 2010)

kylew said:


> Fair enough that they came out yesterday, I don't think many people were suggesting that you re-reviewed all ATi cards using the 10.3 drivers though, only the relevant cards such as the 5850 and 5870.
> 
> Other sites seemed to have the time to do so (quite a few by the looks of things).
> 
> ...



Right, so we expect W1z to do 1600 test AND provide us with juicy graphs all within a day. I recommend you to try to do a full benching suite the way W1z did on any graphics card and report how long it took you. And then come back when its done with the time you took. Anyone with brains would know that is an absurd request.


----------



## Cap'n Killmore (Mar 27, 2010)

this came to mind : http://verydemotivational.files.wordpress.com/2010/03/129128286672236624.jpg


----------



## kylew (Mar 27, 2010)

Fourstaff said:


> Right, so we expect W1z to do 1600 test AND provide us with juicy graphs all within a day. I recommend you to try to do a full benching suite the way W1z did on any graphics card and report how long it took you. And then come back when its done with the time you took. Anyone with brains would know that is an absurd request.



How did other sites mange the same task?


----------



## Wile E (Mar 27, 2010)

kylew, just drop it. Seriously, you are now being a douche. You made your point, now stfu.


----------



## Fourstaff (Mar 27, 2010)

kylew said:


> How did other sites mange the same task?



Oh, so the other sites provide review as thorough as W1z. And they did it without pay. And they have a life.


----------



## btarunr (Mar 27, 2010)

kylew said:


> Fair enough that they came out yesterday, I don't think many people were suggesting that you re-reviewed all ATi cards using the 10.3 drivers though, only the relevant cards such as the 5850 and 5870.



That's not how it works. The test-bed should stay uniform at all times, with the same drivers in place. If you plan to change the drivers, rebench every single card. Again you can't come up with a relative performance chart with a salad of different cards running with different settings. Credibility takes a blow. 

Credibility also takes a blow when you come up with things such as "relevant cards". HD 5870 doesn't deserve special status in this review, it isn't even the same price-range. It's just a presentation of how cards play out. 



kylew said:


> Other sites seemed to have the time to do so (quite a few by the looks of things).



Show me such a site with 1600 tests in the review. 



kylew said:


> I don't think many people are expecting "the moon", no way near that at all, I know I'm not at least.
> 
> But I'm sure most people would see it as simple as "well other sites managed to use 10.3s, why couldn't TPU?".



Because it shouldn't matter. A review strives to be neutral, and you read a review taking note of test-bed configuration before you get into tests. Giving certain cards certain drivers is not a show of neutrality, just as sticking to 9.12 is not a show of partiality.


----------



## DaedalusHelios (Mar 27, 2010)

W1zzard said:


> what do you think i have been doing for the past 7 years? it's just now that i realized that people here can never be pleased ... i feel like i am wasting my precious time and will go do something else



W1zzard, 

You are a good guy with good reviewing skills and known for one of the most influential and trusted utilities on GPUs. When enthusiasts think GPU they think GPU-z, W1zzard, and TechPowerUp among other things not directly associated to your site. People trust you and your opinion to make purchases with fairly large amounts of money. That is power that you don't directly see although it reaches a far greater scope than just this forum. You are respected across the internet. If that power and fame isn't enough to keep going with reviews then stop reviewing. I just want you to enjoy life. Whatever that takes, do it. Life is too short to do it any other way.


----------



## kylew (Mar 27, 2010)

Wile E said:


> kylew, just drop it. Seriously, you are now being a douche. You made your point, now stfu.



Can't have an opinion that isn't popular?  

I'm just as entitled to express my opinion as others. I don't see why I'm not getting moaned at considering I've been far more reasonable than those doing the real moaning.

So you can take your "stfu" and use it yourself considering you don't seem to be able to have a debate like a mature adult.


----------



## Wile E (Mar 27, 2010)

kylew said:


> Can't have an opinion that isn't popular?
> 
> I'm just as entitled to express my opinion as others. I don't see why I'm not getting moaned at considering I've been far more reasonable than those doing the real moaning.
> 
> So you can take your "stfu" and use it yourself.



But the problem is, you keep dragging out the issue. State your opinion, and move on. No need to keep going with it. All you are doing at this point is pissing people off.


----------



## kylew (Mar 27, 2010)

Wile E said:


> But the problem is, you keep dragging out the issue. State your opinion, and move on. No need to keep going with it. All you are doing at this point is pissing people off.



I got replies, I replied to them, seriously, talk about dragging it on when you're doing the very thing your self.


----------



## TAViX (Mar 27, 2010)

Back on topic.

Still, anyone else except me, thinks that nvidia won big mostly on the POPULAR games, and nvidia optimized games??? I mean, I'm looking at games or benches with good implementation like 3D Mark, or those with the Source engine....hmmm

The 3D Mark thing is also very interesting. I was expecting more from nvidia here, especialy since it was always their favorite bench...


----------



## crow1001 (Mar 27, 2010)

Wile E said:


> kylew, just drop it. Seriously, you are now being a douche. You made your point, now stfu.



+1

Yeah W1zz is a legend, ATI tool,  GPU-Z, OK maybe some have gone over the top with the driver thing but either way TPU is highly respected on the web and is in the top sites people visit for hardware reviews and the latest tech info. I always recommended TPU to members on other forums when it comes to tech reviews.

W1zz and his crew must be doing something right.


----------



## LAN_deRf_HA (Mar 27, 2010)

Yeah sure, I'd like to have seen it done with the new drivers, but let's face it, that would have been practically impossible with the depth of comparison you get with w1zzard's reviews. Too many cards are compared for retesting them all to be that frequent. The only thing I can think of that would have been possible with the time constraints would have been to mix driver versions and just list the driver version next to each card.... but that sort of defeats the point of the comparison.


----------



## kylew (Mar 27, 2010)

btarunr said:


> That's not how it works. The test-bed should stay uniform at all times, with the same drivers in place. If you plan to change the drivers, rebench every single card. Again you can't come up with a relative performance chart with a salad of different cards running with different settings. Credibility takes a blow.
> 
> Credibility also takes a blow when you come up with things such as "relevant cards". HD 5870 doesn't deserve special status in this review, it isn't even the same price-range. It's just a presentation of how cards play out.
> 
> ...



Reason I stated relevance, regardless of price bracket, is because the GTX480 will naturally be assumed to go against the 5870 and the GTX470 against the 5850, considering their performance, it's not something you would wonder about.

I'd reckon when it came down to it, price aside, people will be wondering what they should get, a 5870, or a GTX480.

Do you REALLY rebench all included cards each time you review a new one?


----------



## btarunr (Mar 27, 2010)

Wile E said:


> At any rate, in your opinion, if you take power consumption out of the equation, 480 or 5870 @ 1920x1200 resolution?



Look at performance/$ @ 1920 x 1200 graph. If even $$$ isn't a factor, GTX 480. Yours is an open bench so cooling isn't an issue.


----------



## Frick (Mar 27, 2010)

kylew said:


> I got replies, I replied to them, seriously, talk about dragging it on when you're doing the very thing your self.



You're dragging it on. Just came back from a weeks vacation to this. You are dragging it. Stop that.

Anyway, Fermi is pretty much exactly what I thought it would be in gaming. A bit too hot, but it scales really well with overclocking though. Should be interesting to see what happens when people put some ln2 on them.


----------



## DaedalusHelios (Mar 27, 2010)

kylew said:


> Do you REALLY rebench all included cards each time you review a new one?



For an entire brand if the driver version changes and is being compared in a chart then yes.


----------



## crow1001 (Mar 27, 2010)

TBH you would have to be a masochist to go with a 480, it's a terrible release, overclock a 5870 and the 480 is irrelevant, temps are crazy, power draw is borderline ridiculous, way overpriced.


----------



## DaedalusHelios (Mar 27, 2010)

crow1001 said:


> TBH you would have to be a masochist to go with a 480, it's a terrible release, overclock a 5870 and the 480 is irrelevant, temps are crazy, power draw is borderline ridiculous, way overpriced.



It is release drivers so accurate performance figures with mature drivers would be months from release really.


----------



## AMDfur (Mar 27, 2010)

I've always used TPU for my daily dosage of hardware news, and offcourse will do so in the future. I wonder what this page would be without guys like w1zz, btarunr and the other guys that makes this the "one", neutral and correct place to be.

So, for this fantastic and neutral review, I really don't give the **** if it was used an older driver on the ATI cards. I think, this is now well known. The "funny" thing is that the near six months older cards from ATI perform very well against the fresh card from nVidia, and really, fanboy or not, ATI is in the evolution drivers seat. I was really shocked that the rumors about nVidia really was true. What happened to thm, really? Still, ATI has the best "one card" in the 5970, but I think nVidia will come up with a dual sollution soon as well.

Ok, well, great that the competition lives on, great to see a competive card from nVidia again! Not my flavor, though.


----------



## Tartaros (Mar 27, 2010)

Now I realize, we need to get mules and slaves to make the new power plant for the 3-way sli review  I don't want w1z's house collapsing into a black hole when he turns on the cpu.


----------



## Wile E (Mar 27, 2010)

crow1001 said:


> TBH you would have to be a masochist to go with a 480, it's a terrible release, overclock a 5870 and the 480 is irrelevant, temps are crazy, power draw is borderline ridiculous, way overpriced.



But what about from the standpoint of someone like me. Temps and power are a total non-issue for me. I have a huge psu and I watercool everything. Plus, lets not forget the benefit of CUDA apps.

Although I do have enough psu, there's still the power bill to consider. 

It's just such a tough decision for me. I really like the folding and CUDA power of the nv cards, but the power consumption of the ATI cards is so spectacular.


----------



## Tartaros (Mar 27, 2010)

About temps, it seems people forget the temps in the 8800's and hd3000... I guess the 3870x2 was about 100º when full if I remember. And my 8800gtx was about 90º.


----------



## crow1001 (Mar 27, 2010)

DaedalusHelios said:


> It is release drivers so accurate performance figures with mature drivers would be months from release really.



Yeah I think we will see a refresh before major driver boosts, the card is THAT bad.



Wile E said:


> But what about from the standpoint of someone like me. Temps and power are a total non-issue for me. I have a huge psu and I watercool everything. Plus, lets not forget the benefit of CUDA apps.
> 
> Although I do have enough psu, there's still the power bill to consider.
> 
> It's just such a tough decision for me. I really like the folding and CUDA power of the nv cards, but the power consumption of the ATI cards is so spectacular.



If cash is not an issue for you then yes the 480 for folding and cuda is the card for you.


----------



## LAN_deRf_HA (Mar 27, 2010)

You know it's worth pointing out - I've noticed that usually the people most likely to make comments are the small percentage that have an issue... the large percentage of satisfied people will go on their way merrily. This is one of the reasons why newegg ratings don't always seem accurate to a products actual reliability.


----------



## DaedalusHelios (Mar 27, 2010)

crow1001 said:


> Yeah I think we will see a refresh before major driver boosts, the card is THAT bad.



Same situation with ATi releasing 3 series against G92's and failing hard. Now Nvidia is in the same situation plus the tabloid nature of fringe blogs grating on the nerves of those with decency and politeness.

The way I see it is Nvidia made a new card which is pretty good to be a prototype for what is to come. They have a new architecture now to build upon and revise. It is like the building stage of an american NFL team. You don't bet on them to take it all the way to the superbowl in the first year of a building stage as they are creating a new foundation(team). My only beef with Nvidia is that the prices are too high right now. If they lower prices to accept less of a margin they should be fine and more competitive with the competition this series.


----------



## Melvis (Mar 27, 2010)

I can't beleive what ive been reading in this thread, iam absolutely appalled with some peoples replies. Then to make Wizzard want to give it away gezz thats just bad. 

I take my hat off to Wizzard for what he does, to put out a great review in such a short amount of time, to me is just amazing. To me his reviews are the best on the net, and i only go by his reviews for information on new products. They are presented well, accross a wide range of apps and IMO are not biast reviews and thats what im looking for. TPU is the ONLY forum im signed up on for computer.......anything, thats how good i think it is.

It takes a stuff loud of time to do a review, heck takes me all day just to bench 3 different cards on three different systems let alone doing a bench test on 30 different games/apps/etc.

Im just disgusted to see people whine that much over something that takes alot of hard work and time to get a excellent review out that we ALL have been waiting for, for so long and then to be beatin down for it? tsk tsk :shadedshu

If you leave wizzard you will be missed.


----------



## shevanel (Mar 27, 2010)

Did I hear him say you need *2!!* fermis to run 3 displays???

Good god an idle 60c+!! IDLE!!

he said it gets to 96c then the fan blasts away at the same loudness as 4870x2 and louder than a gtx 295! holy crap! so average load temps are 90C with 50dbA 



> It's a pleasant *22* degrees this time of the year being Spring, but the GTX 480 still scores a scorching 96 degrees Celsius on typical gaming load. Crossing the 100 degrees mark won't be tough for this GPU in even slightly hotter places, especially with Summer coming up.



So in the summertime I need to keep my house @ 71F with constantly moving cool air to keep this card from making my room into a friggin sauna.. yea i rememeber you making me sweat mr gtx 275..

I'm so thrilled with the performance of lower res gaming but after seeing the *325w load draw*, *MASSIVE heat*, *fan noise*, *price* and limited availibility I think I am going to pass on trying this sucker out even though I wanted to but I just cannot sacrafice a cool and quiet bedroom for an average gain of 9-13% over what I currently have.

Only way I would buy this is if the hottest girl on my block only dated dudes with gtx 480's (block-peen) and ive never even met a chick that even knows what an nvidia is LOL..

If it wasn't so hot and loud this would be the best deal ever but there is a HUGE sacrafice to be made to live alongside one of these crock pots.


----------



## shevanel (Mar 27, 2010)

I think w1z's use of 9.12 was a great way to comapre the gtx 480 at launch to the 5870 performance at launch. so what if he  didnt use 10.3 .. you dont think the drivers released from nvidia 3 months from now will not make that bench obsolete?

The review was perfect at comparing the gtx 480 to the competition as if they were both released at the same time.

NOW DIGG IT or TURN in YOUR TPU BADGE!


----------



## HillBeast (Mar 27, 2010)

I would just like to make an apology to W1zz. I know it would be hard doing the review and getting it out on time and making few to no errors in it, and he did a pretty bang up job covering all the bases. My comments earlier about the drivers were a little rude so I do apologise.

Please don't quit man. I really do just want to know why you chose to use 9.12 drivers, and if you can tell us then I think everyone will be happy.


----------



## Wile E (Mar 27, 2010)

shevanel said:


> *Did I hear him say you need 2!! fermis to run 3 displays???*



It's not like a single ATI card can run 3 displays in games with decent frames anyway. You would still need 2 or more to get acceptable framerates at higher settings.

Now, needing 2 cards for 3 displays does suck from a desktop/2D standpoint.


----------



## LiveOrDie (Mar 27, 2010)

abit of a let down if i say so not much better than 2xgtx 280's if u look at it that way


----------



## Wile E (Mar 27, 2010)

Live OR Die said:


> abit of a let down if i say so not much better than 2xgtx 280's if u look at it that way



5870 isn't any better than 2 4870's either. But 4870's and 280's don't have DX11, do they?


----------



## human_error (Mar 27, 2010)

w1zzard,

I do feel a lot of the hate which has been put towards the article is not because people don't think it's very good (i know we would all rather have one of your reviews to read more than any other reviews, especially when it comes to the analysis at the end where we get all the price/power/performance metrics). I feel we have hate here atm because people who wanted ati to show nvidia up are dissapointed the 480 is better than the 5870 in fps, and we have the nvidia fans who are dissapointed that it didn't beat the 5870 by enough and is a power hog - both these groups are annoyed because of how the cards perform but are taking it out on the review and each other (which is unfair, but this is the internet and people can be dicks for the slightest thing).

Many of us understand the hard work which goes into the review (hell i get bored benching my card to check it's overclock is stable, let alone the number of cards you deal with) and if anyone who is complaining stopped to think how long it has taken you so far to do the review and then how much longer it would take you to re-do the ATi cards with newer drivers, and then the older nvidia cards with newer drivers else it isn't fair on them etc then i'm sure very few people, if any, would complain. Yes i would prefer 10.3 drivers, but because you can't be expected to test all your life i accept that isn't possible - the review is still very thorough, i can see where the new cards seem to have strengths and weaknesses in the games and with all the other data you provide i can happily see everything i want/need to about them.

Everyone else needs to remember this isn't an ATi vs Nvidia article - it is a review of a new card and the other cards are thrown in not as a competition, but rather as a point of reference which allows people to get a feel for where the card sits in comparison to others. W1zzards' reviews are always impartial and when it comes to reviewing new parts it focuses on what is needed - how do these new cards perform? If you want a competition between ATi and Nvidia to see which cards bench highest or give the highest fps then take it to the forums in a benchmarking thread, not to a thread about a review. The review delivers what it should in a thorough analysis of how fermi does - as with all w1zzards' reviews i don't feel like i want to go read other reviews to find out missing info on the new card - all i want is here and that shows that w1zzard has succeeded in producing a comprehensive picture of what fermi is.

*Everyone if we can stop discussing why 9.12 was used (it would have taken a LOT longer to review everything again with 9.3s and newer drivers for all cards - we wouldn't have a fermi review at all for a week or two - all so we don't have to add 10% in our heads for ati cards if we really care). we should start discussing fermi, which is what this thread is meant to be about (and also the review, but not on something which has been argued over to death).*


----------



## oily_17 (Mar 27, 2010)

HillBeast said:


> I really do just want to know why you chose to use 9.12 drivers, and if you can tell us then I think everyone will be happy.



Well if you take 7 ATI cards x 21 benchmarks at 5 resolutions = 735 tests

If it took 2min per test that would be ~24h non stop testing for drivers that were released 2 days ago.
Maybe he had other things to get done in that time as well,, new release of GPU-Z and review of GeForce GTX 480 PCI-Express Scaling


----------



## shevanel (Mar 27, 2010)

Well, I know someone wouldnt buy a GTX 480 for non-gaming purposes but a security firm can slip a single 5000 series card in and run multiple monitors for cameras or something like that right?

I don't know dude, I was really excited when I came home an hour ago to read the gtx 480 review and while the performance is impressive I think nvidia expects to much of a sacrafice to the user for moderate performance gains.

I never owned ATI and thought I never would... always chose the NV brand over the ATI brand my whole life, but when the 5800 series benches came out I just had to have one.. it was a well rounded video card and i was hoping the Gtx480 review to make me want to own one but it would be like buying a dump truck as a commuter vehicle.

I know the majority of the people are saying the same things but I don't think people truly realize the bigger picture here which is the fact that it takes a 6 month late $500 card pushed to its limits with power consumption, heat , noise and psu requirements to out pace a $400 6 month old card. 

I just hoped for more, I would have sold my 5870 for $325 if this card from NV wasn't such an upset.. I was really hoping for something great from Nv and instead all we were given was a glimpse at what sacrafices must be made to get 4.8 more fps in Dirt 2 or 5 more frames in Metro 2033. an overclocked 5870 still shines.. 








too bad the 5000 series didnt use 384bit or else NV wouldve really had to raise the bar.

I'm hoping people do not have any stability issues with these $500 beasts and these card manufacturers cannot afford to have massive RMA's


----------



## 983264 (Mar 27, 2010)

Well, I don't know if this could help you guys...

If you are a guy that is hooked up to the latest hardware in pc, you don't care about your power bills just for a powerful rig, you don't care about the problems of the card as long as you want it, you don't care about the price tag, you don't care about global warming, go for the GTX 480.

But if you're a budgeted person, who needs a powerful card just right for the price, you don't want to increase your bills, you wanted a card with least problems as of now, and you care for global warming, you want a card that is still one of the latest in the hardware, go for 5870...

For my conclusion about the GTX 480, not advisable in hot regions, especially for user's who are in cool places, and WTH, power hog... I can say this because I live in the Philippines and it's God-&@%$ very hot in here cause by El-niño, and my rig is not water cooled but only air cooled...


----------



## LiveOrDie (Mar 27, 2010)

Wile E said:


> 5870 isn't any better than 2 4870's either. But 4870's and 280's don't have DX11, do they?



true but its not even worth going to a DX11 card till the end of the year by then they would of drop in price, i was hoping the 480 would leave a gap like when the 8800's come out but i guess i was wrong.


----------



## shevanel (Mar 27, 2010)

maybe gtx 295's will stop being $529 now


----------



## nt300 (Mar 27, 2010)

Wel its true Fermi runs HOT. I can't believe the GTX 480 doubles the HD 5870 in WATTs  Charlie with his original article was correct, something is wrong with this Fermi. 
The performance gap is not by much. I say no thank you for Fermi. ATI has won another day.

How long will the Fermi last for since its hotter than a frying pan 


Tartaros said:


> About temps, it seems people forget the temps in the 8800's and hd3000... I guess the 3870x2 was about 100º when full if I remember. And my 8800gtx was about 90º.


That’s irrelevant, we are talking about complex hardware with advancements in process and design. These Fermi should be run cooler but its quite evident there’s something erroneous with this design and Nvidia really needs a re-spin or something.


----------



## HalfAHertz (Mar 27, 2010)

W1zzard said:


> power, noise, average performance, performance per watt, performance per dollar, more resolutions, 2560x1600, 64-bit OS
> 
> all that has changed and probably even more, if you dont know what you are talking about shut up and go somewhere else



TPU is always the first place i go to for my hardware fix, hell it's on top of my bookmarks list. You have provided one of the most comprehensive reviews on the web, don't let a few rotten eggs get you down! My suggestion to you is to keep up the great job, and maybe drop benchmarks on some older games if you find the work overwhelming. You review like 20+ games? That's larger than my entire gaming collection, simply crazy


----------



## SteelSix (Mar 27, 2010)

W1zzard said:


> email me serious offers to w1zzard@techpowerup.com



Ouch.

W1zzard, I initially questioned 9.12 and feel bad so many have responded so harshly. Your reviews are by far the one of the most detailed, most informative ones to be found. Your pics and detailed card tear down are absolutely priceless. NO ONE DOES THIS like you do. You're the only one who takes the time, every time, to tear a card down. I've linked to your photos dozens of times here and on other forums to answer people's questions.

Have a nice weekend, blow off some steam, and please don't let the negative comments upset you. Thank you sir, TPU is an amazing site.


----------



## brandonwh64 (Mar 27, 2010)

from the looks of the 480, it seems to be exchanging blows with the 5870/GTX295 and sometimes even a 5850/4870X2. 

for the price of these cards and the power consumption, i reality its not worth it IMO


----------



## HalfAHertz (Mar 27, 2010)

If you are a fps crazed fanatic then this is the card for you. It manages to beat the 5970 in some cases! And we all know that 1GPU solutions provide a more stable gameplay and a better overall experience than Sli/CFX


----------



## HillBeast (Mar 27, 2010)

oily_17 said:


> If it took 2min per test that would be ~24h non stop testing for drivers that were released 2 days ago.



I was more refering to at least 10.1 or 10.2.


----------



## laszlo (Mar 27, 2010)

lol good card but power draw is almost like 1 full pc


----------



## HillBeast (Mar 27, 2010)

nt300 said:


> Wel its true Fermi runs HOT. I can't believe the GTX 480 doubles the HD 5870 in WATTs  Charlie with his original article was correct, something is wrong with this Fermi.
> The performance gap is not by much. I say no thank you for Fermi. ATI has won another day.
> 
> How long will the Fermi last for since its hotter than a frying pan
> ...



Like I said: Netburst.


----------



## Solaris17 (Mar 27, 2010)

kylew said:


> How did other sites mange the same task?



they do it because 

1. they dont run as many tests

2. they dont get as many cards. just because we dont have the toxic or atomic or XFX card etc up doesnt mean he doesnt already have it.

3. they will get the new drivers up in a later review. these ones were obviously done first. what you think w1zz got his card yesterday?

4. seriously STFU if you dont know what your talking about. go get some hardware to review then come back and talk to the adults when you have a clue. and not just jumping on some band wagon. what does it matter anyway? are you actually going to buy one? and if you are what do the drivers matter?  go download the new ones problem solved.


----------



## brandonwh64 (Mar 27, 2010)

HalfAHertz said:


> If you are a fps crazed fanatic then this is the card for you. It manages to beat the 5970 in some cases! And we all know that 1GPU solutions provide a more stable gameplay and a better overall experience than Sli/CFX



i might of overlooked it but i didnt see a single bench were it beat the 5970


----------



## BarbaricSoul (Mar 27, 2010)

thanks for the thorough review Wizz, I appreciate it.

As for Fermi's performance, I'm happy I went from nvidia to ATI and got a 5870 with all things considered


----------



## shevanel (Mar 27, 2010)

brandonwh64 said:


> i might of overlooked it but i didnt see a single bench were it beat the 5970



http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/27.html


----------



## pantherx12 (Mar 27, 2010)

I am so confused, why is everyone getting annoyed with people who have criticized ? 

You post something publically expect criticism. I do at any rate. That way I'm pleasantly surprised when I get none.

Having read the entire thread I have to say no one was "harsh" harsh would of been to call W1zz names, pointing out what we perceive as a flaw in a review is not calling names 

Anywhom the main fact of the matter is that the cards should be compared to how cards perform now.

That's all people had an issue with.

Not harsh, not wrong.

An opinion which people are perfectly within grounds to have and to talk about.

No sorry from me 



Nothing more to say about the GPU or the review mind you so won't be checking the thread again, so PM me if you wanna reply


----------



## brandonwh64 (Mar 27, 2010)

shevanel said:


> http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/27.html



Oh i see it was metro 2033. the only reason it got such better frames is the heavy physx that game uses. if you notice the ATI cards are at the bottom of the list mostly cause of that issue. 

To compare this game on a nvidia card to a ATI without physx, i used a 9600GT with this game and got around 20-30FPS with 1600x1200res 4AA all gfx on medium. well same test with my new 5850 and it got almost the same result will the 5850 by 5 FPS. 

is there any other bench that the 480 beats the 5970 beside metro 2033?

BTW

*YOU ROCK W1ZZ!!!!!!*


----------



## WhiteLotus (Mar 27, 2010)

The noise on that thing is ridiculous. I shouldn't have to be deafened every time I load up a pretty game.


----------



## Solaris17 (Mar 27, 2010)

WhiteLotus said:


> The noise on that thing is ridiculous. I shouldn't have to be deafened every time I load up a pretty game.



let me tell ya dude if it gets into the 90's like that rooms are going to get hot. my GX2's did and that will bake a room


----------



## WhiteLotus (Mar 27, 2010)

I already have the morning sun into my room, that and one of these would actually start boiling me alive.


----------



## shevanel (Mar 27, 2010)

its in the 90's in a 71F environment.

I usually keep my house at about 77f if I want to have any extra money to spend after the power bill comes in.

My ol gtx 275 used to make my room 3-4F more than the rest of the house... if the house is 77.. my room was fucking 80. unacceptable.

it even caused my window AC at the old house to condensate all over the floor everytime i turned my pc on.

play with these hot ass cards all you want.. might lose a few pounds too.

96 degrees Celsius = 204.8 degrees Fahrenheit 

with a hair dryer pumping heat into your room @ 50dbA AWESOME!


----------



## TheMailMan78 (Mar 27, 2010)

DaedalusHelios said:


> W1zzard,
> 
> You are a good guy with good reviewing skills and known for one of the most influential and trusted utilities on GPUs. When enthusiasts think GPU they think GPU-z, W1zzard, and TechPowerUp among other things not directly associated to your site. People trust you and your opinion to make purchases with fairly large amounts of money. That is power that you don't directly see although it reaches a far greater scope than just this forum. You are respected across the internet. If that power and fame isn't enough to keep going with reviews then stop reviewing. I just want you to enjoy life. Whatever that takes, do it. Life is too short to do it any other way.



Damn it DaedalusHelios I NEED W1zz reviews. Don't be telling him to be all happy and shit.

W1zz as the ban stick dummy I command you to keep reviewing and beating me. If you close the site where will I go? THINK OF THE CHILDREN!


----------



## shevanel (Mar 27, 2010)

Rememeber that rocket sled demo?

It's available for download now http://www.nvidia.com/object/cool_stuff.html#/demos/2117

but I guess NV doesnt want people seeing it ran on ATI cards bec. this one is GTx400 series only, you will be denied.

ty nvidia


----------



## Solaris17 (Mar 27, 2010)

TheMailMan78 said:


> Damn it DaedalusHelios I NEED W1zz reviews. Don't be telling him to be all happy and shit.
> 
> W1zz as the ban stick dummy I command you to keep reviewing and beating me. If you close the site where will I go? THINK OF THE CHILDREN!



its true im 12 and what is this? if i dont get reviews ill refuse to eat cheerios get beat by my rents and kill myself


----------



## HillBeast (Mar 27, 2010)

shevanel said:


> Rememeber that rocket sled demo?
> 
> It's available for download now http://www.nvidia.com/object/cool_stuff.html#/demos/2117
> 
> ...



They did that with Adrianne and Human Head. Most of their stuff apart from the PhysX demos don't work on ATI.


----------



## TAViX (Mar 27, 2010)

shevanel said:


> http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/27.html



That game is optimized for nvidia and also it uses phisics on the GPU not CPU. Completely irrelevant! Just like when testing Batman:A.A. with that $hit on.


----------



## SteelSix (Mar 27, 2010)

TAViX said:


> That game is optimized for nvidia and also it uses phisics on the GPU not CPU. Completely irrelevant! Just like when testing Batman:A.A. with that $hit on.



Love that 7970 pic in your sig! Damn if that XFX custom 5970 coming out doens't make one wonder..


----------



## Yellow&Nerdy? (Mar 27, 2010)

Great review as always. I especially like the "Performance per Dollar/Watt", TPU is the only site that provides that.

The GTX480 wasn't bad at all in performance and price, both exactly between the 5870/5970. But it's a big pile of steaming fail in thermal performance, that is power consumption, noise and temps. Consumes power like crazy, more than a 5970. And the temps the card is putting out is in no way acceptable. A card should never go beyond 90 degrees, even with Furmark. Not mentioning normal load, like games or Vantage. The fan is noisy too, even though it doesn't cool the card properly. 

And Nvidia has had time to tinker and tweek the card, 6 months to be exact, so I did expect these issues to be some what solved. Makes me think that either the Fermi wasn't even half-done when ATI came out with the 5000-series, or Nvidia has been lazy OR the Fermi-architecture is just too big and hot. Try again Nvidia.


----------



## tkpenalty (Mar 27, 2010)

performance crown still goes to AMD at the moment... the card is terribly dissapointing when you consider how long nvidia has taken with it. Not really their fault, more like TSMC's fault. I don't think nvidia even contemplated that that silicon would run so hot. 

They dont really need this card to sell however since they're making more money with tegra and other lines (hint: the cockroach GPU).


----------



## Sasqui (Mar 27, 2010)

So I gotta ask - should we expect a 470 review soon?


----------



## entropy13 (Mar 27, 2010)

Considering we've reached 40C for most of the week (and summer's just starting!) I wonder if you can only use Fermi in watercooled setups, or with aftermarket coolers, and an air-conditioned room, in this type of weather...


----------



## facepunch (Mar 27, 2010)

great review wizz now after reading this im glad i got my 5970 now


----------



## hv43082 (Mar 27, 2010)

Kinda disappointing performance.  Can they make the dual GPU version of this with such high power draw and heat output?


----------



## Dahaka (Mar 27, 2010)

W1zzard said:


> yes, that's why i'm quitting, seems i'm not the right person to do this



Cry me a river ..

Don't be so dramatic, there is a mistake, a huge one, but is the first i read here.

Recognize the mistkae and fix it, not justo for me, for all users and web readers that you have through time.


----------



## btarunr (Mar 27, 2010)

Using Catalyst 9.12 is not a mistake, not a flaw, nothing there to fix.


----------



## Black Panther (Mar 27, 2010)

Why don't you quit dragging this out?
I see no 'mistake' either, just a dam good, informative, and in-depth review.

Don't you think W1z would have used 10.3 had he _wanted_ to?


----------



## newtekie1 (Mar 27, 2010)

Seriously, this thread certainly did take a turn...lol

W1z still does the best reviews on the net. PERIOD.  They are the only reviews I trust fully.

The drivers uses are not that old, something like 3 months, W1z tends to do a complete rebench with the latest drivers every quarter, which means new driver review will likely be coming.  W1z could have benched just the HD5800 cards with the new drivers, but then people would still bitch about the older drivers being used with the older cards, and the review not helping give an idea of how much of a performance boost people get over their current cards...  It is lose lose.  I'd rather see one single driver used for all the cards than some cards with one driver and others with another.

Having a few nice constructive comments about the older drivers would have been fine, however IMO going on and on about it, posting the same non-constructive comments about it and implying that W1z doesn't know what he is doing is nothing more than trolling and flaming, and it should be handled as such with infractions and bans.  You don't just come onto *HIS* website and start bashing his work, it isn't acceptable.

If you can't make your comments constructive, DON'T make any.

IMO, posts about the drivers by the following people should have been deleted, and met with infactions or bans:
crow1001
Reefer86
rpsgc
Semi-Lobster
a_ump
Dahaka

And mind you, that isn't everyone that talked about the drivers, just those that were really just trolling about them and had crap like WTF and FAIL in the post, and nothing contructive.

IMO, the comments for a review should be able the product, and not the review process itself.  A constructive comment or two about a certain issue with the review is fine, but once it is posted by someone else, that is it, no need to repost the same thing 30 times.


----------



## Mussels (Mar 27, 2010)

why was cat 9.12 used? 5K cards get quite the performance boost in the latest drivers.

the reason it matters is especially with the 5970, it would have very outdated crossfire profiles for the newer DX11 games - 3 months could be the difference between one GPU or both GPU's being used in some titles.


----------



## W1zzard (Mar 27, 2010)

added your oh so precious catalyst 10.3 scores for the 5870 ... 2% was definitely worth the drama .. 5970 coming later

now ask all your 10.3 review websites whether they used 10.3a or the real whql build


----------



## Mussels (Mar 27, 2010)

W1zzard said:


> added your oh so precious catalyst 10.3 scores for the 5870 ... 2% was definitely worth the drama .. 5970 coming later
> 
> now ask all your 10.3 review websites whether they used 10.3a or the real whql build



10.3, 10.3a and 10.3 OGL4 :S
betas make it messy!


we ask YOU to test it, because we trust your results!


----------



## Cold Storm (Mar 27, 2010)

nah, it's not that betas make mistakes.. Beta's aren't "official"..

Thx W1zzard for the add. 


Stuff has to start somewhere..


----------



## Dahaka (Mar 27, 2010)

Black Panther said:


> Why don't you quit dragging this out?
> I see no 'mistake' either, just a dam good, informative, and in-depth review.
> 
> Don't you think W1z would have used 10.3 had he _wanted_ to?




Ok, let me ask you something, how much pay nvidia for this review....


----------



## Cold Storm (Mar 27, 2010)

Dahaka said:


> Ok, let me ask you something, how much pay nvidia for this review....



let me ask you something. 


want to stop trolling?


----------



## W1zzard (Mar 27, 2010)

Mussels said:


> 10.3, 10.3a and 10.3 OGL4 :S
> betas make it messy!
> 
> 
> we ask YOU to test it, because we trust your results!



i'm not testing a beta driver that appears right before the launch of the biggest competitor. you're still not happy with 10.3 whql ? apparently all of the 9.12 naysayers fell for amd's propaganda of 15% faster drivers


----------



## pantherx12 (Mar 27, 2010)

W1zzard said:


> added your oh so precious catalyst 10.3 scores for the 5870 ... 2% was definitely worth the drama .. 5970 coming later
> 
> now ask all your 10.3 review websites whether they used 10.3a or the real whql build




I appreciate you going to the trouble.

If you consider the sample 480 you have only over clocks 10% a 2% overall difference just from drivers is still a difference that needs to be taken into account eh 

Although I see in a few games the little extra the drivers give put it ahead of the 480 XD 


Cheers again Wiz.

Whilst we're here is their any other dx11 games you can use for future reviews?
Does AVP have a bench mark for example?


Also I realise I said I wouldn't post again but this doesn't count


----------



## Mussels (Mar 27, 2010)

W1zzard said:


> i'm not testing a beta driver that appears right before the launch of the biggest competitor. you're still not happy with 10.3 whql ? apparently all of the 9.12 naysayers fell for amd's propaganda of 15% faster drivers



happy with anything you test - we wanted to know if 9.12 was holding them back due to the age, from a verified source (read: not AMD's release notes)

10.3 is legit WHQL, so its quite valid to use that one.


----------



## pantherx12 (Mar 27, 2010)

W1zzard said:


> apparently all of the 9.12 naysayers fell for amd's propaganda of 15% faster drivers



I'm going off my friends/associates results using the new drivers actually 

I didn't even know AMD stated such a thing


----------



## mtosev (Mar 27, 2010)

i was expecting it to be on pair with the 5970.oh well


----------



## TotalChaos (Mar 27, 2010)

W1zzard said:


> i think i'm done .. every time there is a review you people have something to complain about .. suddenly lost all motivation to do any further vga reviews ... anyone interesting in buying a tech site ?


The ones that bitch are usually the ones that have never seriously taken the time out to try and do a VGA card review. It takes huge amounts of time and effort and that's if things go according to plans. Once you bring a whole new product range into the picture no doubt that time frame increases three times as drivers and compatibility questions arise.  W1zzard I hope you realize that you do carry a high level of respect among many sites, not just here at TPU and that once the de-stressing passes and you recover from GPU burnout that you are here for a long time providing quality reviews.


----------



## oily_17 (Mar 27, 2010)

TotalChaos said:


> The ones that bitch are the ones that have never seriously taken the time out to try and do a VGA card review. It takes huge amounts of time and effort and that's if things go according to plans. Once you bring a whole new product range into the picture no doubt that time frame increases three times as drivers and compatibility questions arise.  W1zzard I hope you realize that you do carry a high level of respect among many sites, not just here at TPU and that once the de-stressing passes and you recover from GPU burnout that you are here for a long time providing quality reviews.



Coundn't agree more, it's the weekend W1zz, go drink some beer and visit some hookers


----------



## pantherx12 (Mar 27, 2010)

TotalChaos said:


> The ones that bitch are the ones that have never seriously taken the time out to try and do a VGA card review.



Don't be presumptuous


----------



## TheMailMan78 (Mar 27, 2010)

W1zzard said:


> i'm not testing a beta driver that appears right before the launch of the biggest competitor. you're still not happy with 10.3 whql ? apparently all of the 9.12 naysayers fell for amd's propaganda of 15% faster drivers



 at 2%. I thought those 15% increases were to good to be true. And there you have ya dumb asses. NOW do you understand why this was all pointless and the review was in fact awesome?

W1zz rest assured that most people on this site are very greatful for your work. I wouldn't have spent my time on that logo if I didn't believe in this place......except for newtekie1. Hes just a fanboi.


----------



## newtekie1 (Mar 27, 2010)

W1zzard said:


> added your oh so precious catalyst 10.3 scores for the 5870 ... 2% was definitely worth the drama .. 5970 coming later
> 
> now ask all your 10.3 review websites whether they used 10.3a or the real whql build



Thank you W1z, and I appreciate you going through the touble.  2% really puts all those people trolling and bitching in their place...

Really makes a difference, and really shows how little ATi has done with their drivers over the time the cards have been out.

I'm really looking forward to your review of the GTX470, and would have prefered your time spent reviewing that then redoing ATi cards with drivers that make no real difference...


----------



## Mussels (Mar 27, 2010)

i think i missed the pages of trolling, .. i didnt read the last 10 pages before i posted mine.


I think we need to organise a beer fundraiser for w1zzy.


----------



## jagd (Mar 27, 2010)

Dont be emo and thanks 

http://www.semiaccurate.com/forums/showpost.php?p=37483&postcount=17



W1zzard said:


> added your oh so precious catalyst 10.3 scores for the 5870 ...


----------



## TheMailMan78 (Mar 27, 2010)

newtekie1 said:


> Thank you W1z, and I appreciate you going through the touble.  2% really puts all those people trolling and bitching in their place...
> 
> Really makes a difference, and really shows how little ATi has done with their drivers over the time the cards have been out.



The only thing I will guess is those "15%" increases are in some sections of the game. Not over all. Anyway thats my best guess.


----------



## pantherx12 (Mar 27, 2010)

TheMailMan78 said:


> The only thing I will guess is those "15%" increases are in some sections of the game. Not over all. Anyway thats my best guess.





Did you relook through the review? 2% is the average difference.

In some games you get that full 15% ( and in some other games you get as much as 20%)


Its important to look at all the data not just the conclusions .


----------



## W1zzard (Mar 27, 2010)

pantherx12 said:


> Also I realise I said I wouldn't post again



i was so looking forward to that


----------



## Mussels (Mar 27, 2010)

W1zzard said:


> i was so looking forward to that





I'm so out of this thread.


----------



## pantherx12 (Mar 27, 2010)

W1zzard said:


> i was so looking forward to that



Its a pleasure to disappoint 


In all seriousness what's the problem though, we have a difference in opinion and that is it.

Can't we all just get along


----------



## TheMailMan78 (Mar 27, 2010)

pantherx12 said:


> Did you relook through the review? 2% is the average difference.
> 
> In some games you get that full 15% ( and in some other games you get as much as 20%)
> 
> ...



And in some cases it got LESS FPS. So yeah running the 10.3 was a waste of time.


----------



## pantherx12 (Mar 27, 2010)

TheMailMan78 said:


> And in some cases it got LESS FPS.




And that's the precise reason why current drivers should be used, as it gives you the most accurate current results.


----------



## newtekie1 (Mar 27, 2010)

TheMailMan78 said:


> And in some cases it got LESS FPS. So yeah running the 10.3 was a waste of time.



Yep, and average of 2% means that overall it was only 2% better.  If it was 15% better in one area, that means it was 13% worse in another...  Either way, the whole driver argument was a waste of time for the most part, as no one will notice a 2% performance difference.


----------



## mlee49 (Mar 27, 2010)

IS there a dual gpu 4xx series card coming out?  Check this pic from Evga's youtube video of PAX 2010 and tell me that doesnt look like a dual gpu watercooling block:


----------



## TheMailMan78 (Mar 27, 2010)

pantherx12 said:


> And that's the precise reason why current drivers should be used, as it gives you the most accurate current results.



Was 2% worth all the drama? Did it bring it closer to beating Fermi? Come on man this was all fucking pointless. All that 2% did was piss W1zz off and made NO REAL difference in the outcome other than boosting your ego that you might have some kind of pull in this site. Its stupid. I hate when this kinda crap happens. It takes away from the meat and potatos.


----------



## btarunr (Mar 27, 2010)

mlee49 said:


> IS there a dual gpu 4xx series card coming out?  Check this pic from Evga's youtube video of PAX 2010 and tell me that doesnt look like a dual gpu watercooling block:
> 
> http://img.techpowerup.org/100327/Capture001.jpg



Nah, check the EVGA news post. It's just a GTX 480 full-coverage block from Swiftech.


----------



## TheMailMan78 (Mar 27, 2010)

Here does everyone feel better now?

Good lets get back to Fermi.


----------



## newtekie1 (Mar 27, 2010)

mlee49 said:


> IS there a dual gpu 4xx series card coming out?  Check this pic from Evga's youtube video of PAX 2010 and tell me that doesnt look like a dual gpu watercooling block:



I don't think that is a dual-GPU block.  However, I wouldn't be surprised if a dual GTX470 card came out to at least match the HD5970, from what I've seen it should be possible.  I doubt a dual GTX480 is possible right now though.


----------



## TheMailMan78 (Mar 27, 2010)

newtekie1 said:


> I don't think that is a dual-GPU block.  However, I wouldn't be surprised if a dual GTX470 card came out to at least match the HD5970, from what I've seen it should be possible.  I doubt a dual GTX480 is possible right now though.



Dude with those temps I would be surprised.


----------



## newtekie1 (Mar 27, 2010)

TheMailMan78 said:


> Dude with those temps I would be surprised.



The GTX470 seems to get the same temps as the GTX480, but has a much weaker heatsink, with a smaller and quieter fan.  I think it would be possible for a dual-GPU card based on GTX470 with some beefy cooling.


----------



## pantherx12 (Mar 27, 2010)

TheMailMan78 said:


> Was 2% worth all the drama? Did it bring it closer to beating Fermi? Come on man this was all fucking pointless. All that 2% did was piss W1zz off and made NO REAL difference in the outcome other than boosting your ego that you might have some kind of pull in this site. Its stupid. I hate when this kinda crap happens. It takes away from the meat and potatos.



To me yes, accuracy in a review that is obviously trying to be accurate is paramount.

It also hurt W1zz rep regardless of what we as individuals think, that's certainly important to consider as a professional reviewer is it not?


----------



## TheMailMan78 (Mar 27, 2010)

pantherx12 said:


> To me yes, accuracy in a review that is obviously trying to be accurate is paramount.
> 
> It also hurt W1zz rep regardless of what we as individuals think, that's certainly important to consider as a professional reviewer is it not?



WTF how does it hurt his rep? Hes ALWAYS done reviews this way. I call BS on any site that claims to have done a PROPER review with 2 day old drivers. I mean really man. You lose on this one.


----------



## cdawall (Mar 27, 2010)

I'm kinda happy I found a 4870x2 and 4850x2 those together should beat a pair of 480's and I don't need to rent a 12v generator to get my pc to turn on. 320w under load is a wee bit ridiculous hell I remember when I had a fsp 250w add on psu that can't even power one of these cards :shadedshu after so much delay this card should have wiped the floor new drivers or not the 5970x2 performed better 5870 threw a good stab at it and for less money and power consumption I think its a better card hell with an oc the 5870 may actually beat the 480 oc'd. I just think my alnost year old 4870x2 should have been destroyed by both the 5870 and gtx480 not kept up in some games beat them in others that is stupid. Maybe nvidia will fix fermi as it seems the 480 is already a failed version of a bigger card with its core partially shut down. I assume that means rumors of fab issues were true and still exist. Maybe a smaller less complicated die would help 1b more transistors than ati might be an issue. Oh well gtx490 with a fully unlocked gpu should do well against the 5970 doubt it will beat it when both are oc'd but they will do well against each other.


Oh on a lighter not fsp sells a 2000w psu that will be perfect for sli with 3 of these cards


----------



## pantherx12 (Mar 27, 2010)

TheMailMan78 said:


> WTF how does it hurt his rep?




As I said, regardless if what we think. People would of seen him using old drivers and straight away dismissed the review, you can see that in this very topic.


It is not opinion, it is fact that doing it hurt his rep, how much? probably not a lot but damage is damage.


----------



## TheMailMan78 (Mar 27, 2010)

newtekie1 said:


> The GTX470 seems to get the same temps as the GTX480, but has a much weaker heatsink, with a smaller and quieter fan.  I think it would be possible for a dual-GPU card based on GTX470 with some beefy cooling.



The power draw if your right wouldn't be worth it IMO. I think Fermi is Nvidia stepping stone much like the 2900 was for ATI. I'm willing to bet the next release will be epic from Nvidia.



pantherx12 said:


> As I said, regardless if what we think. People would of seen him using old drivers and straight away dismissed the review, you can see that in this very topic.
> 
> 
> It is not opinion, it is fact that doing it hurt his rep, how much? probably not a lot but damage is damage.



And those people dont have a clue. What would you rather have. FACTS or a bunch of made up shit to sound cool? Yeah thats what I thought.


----------



## newtekie1 (Mar 27, 2010)

TheMailMan78 said:


> The power draw if your right wouldn't be worth it IMO. I think Fermi is Nvidia stepping stone much like the 2900 was for ATI. I'm willing to bet the next release will be epic from Nvidia.



It'd only be about 60w more than an HD5970, judging by the ~30w more the GTX470 has over the HD5870.  And the GTX470's clocks could be lowered also if they are just after matching the HD5970, since the HD5970 has lower clock than an HD5870.



TheMailMan78 said:


> And those people dont have a clue. What would you rather have. FACTS or a bunch of made up shit to sound cool? Yeah thats what I thought.



I think the driver issue has been talked about enough.  Everyone can have their own opinion, the two sides of the argument have been stated, there is no need to continue to re-hash it.  If we don't have comments about the card, don't post.


----------



## pantherx12 (Mar 27, 2010)

TheMailMan78 said:


> And those people dont have a clue. What would you rather have. FACTS or a bunch of made up shit to sound cool? Yeah thats what I thought.




.... How does that even have any relevance ? 


Lets call it a day anyway, clearly getting wires crossed.


----------



## crow1001 (Mar 27, 2010)

You would have seen a 15-20% increase in dirt 2 if you ran it in DX11, just saying..Unigine 2 has not been benched with 10.3 either from what I can tell.


----------



## Yellow&Nerdy? (Mar 27, 2010)

Lol I didn't even notice the driver "issue" before I started reading this thread .

Doesn't bother the most of us I would believe. 2% overall isn't worth an extra round of work for the reviewer. Looking at the amount of cards and the games they were tested and thinking about the amount of work makes me feel kind of overwhelmed, but also thankful that I can get all these results just by a couple of clicks. That's how I see things. There will always be people that don't realize the amount of work that must go to making a good review, and nitpick about some minor flaws, that in the end don't matter. The difference could be 0.1%, the fans/trolls will always strike


----------



## Steevo (Mar 27, 2010)

Wow, what a bunch of fucking assholes. Wipe the snot off your blubbering vaginas and go get bent with your own review if you don't like this one. It shows a even comparison at low through high game settigns and resolutions, the temps, overclocks, power draw with modern most played games, on a decent system that a good gamer would use to play on. 


I benched Heaven 2.0 on 10.3 drivers overclocked, and there is a thread with many other users running stock clocks posting results, so use search.


----------



## DOM (Mar 27, 2010)

W1zzard 

you do nice reviews  i really dont even look at other sites on reviews cuz yours are with new and older tech plus all the res you run them at, so my hats off to you

i was waiting for your review on to which card i would be upgrading to and thanks for you time doing what you do with your reviews dont let the dumb ppl get to you theres always going to be haters out there finding reasons to bitch.


----------



## 20mmrain (Mar 27, 2010)

*Did you guys see this????...... I was looking at some of these cards and I noticed that the EVAG F.T.W editions run so hot they require a Water Block. This can't be a good sign!

http://www.evga.com/articles/00539/

Maybe I have that wrong and they are just special editions GTX480/470's and they will come out with a non water blocked FTW version. But I don't think I am wrong about this.

BTW Nvidia What is going to stop us ATI owners just from overclocking a little bit and match the power of the GTX 480. Then we could save our selves $500 Bucks. *


----------



## @RaXxaa@ (Mar 27, 2010)

It didnt look very impressive, kinda same performance as any other cards arnd. now they gonna just edit some small clocks and  here and there and call them gtx 485 which is as fast as the 5890 and then make it a bit more better and call it 495.... whats the use


----------



## ShiBDiB (Mar 27, 2010)

We  u wizz... just ban all the ati fanboys from the site and we'll all be happier.


----------



## erocker (Mar 27, 2010)

Wow do people get defensive about the products they own. Great review on Fermi, I could care less what drivers are used for the other cards. If I'm interested in a review, I'm interested as a potential buyer of reviewed product. This is a very good review I came to the conclusion that my current video card setup is better than this card and I won't be changing anytime soon.

Anyways, many people all over the world have a false sense of entitlement and a false sense of their importance of opinion, you see when they were young thier mommies told them "they can do anything" and gave then anything they want. Now they sit behind their computers all day with overinflated egos and are as useful as a bucket of dirt. People think they matter on the internet. It's all just blablablablablablablablabla. 

What is really sad is most people don't know how to criticize constructively. Sitting back in their fake leather chairs, eating Doritos and drinking Mountain Dew, staring at a computer screen, there is no time to learn to be a real critic. Words like, OMG, FAIL, WTF, etc. are the extent of their intelligence.

Great review that includes a wide variety of video cards to compare from a reviewer that I trust. Thanks for all the hard work to give us this free and valuable information!


----------



## Wrigleyvillain (Mar 27, 2010)

Hey w1z I'd love to buy it! Only $15M, right? 

Seriously, please don't stop doing reviews. But these fanboys are aggravating toolboxes all right. It's just a freakin' video card you losers.


----------



## nt300 (Mar 27, 2010)

tkpenalty said:


> performance crown still goes to AMD at the moment... the card is terribly dissapointing when you consider how long nvidia has taken with it. Not really their fault, more like TSMC's fault. I don't think nvidia even contemplated that that silicon would run so hot.
> 
> They dont really need this card to sell however since they're making more money with tegra and other lines (hint: the cockroach GPU).


More like design flaw. They did something wroing because re-spinning didn't seem to help hence the 6plus months of delay.

Fermi's performance is not all that great whn you consider price & performance. Nvidia can easy fix this no problem by lowering the price tag but the true problem is power consumption and many don't want a card that sucks back so much power. 

This is BullsCrop tell you I 

 sources I trust for reviews:
TechPowerUp & Anandtech 

*Temperature, Power, & Noise: Hot and Loud, but Not in the Good Way*
http://www.anandtech.com/video/showdoc.aspx?i=3783&p=19
*Final Throughts*
http://www.anandtech.com/video/showdoc.aspx?i=3783&p=20


----------



## 20mmrain (Mar 27, 2010)

Wrigleyvillain said:


> Hey w1z I'd love to buy it! Only $15M, right?
> 
> Seriously, please don't stop doing reviews. But these fanboys are aggravating toolboxes all right. It's just a freakin' video card you losers.



In all seriousness.....Wizzard..... It's because you can't stop every bad egg from entering. Especially young people have a hard time telling between constructive criticism and non- constructive criticism. 
It's one thing to make a joke and have fun with your comments ..... like I see allot of responsible people doing. 

But it's another to see people trolling and making bad and mean comments with no merit behind them.

Maybe a good thing to do would be to have a meeting with your moderators about it...... and try to find away to stop it.

Because when it's starts getting to you that much it is time to do something to stop it.

Keep up the good work!!!


----------



## TheMailMan78 (Mar 27, 2010)

erocker said:


> Most likely 5 series owners getting thier undies in a bundle. Great review on Fermi, I could care less what drivers are used for the other cards. If I'm interested in a review, I'm interested as a potential buyer of reviewed product. Through this very good review I came to the conclusion that my current video card setup is better than this card and I won't be changing anytime soon.
> 
> Anyways, many people all over the world have a false sense of entitlement and a false sense of their importance of opinion, you see when they were young thier mommies told them "they can do anything" and gave then anything they want. Now they sit behind their computers all day with overinflated egos and are as useful as a bucket of dirt. People think they matter on the internet. It's all just blablablablablablablablabla.
> 
> ...



This post is teh failz.


----------



## thebluebumblebee (Mar 27, 2010)

W1zzard said:


> i think i'm done ..



*NOOOOOOOOOOOOOOOOO*

Seriously W1zz, I EXPECT your reviews to be the same.  Boringly similar even.  Same format.  Same tests.  Same thoroughness.  Same VGA card ONLY power usage.  I like the fact that I can go back and look at one of your older reviews and see the same info (except for that wiz-bang power meter thingy, which is fine) And I've never picked up a hint of bias in your reviews.  I open two sites when I get up in the morning: a news site and TechPowerup! ! 



W1zzard said:


> now ask all your 10.3 review websites whether they used 10.3a or the real whql build



Testing with beta drivers WOULD be a FAIL.  Driver updates are always "over promise, under deliver".

Now, back to what this thread is SUPPOSED to be about, the GF100.  Is it an advancement of GPU's?  Did anyone think the 2900XT was?  Think of it this way.  A car company brings out a new car that transports 5 people in comfort, does 0-60 in 2.1 sec, does the 1/4 in 10 sec at 150mph and has safety features that wows everyone.  Only thing is, it uses gas like a 1970 New Yorker.  Would the car even be considered for car of the year?  The GF100 is built on the 40nm process, which lead us the expect lower power consumption, but 3.2 million transistors need to be fed.  Those 2D clocks are .  
I'm looking forward to the GF104 cards.  It will be interesting to see how the GTS450 compares to the GTX285.


----------



## dr emulator (madmax) (Mar 27, 2010)

W1zzard said:


> i think i'm done .. every time there is a review you people have something to complain about .. suddenly lost all motivation to do any further vga reviews ... anyone interesting in buying a tech site ?



just post your reviews boss then close the thread ,

i really can't beleive the disrespect their has been here 
wow how some of you are still here is beyond me 
come and give me some stick over at general nonsense instead


----------



## nt300 (Mar 27, 2010)

W1zzard said:


> i think i'm done .. every time there is a review you people have something to complain about .. suddenly lost all motivation to do any further vga reviews ... anyone interesting in buying a tech site ?


Criticism is all part of the game W1zzard. You just keep doing what you do best on the net and that is trust worthy and highly honest reviews. Let those feather dusters talk away.
There’s a cycle, one day ATI wins the day and the other day Nvidia wins the day.

People complaining and criticizing is nothing new, it happens all the time. In regards to older, new & beta drivers, reviewers have there reasons for choosing either of them. I’ve seen many review sites update the review in question with new driver releases as they get released day by day, week by week, month by month.  Can it be Nvidia or ATI lining up some pockets perhaps 

All in all, keep up the good work, nobody wants to see you go 

Hey, never thought to use WORD spell checker before


----------



## jessicafae (Mar 27, 2010)

I think one of the most interesting things about GTX470 GTX480 (Fermi) is
1) it does compete nicely with 5850 and 5870, but needs to "run hot" to do it.
2) but the architecture looks really clean and scalable into the future
3) the "problems" are heat and power (not architecture) and maybe lower clocks than expected (because of heat and power issues)
If 28nm is a clean process, these "problems" could go away and Fermi could have a major performance jump (without any major re-architecture).  But I think Fermi shows most of NVidias architecture plans for a couple generations. ATI 6xxx will be their "new" architecture. The 28nm generation battle could have a lot more surprises.

PS: I don't know why everyone is so bent out of shape over this review.  w1zzards reviews are clean and simple and as close to "scientific" as one can get. I mean who else includes so many old games with >100fps in their averages?  By using so many games the "biases" average out so much better.  And with so many graphs, if someone is interested in only one game at one resolution they can check that out and make a choice based on that. And the performance/price and performance/power graphs show real numbers not just someones "feeling" about these in some summary paragraph.  If w1zzard would stop reviewing it would be a real loss for the community.  ok that's my 2 cents...


----------



## nopower09 (Mar 27, 2010)

W1zzard said:


> i think i'm done .. every time there is a review you people have something to complain about .. suddenly lost all motivation to do any further vga reviews ... anyone interesting in buying a tech site ?



some people complain. some talking great job. why because some people that complain you want to leave people that still need you, Boss? 
i'm joint this forum because this fermi review, and after looking some good review(sapphire 5770 which want i buy because your review)


----------



## nt300 (Mar 27, 2010)

nopower09 said:


> some people complain. some talking great job. why because some people that complain you want to leave people that still need you, Boss?
> i'm joint this forum because this fermi review, and after looking some good review(sapphire 5770 which want i buy because your review)


*WELCOME to TECHPOWERUP* 

I have a question for everybody posting on this thread, why is there about 380 posts in this thread but only *13 Diggs?* You can find the 13 Diggs on W1zzards 1st post. 

*Come on people let’s DIGG the review OK.*
http://forums.techpowerup.com/showthread.php?t=117929


----------



## jasper1605 (Mar 27, 2010)

shevanel said:


> I think w1z's use of 9.12 was a great way to comapre the gtx 480 at launch to the 5870 performance at launch. so what if he  didnt use 10.3 .. you dont think the drivers released from nvidia 3 months from now will not make that bench obsolete?
> 
> The review was perfect at comparing the gtx 480 to the competition as if they were both released at the same time.
> 
> NOW DIGG IT or TURN in YOUR TPU BADGE!



well stated

compare launch to launch to get an idea of equally balanced card powers.  Drivers update and change and add lil' bits of performance but to get a COMPARISON on cards.  Drivers should be matched up as closely as possible.  

That'd be like taking a henessey viper venom 1000tt and racing it against a corvette c5r.  The viper is more powerful yes, but what if it had to race with E-85 ethanol (which it can't but work with me here lol) and the c5r had 110 octane Jet-A fuel.  I reckon the c5r would look a lot better than it would if both were using same grade fuel.

Good job wizzard.  As a noob looking up to a genius... I salute you


----------



## [I.R.A]_FBi (Mar 27, 2010)

Yes, im late to the aprty but give wizz some slack, i couldnt read further after u guys ragged on his work

Old  Mar 19, 2010, 12:30 PM 

Thats when he posted the reviews and i guess hid them, i think you guys are being too hard on him.


----------



## jaredpace (Mar 27, 2010)

Alright, I've been reading through this thread for a bit now.  Just want to say a few things.  W1zzard & Btarunr are awesome.  This site rocks and has always had great reviews, and is considered one of the best on the web.  The performance per dollar, performance per watt, and all the added bonuses to more recent and this latest gtx480 review that take up plenty of time are very nice to have.  Such a comprehensive review takes plenty of hard work and man hours, no doubt.  I consider them perks to visiting this site.  Don't take everyone's criticism of this review as hate.

(W1z, please don't give up this awesome work, go relax, buy an orange lambo or something JK).  

However, as many others have already mentioned, this review sucks.   There is too much content to give justice to in a week or two of preparation.  This is the biggest review TPU or any GPU hardware review site will publish of the past 9 months and next 9 months.  It's obvious who prepared and who did not.  The most important review of a year and a half.  9.12 on all the older ati benches skew the output of all the new wonderful bonus review content making the review seem VERY outdated compared to the rest of the web's review content.  10.3 preview came out in early feb, 10.3 final came out march 2nd, 10.3a came out march 13, opengl4 preview march 23, and 10.3b march 25th.  Why put all the effort into the added features, and not use the newest drivers?

So you can listen to this thread's and other "places on the interweb"s criticism of this huge important TPU review, and moan right back.  
_Or_ quit your job as you said earlier and sell the website for cash.  
_Or_ just leave it as it is: a very thorough review done with old drivers and old benchmarks.  
_Or_ amend some benches within the review (as is what seems might be happening today)
_Or_ redo a current modern review in the TRADITIONAL TPU BADASSEDNESS with only the newest and most relevant drivers, benches, games, cards, resolutions, and settings.  
Take all the time you want, and just release it when you're good and ready, and the review doesn't have to include Riva TNT2 ultras, voodoo banshees, and 7600GS's at 640x480.  Your fanbase isn't going anywhere anytime soon, but a dozen or so more old ass reviews will surely deter some of your readers as evidenced already in the first 14 pages of this thread.  

*Pros *of the review:  Excellent written content, great photos, Wizzards overclocking section, his high definition photos of the pcb, 4 methods of power consumption tested directly on the card, a thorough card disassembly, his actual measured voltage readings, his additions and creation of gpu-z database, the most games and most cards tested (although a drawback as time progresses and drivers are 4 months old).  I love seeing hundreds of games tested at 5 resolutions on 30 cards and the relative performance graphs (but face it, that probably isn't possible every time without a month or more of preparation).  

*Cons*:  Old drivers.  (Should be 10.3, 10.3a, 10.3b)

Anyways, Cheers W1zzard & Btarunr.  This site is excellent, and the forums are packed with laid back people.  It's really cool to see you an active member of the forums with good tech knowledge.  It's cool to see you on TPU and XS forums.  And your contributions go way back.  GPU-Z rocks.  RBE & BAGGZLASh rock, the popular and current news section is great.  Community is cool, Reviews are good, and I'm sure there's dozens of other great attributes to TPU I've yet to explore.   I'm glad to be a member here and recommend it to lots of people.  It's a good place to be, even if this review is getting the criticism it deserves.  Thanks for all your hard work over the years.  Not enough people can give you praises.  Regretfully, I never praise anyone or anything, and here I am, in conclusion, sending (IMO a great deal of) praise to you.  Take that for what it's worth.  

Cheers 

-Jared


----------



## jasper1605 (Mar 27, 2010)

nt300 said:


> *WELCOME to TECHPOWERUP*
> 
> I have a question for everybody posting on this thread, why is there about 380 posts in this thread but only *13 Diggs?* You can find the 13 Diggs on W1zzards 1st post.
> 
> ...



consider it dugg.  I created a digg account just to bump it up one!


----------



## johnnyfiive (Mar 27, 2010)

I just wanted to say, GREAT REVIEW w1z. I know exactly what its like (first hand) to have minimal time to get a review done. We have issues at Bjorn3D with our review sample (which I won't go into) but thanks to NVidia for stepping up and handling things, the review was finished and out on the 26th. I felt so bad for our guys reviewing it though, they were up soooooo late because of the crazy time constraints. People who don't do reviews won't really understand the time and effort that goes into them, especially quality ones like you do w1z. It takes me days to do what most people think are simple reviews, but the reality is it takes many hours to write, take pics, run benches, and complete a solid quality review. So for everyone complaining about a simple typo, you really need to stop the bitching and read before you go on a rant typing frenzy.

Anyway, great review w1z, your reviews are always solid and very informative. My only suggestions would be to implement games like BC2, and Stalker COP, etc. But overall, great work as always especially considering the short time I'm sure you had to work with.


----------



## Hayder_Master (Mar 27, 2010)

completed review w1zzard, excellent work and nice move to put haven benchmark and metro 2033 in the test


----------



## TheMailMan78 (Mar 27, 2010)

I have a review request for NEXT time. Could you include Bad Company 2? Its also DX11 and uses its own engine.


----------



## DrPepper (Mar 27, 2010)

I love how he says he's quitting and everyone changes tune. Seriously reviews are hard to do and very boring. The amount of effort that goes into each review is rediculous. It took me about 3 hours to do a quick bunch of memory benchies that barely filled one post now look at what w1zz has done. Not to mention keeping this site afloat and what he gets in return in bitchin'. If w1zz did quit I wouldn't be suprised.


----------



## johnnyfiive (Mar 27, 2010)

*For the people who want Bad Company 2 GTX 4xx numbers, here ya go*
http://www.bit-tech.net/hardware/2010/03/27/nvidia-geforce-gtx-480-1-5gb-review/9


----------



## DrPepper (Mar 27, 2010)

johnnyfiive said:


> *For the people who want Bad Company 2 GTX 4xx numbers, here ya go*
> http://www.bit-tech.net/hardware/2010/03/27/nvidia-geforce-gtx-480-1-5gb-review/9



I think performance between the 5870 and GTX480 are too close tbh.


----------



## stinzza (Mar 27, 2010)

showing the best of ati to the newcomer from nvidia.. is logical.. show the newcomer what his fighting against today!


----------



## stinzza (Mar 27, 2010)

i saw the update 10.3 now.. my bad and nice mr. Wizz.. funny in some test tho.. hehe


----------



## nt300 (Mar 27, 2010)

DrPepper said:


> I love how he says he's quitting and everyone changes tune. Seriously reviews are hard to do and very boring. The amount of effort that goes into each review is rediculous. It took me about 3 hours to do a quick bunch of memory benchies that barely filled one post now look at what w1zz has done. Not to mention keeping this site afloat and what he gets in return in bitchin'. If w1zz did quit I wouldn't be suprised.


Reading your post and looking at your AVATAR I swear I can hear HOUSE talking to me


----------



## pjladyfox (Mar 27, 2010)

W1zzard said:


> i'm not testing a beta driver that appears right before the launch of the biggest competitor. you're still not happy with 10.3 whql ? apparently all of the 9.12 naysayers fell for amd's propaganda of 15% faster drivers



I'm coming into this kind of late but I'd like to say, for the record, thank you for all of your hard work on this. After getting introduced to this site by Wile E and others this has pretty much been my first stop for reviews as well as one of the best communities to be a part of. While I was confused at the 9.12's being used at first I also considered that the 10.3's really had only been out a VERY short time and it just would not have made sense to re-test everything and delaying a review.

Do not let all of the groaning over this decision make you quit. I've seen what happens to a website when one of it's founders goes away and it's painful to watch. 

So, please, don't go?


----------



## btarunr (Mar 27, 2010)

tkpenalty said:


> performance crown still goes to AMD at the moment... the card is terribly dissapointing when you consider how long nvidia has taken with it. Not really their fault, more like TSMC's fault. I don't think nvidia even contemplated that that silicon would run so hot.
> 
> They dont really need this card to sell however since they're making more money with tegra and other lines (hint: the cockroach GPU).



Don't blame TSMC, it's not their job to design GPUs, but to just take designs and manufacture them. This is the same company which manufactures AMD Cypress, which has awesome thermal characteristics. If GF100 fails with its thermals, it's because NVIDIA stuck to its up-the-arse "heil monolithic" philosophy, and the way it went about executing it.


----------



## dir_d (Mar 27, 2010)

btarunr said:


> Don't blame TSMC, it's not their job to design GPUs, but to just take designs and manufacture them. This is the same company which manufactures AMD Cypress, which has awesome thermal characteristics. If GF100 fails with its thermals, it's because NVIDIA stuck to its up-the-arse "heil monolithic" philosophy, and the way it went about executing it.



This is true...

On a side note why cant Standford make the 5 series card scream like the BIONIC guys have? I really feel if the 5870 could fold like it performs in BIONIC that the GTX 480 would be irrelevant all together until another revision or 28nm.


----------



## W1zzard (Mar 27, 2010)

my personal paris pics when i was there for the nvidia press event:
http://www.techpowerup.com/wizzard/paris10/


----------



## Tatty_One (Mar 27, 2010)

Great review, I was looking at 2 others before I saw this one, most of their tests were run at 20XX res with only 3 or 4 games, I wonder why?  I'll leave you all to guess, this one as ever is very thorough, completely un-biased and for TPU members who follow all of W1z's reviews, an easy ready reference and comparison to all the other GPU reviews.

Now many people can request that he does different things or uses different games or uses different drivers etc etc, damn lets all start a thread on what colour underwear he should wear next time he does one.... ffs be happy it's un-biased and honest, there are fewer and fewer sites that genuinly give completely un-biased reviews, let's not lose the best of them!  Kind of reminds me of my oldest daughter's 18th Birthday, i gave her £2000 to buy her first car, she gave me a hard time because she thought she was getting £3000!  In the end she didn't get the car..... I wonder why 

Edit:  If we do vote for a colour of underwear I'll go for Pink with green spots.


----------



## Scrizz (Mar 27, 2010)

qubit said:


> Nice, It's got the performance crown, I want one!
> 
> Thanks for another great review, W1zzard.
> 
> ...



lol DIE shrink.
look at all the problems they had now


----------



## wahdangun (Mar 27, 2010)

W1zzard said:


> i think i'm done .. every time there is a review you people have something to complain about .. suddenly lost all motivation to do any further vga reviews ... anyone interesting in buying a tech site ?



don't do that wizz i really like your review, maybe this was the result after loong time never been a fanboy war and some ppl just feel enought and they just released what ever they got, and they want find something to blame of


----------



## LAN_deRf_HA (Mar 27, 2010)

W1zzard said:


> my personal paris pics when i was there for the nvidia press event:
> http://www.techpowerup.com/wizzard/paris10/



That's actually pretty neat, I've never seen the tower from the underside before. What's the name of the angular building in the last 2 pics?


----------



## eidairaman1 (Mar 27, 2010)

DrPepper said:


> I think performance between the 5870 and GTX480 are too close tbh.



Took a Large Core from NV to gain anything where AMD did it with a smaller core. Besides you can scale the AMD part higher.  Beyond that Cooler, less power draw are a must in today's age when fuel prices are going up everywhere.


----------



## Tatty_One (Mar 27, 2010)

It's simple for me, I don't care for sides, I pay the bills so electricity useage is my problem, I'll knacker the wife's hairdryer and hide the food mixer to make up the bill.... however roughly a 10% performance gain across the board but more than 20% on price in the UK over the 5870 I am afraid IS fail for me.


----------



## wahdangun (Mar 27, 2010)

thanx wizz for cat. 10.3 driver benches.


in other note, why didn't you test game that have native DX11(like : dirt2, battle forge, AVP, stalker CoP) support instead you use DX 9 path?


----------



## Kitkat (Mar 27, 2010)

Wizard your reviews are great don't get caught up in bs dude keep at it. lol I don't care what drivers set u tested it was in the middle of a driver beta-release cycle (which is suddenly new for ati to go public on them for everyone guess they got tired of leaking so) Totally not your fault. They are looking for some showdown that wont happen for like 5 months-year 
(driver gain by driver gain, which will be an even longer battle with these monster cards)
Don't cast stones at wizard for bringing you the news. And don't hate on Fermi is what it is. 

(The sad part is you could of had a much cooler version of it a long time ago, and have been waiting for nothing  )  burn.


----------



## Para_Franck (Mar 27, 2010)

I finaly have my answer, my next card will be a 5850. The performance per watt of these new Nvidia is just not acceptable for me. I really did not know what to expect here, but I wanted to wait and see how it would turn out. While I would have liked to see the 5850 prices drop a little bit more in that time period, I now know exactly what card I want.

These gtx 4XX cards are fast cards with a great set of features and all, but I want more efficiencie, as I do for my car and my snowmobile (Thank you Ski-doo for those clean and efficient e-tec and ACE engine)


----------



## TheMailMan78 (Mar 27, 2010)

newconroer said:


> Out of the whole thing, this was the relevant part :
> 
> http://tpucdn.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/images/metro_2033_1920_1200.gif http://tpucdn.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/images/metro_2033_2560_1600.gif



Why was that more relevant than anything else? All I see is a horribly unoptimized game.


----------



## newconroer (Mar 27, 2010)

wahdangun said:


> thanx wizz for cat. 10.3 driver benches.
> 
> 
> in other note, why didn't you test game that have native DX11(like : dirt2, battle forge, AVP, stalker CoP) support instead you use DX 9 path?



Yes, why we're still getting copy/paste titles and results from engines that are several years old now, and end up with reviews where only one game(Crysis, no surprises there) makes the cards actually stress - I don't know, but at least we did get something relevant and useful this time around ... :












EDIT: @Mailman

It's a hell of a lot more relevant than ten+ games that run at 100+fps and tell us absolutely nothing about minimum frame rates. At least Metro 2033 is a current game, with current tech that the card(s) can make use of - and as I previously said, GTX performance will probably be on par with current 5XXX cards, but have the edge in some of the newer features from DX11. Which means, it's a single card solution that's competitive and stronger in the future proof department.


----------



## TheMailMan78 (Mar 27, 2010)

newconroer said:


> Yes, why we're still getting copy/paste titles and results from engines that are several years old now, and end up with reviews where only one game(Crysis, no surprises there) makes the cards actually stress - I don't know, but at least we did get something relevant and useful this time around ... :
> 
> 
> http://tpucdn.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/images/metro_2033_1920_1200.gif http://tpucdn.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/images/metro_2033_2560_1600.gif
> ...



And you do understand all of those "old games" use engines that are still in use today in other games. Other than Metro show me one game that uses the same engine......yeah I thought so.


----------



## shevanel (Mar 27, 2010)

31.3 fps doesnt speak volumes about the game especially since it has nvidia written all over the box.

seems poorly unoptimized?


----------



## eidairaman1 (Mar 27, 2010)

TheMailMan78 said:


> And you do understand all of those "old games" use engines that are still in use today in other games. Other than Metro show me one game that uses the same engine......yeah I thought so.



TBH THQ is not a big company pertaining to PC Gaming, they are mainly a budget org for Console games so to be that tells me this game is unoptimized for PC use, just as Midway is, UT3 wasn't as good as 2004 and 99.


----------



## TheMailMan78 (Mar 27, 2010)

shevanel said:


> 31.3 fps doesnt speak volumes about the game especially since it has nvidia written all over the box.
> 
> seems poorly unoptimized?



Um yeah but dont burst his bubble.


----------



## shevanel (Mar 27, 2010)

I don't remember seeing any batman AA benches.. I probbaly overlooked but that title might be a good way to see relative performance across other nv cards.


----------



## TheMailMan78 (Mar 27, 2010)

shevanel said:


> I don't remember seeing any batman AA benches.. I probbaly overlooked but that title might be a good way to see relative performance across other nv cards.



Now that seems like a good point. However given its history "Batmanagate" people would go nuts if W1zz used it. Maybe just use it on Nvidia reviews? The games I would like to see are the following.

1. BC2 because of the HUGE following it has on TPU.
2. Anything that uses the Source Engine
3. The tech 5 engine when it hits the market.


----------



## shevanel (Mar 27, 2010)

http://www.techspot.com/review/263-nvidia-geforce-gtx-480/page8.html


----------



## TheMailMan78 (Mar 27, 2010)

shevanel said:


> http://www.techspot.com/review/263-nvidia-geforce-gtx-480/page8.html



I call BS on that review. They have a single 5770 beating a 5870 in MW2.






Edit: I missread it. lol BUT DAMN two 5770s beat the 480?!?


----------



## shevanel (Mar 27, 2010)

has bc2 benches

http://www.legionhardware.com/articles_pages/nvidia_geforce_gtx_480_fermi_arrives,4.html


----------



## DrPepper (Mar 27, 2010)

eidairaman1 said:


> Took a Large Core from NV to gain anything where AMD did it with a smaller core. Besides you can scale the AMD part higher.  Beyond that Cooler, less power draw are a must in today's age when fuel prices are going up everywhere.



The design doesn't matter to me really. Despite the fact that nvidia's core is rediculously huge I mean WTF huge the only thing that matters is that it has good performance/price ratio and runs at acceptable temps 95 degrees C and down and has decent power draw. However I don't see fermi as a fail as other but I certainly won't be getting one


----------



## DrPepper (Mar 27, 2010)

pantherx12 said:


> If it wasn't for the limited cooling options the 5770s are fucking insane cards man
> 
> Mines currently keeping up with my old 4890 1gb : ]



I wish you never said that. I always dismissed the 5770 because I thought my 4890 was faster


----------



## boulard83 (Mar 27, 2010)

The GTX480 is a great peice of tech. but its looking like a Beta version to me .... Overheating, noisy, but good perf out of the card. 

Runing a 3SLI can pump 900Watts ++ under load just from the GPU ... Hope your AC is good .. 

Ill wait until Nvidia can make Fermi the way its supposed to.


----------



## shevanel (Mar 27, 2010)

i had 5770Cf before the 5870 i have.. the 5770's were faster.. but thats okay they wont be faster than 2x5870 in about 4-5 months


----------



## TheMailMan78 (Mar 27, 2010)

DrPepper said:


> I wish you never said that. I always dismissed the 5770 because I thought my 4890 was faster



A single 4890 is faster than the 5770. TWO 5770 will eat a 4890 for lunch.


----------



## Tatty_One (Mar 27, 2010)

DrPepper said:


> I wish you never said that. I always dismissed the 5770 because I thought my 4890 was faster



It is don't worry, well mine is..... don't know about yours


----------



## ShogoXT (Mar 27, 2010)

Thank you very much for this awesome review W1z. I am also one of those people who would wait any amount of time to see your finished review. I always go here first before anywhere on the net for the accurate and done right review. I hope you continue to do your awesome reviews. 

I also see it as the one that is more accurate and right vs all others. I might not even notice if you scramble up the results for a April fools day joke and still may post and link it to others! 

That said, even though i usually am a slight fanboi, this does not bode well for us all according to the results. I was really hoping nvidia would come through on this one so we could see competition once again.


----------



## DrPepper (Mar 27, 2010)

Tatty_One said:


> It is don't worry, well mine is..... don't know about yours



Mine's is at stock and runs hot as hell. So maybe equal to it. Either way fast enough for what I need it to do.


----------



## HillBeast (Mar 27, 2010)

shevanel said:


> i had 5770Cf before the 5870 i have.. the 5770's were faster.. but thats okay they wont be faster than 2x5870 in about 4-5 months



I can second that. My mate has two 5770s and they beat my 5870. Strange that ATI let that blow by and considering I think it's cheaper to buy two 5770s than to buy a single 5870...


----------



## eidairaman1 (Mar 27, 2010)

try that game at a resolution higher than what it states. It's just something I noticed about that chart that was posted by MailMan.


----------



## Wile E (Mar 28, 2010)

cdawall said:


> I'm kinda happy I found a 4870x2 and 4850x2 those together should beat a pair of 480's and I don't need to rent a 12v generator to get my pc to turn on. 320w under load is a wee bit ridiculous hell I remember when I had a fsp 250w add on psu that can't even power one of these cards :shadedshu after so much delay this card should have wiped the floor new drivers or not the 5970x2 performed better 5870 threw a good stab at it and for less money and power consumption I think its a better card hell with an oc the 5870 may actually beat the 480 oc'd. I just think my alnost year old 4870x2 should have been destroyed by both the 5870 and gtx480 not kept up in some games beat them in others that is stupid. Maybe nvidia will fix fermi as it seems the 480 is already a failed version of a bigger card with its core partially shut down. I assume that means rumors of fab issues were true and still exist. Maybe a smaller less complicated die would help 1b more transistors than ati might be an issue. Oh well gtx490 with a fully unlocked gpu should do well against the 5970 doubt it will beat it when both are oc'd but they will do well against each other.
> 
> 
> Oh on a lighter not fsp sells a 2000w psu that will be perfect for sli with 3 of these cards


Our 4870x2 draws even more power, performs worse, and has less features. 480 is the better card.


As for the heat issues, I want to know why nVidia has chosen to use an IHS. I bet it would run a good bit cooler with a naked die. Tho power draw would still be insane, at least temps would be helped.


----------



## SK-1 (Mar 28, 2010)

After all the wait, just another power hungry sweaty pig. Even with lipstick, still a sweating pig. :shadedshu  ATI is still a much better overall graphics card choice. Thanks W1zz.


----------



## senninex (Mar 28, 2010)

5870 Still RemaiN The Best Choice For Gamer...

Anyway 5890 Is On The Way..


----------



## Kantastic (Mar 28, 2010)

Thanks for the great review W1z, I was waiting for it. 












Now was that so hard to say?


----------



## Scrizz (Mar 28, 2010)

Thanks for the EXCELLENT review W1zzard, I was waiting for it. 
Hope to see you on BC2 again


----------



## TAViX (Mar 28, 2010)

Here is something a few people know:

The new GTX480 and 470 are a DISASTER if you try to run 2 monitor at once. In order to operate, the cards are switching to the 3D freqs all the time, including in desktop idle mode!!!!!! That way they get to 90C and up to max power consumption!!!! nvidia, WTF???? EPIC FAIL!!!!!!!

http://www.legitreviews.com/article/1258/15/


----------



## Frizz (Mar 28, 2010)

Great review w1zzard, was waiting for this also . I wonder what will happen next in this competition .


----------



## stinzza (Mar 28, 2010)

> wile e=... As for the heat issues, I want to know why nVidia has chosen to use an IHS. I bet it would run a good bit cooler with a naked die. Tho power draw would still be insane, at least temps would be helped.



a nice point there friend


----------



## wahdangun (Mar 28, 2010)

newconroer said:


> Yes, why we're still getting copy/paste titles and results from engines that are several years old now, and end up with reviews where only one game(Crysis, no surprises there) makes the cards actually stress - I don't know, but at least we did get something relevant and useful this time around ... :
> 
> 
> http://tpucdn.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/images/metro_2033_1920_1200.gif http://tpucdn.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/images/metro_2033_2560_1600.gif
> ...



no, that's not what i mean, i mean Drift2, battle forge, stalker CoP, and AVP 3 have built in DX 11 path why don't wizz use that ?

and i really wish if wizz can add BC2 for bench, it's really great game


----------



## Frick (Mar 28, 2010)

TAViX said:


> Here is something a few people know:
> 
> The new GTX480 and 470 are a DISASTER if you try to run 2 monitor at once. In order to operate, the cards are switching to the 3D freqs all the time, including in desktop idle mode!!!!!! That way they get to 90C and up to max power consumption!!!! nvidia, WTF???? EPIC FAIL!!!!!!!
> 
> http://www.legitreviews.com/article/1258/15/



Ouch. Just ... ouch. WTF indeed.


----------



## shevanel (Mar 28, 2010)

funniest shit i ever read



> All that power makes for one seriously hot graphics card. Do not be fooled by the 97 degree load temperature. While this is the maximum temperature that we recorded, the fan was spinning so fast things started sliding across my desk towards the test system. The GeForce GTX 480 was truly deafening when running our full load test.
> 
> Letting the card sit at the Windows 7 desktop for 20 minutes after stress testing saw the temperature only drop to 65 degrees. When it came time to remove the GeForce GTX 480 from our test system I had to wear a pair of gloves to avoid any possible burns or dropping the extremely hot graphics card. The PCI Express power cables were amazingly soft from all the heat that had been thrown at them.
> 
> In short the GeForce GTX 480 is around 11% hotter than the Radeon HD 5870 under load and a whopping 71% or 27 degrees hotter at idle.


----------



## hertz9753 (Mar 28, 2010)

shevanel said:


> funniest shit i ever read



  This card reminds of that Nvidia 7800 card that I bought a while back.  The one that came with the free leaf blower sound effects.


----------



## spud107 (Mar 28, 2010)

i never thought a gpu could cause so much drama . . .


----------



## OnBoard (Mar 28, 2010)

TAViX said:


> Here is something a few people know:
> 
> The new GTX480 and 470 are a DISASTER if you try to run 2 monitor at once. In order to operate, the cards are switching to the 3D freqs all the time, including in desktop idle mode! That way they get to 90C and up to max power consumption!
> 
> http://www.legitreviews.com/article/1258/15/



It's been like that with GT200 too. I'm with full 3D clocks as I write, because I have LCD TV on the other DVI. They said they'll look into it over a year ago and nothing has happened.

How ever, with my card temps don't rise, idling happily 40C 24/7 with those clocks.

Only thing you can do is disable "extend desktop to this display" then you'll get 2D clocks. NVIDIA won't fix it, it will be like that for ever. If it's an issue for you don't buy Fermi. (or you could just plug a 8x00 card and use second monitor from that)

And here's a link to the issue with earlier cards. If they can make a 3 billion transistor card, surely they can add "use 2D clocks always" force to drivers.
http://forums.nvidia.com/index.php?showtopic=86932


----------



## wahdangun (Mar 28, 2010)

OnBoard said:


> It's been like that with GT200 too. I'm with full 3D clocks as I write, because I have LCD TV on the other DVI. They said they'll look into it over a year ago and nothing has happened.
> 
> How ever, with my card temps don't rise, idling happily 40C 24/7 with those clocks.
> 
> ...



it's a feature you know !!!! nvdia never have a problem in driver


----------



## hertz9753 (Mar 28, 2010)

Hey W1zzard, the review was good BTW.


----------



## TheMailMan78 (Mar 28, 2010)

TAViX said:


> Here is something a few people know:
> 
> The new GTX480 and 470 are a DISASTER if you try to run 2 monitor at once. In order to operate, the cards are switching to the 3D freqs all the time, including in desktop idle mode!!!!!! That way they get to 90C and up to max power consumption!!!! nvidia, WTF???? EPIC FAIL!!!!!!!
> 
> http://www.legitreviews.com/article/1258/15/



That has to be a simple driver mess up. I'm sure they will fix that.


----------



## Deleted member 24505 (Mar 28, 2010)

TAViX said:


> Here is something a few people know:
> 
> The new GTX480 and 470 are a DISASTER if you try to run 2 monitor at once. In order to operate, the cards are switching to the 3D freqs all the time, including in desktop idle mode!!!!!! That way they get to 90C and up to max power consumption!!!! nvidia, WTF???? EPIC FAIL!!!!!!!
> 
> http://www.legitreviews.com/article/1258/15/



I agree,that is fail.I just read a couple of pages of that review,and he even said he had to wear gloves to take it out so as not to burn his hands.

It might be better than the 5870 in some games but,i run two monitors all the time so feck that i would not even consider getting a gtx480 because of this reason.It is a hot running inefficient waste of space.

The guy even said that after one day with it in his machine,his computer room was noticeably hotter than the rest of his house.This is ok if you live in iceland maybe but for someone who livees in a warmer climate would not be very good at all.


Nice review W1zzard,screw all the whinging twats,your reviews have always been thourough and informative.Thank you.


----------



## TheMailMan78 (Mar 28, 2010)

tigger said:


> I agree,that is fail.I just read a couple of pages of that review,and he even said he had to wear gloves to take it out so as not to burn his hands.
> 
> It might be better than the 5870 in some games but,i run two monitors all the time so feck that i would not even consider getting a gtx480 because of this reason.It is a hot running inefficient waste of space.
> 
> ...



I think that whole glove thing was a joke? If not I would hate to see this thing run in my home state. Florida.


----------



## Black Panther (Mar 28, 2010)

TheMailMan78 said:


> I think that whole glove thing was a joke? If not I would hate to see this thing run in my home state. Florida.



I don't think it's a joke. If the gpu is 95 degrees while gaming the outside of the card won't be exactly cool...
Btw, once I nearly burned myself removing a passively cooled FX5500.


----------



## TheMailMan78 (Mar 28, 2010)

Black Panther said:


> I don't think it's a joke. If the gpu is 95 degrees while gaming the outside of the card won't be exactly cool...
> Btw, once I nearly burned myself removing a passively cooled FX5500.



Passively cooled I understand but that 480 has a HUGE cooler on it.


----------



## Deleted member 24505 (Mar 28, 2010)

He did show the twmp at the back end at 67c though,which is quite hot,i suppose it does depend how long the pc had been off before he took it out though.


----------



## TheMailMan78 (Mar 28, 2010)

tigger said:


> He did show the twmp at the back end at 67c though,which is quite hot,i suppose it does depend how long the pc had been off before he took it out though.



Ya know with better cooling and higher volts to the extreme that the 480 is set at I wonder how the 58xx series would fair.

I mean my idle speed it 32c and my load never breaks 65. Thats with a mild OC too. I wish someone with ballz of steal would really set one of these puppies up to the temp levels of the 480 and do a bench.


----------



## option350z (Mar 28, 2010)

This reminds me of the PWM's on my Asus 4870. They go beyond this temp under load. The 480 is a great single card but I'm interested to see how this scales in SLI. I would think temps would be above 100C at that point.


----------



## Deleted member 24505 (Mar 28, 2010)

I think there is also he problem of two cards bleeding that 67c heat into your case,you better have good case ventilation.


----------



## TAViX (Mar 28, 2010)

TheMailMan78 said:


> That has to be a simple driver mess up. I'm sure they will fix that.



Maybe, maybe not... But it seems that users had this kind of problem with early generations too, so...

But I'm thinking at this: If you need 2 cards to run on 3 monitors, and obviously, they are running 3D freq all the time, how much heat is produced?

 Remind me during the winter to shut off the heating. Those babies can heat my entire building!!


----------



## W1zzard (Mar 28, 2010)

Black Panther said:


> I don't think it's a joke. If the gpu is 95 degrees while gaming the outside of the card won't be exactly cool...
> Btw, once I nearly burned myself removing a passively cooled FX5500.



i burned myself several times too .. basically everything on the card is hot .. and the heatipes are REALLY hot, really easy to touch them with your thenar muscle when handling the card


----------



## wahdangun (Mar 28, 2010)

W1zzard said:


> i burned myself several times too .. basically everything on the card is hot .. and the heatipes are REALLY hot, really easy to touch them with your thenar muscle when handling the card



WHAT , wow, i never expected the card is THAT  hot 


i hope board partner, include some protective glove, so no one get burned with this thing


 so someone who say this temperature are fine are crazy


----------



## pantherx12 (Mar 28, 2010)

Amazing its getting hot enough to burn even with a fan on it really.

Only times I've managed to burn myself on heatsinks is when I've left them running passively.

( once had my true get to 90c, that was painful)


----------



## wahdangun (Mar 28, 2010)

yeah i once get burned with NB heatsink when i'm trying to OC the CPU,


wew, so anyone who bashing wizz review are moron and didn't know what the dedication he have to deliver this review.


----------



## TheMailMan78 (Mar 28, 2010)

W1zzard said:


> i burned myself several times too .. basically everything on the card is hot .. and the heatipes are REALLY hot, really easy to touch them with your thenar muscle when handling the card



I stand corrected. Man Fermi is an epic fail on Nvidias part. I had a feeling they would bomb but DAMN I had no idea they were this bad.


----------



## Cold Storm (Mar 28, 2010)

All I can say anymore... Revisions, revisions, revisions!!!


----------



## Steevo (Mar 28, 2010)

Wile E said:


> Our 4870x2 draws even more power, performs worse, and has less features. 480 is the better card.
> 
> 
> As for the heat issues, I want to know why nVidia has chosen to use an IHS. I bet it would run a good bit cooler with a naked die. Tho power draw would still be insane, at least temps would be helped.



Probably to help stabilize the core on the board, no thermal problems with the interface, no issues with the actual core popping off at 105C or issues with the die being chipped if they didn't use a protective layer that was thick enough to bear the brunt of a heatsink for thermal reasons.


I want to see what a watercooled card with a good powersupply can do.


----------



## hat (Mar 28, 2010)

I wonder what kind of aftermarket heatsink designs they'll come up with for this...


----------



## Steevo (Mar 28, 2010)

More than five heatpipes, that are 8mm by the look, with a powerful blower fan? I doubt if anything aftermarket right now will cool this as well as the stock HS combo. Just face it, 300W is your target cooling capacity. The only thing that can handle that is water. Cranking up my 5870 to 1.35V and running 1100 core I actually start to raise my water temps. It feels almost hot coming out of the card. 

This card with overclocking is going to take a 360 radiator by itself, plus a 240 for your CPU, and at least a 120 for your chipsets.


----------



## pantherx12 (Mar 28, 2010)

hat said:


> I wonder what kind of aftermarket heatsink designs they'll come up with for this...



Think traditional water cooler radiator but using heatpipes instead of water pipes.

They'd have to maximise surface area by using the zig zag style of a traditional rad.

I imagine support bars would be a must as well .



Think t-rad2 
	

	
	
		
		

		
			





but with all the fins zig zagging : ] 


Although just a trad would still do a better job then the stock cooler I'd bet.


----------



## a_ump (Mar 28, 2010)

Steevo said:


> More than five heatpipes, that are 8mm by the look, with a powerful blower fan? I doubt if anything aftermarket right now will cool this as well as the stock HS combo. Just face it, 300W is your target cooling capacity. The only thing that can handle that is water. Cranking up my 5870 to 1.35V and running 1100 core I actually start to raise my water temps. It feels almost hot coming out of the card.
> 
> This card with overclocking is going to take a 360 radiator by itself, plus a 240 for your CPU, and at least a 120 for your chipsets.



i agree. the stock heatsink is on the same level if not higher than aftermarket cooling. I mean 5 8mm HDT heatpipes? is there even an aftermarket cooler out right now that has that? 

Course there is the logic that if you get an aftermarket that only has to cool the GPU and a fan dedicated to that temps might improve, since you would have heatsinks taking care of memory and the other extras. If i recall weren't there some HD 4870's that didn't even have heatsinks on the memory chips?


----------



## pantherx12 (Mar 28, 2010)

Whilst this is 3rd party ( Asus) this is probably the best stock cooler around

















So there is better stock cooling out there, and much much better after market cooling XD

Whilst the stock cooler used on the 480 has a lot of heatpipes the rest of the design is pretty average, if not inefficient.


With a few tweeks it be a great cooler for the 480 : ]


----------



## wahdangun (Mar 28, 2010)

pantherx12 said:


> Think traditional water cooler radiator but using heatpipes instead of water pipes.
> 
> They'd have to maximise surface area by using the zig zag style of a traditional rad.
> 
> ...



but i think something like this is not really efficient. just look what happen to something like this in CPU cooler, they fail miserable


----------



## pantherx12 (Mar 28, 2010)

wahdangun said:


> but i think something like this is not really efficient. just look what happen to something like this in CPU cooler, they fail miserable




Trad-2 is the best after market gpu cooler I've used so far 

Hell I ran several cards on it passively!

4850 @750mhz didn't go over 60c passively : ] ( Custom case at the time though, so airflow was top notch)


----------



## imperialreign (Mar 28, 2010)

After all the hype nVidia were throwing out, I was seriously expecting these cards to flat-out spank the 5000 series (except the 5970) . . . all-in-all, I'm really not seeing it.  Aside from PhysX support, it looks to me like the 5850/70 are a generally better deal for most gaming setups.



*w1z* - any consideration to running the STALKER: Call of Pripyat benchmark for DX10/11 comparisons?  

As well, I think it'd be a good benchmark to add to the testing list - allows for benching DX9, DX10, DX10.1 and DX11.


----------



## Steevo (Mar 28, 2010)

GPU Physx for all 15 games out there? No thanks.

Panther, that is a good looking cooler, except the fact that 300W woth of heat remains in your case, so 90% of users with closed cases will gain a extra 40C of heat in their case, and that means less heat extraction, and that means higher temps, and that means performance the same as or lower than stock cooling that removes heat from the GPU to the outside of the case, and also drawing in more fresh cool air.


Plus your 4850 at whatever speed is at least half the load of this, so not even the same ballpark.


----------



## blkhogan (Mar 28, 2010)

I was really considering Nvidia for my next gpu upgrades soon to be happening. Not too impressed overall. Its still a "powerhouse" and a great leap forward for Nvidia. The power required to power this beast and the heat produced is really going to be a sticking point for some people. My more efficient but much less powered 5770 will have to last a bit longer. Cant wait to start seeing numbers and benches from some members. Great review as always.


----------



## pantherx12 (Mar 28, 2010)

Steevo said:


> GPU Physx for all 15 games out there? No thanks.
> 
> Panther, that is a good looking cooler, except the fact that 300W woth of heat remains in your case, so 90% of users with closed cases will gain a extra 40C of heat in their case, and that means less heat extraction, and that means higher temps, and that means performance the same as or lower than stock cooling that removes heat from the GPU to the outside of the case, and also drawing in more fresh cool air.
> 
> ...



Well its hardly my fault people don't know about proper case airflow management 


As for the later point, obviously man  but stick some fans on a trad and it will perform better then that stock heat-sink no problems .

has more surface area better heat distribution for one.

Decent 92mm fans on it and you get around 110cfm blowing on it too : ]


----------



## Steevo (Mar 28, 2010)

Surface area has little to do with total heat dissipation anymore, with the size of these coolers as they are. Thermal conductivity, laminar flow, and temperature differential have more to do with it than anything. Thus the reason a 8, 12, 24 or 200 pipe CPU cooler perform so close to the same. the limitation is the interface, conductivity, and total load ability, not the surface area.

More air at higher speed over a smaller area with higher thermal conductivity will cool better than a larger area low speed cooling solution. 

Thus the reason we see two or more fans on tower CPU coolers, more air speed. But the effects are still much less pronounced than even a single 120 mm water cooler. the interface medium is better.


----------



## pantherx12 (Mar 28, 2010)

*facepalm* Point has been missed.

Yes its a sum of all things that make a good cooler, does it change the fact that the T-rad is a GOOD cooler?

Stop being so damn patronising and assuming someone doesn't know something because they did not use technical language .


whilst I did miss a coma here

"has more surface area better heat distribution for one."

See the bit about heat distribution?


----------



## thebluebumblebee (Mar 28, 2010)

Steevo said:


> GPU Physx for all 15 games out there? No thanks.
> 
> Panther, that is a good looking cooler, except the fact that 300W woth of heat remains in your case, so 90% of users with closed cases will gain a extra 40C of heat in their case, and that means less heat extraction, and that means higher temps, and that means performance the same as or lower than stock cooling that removes heat from the GPU to the outside of the case, and also drawing in more fresh cool air.
> 
> ...



I can't find it, but Nvidia actually talked about "approved cases" for cooling!


----------



## Steevo (Mar 28, 2010)

Tests with 1 120mm show it taking a 4850 tweaked to 60C at full load in a open air test. So it would easily take a 480 to 90C, add a case and you have just reached your 105C limit.

It is a great cooler for 200W loads, but not 300+.


----------



## R_1 (Mar 28, 2010)

Some inexpensive exhaust fan with high CFM will be needed, like Evercool EC12038M12BA. Have in mind that it is noisy and produce lots of vibrations.


----------



## pantherx12 (Mar 28, 2010)

Steevo said:


> Tests with 1 120mm show it taking a 4850 tweaked to 60C at full load in a open air test. So it would easily take a 480 to 90C, add a case and you have just reached your 105C limit.
> 
> It is a great cooler for 200W loads, but not 300+.



Your forgetting the hotter something is the quicker it looses its energy 

also 2 92mm fans are more effective, as they have greater static pressure.

Using a 120mm fan on the T-rad2 is if you want to cool it quietly.


----------



## Steevo (Mar 28, 2010)

They need to get a custom water loop for this card, like they did with some of the older ATI cards, self contained pump, radiator, fan, shroud, and flowmeter that shows up on the PWM for RPM.

http://hothardware.com/Articles/Sapphire-Toxic-Radeon-X1900-XTX/




pantherx12 said:


> Your forgetting the hotter something is the quicker it looses its energy
> 
> also 2 92mm fans are more effective, as they have greater static pressure.
> 
> Using a 120mm fan on the T-rad2 is if you want to cool it quietly.




You buy one, try it and post a review. For me after trying alot of different air coolers at work, home, and for friends. I like to get wet.


----------



## pantherx12 (Mar 28, 2010)

Pfft can't afford a 480, I'm unemployed!

ha ha


----------



## Steevo (Mar 28, 2010)

I am a network and system administrator, and have a fleet of 20+ PC's 6 laptops, two servers, and one AS/400 besides my home machine, and friends builds I have worked on. I have at work everything from stock Intel and AMD coolers to some nice Zerotherm coolers to support some overclocked machines, some Arctic Cooling tower style coolers, a opteron four pipe cooler on a dual core AMD build that is overclocked to 3.6Ghz. 


I have alot of computers I have tried different things on. I'm currently watercooling with a VRM-4 on, extra heatsinks glued on my mohterboard as the CPU powerload was heating the board too much. And my computer is quiet.

While I don't go crazy with TEC, or LN yet I enjoy my system as it requires almost no maintenance, and is fast. Really fast. 

A person trying to half ass cool a card like a 480 is like someone driving a Ferrari with two flats on a dirt road. Just asking for trouble.


----------



## pantherx12 (Mar 28, 2010)

Yeah I've fucked around with a lot of coolers and computers too.

: ]


I'm not sure what your trying to achieve by telling us your experiences steevo D:


----------



## Steevo (Mar 28, 2010)

I guess that when you ante up, buy a 480, try it then come tell us all it won't work any better than the stock solution. AKA, talks cheap. 


For now this card is only going to be taken seriously when it has a waterblock strapped to it. Or you could see if W1zz will try it, as no one will be getting a card until sometime next month at the earliest. 


I really do want to see this hot bastard watercooled. With as high leakage as it is stable voltage at high load should be easy if you can only keep it cool. Can't wait to see someone run it on LN.


----------



## erocker (Mar 28, 2010)

Steevo said:


> I guess that when you ante up, buy a 480, try it then come tell us all it won't work any better than the stock solution. AKA, talks cheap.
> 
> 
> For now this card is only going to be taken seriously when it has a waterblock strapped to it. Or you could see if W1zz will try it, as no one will be getting a card until sometime next month at the earliest.
> ...



You can expect a plethora of Fermi aftermarket coolers very soon. One thing I noticed with Fermi is that the actual metal cooler part could of been larger. That being said whoever comes up with the best aftermarket cooler that doesn't take up 3 to 4 motherboard slots is winnar!


----------



## pantherx12 (Mar 28, 2010)

I won't be buying a 480 anyway, might get a 5870 next year though : ]

Anyway I was coming to my conclusion about cooling performance of the stock cooler by studying its design.







It simply is not that great!


To use the T-rad as an example ( since we're talking about it anyway, and its a cooler I'm very familiar with, holding it in my hands right now actually )



the 480 stock cooler simply has less contact points for the heat-pipes to distribute their heat.

That and the combination of less surface area makes it not the best heatsink in the world.

Probably best reference stock heatsink designed to be fair.

But not all that great.


----------



## erocker (Mar 28, 2010)

pantherx12 said:


> I won't be buying a 480 anyway, might get a 5870 next year though : ]
> 
> Anyway I was coming to my conclusion about cooling performance of the stock cooler by studying its design.
> 
> ...



In relation to the size of the chip, the cooler is tiny.


----------



## wahdangun (Mar 28, 2010)

pantherx12 said:


> I won't be buying a 480 anyway, might get a 5870 next year though : ]
> 
> Anyway I was coming to my conclusion about cooling performance of the stock cooler by studying its design.
> 
> ...



but if that thing was not so great with contact points, why wizz still burned when he touch it ?

i think what this cooler lack was, it's abillity to dispel heat fast enough


----------



## pantherx12 (Mar 28, 2010)

wahdangun said:


> but if that thing was not so great with contact points, why wizz still burned when he touch it ?
> 
> i think what this cooler lack was, it's abillity to dispel heat fast enough




You hit the nail on the head unintentionally there  

 The combination of less contact points for the fin array* and having what is a relatively small surface area it means the heatsink does not dissipate the massive amounts of thermal energy it picks up efficiently.

The heatsink becomes " heat-soaked" a really great heatsink should only feel warm to the touch as its dissipating its thermal energy so well and evenly 


*points where the heat-pipes touch the fin array.


----------



## erocker (Mar 28, 2010)

wahdangun said:


> but if that thing was not so great with contact points, why wizz still burned when he touch it ?
> 
> i think what this cooler lack was, it's abillity to dispel heat fast enough



With a 22 watt fan. I would chalk all of this up to GPU thermal characteristic failure. Meaning the GTX480 is not very good. It performs better than a 5870 and that's about it. The negatives outweigh the positives it seems.


----------



## pantherx12 (Mar 28, 2010)

erocker said:


> With a 22 watt fan. I would chalk all of this up to GPU thermal characteristic failure. Meaning the GTX480 is not very good. It performs better than a 5870 and that's about it. The negatives outweigh the positives it seems.



Good point, I forgot to mention the 480s are just hot as hell in general


----------



## wahdangun (Mar 28, 2010)

pantherx12 said:


> You hit the nail on the head unintentionally there
> 
> The combination of less contact points for the fin array* and having what is a relatively small surface area it means the heatsink does not dissipate the massive amounts of thermal energy it picks up efficiently.
> 
> ...



yeah, but i think, nvdia should use vapor chamber, just like HD 5970, because it's also have same thermal characteristic.  



erocker said:


> With a 22 watt fan. I would chalk all of this up to GPU thermal characteristic failure. Meaning the GTX480 is not very good. It performs better than a 5870 and that's about it. The negatives outweigh the positives it seems.




yeah i really suprised when i first time see that fan, and i never know any fan that use that kind of power (except for server)


----------



## pantherx12 (Mar 28, 2010)

Heat-pipes are better then vapour chambers : ]

a heatpipe makes 360 degree contact through every fin it passes through , ontop of that multiple heat-pipes mean more 360 degree contact points  You can also run the heat-pipes through the centre of a fin so either side of the fin gets equal amounts of thermal energy etc 

Vs the one contact point the vapour chambers offer.

Vapour chambers are cost effective mind you, my 5770 uses a vapour chamber ( stock batmobile style 5770)


----------



## btarunr (Mar 28, 2010)

In my opinion a cooler is nice if it's keeping the GPU cool at acceptable levels (with RT at 25C, idle @ 45C, load @ 65C. This cooler does not. Hence it's crappy.


----------



## erocker (Mar 28, 2010)

pantherx12 said:


> Heat-pipes are better then vapour chambers : ]
> 
> a heatpipe makes 360 degree contact through every fin it passes through , ontop of that multiple heat-pipes mean more 360 degree contact points
> 
> ...



Yet the 5850 cooler is far superior to the 5770 cooler and then some. With the 5770, mine idles around 38c at 35% fan. With my 5850's they idle at 32c with 21% fan. Regardless, either cooler would probablly fail on a GTX 480, but that is one mod I would absolutely love to see. Fermi with a batmobile cooler would be win.. even though it would fail.


----------



## LAN_deRf_HA (Mar 28, 2010)

What really bugs me is how close this thing gets to the thermal max of 105c. It apparently gets up to 98c before the fan really kicks in under load. I think this is why the fan is so insane... it might get really hot but that fan has the head room to just ramp higher and higher to keep that temp under 100c. It would be safer for them to have set the fan to ramp up at lower temps but that would have lead to hearing more noise more often. So we have all this shit going on in the high 90s because of the struggle to keep the noise down but to keep it under 100c. 

My concern is if you don't keep this thing blown out and clean either it's going to get louder and louder over time or it's going to start shutting down.


----------



## pantherx12 (Mar 28, 2010)

erocker said:


> Yet the 5850 cooler is far superior to the 5770 cooler and then some. With the 5770, mine idles around 38c at 35% fan. With my 5850's they idle at 32c with 21% fan. Regardless, either cooler would probablly fail on a GTX 480, but that is one mod I would absolutely love to see. Fermi with a batmobile cooler would be win.. even though it would fail.




Its because the 5850 uses heatpipes : ]


also check out the size of a 5770 cooler






Tiny !

To give you an idea, the horizontal measurement of the two holes on the backplate is 32mm or so.

( seeing as diagonally they are 45 mm)


----------



## wahdangun (Mar 28, 2010)

pantherx12 said:


> Heat-pipes are better then vapour chambers : ]
> 
> a heatpipe makes 360 degree contact through every fin it passes through , ontop of that multiple heat-pipes mean more 360 degree contact points  You can also run the heat-pipes through the centre of a fin so either side of the fin gets equal amounts of thermal energy etc
> 
> ...



wait.... wait a sec, didn't HD 5970 use VAPOR chamber?


----------



## pantherx12 (Mar 28, 2010)

wahdangun said:


> wait.... wait a sec, didn't HD 5970 use VAPOR chamber?



Yes it did, it would of been much more expensive to produce a card that uses 8 large diameter heat-pipes, it also would of effected air resistance a lot to use heatpipes in a dual gpu blower style cooler.

5970






5870






Thanks to TPU for the images, have to say I find taking photos of the heat-sinks themselves very handy!


----------



## wahdangun (Mar 28, 2010)

so why did you say heat pipes was better? when it's clearly vapor chamber can disspell up to 400 wat of heat ??

so what do you think is better for fermi : a heat pipes type cooler or vapor chamber type cooler ?


----------



## pantherx12 (Mar 28, 2010)

wahdangun said:


> so why did you say heat pipes was better? when it's clearly vapor chamber can disspell up to 400 wat of heat ??




Didn't understand what I said above?


more info then.

Producing a big long vapour chamber is easier/cheaper then using heat-pipes.

Heat-pipes also effect air flow so in a dual gpu card that uses one fan and one massive heat-sink its more cost effective to use a vapour chamber.

Even though in theory a heat-pipe heat-sink would be distributing the thermal energy more evenly and efficiently.


----------



## Super XP (Mar 28, 2010)

Black Panther said:


> I don't think it's a joke. If the gpu is 95 degrees while gaming the outside of the card won't be exactly cool...
> Btw, once I nearly burned myself removing a passively cooled FX5500.


I go away for about 2 days and this thread went nuts 
There's a few reviews that measured the 480 with 97C


----------



## HookeyStreet (Mar 28, 2010)

Nice review, thank you o wise one 

BUT, I think the price/performance crown is still owned by the red team


----------



## wahdangun (Mar 28, 2010)

pantherx12 said:


> Didn't read what I said above did you?



sorry, i litle confuse when you said 

"Heat-pipes are better then vapour chambers : ]

a heatpipe makes 360 degree contact through every fin it passes through , ontop of that multiple heat-pipes mean more 360 degree contact points  You can also run the heat-pipes through the centre of a fin so either side of the fin gets equal amounts of thermal energy etc 

Vs the one contact point the vapour chambers offer."

but if heat pipes was better why fermi still reach 95 degree when load? it's really sure the cooler cannot take that load of heat at once and become "heat soaked"

EDIT: even HD 5970 can still have more overcloke head room than fermi


----------



## pantherx12 (Mar 28, 2010)

Because the GPU used by Nvidia outputs much much more heat then any other gpu produced.


Hell its the hottest piece of consumer level silicon you can get I think 

( don't quote me on that as I could be very wrong on that point)


----------



## btarunr (Mar 28, 2010)

wahdangun said:


> so why did you say heat pipes was better? when it's clearly vapor chamber can disspell up to 400 wat of heat ??
> 
> so what do you think is better for fermi : a heat pipes type cooler or vapor chamber type cooler ?



Of course the vapor chamber thingy. It's keeping the GPU cooler than what Fermi's cooler manages. Don't look at the "means", look at the "ends". If NVIDIA's cooler was really great, it would have kept the GPU at sub-70C in Furmark. It doesn't.


----------



## pantherx12 (Mar 28, 2010)

btarunr said:


> Of course the vapor chamber thingy. It's keeping the GPU cooler than what Fermi's cooler manages. Don't look at the "means", look at the "ends". If NVIDIA's cooler was really great, it would have kept the GPU at sub-70C in Furmark. It doesn't.




You're highlighting how little you know about cooling here BTA, the cooler is EASILY the best stock ( I.E from nv and ati) cooler I've seen for a single GPU card.

Its the GPU at fault here not the heatsink.


Just scroll up and look at the 5870 cooler, the 480 cooler is beautiful in comparison XD


----------



## btarunr (Mar 28, 2010)

pantherx12 said:


> You're highlighting how little you know about cooling here BTA, the cooler is EASILY the best stock ( I.E from nv and ati) cooler I've seen for a single GPU card.
> 
> Its the GPU at fault here not the heatsink.
> 
> ...



You're highlighting illogicality. 

People don't care how complex their cooler is, or how much of a task the GPU is laying on it. People care about temperatures and fan-noise levels. If this was a good GPU cooler, it would have kept the GPU cool. Like sub-70C cool, not >90C with normal gaming (not Furmark, but normal gaming). 

The cooler can look like Chuck Norris, but with those temperatures and noise levels the consumer ends up with, it's all moot.


----------



## pantherx12 (Mar 28, 2010)

*edit*

I removed all the handy information.

Instead I'll just stare blankly into space.


----------



## R_1 (Mar 28, 2010)

I think that some epoxy glue, used as in HD5870 cooler, may bring few degrees of the GPU temp down.


----------



## btarunr (Mar 28, 2010)

pantherx12 said:


> "If this was a good GPU cooler"
> 
> So you are implying its a bad GPU cooler?



Yes, absolutely.



pantherx12 said:


> Stick it on a 5870 ( you'd have to make a IHS for the 5870) and see how wrong you are.



It won't fit, your point is moot. 




pantherx12 said:


> Its not a BAD cooler, its a HOT gpu.
> 
> How can you not understand that?



It is a bad cooler because it's not designed to keep that "HOT gpu" at acceptable temperatures. Hemlock's cooler is keeping TWO GPUs , 4 billion transistors, a PCI-E switch, two sets of vGPU, vMem, and uncore VRM circuits cooler than what this cooler manages.


----------



## Steevo (Mar 28, 2010)

Go search around and see. Heatpipes have a efficiency at maximum of about 50-60W each, in a vertical installation, in ideal conditions, above this and they start to lose their cooling ability. 

For example a Noctura NH-D14 is one of the best coolers out, 12,000 square cm of surface area, 12 heatpipes, two fans, and still only moves 200 is watts of heat at 70C

All those specs and compared to a more thermally efficient cheaper waterblock it sucks.


----------



## LAN_deRf_HA (Mar 28, 2010)

I think they did the heat pipes right, but the overall cooler size wrong. For a gpu that hot they just need more metal for those pipes to dump into.


----------



## R_1 (Mar 28, 2010)

Yep, more surface area is needed.


----------



## pantherx12 (Mar 28, 2010)

Not much extra they could of added to be honest






Bigger fan perhaps.

Nvidia should of done what ATI done and made the cards long to allow more space for bigger heatsinks.

Again its not the heatsink its the card/gpu design.


----------



## LAN_deRf_HA (Mar 28, 2010)

pantherx12 said:


> Not much extra they could of added to be honest
> 
> http://tpucdn.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/images/cooler4_small.jpg
> 
> Bigger fan perhaps.



If the gave up on venting out the back they could have spread the cooler out along the entire length of the card and used dual top fans like many aftermarket coolers.


----------



## Steevo (Mar 28, 2010)

http://www.frostytech.com/articleview.cfm?articleid=2480&page=5

I don't know how much clear it can get. 14.1C above ambient room temps with 10 heat pipes, in perfect conditions, at 150W load. 

So double it assuming it still maintains perfect efficiency despite heatpipes losing efficiency at higher temps, I know that heat loading ability becomes greater due to larger differential. 

28.2 higher than ambient, at 300W load in perfect conditions so mount your GPU upside down to maintain correct orientation for the heatpipes, keep it in open air, at max fan RPM. 


My max GPU temp was 42C under water. So I still win.


----------



## R_1 (Mar 28, 2010)

pantherx12 said:


> Not much extra they could of added to be honest
> 
> ...



They can, if additional back plate is added and connected to the main heat-sink with a  heat-pipe.


----------



## pantherx12 (Mar 28, 2010)

So what your suggesting is that the 480 should be another 75$ more expensive then guys?

To get the cooling it needs?

In my opinion its already to expensive XD


----------



## Fourstaff (Mar 28, 2010)

Nvidia is keeping load temps high so that they can keep your coffee warm and cook pot noodles. Nuff said.


----------



## Steevo (Mar 28, 2010)

So 6 heatpipes for the front, two for the back, two fans one on front on on back, a extra heatsink in the back. Now we have gone from 50Db and 96C to 40Db and 80C and cost a extra $100, not allow SLI due to the extras.


Great sign me up, it wasn't a winner before, but will all that extra bling how can anyone refuse!


----------



## pantherx12 (Mar 28, 2010)

Steevo said:


> So 6 heatpipes for the front, two for the back, two fans one on front on on back, a extra heatsink in the back. Now we have gone from 50Db and 96C to 40Db and 80C and cost a extra $100, not allow SLI due to the extras.
> 
> 
> Great sign me up, it wasn't a winner before, but will all that extra bling how can anyone refuse!




Glad your getting it


----------



## Mike89 (Mar 28, 2010)

Good review. Still has 1280x1024 comparsions. Since I'm still a 19" LCD monitor user (and plan to be for quite some time still), I look for reviews that haven't forgotten about me.

This card I'm not really impressed with. Power demands, heat issues and noise issues put this card over the top for me, I don't care how good it is. It's just not something I want to deal with. Since I still game at 1280x1024, my GTX 280 has plenty of teeth left, so I'm not too worried about my gaming situation right now. When that time comes, I will seriously consider switching to ATI unless nvidia comes up with something better than this (for the reasons I stated).


----------



## Steevo (Mar 28, 2010)

I still stand by my idea, a single closed loop integrated water cooler. Maybe 70C temps still but much better than 96C and probably close to the same price. 


I will take a guess these cards won't last beyond two years from thermal stress issues.


----------



## btarunr (Mar 28, 2010)

pantherx12 said:


> Not much extra they could of added to be honest
> 
> http://tpucdn.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/images/cooler4_small.jpg
> 
> ...



Contradictio in terminis. Blue: you agree this cooler is insufficient for this GPU. Which means yes, this is a bad cooler. Red: And then you blame it on the card design/GPU.

I'll say it again. Hemlock's (Radeon HD 5970's) cooler is keeping far more number of hot components cooler than what Fermi's cooler is handling. It's cooling two GPUs with 2 Billion transistor counts each, two sets of 3+2+2 phase digital PWM VRM, 16 memory chips, and still ends up being cooler. That implies it's better by design.


----------



## pantherx12 (Mar 28, 2010)

Steevo said:


> I still stand by my idea, a single closed loop integrated water cooler. Maybe 70C temps still but much better than 96C and probably close to the same price.
> 
> 
> I will take a guess these cards won't last beyond two years from thermal stress issues.




I agree actually.

Nvidia if they were smart would make the heat-sink a non profit part and just charge the consumer how much it cost them to implement on the card.

That way price will certainly barely take a knock.

unless they're already doing this 



*edit*
BTA I'm going to try to explain one more time why you are wrong.

Firstly a contradiction? No way you obviously can't read when did I say the cooler WAS sufficient for the job? I was merely correcting your statement of the cooler is BAD, which it is not.

Secondly

2 under-clocked 5870 gpus produce less heat then a single 480, that combined with having a LOOOOOOOOOOOOOOOOOOOOONG card ( and thus space for an epic cooler) allows the 5970 cooler to perform better.

So once again, its a the GPU at fault here and Nvidias choice to make the cards short.

If you can't understand that I won't even bother to reply to any subsequent posts you make lol



OOOOOOHHHH thought of an even simpler way to explain it. I won't even use technical terms.

Item 1 produces 100 heat 
Item 2 produces 150 heat

Item 1 is in a nice BIG room, so has a nice big radiator attached to it. : ]

Item 2 is in a SMALL room, so unfortunately has to settle for a small radiator : [


Now what source item will be hotter BTA?

And is it a fault of the radiator, or is it due to the higher original heat out put, and the space available FOR the radiator.


----------



## Steevo (Mar 28, 2010)

If I throw a ball to you which way is it going?

A) The ball is coming to you.
B) The ball id going away from me.

Either way is correct.



Is Fermi too hot? Yes
Is it the fault of the cooler they chose? Yes
Is it nvidias Fault? yes


The cooler sucks, the card sucks, the heat sucks, and it sucks too much power.


----------



## pantherx12 (Mar 28, 2010)

Steevo said:


> Is Fermi too hot? Yes
> Is it the fault of the cooler they chose? Yes
> Is it nvidias Fault? yes



Fixed

Is Fermi too hot? Yes
Is it the fault of the cooler they chose? *No*
Is it nvidias Fault? yes

Its the fault of the GPU being hot as fuck


----------



## TheMailMan78 (Mar 28, 2010)

pantherx12 said:


> OOOOOOHHHH thought of an even simpler way to explain it. I won't even use technical terms.
> 
> Item 1 produces 100 heat
> Item 2 produces 150 heat
> ...



Wait what happen to item 1?


----------



## El_Mayo (Mar 28, 2010)

Does it matter?


----------



## pantherx12 (Mar 28, 2010)

TheMailMan78 said:


> Wait what happen to item 1?



A typo  cheers for pointing it out.


----------



## TheMailMan78 (Mar 28, 2010)

pantherx12 said:


> A typo  cheers for pointing it out.



So item 1 was a typo?


----------



## pantherx12 (Mar 28, 2010)

Well I fixed it now, so just scroll up.


----------



## TheMailMan78 (Mar 28, 2010)

pantherx12 said:


> Well I fixed it now, so just scroll up.



I know Panther. I was just f@#kin with ya.

Anyway I think you are missing everyones point....

Steevo and Bta correct me if I'm wrong.....

Basically the GPU could be as hot as the sun and a decent cooler would keep it cooler than Fermi's stock cooler. The thermal production of the GPU is irrelevant in this case.

Bta gave you an example that the 5970 cools TWO GPU's among other things and STILL manages to cool them better than Fermis stock HS. So Ill uses your terms.

Item 1 produces 200 heat
Item 2 produces 150 heat

Item 1 is in a SMALL room and cooled with a big radiator. The heat is now down to 100 heat.
Item 2 is in a BIG room and cooled with a big radiator. The heat is now down to 130 heat.

Whats to be concluded? Fermis cooler sucks.


----------



## pantherx12 (Mar 28, 2010)

Two gpus producing less heat mailman, with a much bigger cooler.

As I said its a combination of space available for the heat-sink and the gpu producing metric shit tons of thermal energy.

As I said, its the best stock cooler I've seen in a long while : ]

Just it simply can't keep up with 300w, does that make it a bad GPU cooler? No, because not many gpu coolers can dissipate that much heat.


----------



## TheMailMan78 (Mar 28, 2010)

pantherx12 said:


> Two gpus producing less heat mailman, with a much bigger cooler.
> 
> As I said its a combination of space available for the heat-sink and the gpu producing metric shit tons of thermal energy.
> 
> ...



And its also cooling a LOT more.


----------



## jasper1605 (Mar 28, 2010)

TheMailMan78 said:


> And its also cooling a LOT more.



what's the power draw of the 5970?


----------



## TheMailMan78 (Mar 28, 2010)

jasper1605 said:


> what's the power draw of the 5970?



Its less. However the 4870x2 pulls more and also stays cooler.


----------



## LAN_deRf_HA (Mar 28, 2010)

You design a cooler for how hot your gpu is, they didn't do a great job of that as people have pointed out changes that could have increased cooling performance. Instead they chose to just put an insane ass fan in there that can ramp up to crazy levels. You can blame the gpu design, but the fact is once you know what the thermal demands of the gpu will be you need to design a cooler that can keep up with it, which they only barely managed.


----------



## TheMailMan78 (Mar 28, 2010)

LAN_deRf_HA said:


> You design a cooler for how hot your gpu is, they didn't do a great job of that as people have pointed out changes that could have increased cooling performance. Instead they chose to just put an insane ass fan in there that can ramp up to crazy levels. You can blame the gpu design, but the fact is once you know what the thermal demands of the gpu will be you need to design a cooler that can keep up with it, which they only barely managed.



Well that is true HOWEVER that GPU is WAY to hot for what it does. Fermi itself is the most epic fail since the spruce goose.


----------



## pantherx12 (Mar 28, 2010)

LAN_deRf_HA said:


> , which they only barely managed.



Precisely 



Also mailman on the 5970 its infact cooling less 

And with a bigger cooler!


----------



## TheMailMan78 (Mar 28, 2010)

pantherx12 said:


> Precisely
> 
> 
> 
> ...


Its not that much less and the cooler is not that much bigger.
Also what about the 4870x2?


----------



## pantherx12 (Mar 28, 2010)

Around the same thermal energy produced total, and the temperatures reflect that

With extra space for a bigger heatsink I might add.


"Radeon HD 4870 X2 is a daunting task. According to the Catalyst Control Center, the GPUs were as hot as 92-95°C under load. "

Quote from a xbitlabs review.
http://www.xbitlabs.com/articles/video/display/his-hd4870x2_4.html#sect0


----------



## TheMailMan78 (Mar 28, 2010)

pantherx12 said:


> Around the same thermal energy produced total, and the temperatures reflect that
> 
> With extra space for a bigger heatsink I might add.
> 
> ...



Yeah thats where it can peak but doesnt stay unlike Fermi.


----------



## Velvet Wafer (Mar 28, 2010)

i wouldnt blame the cooler, for not beeing able, to cool this piece of MISengineering, to adequate levels! Nvidia sadly wasnt able to bring a worthful opponent in the game, if you are concerned about more than performance. 
its simply not balanced, to be true.
not even an accelero-style aftermarket cooler would be able to dissipate that much heat! 
and these things are known for beeing very close to watercooling.
anyhow, the temperated air they emit, will heat up a room in no time,

if someone,someday has a gtx 480 SLI, he may want to do the following experiment:
1. install both gtx 480
2. close all windows and doors in the room where the pc is
3. turn off all heaters in that room, so the heat of them isnt added
4. fire up Furmark in SLI
5. let everything sit for 2-3 hours, meanwhile leave the room, read a book,watch a movie anything that will kill your time.
6. after that time, remove your clothes, besides your underwear, and enter the room
7. suprise! everything is warm like in Simbabwe! your Africa Thermal Condition Simulator is ready!
8. = Profit.


i hope the GTX 580 to be really strong,efficient, and silent! so ati has to struggle and lower their prices again!
the consumers would get better prices, if there was a head-to-head race, this current ATI-Domination hurts us Red teamers more, than it pleases us


----------



## a_ump (Mar 29, 2010)

i agree with everyone  lol. And it will be interesting to see what GTX 580 will be. Hopefully just a die-shrunk, reworked, higher clocked, fermi. But they're def going to have to increase shaders, even re-worked, the HD 5870 is too close to fermi's performance. So the GTX 580 will need to have a good bit of improvement as a tweaked fermi will just get walked on by the HD 6XXX series...i think haha


----------



## Super XP (Mar 29, 2010)

Steevo said:


> The cooler sucks, the card sucks, the heat sucks, and it sucks too much power.


  

Just in case nobodies aware but ATI’s “NEXT GEN” design is going to destroy NVIDIA, so they’d better get Fermi 2 right.

If anybody thinks the HD 5970 is real fast, how about the X800XT vs. HD 5970. Now replace the X800XT with the HD 5970 and replace the HD 5970 with ATI Next Gen  we are talking about massive performance


----------



## a_ump (Mar 29, 2010)

so yea, i thought we had a lot of Diggs/Duggs, but i googled the review and TPU isn't there till the first link on the 2nd page. Does google go by diggs/duggs? cause i personally feel that TPU does the most thorough reviews on the web so it kinda surprises me that it isn't on the first page.


----------



## pantherx12 (Mar 29, 2010)

Why would google go off diggs man?

having a high digg rating means its high up in digg results XD


Google is based on traffic and frequency of search terms etc.


----------



## Robbaz (Mar 29, 2010)

Im going to express my feelings with a video
http://www.youtube.com/watch?v=If0Bkfnifi4


----------



## hertz9753 (Mar 29, 2010)

Robbaz said:


> Im going to express my feelings with a video
> http://www.youtube.com/watch?v=If0Bkfnifi4


----------



## @RaXxaa@ (Mar 29, 2010)

Six months later and it can barely out perform a 5870 and not a 5890. 
The way its meant to be!


----------



## OnBoard (Mar 29, 2010)

You sure went on about how to cool this. Really easy, Accelero Xtreme GTX 480 does it with ease. It will be the same as GTX 280, but with moved contact point. Maybe they'll add a pipe just to be cool.

45C idle and 75C load my prediction with Accelero 

edit: forgot, not all the 320W has to be cooled, just core. That number has memory+voltage regulation parts in it.


----------



## Delta6326 (Mar 29, 2010)

http://tpucdn.com/reviews/NVIDIA/GeForce_GTX_480_Fermi/images/metro_2033_2560_1600.gif ?Metro 2033 really?

Also nice review and dang the 480 is better than i was thinking but still it uses alot of power to do it also with cat10.3 the 5870 was pretty close, and also if you think about it the 5870 is a better chose because its cheaper yes the 480 is faster per say but in most games there BOTH getting over 50+ frames and people wont tell a difference


----------



## pantherx12 (Mar 29, 2010)

Probably needs a lot of video ram.
That and it being a Nvidia optimised game wouldn't help XD


----------



## btarunr (Mar 29, 2010)

pantherx12 said:


> I agree actually.
> 
> Nvidia if they were smart would make the heat-sink a non profit part and just charge the consumer how much it cost them to implement on the card.
> 
> ...



You have great difficulty in understanding simple things. Given, hypothetically this cooler would ace on Radeon HD 5870, giving better cooling (although in reality it won't even fit), but that doesn't make it the "best cooler", or even a good one at that. If it were the best cooler, it would at least work "good" on the GPU it is designed to cool. Since the GPU is ending up >90C for even basic gaming, this cooler isn't doing the job it's meant for. Hence it is not the "best cooler" by a long shot. 

Don't use the "bigger/smaller" argument, we're talking "best" here. So don't throw semantics at me. Size and weight won't count in the argument. Radeon HD 5970's cooler is able to cool way more components (combined TDP higher), and ending up with lower temperatures. Hence it's better than GTX 480's cooler, if not "the best" cooler. 

Even otherwise I'm suspecting that Fermi actually weighs higher than Radeon HD 5970, by the looks of it. I'm searching for a review with such weight measurements. That would nail your "but it's bigger" argument.


----------



## imperialreign (Mar 29, 2010)

TheMailMan78 said:


> Yeah thats where it can peak but doesnt stay unlike Fermi.



Regarding the 4870x2, I can attest to that . . . I have two of those beasts spooning in my setup, all 4 GPUs OCed, and none of the 4 GPUs stay that hot under short- moderate workloads . . . _only_ if they're working their tales off for a couple of hours straight (only gaming) do they settle in at 90C+

But, regarding their coolers, the 4870x2 setup is actually *pathetic* and still cools better, IMHO - just for comparison:







_(inside the fan shroud, with fan removed)_






. . . and it somehow manages to keep two RV700s, 2GB of GDDR5, the PCIE chip, and all other nuclear PCB components happy . . .


What do I see wrong with the 480s cooler?  It appears to be aluminum fins + HP - copper would've been a more _efficient_ choice . . . although heavier, copper is more willing to give up heat than aluminum is.  Also, the annodized coating negatively affects cooling efficiency as well . . . is it a major impact?  Not typically; but when dealing with components that become so hot so quickly, and dealing with an item designed specifically to cool an extremelly hot component that must operate in an extremelly enclosed space, plus a rather poor choice of metal to begin with, I'm sure it's probably doing more harm than good.  The cooler could also make do with more fins, too . . .

Either way, it really looks more like the cooler was designed more for show than go . . . it looks pretty and all, but it's simply not cut out for the work it must endure.


----------



## pantherx12 (Mar 29, 2010)

God BTA its like you can't even read, or read what you want to read.

when did I say it was the "best cooler"

Or did I say it was probably the "best stock cooler"


Thought I'd point something very important out, I am only disagree with you implying ( and then later confirming) that you think the cooler is BAD. That is all.
The cooler is actually good for a stock cooler and if it was fitted to any other cards people would be ranting and raving. That would make it infact a GOOD stock cooler.
See what I mean now?


Also BTA it looks like the 480 uses 55mm mounting holes to me, thus the cooler would fit on a 5870 

Or hell switch it round and stick a 5870 cooler on the 480 and watch the card fry.


Regardless of how right you think you are, you are wrong.

As by your logic, metric shit tons of after market coolers are now bad coolers thanks to Fermi : [


Also Bta for the last time, the 480 gpu by its self outputs MASSIVE amounts of thermal energy, easily nearly as much as everything the 5970 has to cool. ( but with less room for cooling!)

God damn what a round a bout!





Non-nonsensical order to this post sorry folks! I've not slept XD


Oh by the way, guy above me copper is less willing to give up heat. thought I'd clear that up. Its Brilliant at soaking up heat though : ]

also 

"they're working their tales off for a couple of hours straight (only gaming) do they settle in at 90C+"

You realise to get temperatures in reviews they often use bench marking programs that really stress the gpu right? thus its under 100% load constantly.


----------



## btarunr (Mar 29, 2010)

Of course we're talking about stock coolers, isn't that a given? Otherwise wouldn't I pull out some Arctic Cooling Accelero Xtreme numbers? Smell the coffee.


----------



## imperialreign (Mar 29, 2010)

pantherx12 said:


> By the way, guy above me copper is less willing to give up heat. thought I'd clear that up.




No.


Copper is more thermally conductive, and therefore dissipates heat better over longer periods of time from the component it's trying to cool.  

What makes it generally confusing is that if you were to remove an aluminum or copper HS from a component, alum would return to room temp faster than copper - copper is a denser metal, so it will retain more heat during cool-down . . . 

Also because of it's density, copper it able to absorb more heat from the component, whereas aluminum can't - if you have sufficient airflow across the cooler, copper will outperform aluminum every time.

Dissipation of heat is simply the opposite of absorption . . . that's where the fan comes in, to help with this process.  The copper is dissipating _more_ heat than alum, as it's absorbing _more_ heat than alum.

They could've gone with a hybrid style cooler (copper base/HP + aluminum fins), which would give _better_ cooling efficiency than the current setup, while not being as heavy as a full-blown copper cooler . . . but copper still reguarly out-performs the vast majority of hybrid coolers as well.


----------



## pantherx12 (Mar 29, 2010)

btarunr said:


> Of course we're talking about stock coolers, isn't that a given?



Not really given the discussion we've been having 


Anyway fact of the matter is its not a bad GPU cooler.

Stick it on any other card ( and it would fit) and it would perform better then what ever stock cooling is already on.

( single gpu)




Also poster above me, mind if I reply in the morning I'm tired as hell ha ha

To tired to run around finding info etc.

But for a quick example check out reviews of the true 120 ex and the copper version. The difference would be greater if copper could dissipate heat as fast as it could absorb heat.


----------



## btarunr (Mar 29, 2010)

pantherx12 said:


> Not really given the discussion we've been having
> 
> 
> Anyway fact of the matter is its not a bad GPU cooler.
> ...



No it's a given, use commonsense. If I were talking about "the best" cooler, I'd be talking about Kingpin LN2 evaporators. I was talking about stock coolers all this while, it's like you're the one who can't read or make things out in contexts.


----------



## pantherx12 (Mar 29, 2010)

btarunr said:


> No it's a given, use commonsense. If I were talking about "the best" cooler, I'd be talking about Kingpin LN2 evaporators. I was talking about stock coolers all this while, it's like you're the one who can't read or make things out in contexts.





Common sense goes out the window when someone disagrees with science and logic


----------



## btarunr (Mar 29, 2010)

pantherx12 said:


> Common sense goes out the window when someone disagrees with science and logic



And yours is critically flawed.


----------



## pantherx12 (Mar 29, 2010)

btarunr said:


> And yours is critically flawed.




Not by a long shot.

In fact my reasoning and logic have been described as " off the scale" before now.




Seriously stop trying to insult me and get back on topic.

If this cooler was on a 5870 would it be considered a  good stock cooler? ( a copper plate would be added to the bottom of the cooler to simulate a IHS)


----------



## imperialreign (Mar 29, 2010)

pantherx12 said:


> Also poster above me, mind if I reply in the morning I'm tired as hell ha ha
> 
> To tired to run around finding info etc.
> 
> But for a quick example check out reviews of the true 120 ex and the copper version. The difference would be greater if copper could dissipate heat as fast as it could absorb heat.





You can replay in the morn, no problem . . . but it still won't change the fact that copper is a better cooling solution than aluminum.

It's simply one of the basiscs of physics - energy in must equal energy out.


----------



## pantherx12 (Mar 29, 2010)

imperialreign said:


> It's simply one of the basiscs of physics - energy in must equal energy out.




I'm not debating that at all, how ever the rate at which something looses is thermal energy is important.

I need Alex p  He's better at explaining it then me, but check out all the thermal properties of copper you'll find some interesting numbers : ]


----------



## DaedalusHelios (Mar 29, 2010)

Dear pantherx12 and btarunr,

                      I regret to inform you both that you have been in disagreement through a 
simple miscommunication. Btarunr has been saying the cooler is bad because 
it barely handles its intended application. Pantherx12 you are judging the 
cooler by itself and not by the single action of cooling Fermi. 

PS. You are both right but arguing different things.


----------



## pantherx12 (Mar 29, 2010)

DaedalusHelios said:


> Dear pantherx12 and btarunr,
> 
> I regret to inform you both that you have been in disagreement through a
> simple miscommunication. Btarunr has been saying the cooler is bad because
> ...




I figured that out quite a few posts back 

Cheers though, hopefully it clears things up for other people.


----------



## btarunr (Mar 29, 2010)

pantherx12 said:


> Not by a long shot.
> 
> In fact my reasoning and logic have been described as " off the scale" before now.



Leave others to judge that, trolls judge themselves. There are simple arguments presented to you. 

There is a stock cooler (Radeon HD 5970) that handles way more components, and keeps them cooler. Doesn't that make it better than the one on the GTX 480? 
Who cares if it's longer/bigger/wider? We're talking about which is better, and the one on the HD 5970 is handling a higher thermal load, and ending up cooler.
A GeForce GTX 480 cooler won't fit a Radeon HD 5870, just as the HD 5970 cooler won't fit it, so we're left to hypothesize, and the HD 5970 will do a better job since it's on half the thermal load.
A good VGA cooler keeps temperatures down at acceptable levels, the one on the GTX 480 doesn't. It's worse than the HD 5970's cooler in this aspect. The immediate repercussions of GPUs running at 90C falls on overclocking potential. For example, look at EVGA's product lineup. Look at how the out of the box overclocking is unusually low by NVIDIA's/EVGA's standards, and how for a mere 7% higher clock speeds (FTW variant), EVGA immediately jumps into water-cooling (there's no air-cooled FTW variant).



pantherx12 said:


> Seriously stop trying to insult me and get back on topic.



Among us, you're the one who started with "You're highlighting how little you know about cooling here BTA", and later "can't you read". If I took that as an attempt to insult, I would probably have banned you. When I reply with "maybe you're the one who can't", it's better you not take it as an insult. If you do, you shouldn't have "can't you read" me first. 



pantherx12 said:


> If this cooler was on a 5870 would it be considered a  good stock cooler? ( a copper plate would be added to the bottom of the cooler to simulate a IHS)



Probably, but it's not. Stock cooler is something that's designed to fit the needs of something (a CPU/GPU it's designed for). Since this cooler is unable to give the GPU it's designed for more comfortable temperatures, it's not a good cooler.


----------



## imperialreign (Mar 29, 2010)

pantherx12 said:


> I'm not debating that at all, how ever the rate at which something looses is thermal energy is important.
> 
> I need Alex p  He's better at explaining it then me, but check out all the thermal properties of copper you'll find some interesting numbers : ]




Exactly, and the rate at which a metal gives off thermal energy is equivalent to the rate at which it absorbs thermal energy.

Cu absorbs larger amounts of thermal energy over a given period of time than Al is capable of . . . if you were to have both an Al and Cu HS at room temperature, and apply an equivalent amount of heat to them, you would find that the Al HS will warm up faster than Cu will . . . the converse is true as well, remove that source of heat, and Al will return to room temperature faster than Cu.

Does this mean that Cu is a less-efficient cooler material than Al?  No, quite the contrary.  

The density of the metal has a lot to do with the _amount_ of thermal energy the metal is absorbing.  Cu is absorbing more thermal energy than Al, and when applied to cooling a component, this means that Cu is absorbing more thermal energy being given off by the component, and therefore releasing more thermal energy as well.

This is basic physics, man.


----------



## btarunr (Mar 29, 2010)

DaedalusHelios said:


> Dear pantherx12 and btarunr,
> 
> I regret to inform you both that you have been in disagreement through a
> simple miscommunication. Btarunr has been saying the cooler is bad because
> ...



I'm arguing on his pane. Not only am I saying that "it barely handles its intended application", but also "there's another stock cooler that handles higher thermal load, and stays cooler, so is better".


----------



## Grings (Mar 29, 2010)

Has anyone tested how hot fermi gets when manually setting the fan to 100%?

I really dont think a t-rad or accelero extreme style cooler would work very well here, with heat levels like this you NEED to be exhausting heat out of the rear... i wouldnt be surprised if aftermarket coolers for this card all have big ass ducting systems to get more heat out the case

I wonder if we'll see more of those cases with 'wind tunnels' for the graphics cards soon?


----------



## pantherx12 (Mar 29, 2010)

"aluminum = 0.896 kJ per kg per Kelvin
copper = 0.383 kJ per kg per Kelvin"


BAM!

There's the interesting numbers I was after : ]


There's huge debates on Copper Vs Aluminium so this isn't the place to do it, another thread perhaps fella? 


Bta I found it outstanding you would even consider banning someone for such reasons : /

Regarding everything else I won't reply, someone has already pointed out we're arguing over nothing essentially.

I will say though "Who cares if it's longer/bigger/wider?" A lot of people actually lol I'll leave the dick joke for someone else though.



Just saw this "there's another stock cooler that handles higher thermal load, and stays cooler, so is better". in your post further up, we obviously disagree on what makes something better then.

To me if the cooler was the same size yet performed better it would be better, if its bigger and performs better its simply scale.


----------



## Wile E (Mar 29, 2010)

So you are both arguing semantics at this point? 

Seriously, let it drop you 2. It's kinda getting silly now. lol


----------



## pantherx12 (Mar 29, 2010)

Wile E said:


> So you are both arguing semantics at this point?
> 
> Seriously, let it drop you 2. It's kinda getting silly now. lol





Aye True enough, kept me occupied today though  



BTA I'll drop it if you agree to drop it too?

If you want to argue further we can do it via PM should you like


----------



## a_ump (Mar 29, 2010)

i'm no genius n haven't even taken physics or anything yet but i gotta say imperialreign sounds dead on. 

I see BTA's point and panthers as well...i think lol. GTX 480's stock coolers sucks due to the fact that for its use it is more innefficient than other stock coolers. If your just looking at the heatsink and nothing else, then one could definitely say its probly the best stock cooler by design that we've seen from nvidia. Sorta like...hmmmm analogy....if you have a say 17/32 nut for an 18/32 bolt, is the 17/32 nut bad? no its just not the right component to compliment what its job is for. will it still work? fuck if i kno lol it'll be a pita if it does just like Nvidia's heatsink.


----------



## DaedalusHelios (Mar 29, 2010)

btarunr said:


> I'm arguing on his pane. Not only am I saying that "it barely handles its intended application", but also "there's another stock cooler that handles higher thermal load, and stays cooler, so is better".



But he was also bringing the "in the same size contraint" comment into the discussion. I am not sure how that would change the overall outcome of the measurement but it is worth mentioning if you are in fact talking about the same thing. I think we have reached a point of escalation that we might need to have you two agree to disagree on the subject. It won't change the facts on the table regardless.


----------



## btarunr (Mar 29, 2010)

pantherx12 said:


> I will say though "Who cares if it's longer/bigger/wider?" A lot of people actually lol I'll leave the dick joke for someone else though.



We're talking about "the best", so in that argument it doesn't count which is bigger. A lot of people can't actually run GTX 480 in their systems, a lot of cases can't handle its heat. Hell, NVIDIA tells users they shouldn't just do SLI on any case (chassis) and recommends a "certified cases list". 




pantherx12 said:


> To me if the cooler was the same size yet performed better it would be better, if its bigger and performs better its simply scale.



To me better is better. HD 5970's cooler is the best stock cooler since it's handling higher thermal load and staying cooler.  GTX 480's cooler is not better than that of the HD 5970. You already acknowledged it should have been longer (so you concede it's inefficient at its size).


----------



## btarunr (Mar 29, 2010)

DaedalusHelios said:


> But he was also bringing the "in the same size contraint" comment into the discussion. I am not sure how that would change the overall outcome of the measurement but it is worth mentioning if you are in fact talking about the same thing. I think we have reached a point of escalation that we might need to have you two agree to disagree on the subject. It won't change the facts on the table regardless.



No, he just asserted it's the best stock cooler without "size to size", in that he's wrong.


----------



## pantherx12 (Mar 29, 2010)

True if I had said that, how ever I did specifically state for single GPU.

In that, I'm probably right eh?


Infact my exact wording was

"the cooler is EASILY the best stock ( I.E from nv and ati) cooler I've seen for a single GPU card."

Which pretty much covered my ass if there's a better single GPU stock cooler as well, by stating that its the best single gpu cooler *I have* seen : ]



All cleared up now?



*edit* I want to apologise to everyone that didn't find the discussion amusing/enlightening in some-way, should of made my points and left, or just kept editing one post! 
When it comes to arguments/discussions I get pretty passionate.

I'm sure a lot of you can understand that though.


----------



## btarunr (Mar 29, 2010)

pantherx12 said:


> True, how ever I did specifically state for single GPU.
> 
> In that, I'm probably right eh?



Wrong. Size to size, a GeForce 210 passive heatsink that weighs as much as a few coins and is as big as a baseball card keeps its GPU cooler. Hence size-to-size it's better. You said "best stock cooler" without stating "for single GPU". 

It was here where you made that assertion, without any for "single GPU clause".


----------



## pantherx12 (Mar 29, 2010)

Bta I'm not even going to reply, this is getting silly even by my standards.

I'll probably clash with you in another thread sometime though! : ]

Cherio!


----------



## btarunr (Mar 29, 2010)

You made direct assertions, backed those assertions when you could, took tangents (but it's bigger/ size-to-size / single-GPU / but I said xxx ) when you couldn't, and now you're defining standards. Good job.


----------



## OnBoard (Mar 29, 2010)

Grings said:


> Has anyone tested how hot fermi gets when manually setting the fan to 100%?
> 
> I really dont think a t-rad or accelero extreme style cooler would work very well here, with heat levels like this you NEED to be exhausting heat out of the rear... i wouldnt be surprised if aftermarket coolers for this card all have big ass ducting systems to get more heat out the case
> 
> I wonder if we'll see more of those cases with 'wind tunnels' for the graphics cards soon?



They'd need earplugs to test 100% 
_"This picture was taken during the temperature testing shown above and it shows the GeForce GTX 480 screaming along at 67dB at 85% fan speed, which is ~4200RPM. This is not a silent video card by any means."_ -LegitReviews

Have read just couple test no 100% mention in them, propably because it's already too loud under 100%.

For the 'exhaust of the rear' 1 pci-slot hole versus 3x 120mm holes  People have been obsessing years about coolers pushing hot air out of the rear. If only thing you have is one 120mm output fan (or worse) then yes stock type cooler is best. I'd rather get a new case first in which you don't have to worry about what sort of cooler is in it, than get a worse performing rear output cooler.

For me it's like this, 30C cooler GPU > 2C hotter CPU. I have 250W of heat coming out of the case while gaming and you can easily feel that placing hand over the top of the case. Difference is that it's not burning, like it is with stock cooler exhaust, just lukewarm.

And no there won't be any wind tunnels. ATI and NVIDIA need to start making dual/quad and so on cores, like CPU makers have been. Make it simpler and use more. Now we are at a limit that 2 slot cooler simply doesn't work anymore. Unless they make stock cooler 3 slots wide this is as far as it goes without improvements on energy consumption.

edit: ATI example; HD5750 crossfired outperforms HD5850 and costs less. HD 5770 crossfired outperforms HD5870 and cost less. Crossfire versions consume just a little more power but if you'd make a dual core out of them I'm sure it would be less. A quadcore 5770 would kick even GTX 495's butt, consume less wattage and cost less than ONE GTX 480.


----------



## shevanel (Mar 29, 2010)

its called a 5970 lol

damn nvidia.. why did you have to go mess everything up.. now prices will continue to suck.

would be nice to see a 2gb 384bit 5890


----------



## OnBoard (Mar 29, 2010)

shevanel said:


> its called a 5970 lol



That costs way more than anything out there and isn't really even available. Dual core isn't the same as GPU sandwitch even if you make them sit close to each other on gaming class 

I want to see two GPUs on same silicon. One for low end, two for performance and four for high end. Then top that with SLI/Crossfire if you are not happy.


----------



## btarunr (Mar 29, 2010)

shevanel said:


> would be nice to see a 2gb 384bit 5890



Can't get 2 GB with 384-bit, but 3 GB is possible.


----------



## W1zzard (Mar 29, 2010)

added statement from nvidia to the conclusion, regarding power consumption


----------



## laszlo (Mar 29, 2010)

many debates i see about heat, cooler,design... etc....

someone correct me if i'm wrong but is not the cooler fault.

every gpu or cpu get hot due the power leakage 

a perfect gpu or cpu has minimal power leakage with minimal heat output

the fact that fermi is hot is a design flaw as TSCM use the same production line also for amd.

we won't see any improvement in future revisions because there 2 many transistors packed and the design won't be improved only if they cut from transistor count which imply a new gpu in my opinion.

i think they force the limit of current technology with their monolitic big design,however their approach seems viable but not perfect


----------



## Wile E (Mar 29, 2010)

W1zzard said:


> added statement from nvidia to the conclusion, regarding power consumption



So, did you send them a retort telling them that they are bold faced liars?


----------



## btarunr (Mar 29, 2010)

In my opinion they just altered the definition of "max board power" to suit their convenience.


----------



## Wile E (Mar 29, 2010)

btarunr said:


> In my opinion they just altered the definition of "max board power" to suit their convenience.



Yep. AKA: Bold faced liars. lol.


----------



## OnBoard (Mar 29, 2010)

btarunr said:


> In my opinion they just altered the definition of "max board power" to suit their convenience.



I think with Cryostasis (that got my card really hot) or some other games with PhysX you'd go over the 250W too. Don't remember did Batman AA have a same impact.

---

W1zzard: did you get GTX 470 or is that April 12th time?


----------



## TheMailMan78 (Mar 29, 2010)

WOW this thread now reminds me of the old Abbott and Costello skit "Who's on first?"


If you have never seen it WATCH IT.


----------



## W1zzard (Mar 29, 2010)

and added more clarification from nvidia


no i didn't get a gtx 470 yet, right now board partners have absolutely no cards to give away to the press (or sell), and nvidia didnt give me a 470


----------



## TheMailMan78 (Mar 29, 2010)

W1zzard said:


> and added more clarification from nvidia
> 
> 
> no i didn't get a gtx 470 yet, right now board partners have absolutely no cards to give away to the press (or sell), and nvidia didnt give me a 470



So Nvidia claimed 295 and you pulled 320? Yeah I'm with Wile E on this one. Thats just a bold face lie. I would be SO pissed off if I bought one of these and my PSU couldn't take the load.


----------



## W1zzard (Mar 29, 2010)

TheMailMan78 said:


> So Nvidia claimed 295 and you pulled 320? Yeah I'm with Wile E on this one. Thats just a bold face lie. I would be SO pissed off if I bought one of these and my PSU couldn't take the load.



nvidia claims 250 W "TDP (max board power)" which means "we measure 3-4 games and take the highest single reading"


----------



## Wile E (Mar 29, 2010)

TheMailMan78 said:


> So Nvidia claimed 295 and you pulled 320? Yeah I'm with Wile E on this one. Thats just a bold face lie. I would be SO pissed off if I bought one of these and my PSU couldn't take the load.



Yep. They can PR and spin it all the want. The fact of the matter is, this card is capable of consuming over 300w in stock form. I'm happy to see w1z sticking to his guns.


----------



## btarunr (Mar 29, 2010)

You know what's interesting, they continue to recommend a 600W PSU for systems with even a single GTX 480. Minus a graphics card, the average system (whatever is inside the case) would draw around 150W. So even with the crappiest 600W PSU you can find (75% efficiency ; -150W), that's asking consumers for a 300W headroom for a single card, a whopping 450W room when the PSU peaks (cheapo manufacturers market peak wattage as wattage, good quality ones market continuous power as wattage).


----------



## PopcornMachine (Mar 29, 2010)

FYI, it's not "Bold-Faced" but  "Bald-Faced" Liar: 

_The correct term is bald-faced, and refers to a face without whiskers. Beards were commonly worn by businessmen in the 18th and 19th century as an attempt to mask facial expressions when making business deals. Thus a bald-faced liar was a very good liar indeed, and was able to lie without the guilt showing on his face._


----------



## shevanel (Mar 29, 2010)

421w was anadtechs total system load during crysis. thats 102w more than a 5870.

furmark was 479w. just a few watts short of the dual-gpu gtx 295.

i would think whatever the gtx295 needs would be good enough for the 480?

for a single gpu this is sad really. sad because we have so much time to wait before we ever get to buy something from nvidia that is worth buying (next gen toys)... god knows what this stepping-stone of a gpu will lead to.. but i sincerley hope this type of product isnt going to be what i must get used to using if i want the "fastest". I care less about power draw and more about heat when performance isn't the topic of discussion but in this case both issues are just that and quite major ones too.

for you guys like me that have been gaming since the quake days... it's 2010 yall... 2010. This is not progressive. (especially since it's only _sometimes_ faster by 10-13% and slower in other situations like BC2 which i shouldnt have to remind you is a hell of alot more popular than metro 2033)

I don't complain and bitch to put nvidia down, I buy there shit too (ive always bought NV until this year)... I am just disappointed in what their new choice for next gen graphics is and I was hoping for alot more... well on the bright side I get to keep some more money in my pocket.


----------



## imperialreign (Mar 29, 2010)

pantherx12 said:


> "aluminum = 0.896 kJ per kg per Kelvin
> copper = 0.383 kJ per kg per Kelvin"





Hold the spork, man!

Did you take the time to interpret those numbers?

per kg, it takes 0.896 kJ of energy to raise Al 1 K . . . and, per kg, it takes 0.383 kJ of energy to raise Cu 1 K . . .

Taking into account energy in must equal energy out, then we could easily say that it takes 0.896 kJ of energy to lower Al 1 K, and 0.383 kJ of energy to lower Cu 1 K . . .

So, kg for kg, it takes *less* energy to raise and lower Cu 1 K than it would to raise Al 1 K . . .

For a material to require less energy to raise or lower it's temp, that says to me it's less _resistant_ (not a very scientific term here ) to the heat, and more willing to give off heat, which equates to being more thermally conductive.

But, that's all kg for kg.  If you had a 2kg Al HS, it would be much larger than a 2kg Cu HS . . . and based on those numbers you posted, kg for kg, Cu more readibly absorbs and dissipates heat, which means . . .

. . . it's still the better material for cooling.


----------



## HalfAHertz (Mar 29, 2010)

btarunr said:


> You know what's interesting, they continue to recommend a 600W PSU for systems with even a single GTX 480. Minus a graphics card, the average system (whatever is inside the case) would draw around 150W. So even with the crappiest 600W PSU you can find (75% efficiency ; -150W), that's asking consumers for a 300W headroom for a single card, a whopping 450W room when the PSU peaks (cheapo manufacturers market peak wattage as wattage, good quality ones market continuous power as wattage).



Not sure if I understood you correctly, but if you're stating that 600W is overkill for a gaming system I'd have to dissagree. You must keep in mind that most PSUs reach optimal efficiency at ~50-55% load. So a 600W PSU is usually targeted at 300-400W systems.


----------



## pantherx12 (Mar 29, 2010)

imperialreign said:


> Hold the spork, man!
> 
> Did you take the time to interpret those numbers?
> 
> ...





Its very dependant, google search its pretty much an epic sized debate.


I'll have to find more data, but as I said I've not slept so won't be doing any proper digging any time soon 

But needless to say theres a reason why a combination of aluminium and copper is used in heatsinks and its not just to be cost effective.

As I said check out true vs true copper, you would expect the true copper to completely piss all over the standard version, it beats it by 1-2 Celsius or so.

It is very dependant on heatsink design/size and what airflow is available if copper is an effective heatsink.



Since I'm to tired to explain myself, quoting other people ! wooooo very layman unfortunately but finding things is hard when sleepy, still looking though.

"
Copper and aluminum are both effective materials for heat sink construction, but they have different requirements. If you want to know why, consider a great chef's kitchen.

Aluminum sure can move heat, if it has been done right. It very efficiently absorbs and transfers heat to it's environment and things that interact with it. This works great for bacon in the morning, and even for boiling water, but isn't so good for a large, thick filet mignon. That cold slab of beef sucks the heat right out of the aluminum, and there isn't any left to keep up the cooking. Many people who buy aluminum cookware have a lot of trouble doing steaks properly for this very reason. Aluminum has a low thermal capacity, and a very high thermal conductivity.

As such, aluminum just wicks heat away with little concern for anything else. It won't wick as much as copper, but it sure will move it quickly; Dumping it's capacity as soon as any heat leaves the sink, and quickly soaking up more.

Copper moves heat as well, even if it hasn't been done all that well. Copper very efficiently absorbs and transfers heat as does aluminum. It does it faster, as well. That said, copper has an incredibly high thermal capacity. That big fat steak just can't suck up all the heat that copper will hold on to, and this is where copper and aluminum differ in requirements. Copper won't readily dump all the heat energy it picks up, because it holds so much of it before it changes temperature to any great degree.

That leaves us with a problem. Copper needs help. Somehow, you have to remove all that heat from the copper, as it will just hold on to it otherwise. A copper heat sink can work much better than an aluminum one, but you have to either have loads of pipes and lots of fins and airflow, or you need peltier/water cooling with excellent transfer to help it out.

The thermal capacity of copper, when compared to an aluminum heat sink of the same design, completely removes the benefit of using copper in the first place without help. As a matter of fact, a poorly designed copper sink can be much worse than an aluminum model.

The best way to use the materials is being tried nowadays, and that is combining them. As with most good things, they work better together than apart.
"

More data using cookware as an example lol

"Copper cookware almost always compares favorably to other types of cookware. Stainless steel is not the best conductor, although its strength, durability and ease of cleaning make it a favorite among some cooks. A heavy-gauge aluminum bottom on a stainless steel pan will increase the pan's efficiency, but a thick-gauge aluminum pan is, overall, a better conductor. Aluminum, however, reacts to acidic foods by imparting a metallic taste and sometimes discoloring them -- egg whites beaten in aluminum, for instance, may turn gray. It also does not retain heat for long periods. "

and finally

"ake 2 same sized blocks of metal... one aluminium one copper... heat them to the same temperature. Now monitor temperatures as they cool... one probe in contact with the metal, one a half inch above it's surface, note what happens... The copper block will stay hotter, longer with lower free air temps. The aluminium will cool faster with higher free air temps... because the copper, being higher mass, will retain heat longer.

Yes, because of it's higher thermal conductivity copper soaks up more heat more quickly, but because of it's higher mass it's going to STAY hot. Aluminium isn't as good a heat absorber but, because of it's lower mass, it releases the heat more quickly.

"


Basically the design their using ( copper heatpipes and aluminium fins) gets you the best of both worlds.



Whilst Copper absorbs heat around twice as fast, it dissipates heat about half the rate that aluminium does due to its density.

As I said, tis dependant.

May of used wrong wording earlier maybe, again I blame tiredness !

ha ha


----------



## Super XP (Mar 29, 2010)

Hello everybody, I found out what really happend. And NVIDIA could have released these a lot sooner with lower speed and power but chose not to which dug them deep today.

This is a great NON Bias article and well writen. 



> *Why Nvidia cut back the GTX480
> Less is more*
> by Charlie Demerjian
> March 28, 2010
> http://www.semiaccurate.com/2010/03/28/why-nvidia-hacked-gtx480/





> *Here is some of the article, now I understand what went wrong:*
> Remember when we said that one problem was 'weak' clusters that would not work at the required voltage? Well, if you want to up the curve on yields, you can effectively lower the curve on power to compensate, and Nvidia did just that by upping the TDP to 250W. This the classic overclocking trick of bumping the voltage to get transistors to switch faster.
> 
> While we don't for a second believe that the 250W TDP number is real, initial tests show about a 130W difference in system power between a 188W TDP HD5870 and a '250W' GTX480, that is the official spec. Nvidia lost a 32 shader cluster and still couldn't make 10,000 units. It had to bump the voltage and disable the clusters to get there. Unmanufacturable.
> ...


----------



## imperialreign (Mar 29, 2010)

pantherx12 said:


> Its very dependant, google search its pretty much an epic sized debate.
> 
> 
> I'll have to find more data, but as I said I've not slept so won't be doing any proper digging any time soon
> ...





Biggest reason why manufacturers go with full aluminum or hybrid designs boils down to cost of manufacture, cost of material and weight.

Weight plays a big factor in design.  Take a Noctua, for example, an excellent cooler that works extremelly well - there's a lot of surface area . . . but, if that beast was pure copper, it would break a motherboard if installed in a verticle position.

Notice, though, how well full copper coolers perform compared to larger aluminum coolers - the Zalman 9700, for example, still performs on-par with these larger beasts, and it's much smaller in size.

. . . and we still haven't taken various other aspects into account, such as thermal expansion, either . . .

It can be debated ad naseum, but it won't change the fact that copper is simply better for cooling than aluminum.


----------



## btarunr (Mar 29, 2010)

HalfAHertz said:


> Not sure if I understood you correctly, but if you're stating that 600W is overkill for a gaming system I'd have to dissagree. You must keep in mind that most PSUs reach optimal efficiency at ~50-55% load. So a 600W PSU is usually targeted at 300-400W systems.



No, I meant that NVIDIA's is lying about board power, blatantly. 600W PSU requirement hints at that. Now they can't cut down that requirement to say 500W, because without giving the card 300W, it would become unstable/crash.


----------



## pantherx12 (Mar 29, 2010)

I added to my post by the by XD

I'll say it here though.


Whilst copper is around twice as good at absorbing heat, Aluminium dissipates its heat around twice as fast.

A combination of both is therefore the best solution : ]


----------



## entropy13 (Mar 29, 2010)

pantherx12 said:


> I added to my post by the by XD
> 
> I'll say it here though.
> 
> ...



Coppinium!


----------



## pantherx12 (Mar 29, 2010)

entropy13 said:


> Coppinium!



TO THE LAB!!!!!!!!!1


----------



## Velvet Wafer (Mar 29, 2010)

entropy13 said:


> Coppinium!



you would laugh, there is really metallurgic product that is exactly that, just with a pint of arsenium its named CuAl5As.

there many different tho, even if this should be the most thermal transferring!
just take a look: http://www.dgm.de/past/2004/metallographie/download/686_17.pdf
its in engineering german tho, and very complicated...dont wonder,ok?


----------



## newtekie1 (Mar 29, 2010)

TheMailMan78 said:


> So Nvidia claimed 295 and you pulled 320? Yeah I'm with Wile E on this one. Thats just a bold face lie. I would be SO pissed off if I bought one of these and my PSU couldn't take the load.



And what did ATi claim? 182w or something like that, and we are seeing 212w.  The furmark maximum number is far beyond what you will ever see during any other normal use of the card.


----------



## TheMailMan78 (Mar 29, 2010)

newtekie1 said:


> And what did ATi claim? 182w or something like that, and we are seeing 212w.  The furmark maximum number is far beyond what you will ever see during any other normal use of the card.



Nobody is talking about ATI. This is a thread about Nvidia. A lie is a lie no matter who states it.


----------



## newtekie1 (Mar 29, 2010)

TheMailMan78 said:


> Nobody is talking about ATI. This is a thread about Nvidia. A lie is a lie no matter who states it.



That wasn't the point I was getting at, the point was that the 320w will never be seen in real world use.  So there is no reason for ATi or nVidia or anyone to use the furmark numbers.

It isn't a lie to say 250w, because under normal use, that is the maximum power draw(257w according to this review).


----------



## Steevo (Mar 29, 2010)

Will they warranty a board that has been run with furmark? Seriously, it is a valid 3D application and in a enclosed case during a warm summer in a warm room what will happen....


Care to try it W1zz? I am genuinely interested if it will cook this POS or if it starts to down clock to protect itself, and thus lose a huge amount of performance for normal people trying to use it.


----------



## W1zzard (Mar 29, 2010)

Steevo said:


> Will they warranty a board that has been run with furmark? Seriously, it is a valid 3D application and in a enclosed case during a warm summer in a warm room what will happen....
> 
> 
> Care to try it W1zz? I am genuinely interested if it will cook this POS or if it starts to down clock to protect itself, and thus lose a huge amount of performance for normal people trying to use it.



not possible .. i tried .. card goes to 96° then fan ramps up and it goes to 91°






not working as grill


----------



## Steevo (Mar 29, 2010)

Is that what the whisper quiet 50Db was from? So real users can expect that sort of noise during gaming?

How long did you run Furmark for?


----------



## W1zzard (Mar 29, 2010)

Steevo said:


> Is that what the whisper quiet 50Db was from? So real users can expect that sort of noise during gaming?
> 
> How long did you run Furmark for?



like 40 mins while i was recording HD video in the hope I'd get a nice omelette


----------



## Steevo (Mar 29, 2010)

If you want it watercooled and tortured send it my way. I would love to find the maximum for this card, I would just have to chill my loop in my -15F deep freeze.


----------



## Black Panther (Mar 29, 2010)

Steevo said:


> Will they warranty a board that has been run with furmark? Seriously, it is a valid 3D application and in a enclosed case during a warm summer in a warm room what will happen....
> 
> 
> Care to try it W1zz? I am genuinely interested if it will cook this POS or if it starts to down clock to protect itself, and thus lose a huge amount of performance for normal people trying to use it.



Lol I could try that out in August if Nvidia care to lend me one for testing 

The room where my desktop is in my house faces south-west, and if I don't turn on the AC and leave the curtains open so that the sun shines in, in summer there's a green-house effect where the room's temperature reaches 40 degrees!

I'd put the 480 inside a normal mainstream case and run Furmark. Ambient temperature 40 C...
I'd keep a fire extinguisher at hand obviously.


----------



## W1zzard (Mar 29, 2010)

card is already on its way back to nvidia


----------



## Fourstaff (Mar 29, 2010)

W1zzard said:


> card is already on its way back to nvidia



Even W1z was so appalled by it that he has to send it back :shadedshu


----------



## phanbuey (Mar 29, 2010)

or there are so few of them that they will actually sell it


----------



## HalfAHertz (Mar 29, 2010)

W1zzard said:


> card is already on its way back to nvidia



Those devils! How will you compare future reviews?


----------



## bpgt64 (Mar 29, 2010)

With future cards


----------



## PopcornMachine (Mar 29, 2010)

The heat issue made me think of a scene from one of my favorite movies.

If only I was clever enough to place an NVIDIA or GTX 480 logo on the green tubes of nuclear fuel. 

http://www.youtube.com/watch?v=GIOjFYzD0QE

_Gimme some heat, mahn!_


----------



## Black Panther (Mar 29, 2010)

PopcornMachine said:


> If only I was clever enough to place an NVIDIA or GTX 480 logo on the green tubes of nuclear fuel.
> 
> _Gimme some heat, mahn!_



The closest I could find


----------



## [I.R.A]_FBi (Mar 29, 2010)

epic fail guy


----------



## W1zzard (Mar 29, 2010)

HalfAHertz said:


> Those devils! How will you compare future reviews?



unless i get another card i wont be able to compare


----------



## Black Panther (Mar 29, 2010)

Epic lol, saving some of my tears from seeing retailers pushing the price 5xxx series further up.


----------



## W1zzard (Mar 29, 2010)

Black Panther said:


> Epic lol, saving some of my tears from seeing retailers pushing the price 5xxx series further up.



epic


----------



## [I.R.A]_FBi (Mar 29, 2010)

im so sorry i didnt buy a 5850 at launch


----------



## TheMailMan78 (Mar 29, 2010)

newtekie1 said:


> That wasn't the point I was getting at, the point was that the 320w will never be seen in real world use.  So there is no reason for ATi or nVidia or anyone to use the furmark numbers.
> 
> It isn't a lie to say 250w, because under normal use, that is the maximum power draw(257w according to this review).



That is the point. No vendor should advertise "normal" use. It should be max/min numbers and in this case Nvidia was WAY off. So off I feel they were misleading.


----------



## newtekie1 (Mar 29, 2010)

TheMailMan78 said:


> That is the point. No vendor should advertise "normal" use. It should be max/min numbers and in this case Nvidia was WAY off. So off I feel they were misleading.



So what you are saying is that they should advertise max usage, even though that number will never actually be relevent?  Which is more important, how much power the card will actually consume when being used when being used in real-world applications, or the max it uses when put under one single torture test?  Unless you're running furmark 24/7, the amount of power used during real world usage is more imporant and the more accurate number to report.

And as for nVidia being way off, the only claim I seem to remember is 00w, in which case 20w over in furmark isn't really that misleading...


----------



## Bundy (Mar 29, 2010)

pantherx12 said:


> Whilst Copper absorbs heat around twice as fast, it dissipates heat about half the rate that aluminium does due to its density.



This is not correct in a scientific way but does _appear_ to exist in CPU coolers. Heat transfer is a function of temperature difference and thermal conductivity. It works the same in both directions.

When comparing materials, a cooler with higher thermal conductivity will conduct the same heat as a lower conductivity material but at lower delta T. 

The confusing issue is that Copper is more dense and can hold more heat. This means copper coolers take longer to cool down and longer to heat up, thats all. These factors do not influence performance however, only thermal conductivity does.

Copper is the better product to use than aluminium but aluminium is often preferred because it is lighter.

Edit: apologies for being off topic - suggest replies are PM to me.


----------



## Tatty_One (Mar 29, 2010)

When all is said and done......... put simply........... this thing is friggin hot, let's just hope the average consumer has the sense to actually do a little research before he/she buys a George Foreman fookin Grill to play their games on :shadedshu


----------



## DrPepper (Mar 29, 2010)

Tatty_One said:


> When all is said and done......... put simply........... this thing is friggin hot, let's just hope the average consumer has the sense to actually do a little research before he/she buys a George Foreman fookin Grill to play their games on :shadedshu



It's a shame that they don't and we will be seeing things in the future like my GPU is broke etc etc.


----------



## xrealm20 (Mar 29, 2010)

Black Panther said:


> Epic lol, saving some of my tears from seeing retailers pushing the price 5xxx series further up.



Freaking epic! Great find!


----------



## imperialreign (Mar 29, 2010)

Black Panther said:


> Epic lol, saving some of my tears from seeing retailers pushing the price 5xxx series further up.



ROFLMFGDAO!!    


Holy shit!  That is one of the funniest things I've seen in a *LONG* time!

Much thanks for that link, BP - I've been in need of a great laugh!


----------



## 20mmrain (Mar 29, 2010)

Tatty_One said:


> When all is said and done......... put simply........... this thing is friggin hot, let's just hope the average consumer has the sense to actually do a little research before he/she buys a George Foreman fookin Grill to play their games on :shadedshu



I posted it once before and I'll do it again..... I told you what Nvidia was doing!!!






*This is still funny as hell to me!!!

On another thought though about this video card...... While it might not have that much more video game power than the 5870 on the old games. I really do wonder about Fermi's longevity. Because look at the capabilities with Tessellation, Ray Tracing and the like. Sure games aren't making use of some of theses technologies so much right now..... but what about in the future. Once DX11 really takes off ..... I wonder so will Fermi? and Fermi's design???

Just a thought. 

There are flaws even in this thought like for instance..... with Fermi's performance in Metro 2033 is not that bad...... but then you look at Dirt 2 or AVP other DX11 games..... and the Fermi's performance is not that impressive. If this thought above was true..... then Fermi should really take off in those other examples. *


----------



## Wile E (Mar 30, 2010)

newtekie1 said:


> That wasn't the point I was getting at, the point was that the 320w will never be seen in real world use.  So there is no reason for ATi or nVidia or anyone to use the furmark numbers.
> 
> It isn't a lie to say 250w, because under normal use, that is the maximum power draw(257w according to this review).



I like to bench furmark, tyvm.


----------



## facepunch (Mar 30, 2010)

found this one on extremesystem forum


----------



## TheMailMan78 (Mar 30, 2010)

newtekie1 said:


> So what you are saying is that they should advertise max usage, even though that number will never actually be relevent?  Which is more important, how much power the card will actually consume when being used when being used in real-world applications, or the max it uses when put under one single torture test?  Unless you're running furmark 24/7, the amount of power used during real world usage is more imporant and the more accurate number to report.
> 
> And as for nVidia being way off, the only claim I seem to remember is 00w, in which case 20w over in furmark isn't really that misleading...



Yes. I want max usage. When you buy tires do you look for "round-about" numbers or do you look for facts? I mean WTF we are talking about very precise things. Power draw and such. There should be no "gray" area. Save that crap for driver "performance gains".


----------



## WarEagleAU (Mar 30, 2010)

Wow, not bad, cooling apparatus is bad ass. However, as many have said, 6 months and you are barely beating a 5850/5870 and drawing more power? Not overly impressed by the DX 11 is a nice bump.

Also 645 replies to this, can we say shmokin?


----------



## 20mmrain (Mar 30, 2010)

facepunch said:


> found this one on extremesystem forum



LMAO!!!


----------



## a_ump (Mar 30, 2010)

Black Panther said:


> Epic lol, saving some of my tears from seeing retailers pushing the price 5xxx series further up.



lmfao!!!!!rofl hahaha i haven't laughed that hard from a youtube vid in forever. that should be on TPU's front page lol


----------



## shevanel (Mar 30, 2010)

anyone ever see the video a few months ago of the fake fermi benchmark that starts up like a gta 4 bench the pops spark out of the case and it explodes?

so funny but i cannot find it!


----------



## TheMailMan78 (Mar 30, 2010)

I just found this bench which pits two 480s in SLI vs a 5970

http://www.pcgameshardware.com/aid,...Fermi-performance-benchmarks/Reviews/?page=17

In most cases the 5970 eats the 480 in SLI. However I chalk that up to immature drivers.


----------



## LAN_deRf_HA (Mar 30, 2010)

If a single 5970 is beating a 480 SLI in some instances then the Asus ROG Ares in crossfire will probably be the fastest gpu setup in the world. Which is nice because you'll have slot room for sound/network/raid cards (vs a tri-480 setup).


----------



## Steevo (Mar 30, 2010)

Trying to fake max useage and convince users that 600W is enough with a quad, lots of fans to keep a case cool, 6GB of DDR3, one HDD on SSD, and some lights, is OK untill they decide to benchmark.

Bam, system shuts down.....or fries.


At least I can still run X-fire and doubly rape the shit out of a 480.


----------



## caleb (Mar 30, 2010)

Where is BF Bad company 2 performance?


----------



## crazyeyesreaper (Mar 30, 2010)

TheMailMan78 said:


> I just found this bench which pits two 480s in SLI vs a 5970
> 
> http://www.pcgameshardware.com/aid,...Fermi-performance-benchmarks/Reviews/?page=17
> 
> In most cases the 5970 eats the 480 in SLI. However I chalk that up to immature drivers.



thats NOT SLI

that site is misleading look at the single gpu 480 scores in the games BEFORE that list

that final set of benchmarks are the gtx 470 and 480 VS the dual gpu GTX 295 and 5970 ie gtx 470 and 480 in single card configs vs the 2 big dogs from this and last gen

very very misleading 


*The charts below show a list of games that we have been testing with the GTX 480/470 and the multi GPU cards GTX 295 and Radeon HD 5970*


----------



## DaedalusHelios (Mar 30, 2010)

Steevo said:


> Trying to fake max useage and convince users that 600W is enough with a quad, lots of fans to keep a case cool, 6GB of DDR3, one HDD on SSD, and some lights, is OK untill they decide to benchmark.
> 
> Bam, system shuts down.....or fries.
> 
> ...



Who would really run a PSU close to its rating anyway. PSUs are much more efficient, and last longer when you aren't pushing them close to the limit.


----------



## jessicafae (Mar 30, 2010)

W1zzard said:


> like 40 mins while i was recording HD video in the hope I'd get a nice omelette





W1zzard said:


> card is already on its way back to nvidia



Did you cleaned the grill, um I mean heatsink, before you sent it back?
We should be on the look out for some future forum post where someone complains that Nvidia sent them a used card because there was egg in the creases of the heatsink....


----------



## TheMailMan78 (Mar 30, 2010)

crazyeyesreaper said:


> thats NOT SLI
> 
> that site is misleading look at the single gpu 480 scores in the games BEFORE that list
> 
> ...



You're right. That does read strange. Honestly I'm not even sure WTF they are doing there.


----------



## Velvet Wafer (Mar 30, 2010)

trying to be funny again? im sure he hasnt even noticed his mistake


----------



## W1zzard (Mar 30, 2010)

jessicafae said:


> Did you cleaned the grill, um I mean heatsink, before you sent it back?
> We should be on the look out for some future forum post where someone complains that Nvidia sent them a used card because there was egg in the creases of the heatsink....



cleaned it as much as i could (no teflon coating = real mess) .. there may be a bit of leftovers but i doubt they will be visible on any pictures


----------



## Solaris17 (Mar 30, 2010)

W1zzard said:


> cleaned it as much as i could (no teflon coating = real mess) .. there may be a bit of leftovers but i doubt they will be visible on any pictures



did they request it back? or are you sending it for another reason?


----------



## Delta6326 (Mar 30, 2010)

LOL hahahhahaha Epic Fail i love that it's so true is the sad part


----------



## Steevo (Mar 30, 2010)

DaedalusHelios said:


> Who would really run a PSU close to its rating anyway. PSUs are much more efficient, and last longer when you aren't pushing them close to the limit.



Search the forums, and many of the other forums. I used to mod at a very very very busy pc help forum, about 50% of the hardware related questions were due to underpowered, or just plain mismatched components. 

A 200W PSU to run a old Pentium 2 with a RAGE card, the guy upgraded his video card to a AGP 9800XT and wondered why every time his games would launch it would shut down. 

Hell just look at the number of questions "Is this enough to run X card?" I just ran a couple searches and ended up with almost 400 from just two simple terms. That doesn't count the number of other things people think is wrong before they check the power requirements.

Plus we are a tech savvy site, so most know what is required.

Do we get awards for making this the longest cussed, discussed, and argued review thread?


----------



## springs113 (Mar 30, 2010)

not to flame or thread crap...but isnt ATI a little too quiet ...1st off the 5970 is not a true cypress xt...5890? anyone. or what about the 2gb 5870s performance


----------



## Super XP (Mar 30, 2010)

Why would ATI be loud for? It seems reviews are taking care of it. We all now know Fermi is broken and currently un-fixable


----------



## shevanel (Mar 30, 2010)

Why does the 480 require 42 amps on +12?


----------



## Super XP (Mar 30, 2010)

shevanel said:


> Why does the 480 require 42 amps on +12?


Interesting, not sure? YOu got me curious now


----------



## shevanel (Mar 30, 2010)

http://www.newegg.com/Product/Product.aspx?Item=N82E16814130551&cm_re=gtx_480-_-14-130-551-_-Product


----------



## Super XP (Mar 30, 2010)

shevanel said:


> http://www.newegg.com/Product/Product.aspx?Item=N82E16814130551&cm_re=gtx_480-_-14-130-551-_-Product





> System Requirements Minimum of a 600 Watt power supply. (Minimum recommended power supply with +12 Volt current rating of 42 Amps.)


I know Fermi has a power issue but that is a lot for a GPU no?

Interesting that you need 600W power supply for 1 x GTX 480 and 600W for 2 x HD 5870's in Crossfire. I wonder if NVIDIA is ever going to fix this power issue?


----------



## shevanel (Mar 30, 2010)

13A draw with 5870 18A using furkmark?

40ax12v=480w

Im not good with electrical figures but seems 750 watts should have been the reccomended but idk

I think the 42a recomendation is so that the entire system is covered by the psu and not just the gpu

some reviews had total system load @ 460-525 watts.. so that would be close to 40-42 amps afaik

and some systems are diff than others... like my i5 doesnt draw as much as my old i7 920 when oc'd


----------



## Super XP (Mar 30, 2010)

Good point. How about people that have a lot of HDD's and other components that require power.


----------



## imperialreign (Mar 31, 2010)

springs113 said:


> not to flame or thread crap...but isnt ATI a little too quiet ...1st off the 5970 is not a true cypress xt...5890? anyone. or what about the 2gb 5870s performance



ATI has remained quiet since nVidia first started tooting their own horn over Fermi . . . (or maybe I should say, winding up their own megaphone?) . . .

Right now, I don't really see the performance difference being enough for ATI to get too worried over - especially considering the other "issues" this card is coming out of the psychiatry department with.  Now, if driver release somehow *magically* fix some problems (or the card recieves a revision rather quick), it wouldn't surprise me if ATI dump a 5890 into the GPU skirmish and giggle with red-faced glee.

There were some rumours of a potential 5890 before the end of last year, but ATI have clammed up, since.  I also think they're "rationing" the 5970s right now, keeping the flood gates closed (for the moment) and milking the success as much as possible.




Super XP said:


> Good point. How about people that have a lot of HDD's and other components that require power.



I can't really say too much one way or the other - I'm one of those that adhere to the philosophy of having as much headroom as possible . . . I prefer to keep my PSUs working in the 50%-65% load range.


----------



## Wile E (Mar 31, 2010)

imperialreign said:


> I can't really say too much one way or the other - I'm one of those that adhere to the philosophy of having as much headroom as possible . . . I prefer to keep my PSUs working in the 50%-65% load range.



Yep, me too.


----------



## Benetanegia (Mar 31, 2010)

*Why Furmark power consumption is misleading.*

I've been trying to find any info that suggested that what I'm going to expose has changed but I didn't find any so here we go:

First of all, I'm not saying that Fermi isn't too power hungry or hot, but it's definately not as dramatic as many have claimed. Most claims are based on Furmark readings, especially those who are claiming that Nvidia is lying about TDP and those readings are absolutely misleading when it comes to any brand comparison. The reason is simple, Ati cards throttle down under Furmark to prevent going too high:

http://www.techpowerup.com/index.php?69799

Renaming Furmark will no longer help as AMD succesfully "fixed" that "problem" since Cat 9.8:

http://www.geeks3d.com/20090914/catalyst-9-8-and-9-9-improve-protection-against-furmark/

But that's not all! HD5xxx cards have hardware protection (throttling when a limit is exceeded) against stress tests like Furmark and although that's a good thing for the product, since no single game will stress the cards as much as Furmark, numbers are totally misleading. Furmark numbers don't represent absolute max load as they do on Nvidia cards.

http://www.geeks3d.com/20090925/ati...on-against-power-virus-like-furmark-and-occt/

That feature in the HD5xxx series is fantastic, don't get me wrong, but fact remains true though, that such a protection absolutely denies any attempt of comparison under Furmark load as a valid point.

HD5970 throttling back: 

http://www.legionhardware.com/articles_pages/ati_radeon_hd_5970_overclocking_problems,4.html



> If you look at our FurMark tests, the temperature quickly spiked from 60 degrees at idle, *to over 100 degrees* within 40 seconds. However to avoid crashing or cooking itself, the Radeon HD 5970 quickly throttled the core frequency to 550MHz, which reduced stress temperatures to just below 90 degrees.


----------



## 20mmrain (Mar 31, 2010)

imperialreign said:


> ATI has remained quiet since nVidia first started tooting their own horn over Fermi . . . (or maybe I should say, winding up their own megaphone?) . . .
> 
> Right now, I don't really see the performance difference being enough for ATI to get too worried over - especially considering the other "issues" this card is coming out of the psychiatry department with.  Now, if driver release somehow *magically* fix some problems (or the card recieves a revision rather quick), it wouldn't surprise me if ATI dump a 5890 into the GPU skirmish and giggle with red-faced glee.
> 
> ...



Altought Supposedly ATI has already started there next gen card...... and will most likely be out before Fermi is fully released. This means that ATI would have another reason not to be worried about the performance difference..... Even if there are some Driver improvements with Fermi......Or any type of improvements for that matter.

http://vr-zone.com/articles/-rumour-ati-s-next-generation--southern-islands/8722.html


----------



## TheMailMan78 (Mar 31, 2010)

Benetanegia said:


> I've been trying to find any info that suggested that what I'm going to expose has changed but I didn't find any so here we go:
> 
> First of all, I'm not saying that Fermi isn't too power hungry or hot, but it's definately not as dramatic as many have claimed. Most claims are based on Furmark readings, especially those who are claiming that Nvidia is lying about TDP and those readings are absolutely misleading when it comes to any brand comparison. The reason is simple, Ati cards throttle down under Furmark to prevent going too high:
> 
> ...



Hmm but Nvidia doesn't have that feature so it will in fact run hot and go WAY past its listed power draw which equals lie. Plus thats a 5890 your talking about. Duel GPU? I'm afraid your grabbing at straws here man.


----------



## Benetanegia (Mar 31, 2010)

TheMailMan78 said:


> Hmm but Nvidia doesn't have that feature so it will in fact run hot and go WAY past its listed power draw which equals lie. Plus thats a 5890 your talking about. Duel GPU? I'm afraid your grabbing at straws here man.



It's not lie, since only under Furmark it will go beyond the specified TDP. AMD has put protection so that Furmark does not stress the GPU to those limits. If at all, it's AMD who is lying in that respect, because Furmark is not showing the real max consumption of the cards, Nvidia is. Except that it is not lying, since a card will never reach those limits in any REAL application. Same goes to Nvidia cards, they will never reach those levels under gaming or CUDA apps or whatever you throw at them, as far as is not a synthetic app especifically designed to stress the GPU that far. Any real application will do much more than just stress the shader processors, the SPs do their work, but that work has to go somewhere and has to be treated there too and then go elsewhere, etc. That's why AMD's raw flop numbers are totally meaningless for real apps, because although the SPs can essentially work that hard, the data generated would never be able to go out and be useful. And that's what Furmark does, stress the shaders without the need for the generated data to be useful, without the need for the data to go outside the SPs.

Taking the above into account and the links I posted, which cards are worse when under Furmark before throttling kicks in? Well both the HD4850 and the HD5970 went way above 100C before throttling kicked in, in just 40 seconds of Furmark!!! God knows how far they could go in some minutes under full load. On the other hand the GTX480 stays at around 95C even if there's no throttling going on, so Nvidia took actions on hardware itself to keep the card cool, while AMD used artificial measures. *Both* use what I would call legit measures, since both are rightly "assuming" that nobody will be able to reach those limits under any real condition and from several years of cards being out there, it's obvious they are right. HD4850s have not died while gaming right? Fermi won't either.

Now if you have to use Furmark all day... yeah you'd need a card that artificially cripples the cards performance to prevent it from burning inside your PC. Maybe Nvidia should release a similar protection on their next drivers? Would you all be happy? Thing is, I doubt it. In fact I bet that although AMD did it first AND still is doing it, if Nvidia did that on their next drivers and Fermi power consumtion went lower especially on Furmark, we would see many many complaints about how Nvidia is cheating, because well, it's Nvidia. Sad but true.


----------



## Super XP (Mar 31, 2010)

Do you know why FurMark is one of the best ways to determine a GPU's power consumption along with other synthetic benchmarks? So far it’s the only way to measure a cards true power consumption in a consistent manner by utilising something that is constant, won’t change or surprise and change the GPU’s stress level just as you would find in real world gaming. This is the only thing that is great about Synthetic benchmarks. It’s the best way to measure a cards performance vs. previous gen performances. 

I believe the results speak for themselves  


> *While there are a few games in which the GTX 480 was faster, there are many resolutions in our test games where the HD 5870 comes out on top. Clearly, the GTX 480 is not the world's fastest single-GPU card.*





> *3DMark06 Canyon Flight test, 1,280 x 1,024 0xAA 16xAF, Peak Temperature
> Power Consumption (Idle and Gaming)*
> http://www.bit-tech.net/hardware/2010/03/27/nvidia-geforce-gtx-480-1-5gb-review/10
> 
> ...


----------



## Super XP (Mar 31, 2010)

There’s no denying the facts, it's so evident that the GTX 480 & 470 are Hot, Power Hungry and sound like a jet engine. But anyway we've got more than 25+ reviews to prove this with different methods of tests. No point in defending something that cannot be defended. All we need to do right now is live with the results and wait for a possible Fermi re-fresh if and when it gets released. But we won’t see anything for another 6+ months IMO. Until then, everybody enjoy your nice cool running HD 5800 series cards O.K.


> Thermals and Power Consumption
> Living with a card's thermal characteristics, power consumption and noise levels are just as important as its graphics horsepower and it’s here that the GeForce GTX 480 really runs into trouble.
> 
> Power consumption at idle was the highest we’ve seen from a *single GPU card at 186W system power draw*, 18W more than the HD 5870. At load though it entered a whole new dimension for a *single GPU card sucking down a massive 382W* while looping the canyon flight demo in 3DMark 06. *That’s a full 106W more than the Radeon HD 5870 in the same test, 30W more than the dual GPU Radeon HD 5970* and only 6W less than the dual GPU GeForce GTX 295!
> ...


And I would have to agree 100%


> The higher price, the 100W of extra power consumption, scorchingly hot temperatures and a much noisier stock cooler are all extremely detrimental to its desirability. The HD 5870 remains a far better choice if you're a gamer; while we've yet to see how the GTX 480 performs with CUDA apps and Folding, at this stage *Fermi looks like a flop*.
> 
> *GTX 480*
> Performance - 9/10
> ...


----------



## HalfAHertz (Mar 31, 2010)

@Benetanegia: Haha well let's see what do I prefer...to have a fried GPU or be "cheated" and have my GPU throttle down to safe temps...hm-mm what a hard decision. We've had throttling CPUs since the P4 days, and nobody complained, I don't think anyone ever will either considering the consequences otherwise. I think Nvidia's solution is simpler but just as effective.

About Furmark - yes it is a power virus because, just as you said, it stresses the SPs *beyond *what they were meant to do by means of bypassing the rest of the GPU pipeline and overloading them with calculations - a situation not usefull in a real life situation. You wouldn't see that in any other GPU application.

And why do you insist of saying that Ati's gflops numbers are wrong? I remember we had a similar discussion before. They aren't the only problem is that you'd need smart coding to get to the low level hardware functionality. Remember the SPs are in groups of 5 - 1 for complex calculations and 4 for simple calcs. But if you want to do only dual point calculatins, you can group the 4 simple ones and simulate a second complex SP, indeed reducing the SP count to 640 - the fact why DP gflops is only 2/5 of the SP gflops numbers. The numbers stated by Ati are indeed achievable but only with smart coding specifically for their architecture.

Edit: SuperXP stop spamming your negative propaganda  I remember that the single slot HD4850s and the original 4870x2 also ran at 90+ degrees and nobody complained as much...


----------



## spud107 (Mar 31, 2010)

sometimes not disengaging safety features is a good thing . . .








Benetanegia said:


> But that's not all! HD5xxx cards have hardware protection (throttling when a limit is exceeded) against stress tests like Furmark and although that's a good thing for the product, since no single game will stress the cards as much as Furmark, numbers are totally misleading. Furmark numbers don't represent absolute max load as they do on Nvidia cards.


----------



## Benetanegia (Mar 31, 2010)

HalfAHertz said:


> @Benetanegia: Haha well let's see what do I prefer...to have a fried GPU or be "cheated" and have my GPU throttle down to safe temps...hm-mm what a hard decision. We've had throttling CPUs since the P4 days, and nobody complained, I don't think anyone ever will either considering the consequences otherwise. I think Nvidia's solution is simpler but just as effective.



I have not option but to wonder why you have to take all things personal... I talk about Ati fanboys in one post and *you* reply what *you* think. I talk about how people would complain and *you* reply about what you'd prefer. I'm thinking of an F word and it's not f--k?



> And why do you insist of saying that Ati's gflops numbers are wrong? I remember we had a similar discussion before. They aren't the only problem is that you'd need smart coding to get to the low level hardware functionality. Remember the SPs are in groups of 5 - 1 for complex calculations and 4 for simple calcs. But if you want to do only dual point calculatins, you can group the 4 simple ones and simulate a second complex SP, indeed reducing the SP count to 640 - the fact why DP gflops is only 2/5 of the SP gflops numbers. The numbers stated by Ati are indeed achievable but only with smart coding specifically for their architecture.



I never said they were wrong. They are not achievable in any real application, not even AMD's internal apps are achieving anything beyond a 75% or so and that's on very very especific apps. Why do I insist? I was not insisting in the matter, I mentioned that because normal usage is way below the raw "potential" and that's why under normal usage a HD4850 would not go much higher than 90C, but on Furmark, where artificial stressing will increase the usage close to its potential, well, we don't know how high it could reach, all we know is that it would reach 105C++ and get fried. The reson is simple and it's where I was at when I mentioned it. Typical AMD shader usage is around a 40%, which is around 7% higher than the usage found under SGEMM. That's real usage. On Furmark it probably reaches something close to 100%, tbh I have no idea, but probaby nobody knows the exact number or even an aproximation, except AMD. From 33-40% to 100% there's a long way though enough to put temps through the roof.

As to the performance side of things, if it can't be achieved in normal escenarios, it can't be achieved period. Sure you can create an app that uses 4 simple and 1 complex one and bla bla bla, but that's not an application, that's a benchmark, a demo, a showcase. No single application (not even games, transcoding, SGEMM...) will be close to being able to do that, real apps need what they need in the exact moment they need them and AMD's architecture simply isn't suited for that. Period, you can argue as much as you want.[/QUOTE]


----------



## HalfAHertz (Mar 31, 2010)

Uhm, I was actually trying to agree with you in the above post. I was basically saying that I don't care exactly how they prevent a GPU from frying - be it a throttling function or a powerful fan, as long as my expensive GPU doesn't turn into an expensive paperweight 

I'm no mathematician nor a coder so I don't really know how hard it is to write a code utilizing all that hardware, but I'll tell you one thing - there are many people much smarter than you and I who can and will if given enough incentive to do so...


----------



## Benetanegia (Mar 31, 2010)

HalfAHertz said:


> I'm no mathematician nor a coder so I don't really know how hard it is to write a code utilizing all that hardware, but I'll tell you one thing - there are many people much smarter than you and I who can and will if given enough incentive to do so...



You don't get what I'm trying to say, but it's probably my problem, it's usually difficult for me to explain such complicated things in a foreign language. Well, it's not only the fact that you can use all the shaders, it's that not always the fact that you are using all those shaders will suppose a true benefit. Take games for example, average shader (ALU) use has been stablished (Beyond3D, Devnet...) to be around 3.8 out of 5 on AMD SPs, which is a 76%, but even that number is not exact or true by any means. Let me explain, of course 75% of shaders are working, but not all of them are producing genuine results, many of them are duplicating work (couldn't find a better word than genuine). This becomes obvious as soon as you realize that 76% of 1.2 TFlops (HD4870) is 912 Gflops, way more than the theoretical 708 Gflops on a GTX285 or 536 Gflops on a GTX260, and those don't have 100% efficiency either, not at all. Basically the HD4870 is calculating twice as much for the same task, otherwise if every flop operation was genuine, that would mean that 900 Gflops was required for a certain performance and the GTX cards would be seriously bottlenecked by shaders. What most probably happens is that, like I said, the AMD card is duplicating many of the calculations and it just makes sense if you think about it: when you have many spare ALUs and your bandwidth is more limited, it doesn't make sense to store some results in vram, even if you know you will need them later, because you know you will have spare ALUs too, so you just calculate things (most things) as they come. Nvidia, on the other hand prefers efficiency over throughoutput and hence they store the output, and as a result they need better caches and intercommunications. Like I have always said, two different ways of achieving the same thing.


----------



## shevanel (Mar 31, 2010)




----------



## dir_d (Mar 31, 2010)

hmmmm...
http://www.legitreviews.com/article/1264/1/
blame it on w1zz?


----------



## shevanel (Mar 31, 2010)

> We relayed this information on to NVIDIA and they informed us that our dual monitor idle temp problem will be solved by another new VBIOS that will be released this week that will ramp up the fan speed starting in the 70s instead of the 80s.



yeah, wonderful solution.. Ramp that fan up boys!


----------



## SK-1 (Mar 31, 2010)

Thanks kids. Thanks for pissing off the admin. so much he's leaving TPU. Hope your all proud of yourselves.


----------



## mlee49 (Mar 31, 2010)

No shit, way to go.  Bash on the reviewer and look at what happens.


----------



## trickson (Mar 31, 2010)

Great review . 
I was thinking of getting one now I just may  .


----------



## freaksavior (Mar 31, 2010)

Honestly, after seeing the review, i have no idea which card I want my girlfriend to buy me. 

I really like the non reference cooler 5870's but it looks like the gtx480 does just about as good, and we all know how the driver game goes. 

btw, thanks w1zzard


----------



## PaulieG (Mar 31, 2010)

Stop the retarded arguing. If the negativity continues , I'll be handing out major custom infractions.


----------



## HalfAHertz (Mar 31, 2010)

Benetanegia said:


> You don't get what I'm trying to say, but it's probably my problem, it's usually difficult for me to explain such complicated things in a foreign language. Well, it's not only the fact that you can use all the shaders, it's that not always the fact that you are using all those shaders will suppose a true benefit. Take games for example, average shader (ALU) use has been stablished (Beyond3D, Devnet...) to be around 3.8 out of 5 on AMD SPs, which is a 76%, but even that number is not exact or true by any means. Let me explain, of course 75% of shaders are working, but not all of them are producing genuine results, many of them are duplicating work (couldn't find a better word than genuine). This becomes obvious as soon as you realize that 76% of 1.2 TFlops (HD4870) is 912 Gflops, way more than the theoretical 708 Gflops on a GTX285 or 536 Gflops on a GTX260, and those don't have 100% efficiency either, not at all. Basically the HD4870 is calculating twice as much for the same task, otherwise if every flop operation was genuine, that would mean that 900 Gflops was required for a certain performance and the GTX cards would be seriously bottlenecked by shaders. What most probably happens is that, like I said, the AMD card is duplicating many of the calculations and it just makes sense if you think about it: when you have many spare ALUs and your bandwidth is more limited, it doesn't make sense to store some results in vram, even if you know you will need them later, because you know you will have spare ALUs too, so you just calculate things (most things) as they come. Nvidia, on the other hand prefers efficiency over throughoutput and hence they store the output, and as a result they need better caches and intercommunications. Like I have always said, two different ways of achieving the same thing.



Ok now I see what you mean. So basically if I got what you're saying, the built in scheduler sucks and does some of the calculations multiple times, thus wasting sp cycles?


----------



## Andy77 (Mar 31, 2010)

Hm... long story? I hope it was not because of the 9.12!

When I first saw the review I was like "9.12? WTF?!?!" But then I thought, well, he has lots of cards and lots of tests, and they go way back, so no reason bothering him about it, knowing that there will be kids that will do just that, and I hoped somehow that 10.3 will be released... and it did!  Now I need one for the 5970... probably I'll find something out there in time.


----------



## qubit (Mar 31, 2010)

SK-1 said:


> Thanks kids. Thanks for pissing off the admin. so much he's leaving TPU. Hope your all proud of yourselves.





mlee49 said:


> No shit, way to go.  Bash on the reviewer and look at what happens.



Eh? Who's leaving TPU?


----------



## Fitseries3 (Mar 31, 2010)

Am i the only person who gets this?

the march 26th "event" was another conundrum** to help nvidia further delay the REAL release of the retail version of the 4 series cards so that nvidia could buy more time to fix the issues at hand. 

the "review" samples that were given out were known to be faulty in several ways but it gave everyone something to talk about to shut the fuck up about "when is fermi comming out? its 6months late"

yes, maybe it looks "bad" that the cards are hot and draw a ton of power but it alleviates one problem and starts another. 

the big thing i see here is... NO ONE CAN EVER BE HAPPY ABOUT A DAMN THING. 

if the gtx480 was 30x faster than 5970 you would still bitch cause the price is too high. but why is the price so high? because its bleeding edge technology and thats the price you pay.

i notice alot of you guys bitching about "well you shoulda used the 10.X driver for ATI... its better" yes... perhaps it is but why is that? because ATI has had time to fix and optimize their drivers for better performance. has nvidia had time to do that? NO. does it cross your mind that perhaps the older driver was used so that both ati and nvidia's offerings could be compared as they were released?

if you are comparing 2 brand new cars off the show room floor would it be "fair" to let company A fix a bunch of their problems before the comparison while company B is forced to be judged on what they brought to the plate as it stands? NO. 

these reviews are done with immature cards, and immature drivers. why do you expect so much from them?

perhaps im being an asshole here but i just want to remind you that you should take these early reviews with a grain of salt. 

if you think you can do a better review then do so yourself. oh wait... .you cant.. you dont have any gtx480s or gtx470s. 

give the man some respect. 




**Conundrum is a logical postulation that evades resolution, an intricate and difficult problem


----------



## mdsx1950 (Mar 31, 2010)

qubit said:


> Eh? Who's leaving TPU?



Your kidding right?


----------



## DOM (Mar 31, 2010)

Fitseries3 said:


> Am i the only person who gets this?
> 
> the march 26th "event" was another conundrum** to help nvidia further delay the REAL release of the retail version of the 4 series cards so that nvidia could buy more time to fix the issues at hand.
> 
> ...



fit dont waste your time some ppl will never change, thats a fact look at the world we live in today ppl bitch about everything 

and he does retest everytime he does a new review so idk what all the crying was all about if i had the money i would have two of every card to play with but i dont


----------



## Deleted member 24505 (Mar 31, 2010)

Close this thread to stop the arguing and bitching.


----------



## [I.R.A]_FBi (Mar 31, 2010)

dir_d said:


> hmmmm...
> http://www.legitreviews.com/article/1264/1/
> blame it on w1zz?



=\


----------



## Benetanegia (Mar 31, 2010)

HalfAHertz said:


> Ok now I see what you mean. So basically if I got what you're saying, *the built in scheduler sucks and does some of the calculations multiple times*, thus wasting sp cycles?



NO! Not at all. There's nothing wrong on the scheduler and I never suggested anything similar. But the more parallel an architecture is, the more innefficient it is. That's something inherent to that kind of architecture, but there's nothing wrong there. But you will never reach the same efficiency (efficiency as actual perf/raw perf) of a less parallel architecture, that's why you can't compare AMD's flops with Nvidia's flops.

Regarding AMD cards doing the same calculations multiple times, bear in mind it's just my own speculation and I wasn't saying that in a bad way, it's just my view of how it probably is and a way to explain how AMD use twice as much flops to do the same thing. Often times something is calculated for a shader that is going to be used later, in this case is usual to store it either on the cache or vram (imagine some lighting data that is going to be used as anthe input for HDR, bloom, color correction and whatever filter). My idea is that sometimes (especially if that thing has to be moved to vram) in a chip like AMD's it could be better/faster to calculate some of those things when they are required again, storing it in the SP's registers or L1 cahe, just to use them in the next ready clock cycle, instead of reading the old result stored in L2/vram, because those memories will use more cycles and AMD has spare SPs most of the times anyway. Its architecture favours that kind of brute force programing, while Nvidia's architecture, with its bigger and faster registers and caches, favors the other method. That's not to say Nvdia's method is better as both architecures have been trading blows depending on the generation. Ati's method lets them pack more raw Gflops in the same die area or transistor budget, but they can't be used as efficiently and Nvidia's are efficient, but take more space. As I said, looking back at the performance to transistor ratio, both have been pretty close. Take into account that although G92 and GT200 had more transistors than RV670 and 770, the competing products GTX260 and 8800GT had many clusters disabled. If Nvidia had made them that size i.e 216 SPs/28 ROPs instead of 240/32 it would have been pretty much the same size as RV770.


----------



## douglatins (Mar 31, 2010)

OMGOMGOMGOGMOGM Best review EVUR, i love this review and I want to print it and sleep next to it, and thats no sarcasm...


----------



## HalfAHertz (Mar 31, 2010)

Benetanegia said:


> NO! Not at all. There's nothing wrong on the scheduler and I never suggested anything similar. But the more parallel an architecture is, the more innefficient it is. That's something inherent to that kind of architecture, but there's nothing wrong there. But you will never reach the same efficiency (efficiency as actual perf/raw perf) of a less parallel architecture, that's why you can't compare AMD's flops with Nvidia's flops.
> 
> Regarding AMD cards doing the same calculations multiple times, bear in mind it's just my own speculation and I wasn't saying that in a bad way, it's just my view of how it probably is and a way to explain how AMD use twice as much flops to do the same thing. Often times something is calculated for a shader that is going to be used later, in this case is usual to store it either on the cache or vram (imagine some lighting data that is going to be used as anthe input for HDR, bloom, color correction and whatever filter). My idea is that sometimes (especially if that thing has to be moved to vram) in a chip like AMD's it could be better/faster to calculate some of those things when they are required again, storing it in the SP's registers or L1 cahe, just to use them in the next ready clock cycle, instead of reading the old result stored in L2/vram, because those memories will use more cycles and AMD has spare SPs most of the times anyway. Its architecture favours that kind of brute force programing, while Nvidia's architecture, with its bigger and faster registers and caches, favors the other method. That's not to say Nvdia's method is better as both architecures have been trading blows depending on the generation. Ati's method lets them pack more raw Gflops in the same die area or transistor budget, but they can't be used as efficiently and Nvidia's are efficient, but take more space. As I said, looking back at the performance to transistor ratio, both have been pretty close. Take into account that although G92 and GT200 had more transistors than RV670 and 770, the competing products GTX260 and 8800GT had many clusters disabled. If Nvidia had made them that size i.e 216 SPs/28 ROPs instead of 240/32 it would have been pretty much the same size as RV770.



Ok once you put it that way, it makes a lot more sense


----------



## Black Panther (Mar 31, 2010)

douglatins said:


> OMGOMGOMGOGMOGM Best review EVUR, i love this review and I want to print it and sleep next to it, and thats no sarcasm...



It's an excellent review as I stated in previous posts.

But allow the *'child'* in me to speak: Now I detest fermi more than ever. 


And that means a lot since I've always used Nvidia cards these past... 15 years... and bought my very first ATI a couple of months ago.
So no, it's not fanboism speaking here.


----------



## qubit (Mar 31, 2010)

mdsx1950 said:


> Your kidding right?



No, I'm not.

There's 700 posts in this thread and I'm not gonna plow through them all to get up to speed - would you?

I'd still like to know and if you want to reply by PM to avoid flames (there's enough on here) I'd be grateful.


----------



## Blacklash (Mar 31, 2010)

5870 Crossfire for me.

http://techreport.com/articles.x/18682/12


----------



## erocker (Mar 31, 2010)

qubit said:


> No, I'm not.
> 
> There's 700 posts in this thread and I'm not gonna plow through them all to get up to speed - would you?
> 
> I'd still like to know and if you want to reply by PM to avoid flames (there's enough on here) I'd be grateful.



Check the front page. www.techpowerup.com


----------



## roast (Mar 31, 2010)

dir_d said:


> hmmmm...
> http://www.legitreviews.com/article/1264/1/
> blame it on w1zz?



You're taking information from a half-baked website that doesnt bother with the real stuff and just f**king cooks eggs on a cooler??


----------



## Flyordie (Mar 31, 2010)

sneekypeet said:


> you just don't get it. How about compassion or constructive criticism?
> 
> Please don't kick the guy when he is down
> 
> ...



A+ Man.  So far, all of the reviews I have read from W1zz were great.  TPU as a whole is also great as it persuaded me to get an HD5770 instead of waiting for Fermi in all its lateness. 
-
W1zz, I have never once been critical of you or your reviews, many of us haven't but the few that have, you are letting them get to you.

Look at the bigger picture W1zz, you helped save me $300+, so how many other people have you saved from the hassle of buying a crap product? Helped launch products into the mainstream?  Helped kill bad products? 

Thats all I got to say, If this is real, which I feel it is... Hope you hand the site to a good runner-up.


----------



## fundayjinx (Apr 1, 2010)

Come on Wizz you have to stay with us I love what you've done with this site and you can't give up on your work now...


----------



## qubit (Apr 1, 2010)

erocker said:


> Check the front page. www.techpowerup.com



Jeez, I hadn't seen that. That's a real shame.  W1zz makes for a great admin and Im sorry that a bunch of bickering, sniping posters had this effect on him.

I think this feeling must have been building up for a while now, through various negative experiences on the forum though - someone doesn't leave a site he founded after just one lousy thread.

Added this:



W1zzard said:


> what do you think i have been doing for the past 7 years? it's just now that i realized that people here can never be pleased ... i feel like i am wasting my precious time and will go do something else



Well, I have always been happy with your reviews and I know most others here appreciate them, too. I can't believe people made such a fuss over the driver version for the ati cards. In the end, it was only a 2% difference! Hardly worth getting knickers in a twist over.  I'm sure W1zz had a good technical reason to go with the older driver.

I've always used yours as a benchmark to compare others with. Yours are the ones I quote when discussing graphics cards.

Respect to The Man.


----------



## a_ump (Apr 1, 2010)

dir_d said:


> hmmmm...
> http://www.legitreviews.com/article/1264/1/
> blame it on w1zz?



dir_d, you should be banned. its people like you that don't appreciate how much w1z puts into this website. and bc of members with your disrespect w1z has run his course. There isn't a dam thing wrong with the review. In fact the only flaws i've ever found in w1z's reviews have been occasional typo's, so in reality nothing. You Find another review of the GTX 480/70 that covers and provides as much information as w1z's n then you can say something.


----------



## douglatins (Apr 1, 2010)

Black Panther said:


> It's an excellent review as I stated in previous posts.
> 
> But allow the *'child'* in me to speak: Now I detest fermi more than ever.
> 
> ...



I usually do let the child in me speak. And i was trying to save something for all its worth.


----------



## chaotic_uk (Apr 1, 2010)

good review as always


----------



## jellyrole (Apr 1, 2010)

Thanks for that Fits, but I don't think some of these people understand anything that you just said. You seem to really like the NV side of things(judging from all the cards you've had and conversations with you) and you were able to tell it how it really is. However, it's not ATI's fault that they had time to put a more finesse touch on their drivers due to releasing earlier than NV did. Still, I'm glad you sploded on all the assholes that don't understand some of those points you made.

Very nice review W1zzard, as thorough as can be, as usual!


----------



## Fitseries3 (Apr 1, 2010)

look through this thread....


93% of the people bitching about what drivers were used to test ATI cards are owners of mid to low end ATI cards. 

why? cause they are "fanboys" and want to see nvidia beat by ATI even if they dont have THE ATI card that is beating nvidia's offerings.

another statistic...

89% of the people in this thread bitching about heat and/or temps of the gtx480 dont currently own a GPU that gets even near as hot... most own older ATI and Nvidia G92/G94 cards that dont get very hot. 

so why they bitching? because they probably havent ever owned a card that runs hotter than 65c on air. 



and the best thing i've seen all day....

EVGA did a test with gtx480 on stock air and another gtx480 with their hyrdocopper waterblock with a typical waterloop.

results?

GTX 470
Load Heatsink: 83c
Load Waterblock: 38c

GTX 480
Load Heatsink: 95c
Load Waterblock: 49c


http://www.evga.com/forums/tm.aspx?m=271190



so that tells me that the stock HDT cooler fucking sucks.


----------



## Steevo (Apr 1, 2010)

At 1.35 volts and a 1.6v Phenom II before that in the loop my 5870 at almost 1100Mhz core only runs 42C

480 is a hot bitch. This was even stated by the man who actually had it, tested it, and tried it.


Have you?


----------



## Fitseries3 (Apr 1, 2010)

it is hot but that is to be expected with new tech that hasnt had time to mature.


----------



## Steevo (Apr 1, 2010)

6 months isn't time? Give me all your money for 6 months I will tell you your investment hasn't had time to mature, give it more time.


fanboi


----------



## imperialreign (Apr 1, 2010)

Fitseries3 said:


> and the best thing i've seen all day....
> 
> EVGA did a test with gtx480 on stock air and another gtx480 with their hyrdocopper waterblock with a typical waterloop.
> 
> ...





Wow . . .

Those are _drastic_ differences.

I wonder, though, if those results _are_ legit, and not fudged by "lab environment" testing . . . or if perhaps the cooler is more of a "reference" design.  It's a little unusual for the branded companies to have a "performance cooler" setup so soon to release - especially _before_ the series release.  It kinda leaves me to figure that the vendors knew of the thermal issues well in advance - especially if they've had enough time to design alternative coolers.

Just my musings . . .


----------



## Fitseries3 (Apr 1, 2010)

Steevo said:


> 6 months isn't time? Give me all your money for 6 months I will tell you your investment hasn't had time to mature, give it more time.
> 
> 
> fanboi



there is no schedule for when the next gpu HAS TO come out. it happens when it happens.

and no matter what anyone thinks... i am not a nvidia fanboy. 

i have had just as many ATI cards as i have Nvidia cards. 

i am currently hunting for some ATI 5k cards to play with.


----------



## Steevo (Apr 1, 2010)

Other than broken promises, constantly changing specs, and a final release that ATI could have done with the 5870, pump more volts, and heat it up more. I assure you almost every chip could have made a 1Ghz 5870 at 1.25 volts and been still cooler than fermi, pushing performance up in step with core speed as is shown in every 5870 review. At 1062 I beat a 480 by a decent margin, and still spent less with a game included. I could hit that on air with the same noise and heat a 480 makes.


Those in the know, just know fermi sucks unless you plan on some massive cooling. I really still want one to play with, try a 437W TEC and dedicated water loop on it. Or phase. I believe with LN you could even hit 900 on the core, just who is going to have the balls to solder PSU mains to the board to do it?


----------



## Fitseries3 (Apr 1, 2010)

so lets say ATI wins this round of the battle...

thats 1 in what.... 12?

again... im not a fanboy but dont make judgement until the product hits retail. 

and why does it matter that the specs changed?

if you are in competition with someone who in their right mind would want any specs to get to their competitors? they shouldnt let us know any details until the days its launched.


----------



## Steevo (Apr 1, 2010)

The RAGE and the like were a poor mans 3D solution as ATI was in its infancy. I will give you that point. But......

8500 http://www.guru3d.com/review/ati/radeon8500/ 

Wait, does that look like...tessellationtruform. Something that nvidia put pressure on devs not to use. Close performance to the Ti, and got better.


The 9000 series raped Nvidia, twice. As a 9600, 9800 owner I know how well they performed.

The X800XT and the setup for that was a huge push for this site, the BIOS modding for unlocks and overclocking potential.

X1800XT when released, shortly followed by X1900 series were the fastest cards. (Also almost the loudest!)


A fail with 2900, mediocrity with the 3XXX series and performance per dollar dominance for the 4XXX series, continued to the 5XXX series.


So 1 to 12, no, about even with ATI bringing lots of other new features, and forcing nvidia to play catchup. ATI rage theater.


----------



## Jadawin (Apr 1, 2010)

Probably the best magazine for professionals in Germany has reviewed the Geforce GTX 480, too... and they are writing: "Die GeForce GTX 480 besitzt einen sechs- und einen achtpoligen Stromanschluss und soll laut Nvidia maximal 250 Watt aufnehmen. In unseren Messungen traten jedoch sogar Spitzen von bis zu 302 Watt auf."

Translated this means: "The Geforce GTX 480 uses one six and one eight pin powerplug and should use a maximum of 250 Watts according to Nvidia. We measured up to 302 Watts in our tests".

http://www.heise.de/newsticker/meldung/Nvidia-praesentiert-GeForce-GTX-470-und-GTX-480-965390.html


Case closed.


----------



## DaedalusHelios (Apr 1, 2010)

Steevo said:


> 6 months isn't time? Give me all your money for 6 months I will tell you your investment hasn't had time to mature, give it more time.
> 
> 
> fanboi



Calling Fit a Nvidia fanboy is funny. He held onto two 3870X2's for a while when 9800Gx2 tore ATi a new asshole. 

So if anything call him an ATi fanboi like you. But that wouldn't have the same shock value would it. 

Fit isn't really a fanboi though. He tries to be an objectivist due to the fact he is obsessed with whatever is the highest performance and trying out all the new tech. I like to try out all the new tech too but I don't have as deep of pockets. Or perhaps I am too busy having my money consumed by other pursuits. Regardless you can call him what you want but "Nvidia fanboi", he definately is not.


----------



## roast (Apr 1, 2010)

pantherx12 said:


> This is an "important" review, it should of been done right.



I've only read up to page 8 of this thread, and already you're wreckin' my head.
The review _was_ done right.

And if for whatever stupid reason you think it wasnt, in your deluded little world, convince yourself that if you want something done right, do the fxcking review yourself.

Review was perfect. End of. Everyone stop whinging or im gonna have to release my ginger ninja's on y'all bitches.


----------



## qubit (Apr 1, 2010)

@roast: yup the review was indeed perfect. Naysayers should quit their moaning.

+1.


----------



## animal007uk (Apr 1, 2010)

All i can say is im not impressed and havent liked nvidia for years but thats only down to how much there cards costed, in my own test my old ati x1650 pro beat anything nvidia had to offer in the same kind of price range, infact i personly found nvidia card costing around £20 more and had less performance.

Nvidia have a lot to do before i will ever buy one again, and the one good thing i like with ATI is they have cards for all kinds of people, from the low end to the top of the range and at good prices.

Good luck to you nvidia but i will not buy anything you offer for a long time.

P.S good review even if it was with old drivers, << and another thing not everyone gets the improvments with newer drivers so i think it was a fair review personly, ok im out and have no more to say on this got better things to do.


----------



## gvblake22 (Apr 1, 2010)

Cap'n Killmore said:


> First time poster, long time reader...
> 
> seriously felt like i needed to make this post.... this is by far my favorite HW site on the interwebs.
> 
> ...


I second this (or third, or fourth, depending on how many other people have quoted it).  I visit the site multiple times every day hoping for more content from the great staff here and love reading the video card reviews in particular.  If you are indeed leaving W1zzard, you better make damn sure that whoever replaces you continues to do the performance summary charts!


----------



## overclocker (Apr 1, 2010)

I am going to miss wizz and think hes the reason for a lot of people being here. good luck wizz, you can do better.


----------



## Andy77 (Apr 1, 2010)

DaedalusHelios said:


> Calling Fit a Nvidia fanboy is funny. He held onto two 3870X2's for a while when 9800Gx2 tore ATi a new asshole.



Funny, looking at his specs reveals something else...  I hope those are not real?

As to what Fit said earlier about drivers, it has no relevance right now how optimized some drivers are and how others aren't. Not fair? Save it for the poster boy! Right now, a buyer has the choice to get for this amount of money one card or the other, one with and the other without optimized drivers. FWIW, Nvidia might not even bother to optimize this failed Fermi series and would better push everything it has for a better Fermi II. Think of it like Vista vs 7. You think MS will give a damn about Vista... they will push the "you could upgrade to 7 for x.99" line and people will have to comply or suffer lower support, if any.

Besides, they had A2 lower clock silicon since December of January... not optimized?  What about their own Heaven 1.1 benchmark which people couldn't even download? IDK, but I smell optimizations here.

What was shown in W1zz's review is that even against an unoptimized Cypress, Fermi does a poor job for its position in the market... and the need of the masses to see the comparisons with the latest drivers could have been anticipated. It's only fair that right now, this is the exact performance that anyone could get from current hardware.

Wonder if what Fit said will hold true when Southern/Northen Islands come out? Current beta drivers for Fermi vs the then to be release drivers that support the islands will make it for a fair review.  And we could continue this till it never ends...

The review is comprehensive with lots of tests and the performance summary are something quite useful as gvblake22 also seems to like them, but it's also a... sad/dumb way, no words for this... to see it end.

One thing's for sure, no one can please everyone, never mind rabid fanboys, no matter how much work is put into it... that's why ignorance is bliss.


----------



## Super XP (Apr 1, 2010)

If you guys don’t want another thread close, stop clowning around.


----------



## Naekuh (Apr 1, 2010)

imperialreign said:


> Wow . . .
> 
> Those are _drastic_ differences.
> 
> ...



water averages a 40-50% reduction in load temps *on GPU*. 

Its not fudge, thats just how powerful a well built water setup does.



Fitseries3 said:


> so why they bitching? because they probably havent ever owned a card that runs hotter than 65c on air.



Am i allowed to bitch then since i dont fit in both of your statistics?


----------



## shevanel (Apr 1, 2010)

I bet watercooled/overclocked 480's are going to slam if those temps really stay down that low.


----------



## DaedalusHelios (Apr 1, 2010)

Naekuh said:


> Am i allowed to bitch then since i dont fit in both of your statistics?



Nope, you forfeit those privileges when using anime as an avatar. It is one of those unsaid rules of the internet.


----------



## Wile E (Apr 2, 2010)

DaedalusHelios said:


> Nope, you forfeit those privileges when using anime as an avatar. It is one of those unsaid rules of the internet.



What?!?!?!?!? FFFFFFFFFFFUUUUUUUUUUUUUU!!!!!!!!!!!!!!!!!


----------



## imperialreign (Apr 2, 2010)

Naekuh said:


> water averages a 40-50% reduction in load temps *on GPU*.
> 
> Its not fudge, thats just how powerful a well built water setup does.




I'm still simply shocked by it - I've never even had the urge to look into HOH cooling for any of my GPU setups . . . mostly due to the fact that my rig has simply gotten out of control, and don't have the space for it - and two, that I upgrade my GPUs so often I can't justify the expense when GPU cooler are not always "universal" . . . especially for hardware that is yet to be released.

Still, though, for any of the green camp brands to already have a HOH cooler ready for release . . . that kinda hints at the possibility that the whole green camp knew these cards were going to be nuclear.  Usually there's a good 2-3 month stint between intial release, and the brands releasing "performance cooled" cards.


----------



## Wile E (Apr 2, 2010)

imperialreign said:


> I'm still simply shocked by it - I've never even had the urge to look into HOH cooling for any of my GPU setups . . . mostly due to the fact that my rig has simply gotten out of control, and don't have the space for it - and two, that I upgrade my GPUs so often I can't justify the expense when GPU cooler are not always "universal" . . . especially for hardware that is yet to be released.
> 
> Still, though, for any of the green camp brands to already have a HOH cooler ready for release . . . that kinda hints at the possibility that the whole green camp knew these cards were going to be nuclear.  Usually there's a good 2-3 month stint between intial release, and the brands releasing "performance cooled" cards.



Nah. Evga and BFG have been doing factory h2o cards for years now.

And buy/build a tech station, and watercool everything. It's how I run 2 loops.


----------



## entropy13 (Apr 2, 2010)

DaedalusHelios said:


> Nope, you forfeit those privileges when using anime as an avatar. It is one of those unsaid rules of the internet.



<---That's you btw.


----------



## nt300 (Apr 2, 2010)

imperialreign said:


> I'm still simply shocked by it - I've never even had the urge to look into HOH cooling for any of my GPU setups . . . mostly due to the fact that *my rig has simply gotten out of control*, and don't have the space for it - and two, that I upgrade my GPUs so often I can't justify the expense when GPU cooler are not always "universal" . . . especially for hardware that is yet to be released.
> 
> Still, though, for any of the green camp brands to already have a HOH cooler ready for release . . . that kinda hints at the possibility that the whole green camp knew these cards were going to be nuclear.  Usually there's a good 2-3 month stint between intial release, and the brands releasing "performance cooled" cards.


That’s an understatement, your rig is a disaster, one which I wish I had


----------



## imperialreign (Apr 2, 2010)

Wile E said:


> Nah. Evga and BFG have been doing factory h2o cards for years now.
> 
> And buy/build a tech station, and watercool everything. It's how I run 2 loops.




Oh, I know there've been some brands with factory HOH coolers - I'm just not used to seeing them so soon after a card's release.

Regarding the rig - I've been giving thought to building a tech station this last month . . . possibly something that can support/contain 2-3 systems . . . I really need to draw up some plans.



nt300 said:


> That’s an understatement, your rig is a disaster, one which I wish I had



Thanks.  Although, I swear sometimes that hardware just spawns within my rig.  It's not unusual for me to be fiddling around with things and say "WTF did I buy this?!"


----------



## Steevo (Apr 2, 2010)

Your powerbill.


----------



## crazyeyesreaper (Apr 2, 2010)

can we just have this thread locked now the review is out so are everyone elses and the thread no longer has any significant contribution just back and forth mudslinging after the first few pages


----------



## Steevo (Apr 2, 2010)

But its fun. 


I can't wait to see the reviews and overclocked performance of the watercooled editions. For now this is probably the most cussed and discussed review on TPU. I say leave it.


----------



## dumo (Apr 2, 2010)

GTX480s will replace my HD5970

You have to try it to be able to give an opinion right?

If it turned out to be a flame thrower then stuck it with water block


----------



## Fitseries3 (Apr 2, 2010)

dumo said:


> GTX480s will replace my HD5970
> 
> You have to try it to be able to give an opinion right?
> 
> If it turned out to be a flame thrower then stuck it with water block



+100

temps are good on water


----------



## DaedalusHelios (Apr 2, 2010)

Fitseries3 said:


> +100
> 
> temps are good on water



Who knows what it could scale to on water and maybe more volts. Vantage record anyone?


----------



## dumo (Apr 2, 2010)

DaedalusHelios said:


> Who knows what it could scale to on water and maybe more volts. Vantage record anyone?


It will chug LN2 like winos in a vineyard


----------



## Maban (Apr 2, 2010)

I would still like to know the power consumption while overclocked. I think you should add that to every review.


----------



## DaedalusHelios (Apr 2, 2010)

dumo said:


> It will chug LN2 like winos in a vineyard



As with any Vantage record really. Fermi is no different in that respect. I like regular TEC just as much and it isn't as dangerous. I have never used LN2 on computer parts though.


----------



## qubit (Apr 2, 2010)

I haven't seen any F@H benchies for this thing. I'd be interested to know how it compares to the 285 & 295. Of course, I'm only interested in a W1zzard review, not any other site...


----------



## cowie (Apr 3, 2010)

glad wizz did not use the ati cheater drivers...what you would not take the payola wizz? good for you.


http://www.pcgameshardware.de/aid,7...ias-GF100-Generation/Grafikkarte/Test/?page=3

http://www.xtremesystems.org/forums/showthread.php?t=248755

this 480 card is going haul ass when cold


----------



## DaedalusHelios (Apr 3, 2010)

cowie said:


> glad wizz did not use the ati cheater drivers...what you would not take the payola wizz? good for you.
> 
> 
> http://www.pcgameshardware.de/aid,7...ias-GF100-Generation/Grafikkarte/Test/?page=3
> ...



I love how the ATi fanboys in the linked thread said silly faggy comments like, "We are talking about ATi here. What did they ever do to us?". Most pathetic crap I have ever seen. No company has ever made me their bitch.


----------



## qubit (Apr 3, 2010)

*The moral of the story*

Also, for anyone that thinks W1zzard's reviews aren't comprehensive enough, compare the latest GTX 480 review with his 8800 GTX review from November 2006. He only compared it to 3 cards back then. Now, it's what, 10-15 at a time and with more games and resolutions? That's a lot more work than before and much more informative for us. There's also more information on various aspects of the card in current reviews.

The moral of the story is to appreciate and respect the guy for his quality reviews and to provide _constructive_ criticism and suggestions only. Leave the fanboi flaming at home.


----------



## LAN_deRf_HA (Apr 3, 2010)

cowie said:


> glad wizz did not use the ati cheater drivers...what you would not take the payola wizz? good for you.
> 
> 
> http://www.pcgameshardware.de/aid,7...ias-GF100-Generation/Grafikkarte/Test/?page=3
> ...



How quickly people forget what they don't want to remember. Nvidia did the exact same thing with their drivers back when we had 3870 vs 8800 GT reviews.


----------



## crazyeyesreaper (Apr 3, 2010)

and im gonna be honest looking at that i see no fucking difference period on that TINY section of screen so im sure as hell not gonna notice it when im running around shooting ppl and blowing stuff up ppl seriously need to take a step back and look close and tell me that IQ difference is worth the shit storm i see on multiple forums now.  all i can think is wtf who cares im semi blind anyway i surely cant tell the difference can you?


----------



## Benetanegia (Apr 3, 2010)

crazyeyesreaper said:


> and im gonna be honest looking at that i see no fucking difference period on that TINY section of screen so im sure as hell not gonna notice it when im running around shooting ppl and blowing stuff up ppl seriously need to take a step back and look close and tell me that IQ difference is worth the shit storm i see on multiple forums now.  all i can think is wtf who cares im semi blind anyway i surely cant tell the difference can you?



Of course we can tell the difference lol. It's like going back to the 90's.

I'd like to note that this is not new at all. For those with short memories, this "optimization" was discovered by Alienbabeltech.com just weeks after the HD5xxx series was released and was mainly posted and defended by Bo_Fox in several threads. Although I could clearly see the difference in many games, I didn't insist too much on the matter, but it did start the succession of events that led to the permanent ban of Bo_Fox. I'm not saying it was banned to censor him, since he was banned for how he did it and not for what he was saying, namely breaking many rules, but it was the constant "criticism" he suffered from Ati fanboys, who were constantly denying the facts and at some points insulting him, which led to his behavior.

http://alienbabeltech.com/main/?p=12648&page=2









> With 8xSS, the surfaces in the gray rings look absolutely superb, and all star patterns are almost invisible. Sadly the sharp transitions are still present however.



http://alienbabeltech.com/main/?p=12648&page=6



> Also if you look carefully in all of the 5770 pictures, you can see two borders running horizontally that very slightly mark transitions of varying texture detail. This is probably caused by the same issue that caused the sharp gray transitions earlier, and I’ve marked the 8xSS picture with white arrows to show them. It’s tough to see them in still screenshots, but they’re more apparent during movement.


----------



## crazyeyesreaper (Apr 3, 2010)

well i loaded crysis and warhead and i notice no difference in image quality at all and im on the 10.3b drivers if theres a difference so be it dont really care ive seen a bigger image quality difference going from monitor to monitor so it has no impact on me


----------



## TheMailMan78 (Apr 3, 2010)

Benetanegia said:


> Of course we can tell the difference lol. It's like going back to the 90's.
> 
> I'd like to note that this is not new at all. For those with short memories, this "optimization" was discovered by Alienbabeltech.com just weeks after the HD5xxx series was released and was mainly posted and defended by Bo_Fox in several threads. Although I could clearly see the difference in many games, I didn't insist too much on the matter, but it did start the succession of events that led to the permanent ban of Bo_Fox. I'm not saying it was banned to censor him, since he was banned for how he did it and not for what he was saying, namely breaking many rules, but it was the constant "criticism" he suffered from Ati fanboys, who were constantly denying the facts and at some points insulting him, which led to his behavior.
> 
> ...



Bah more excuses.


----------



## Benetanegia (Apr 3, 2010)

TheMailMan78 said:


> Bah more excuses.



Bah, more denying facts.


----------



## TheMailMan78 (Apr 3, 2010)

Benetanegia said:


> Bah, more denying facts.



Nope its just people keep going on and on about TWIMTBP and ATI AI when both sides uses them as excuses for their cards short comings. Get over it. Both sides do it.


----------



## Benetanegia (Apr 3, 2010)

TheMailMan78 said:


> Nope its just people keep going on and on about TWIMTBP and ATI AI when both sides uses them as excuses for their cards short comings. Get over it. Both sides do it.



Oh yeah, of course both sides do it, that's why this should have attention, so that it gets fixed. Nvidia fixed the issue with 169.04 in a little over a week, when they released the 169.21 betas. If Ati does the same and fixes it soon, there's no harm. But that's exaclty why this has to be mentioned everywhere, because it's been happening for months, but with little notoriety and hence it was not fixed!!


----------



## TheMailMan78 (Apr 3, 2010)

Benetanegia said:


> Oh yeah, of course both sides do it, that's why this should have attention, so that it gets fixed. Nvidia fixed the issue with 169.04 in a little over a week, when they released the 169.21 betas. If Ati does the same and fixes it soon, there's no harm. But that's exaclty why this has to be mentioned everywhere, because it's been happening for months, but with little notoriety and hence it was not fixed!!



Because its not noticeable. Who cares. I mean really.


----------



## Benetanegia (Apr 3, 2010)

TheMailMan78 said:


> Because its not noticeable. Who cares. I mean really.



It's not noticeable? 

Maybe you don't, but I do and many do. I know that I would never pay for a faster newer card so that I can increase IQ (i.e AA levels) just to get the IQ reduced instead. Yah many people claim they see no difference between AA and no AA, or AF and no AF, but there is. The seams between LODs is very noticeable, the fact that some or some can't see the difference means nothing. We would need to know if you all really see no difference. Ask a PS3 or Xbox360 owner if they notice the lack of AA/AF or the fact that their 1080p games are upscaled...


----------



## TheMailMan78 (Apr 3, 2010)

Benetanegia said:


> It's not noticeable?
> 
> Maybe you don't, but I do and many do. I know that I would never pay for a faster newer card so that I can increase IQ (i.e AA levels) just to get the IQ reduced instead. Yah many people claim they see no difference between AA and no AA, or AF and no AF, but there is. The seams between LODs is very noticeable, the fact that some or some can't see the difference means nothing. We would need to know if you all really see no difference. Ask a PS3 or Xbox360 owner if they notice the lack of AA/AF or the fact that their 1080p games are upscaled...



The difference we are talking about is no where near that. Of course you can tell the difference between a lack of AA/AF but the issue he was talking about is a mild engine tweak at driver level. You are exaggerating.


----------



## crazyeyesreaper (Apr 3, 2010)

and im semi blind so eitherway can we stop the pissing match?  if its a bug its a bug if its a cheat its a cheat i dont give a shit unless it STOPS my game from running then i have a problem


----------



## Benetanegia (Apr 3, 2010)

TheMailMan78 said:


> The difference we are talking about is no where near that. Of course you can tell the difference between a lack of AA/AF but the issue he was talking about is a mild engine tweak at driver level. You are exaggerating.



No. For me those seams are far more noticeable than AA at 1920x1200 or between 2xAF and 16xAF. There's no comparison between some little jaggies and a constant obvious line 2 meters in front of you.

You never use Vsync I suppose, since it would be the most stupid thing you could do. If you cant see a permanent line, why on earth would you see a line that appears just 1 or 2 frames every second...


----------



## TheMailMan78 (Apr 3, 2010)

Benetanegia said:


> No. For me those seams are far more noticeable than AA at 1920x1200 or between 2xAF and 16xAF. There's no comparison between some little jaggies and a constant obvious line 2 meters in front of you.
> 
> You never use Vsync I suppose, since it would be the most stupid thing you could do. If you cant see a permanent line, why on earth would you see a line that appears just 1 or 2 frames every second...



Show me the screen shots.


----------



## dumo (Apr 3, 2010)

Probly better to hold all the pissing match until next week (@ least for No. America region) when GTX480 and 470 will be available at retail channel

Buy the card or cards and see for yourself


----------



## HalfAHertz (Apr 3, 2010)

TheMailMan78 said:


> Because its not noticeable. Who cares. I mean really.



The problem is that if they don't fix it and say two driver updates later there's another bug that degrades IQ and then 3 updates later another one, sooner or later it all adds up and we end up with much poorer IQ because of laziness...


----------



## TheMailMan78 (Apr 3, 2010)

HalfAHertz said:


> The problem is that if they don't fix it and say two driver updates later there's another bug that degrades IQ and then 3 updates later another one, sooner or later it all adds up and we end up with much poorer IQ because of laziness...



Like I said I want to see some screens


----------



## Benetanegia (Apr 3, 2010)

TheMailMan78 said:


> Like I said I want to see some screens



It's far more noticeable while gaming than on a still image. And seriously I'm not going to waste my time showing you further evidences, if it doesn't bother you, that's right, but don't say there's nothing there, because there is. This is like dicussing if there is difference between 4xAA and 32xAA, of course there is a heavy improvement, but it's subjective and some will no see it (some will not care, some will be willing to accept it for improved performance...). But in this analogy we have been using 32xAA since late 90's and some of us just don't want to go back.


----------



## cowie (Apr 3, 2010)

i'm sry to post that but keep it civil and dont get booted,i would feel bad if that would happen.
if true(which it looks 1000%) the thing that gets my goat no 1 gives a rats ass might be playing games that look like pong again lol
if no one say shit about it wont get fixed or even admitted


----------



## TheMailMan78 (Apr 3, 2010)

Benetanegia said:


> It's far more noticeable while gaming than on a still image. And seriously I'm not going to waste my time showing you further evidences, if it doesn't bother you, that's right, but don't say there's nothing there, because there is. This is like dicussing if there is difference between 4xAA and 32xAA, of course there is a heavy improvement, but it's subjective and some will no see it (some will not care, some will be willing to accept it for improved performance...). But in this analogy we have been using 32xAA since late 90's and some of us just don't want to go back.





cowie said:


> i'm sry to post that but keep it civil and dont get booted,i would feel bad if that would happen.
> if true(which it looks 1000%) the thing that gets my goat no 1 gives a rats ass might be playing games that look like pong again lol
> if no one say shit about it wont get fixed or even admitted



Honestly I have yet to see any real proof of it.


----------



## DaedalusHelios (Apr 3, 2010)

TheMailMan78 said:


> Nope its just people keep going on and on about TWIMTBP and ATI AI when both sides uses them as excuses for their cards short comings. Get over it. Both sides do it.



Nvidia makes a mistake = outrage torward those evil fucks! 

ATi makes a mistake = Get over it. Both sides do it.

Double standard BS. The love of ATi reminds me of feminism. Opinions based on emotion and not logic. Nvidia bashing is expected and welcomed. ATi bashing is somehow wrong doing? Lets be consistent. I question both companies and their motives. We all should.


----------



## Wile E (Apr 3, 2010)

I don't like that IQ is reduced to gain frames. I want the control over that, not my drivers.


----------



## Tatty_One (Apr 3, 2010)

DaedalusHelios said:


> Nvidia makes a mistake = outrage torward those evil fucks!
> 
> ATi makes a mistake = Get over it. Both sides do it.
> 
> Double standard BS. The love of ATi reminds me of feminism. Opinions based on emotion and not logic. Nvidia bashing is expected and welcomed. ATi bashing is somehow wrong doing? Lets be consistent. I question both companies and their motives. We all should.



I agree, sadly there is no logic in fanboi wars but hey...... I suppose the real secret is to not be a fanboi!


----------



## DaedalusHelios (Apr 4, 2010)

Tatty_One said:


> I agree, sadly there is no logic in fanboi wars but hey...... I suppose the real secret is to not be a fanboi!



I used to be somewhat of an ATi fanboi a while ago, but people like you taught me to be more objective.


----------



## TheMailMan78 (Apr 4, 2010)

DaedalusHelios said:


> Nvidia makes a mistake = outrage torward those evil fucks!
> 
> ATi makes a mistake = Get over it. Both sides do it.
> 
> Double standard BS. The love of ATi reminds me of feminism. Opinions based on emotion and not logic. Nvidia bashing is expected and welcomed. ATi bashing is somehow wrong doing? Lets be consistent. I question both companies and their motives. We all should.



What mistake? I want to see some screens. All I hear is a bunch of hearsay. Anyway an ATI IQ "mistake" wont blow a PSU like a Nvidia voltage "mistake". Apples and oranges. Should ATIs issue be fixed? It damn sure should if its true but Nvidia could cost you money. Big difference. Also I have no "outrage". I could care less. I buy whats best for me at any given time. Right now thats ATI. Next year it could be Nvidia.

All Im asking is to see some screens.


----------



## cliffmidnite (Apr 5, 2010)

TheMailMan78 said:


> What mistake? I want to see some screens. All I hear is a bunch of hearsay. Anyway an ATI IQ "mistake" wont blow a PSU like a Nvidia voltage "mistake". Apples and oranges. Should ATIs issue be fixed? It damn sure should if its true but Nvidia could cost you money. Big difference. Also I have no "outrage". I could care less. I buy whats best for me at any given time. Right now thats ATI. Next year it could be Nvidia.
> 
> All Im asking is to see some screens.



I too buy what is best for me at the current time of purchase. Right now it looks like ATI has best price per performance and performance per watt. I grabbed an 8800GTS when they dropped to about $100 bucks, then grabbed a gtx 260 when they hit $180. Then I snatched up a 5870 on sale for $379 in February. I'm thinking this card will tide me over for 2 years or so unless more demanding games actually start to catch up to the gpu power we have available. Cheers to all the unbiased non fanboys!


----------



## nt300 (Apr 5, 2010)

Benetanegia said:


> It's far more noticeable while gaming than on a still image. And seriously I'm not going to waste my time showing you further evidences, if it doesn't bother you, that's right, but don't say there's nothing there, because there is. This is like dicussing if there is difference between 4xAA and 32xAA, of course there is a heavy improvement, but it's subjective and some will no see it (some will not care, some will be willing to accept it for improved performance...). But in this analogy we have been using 32xAA since late 90's and some of us just don't want to go back.


I see difference when using 32xAA versus 4xAA or 8xAA when playing Left 4 Dead 1 and 2 along with FarCry, Oblivion, Titan Quest and a few others. It seems like game gets a little darker and cleaner or sharper with 32xAA but I do have a hard time seeing a PQ difference with 24xAA versus 32xAA.


----------



## Black Panther (Apr 5, 2010)

The ETA on OcUK's site is tomorrow.

Edit: Now that 'tomorrow' has come, they changed the ETA to 12th April... meh


----------



## Sanhime (Apr 5, 2010)

*revisions coming very soon?*

Perhaps only a couple months after Nvidia releases 480, we'll start seeing a 485gtx very very soon?


----------



## DaedalusHelios (Apr 5, 2010)

TheMailMan78 said:


> What mistake? I want to see some screens. All I hear is a bunch of hearsay. Anyway an ATI IQ "mistake" wont blow a PSU like a Nvidia voltage "mistake". Apples and oranges. Should ATIs issue be fixed? It damn sure should if its true but Nvidia could cost you money. Big difference. Also I have no "outrage". I could care less. I buy whats best for me at any given time. Right now thats ATI. Next year it could be Nvidia.
> 
> All Im asking is to see some screens.



If you don't see the trend perhaps you and many others just do it subconciously. It is no big deal to me as I don't care much for either company but it is something that I have noticed and not just from you. This isn't meant to single you out. Heck you are not the best example of it by far.


----------



## Sanhime (Apr 5, 2010)

Something important was missing in the review.  I would like to see the performance of a 480GTX SLI setup.


----------



## TheMailMan78 (Apr 5, 2010)

Sanhime said:


> Something important was missing in the review.  I would like to see the performance of a 480GTX SLI setup.



Good luck getting two fermis.


----------



## Marineborn (Apr 5, 2010)

Sanhime said:


> Something important was missing in the review.  I would like to see the performance of a 480GTX SLI setup.



good luck getting a powerplant, LOLZ


----------



## mdsx1950 (Apr 6, 2010)

Sanhime said:


> Perhaps only a couple months after Nvidia releases 480, we'll start seeing a 485gtx very very soon?



And hopefully GTX495 too  I would love to see the performance and the heat of the sun generated by the dual GPU card that beats the 5970


----------



## nt300 (Apr 6, 2010)

Sanhime said:


> Perhaps only a couple months after Nvidia releases 480, we'll start seeing a 485gtx very very soon?


Not likely. Look like Nvidia need to scrap GF100 completely and move to something that works and make them profit.



> Getting back to the GF100, sources at Nvidia are confirming that the yields are far less than 20 percent on both of the variants, GTX480 and GTX470, combined. This means that the silicon for the GPU, before packaging and testing, costs at least $250 for each part. Once you add all the components and make the card, there is no way the GTX470 and GTX480 can make a profit, given what they are being sold for.
> 
> If you are queued up for the second batch, don't expect there to be one. If Nvidia is losing money on each board it sells, trying to make it up in volume is not a sane way to proceed. Expect Nvidia to once again pretend that the cards are not EOL'd and trickle the stock it puts aside out over the next few months. If this sounds eerily familiar, that is because Nvidia is doing the exact same thing with the GTX285. It is keeping up appearances while its partners get no stock and suffer the financial consequences.
> 
> ...


----------



## Steevo (Apr 6, 2010)

I'm guessing Nvidia will stockpile the best cores, use the worst cores for low end cards and cheapo workstation cards, then release a high end Fermi part that can hit a certain speed with full shaders, name it the 495 or some bullshit, add a limited time only thingy, throw on some huge and insane cooler, or force watercooling, and sell it for 699 each. 


GPU's need to make the move to SOI.


----------



## SetsunaFZero (Apr 6, 2010)

Steevo said:


> I'm guessing Nvidia will stockpile the best cores, use the worst cores for low end cards and cheapo workstation cards, then release a high end Fermi part that can hit a certain speed with full shaders, name it the 495 or some bullshit, add a limited time only thingy, throw on some huge and insane cooler, or force watercooling, and sell it for 699 each.


 thats what i thought to 

i guess the refresh is gonna be 32nm. As for now im gonna stick with my gtx275 for another year. 
It doesn't pays off to bay a new card now.:shadedshu


----------



## Super XP (Apr 6, 2010)

SetsunaFZero said:


> thats what i thought to
> 
> i guess the refresh is gonna be 32nm. As for now im gonna stick with my gtx275 for another year.
> It doesn't pays off to bay a new card now.:shadedshu


Didn't you hear, there isn't 32nm it got scrapped. We are going to see 28nm sometime between Q4 2010 to Q1 2011 for AMD's Northern Island. I don't think 28nm is going to help Fermi with speed all that much, but heat problems maybe.


----------



## SetsunaFZero (Apr 6, 2010)

Super XP said:


> Didn't you hear, there isn't 32nm it got scrapped. We are going to see 28nm sometime between Q4 2010 to Q1 2011 for AMD's Northern Island. I don't think 28nm is going to help Fermi with speed all that much, but heat problems maybe.


cool  ye speed won't change but the power consume and heat. Thats what im looking forward


----------



## newtekie1 (Apr 6, 2010)

TheMailMan78 said:


> What mistake? *I want to see some screens. All I hear is a bunch of hearsay. Anyway an ATI IQ "mistake" wont blow a PSU like a Nvidia voltage "mistake".* Apples and oranges. Should ATIs issue be fixed? It damn sure should if its true but Nvidia could cost you money. Big difference. Also I have no "outrage". I could care less. I buy whats best for me at any given time. Right now thats ATI. Next year it could be Nvidia.
> 
> All Im asking is to see some screens.



I just have to point out the irony of that statement.  Considering there is no proof of nVidia's "mistake" killing a single PSU.

Doesn't that kind of prove Deadhelios' point about the hypocritical BS?  You demand proof of ATi's issue, but are more than happy to believe hearsay about nVidia without a single shred of proof?

Oh, and of course there were screenshots proving ATi's mistake, if you read the original thread about it...

And if you really want to compare apples to apples, look at the HD4850 issues with furmark loops burning up the cards... Furmark running in a loop is torture on a machine, that is why it is called a torture test.



DaedalusHelios said:


> If you don't see the trend perhaps you and many others just do it subconciously. It is no big deal to me as I don't care much for either company but it is something that I have noticed and not just from you. This isn't meant to single you out. Heck you are not the best example of it by far.



I'm just glad I'm not the only one that notices the trend.  It is probably why most people notice me "bashing ATi" so much, when really I'm just defending nVidia from idiotic fanboys that believe that any ill talk of ATi of positive talk of nVidia makes me an nVidia fanboy.  If they would pay attention, they would probably see that I usually defent ATi in any idiotic thread made about them also, those thread just come along far less often around here, and when they do and I defend ATi most don't even notice because they are too busy only reading posts that are negative about ATi, they tend to just skim over the postive posts.


----------



## TheMailMan78 (Apr 7, 2010)

newtekie1 said:


> I just have to point out the irony of that statement.  Considering there is no proof of nVidia's "mistake" killing a single PSU.
> 
> Doesn't that kind of prove Deadhelios' point about the hypocritical BS?  You demand proof of ATi's issue, but are more than happy to believe hearsay about nVidia without a single shred of proof?
> 
> ...


 And that is why w1zz contacted Nvidia about their mistake? I guess it was just a typo right? I mean voltage is just as much part of opinion as say image quality?  As for any "proof" of Nvidia killing a PSU lets not be stupid. The damn card isn't on the shelf yet and even if it was Nvidia would never admit (nor ATI) of such a mistake. Playing with voltage numbers is FAR more dangerous than bending IQ to "cheat". Also speaking of IQ I saw no proof in that link. So lets sum this up simple like.

1. Nvidia "played" with the voltage numbers. Thats a fact unless you think W1zz lies.
2. ATI cheating in the IQ section has yet to have any proof. 

With that being said I have no idea if ATI is doing what some of you claim. It wouldn't surprise me to be honest. I just want some concrete proof.


----------



## erocker (Apr 7, 2010)

To everyone:

This "thread" is for the discussion of this GTX480 review. This is not an ATi vs. Nvidia thead. Post your thoughts on the review and move along.

Thanks for your cooperation.


----------



## 20mmrain (Apr 7, 2010)

The fact is that Nvidia screwed up this round and was too ambitious. They made a card that runs hotter then the competition. They made a card that uses more power than the competition. They also made a card that despite having a 6 month later due date .... is only about 10% better then the competition.
With from what I hear Nvidia was having money problems to begin with. I don't think this will be the end for Nvidia. But I think we might see a couple of low years from them. I think we might also see things like the market share start to become more 50/50 over the next year or two.

But while I have always been more in favor for the red team..... these problems that Nvidia could start having still doesn't make me excited. All these problems mean that price wars are probably out of the question. I think that is bad news for everyone.

Let's hope with maybe some driver tweaks and a little bit of reworking like (a 8800gt to a 9800gt or a GTX 280 to a GTX 285) Nvidia can pull this technology off.... and we can see even better technology in the future as well as some decent price wars happening in the future.

Although while these cards are listed for sell everywhere know one has one. So Maybe after non reference designs come out and we actually get to play with them..... they might not be as bad as they seem. (except for the heat issue.)  That I am still not sure how they can control that with out a really beefed up air cooler if not having to be water cooled.

When I saw this ..... I didn't think it was a good sign.....

http://www.evga.com/products/moreInfo.asp?pn=015-P3-1489-AR&family=GeForce%20400%20Series%20Family







The only way EVGA can produce a FTW edition card is to have this thing water cooled. That isn't good at all!


----------



## erocker (Apr 7, 2010)

20mmrain said:


> The fact is that Nvidia screwed up this round and was too ambitious. They made a card that runs hotter then the competition. They made a card that uses more power than the competition. They also made a card that despite having a 6 month later due date .... is only about 10% better then the competition.
> With from what I hear Nvidia was having money problems to begin with. I don't think this will be the end for Nvidia. But I think we might see a couple of low years from them. I think we might also see things like the market share start to become more 50/50 over the next year or two.
> 
> But while I have always been more in favor for the red team..... these problems that Nvidia could start having still doesn't make me excited. All these problems mean that price wars are probably out of the question. I think that is bad news for everyone.
> ...



I'll take one of those.  Running that card in my loop would be nothing but good!    -not too pricey though...


----------



## 20mmrain (Apr 7, 2010)

erocker said:


> I'll take one of those.  Running that card in my loop would be nothing but good!



Well for sure if you had a loop..... but for people like me (who haven't invested in a loop yet) and for most of the other people in the world..... it won't be practical.

Face it even most gamers it wouldn't be practical for. 

I think allot of Nvidia people were waiting for this card to come out and be the next big thing...... I believe most of those same people were waiting for it ..... so that they could upgrade their aging 8800GTX's or their 9800GTX's. I know I have heard allot of people claiming that. 
So If I were someone waiting for this card..... and put all my eggs in one basket.... I would be upset too.

I mean sure they could always buy from ATI..... but come on .... call it fanboyism call it what ever you want. But card loyalty is just a fact of enthusiast life and if your brand comes out with a bomb .... your going to be a little bit angry. Especially if you need special equipment that you don't have to run it. 

But with that said .... once I get my loop I would gladly play with a GTX 480 FTW too (If someone gave it to me). I bet you a good water loop could get that card down to at leeast 80c LOL 



> -not too pricey though...



I don't know man....$649.99 for really mild extra performance. That seems like a big jump to me From $499.99 or $549.99. You could get a card that is 30% to 40% more powerful for about the same price.


----------



## erocker (Apr 7, 2010)

Meh, I think people know what it is now. It's not spectacular by any means. People do have a choice. There will be plenty of air cooled versions and if someone buys one without knowing these are hot cards that need very good airflow it's on them. I know a Lamborghini's suck a lot of gas and break down often but that's not the reason I'd buy one.


----------



## 20mmrain (Apr 7, 2010)

erocker said:


> Meh, I think people know what it is now. It's not spectacular by any means. People do have a choice. There will be plenty of air cooled versions and if someone buys one without knowing these are hot cards that need very good airflow it's on them. I know a Lamborghini's suck a lot of gas and break down often but that's not the reason I'd buy one.



You got a point there..... but I would compare this more too.... a Bently...... Fast enough and comfortable...... but by no means fast enough to compete with a Enzo.

But well spoken none the less Erocker


----------



## entropy13 (Apr 7, 2010)

Lamborghinis suck a lot of gas because of the big engines, but they're quite efficient compared to American "super cars" with big engines.

GTX 480 would be more like American super cars than European.


----------



## 20mmrain (Apr 7, 2010)

entropy13 said:


> Lamborghinis suck a lot of gas because of the big engines, but they're quite efficient compared to American "super cars" with big engines.
> 
> Nvidia would be more like American super cars than European.



+1  But my comparison was more GTX 480 vs 5970 .... but I think yours might of flowed out better!

*****Edit Add-on*****

I wonder how many heads are rolling for this screw up at Nvidia. I mean a card with specs like this one should easily be way more powerful than what it is. I mean they stilled pulled off a Engineering marvel of a video card. And Fermi actually works..... But with how hot it runs. I can't expect that the longevity of these cards are expected to be very long.


----------



## erocker (Apr 7, 2010)

entropy13 said:


> Lamborghinis suck a lot of gas because of the big engines, but they're quite efficient compared to American "super cars" with big engines.
> 
> GTX 480 would be more like American super cars than European.



I was using Lamborghini as a somewhat generic term, but you seem to know your cars. The newer Ford GT comes to mind then. Now that sucks fuel! (drives way nicer IMO )

The 5850 is a supercharged Elise.

Now look what I'm doing. 

Cheers.


----------



## entropy13 (Apr 7, 2010)

erocker said:


> I was using Lamborghini as a somewhat generic term, but you seem to know your cars. The newer Ford GT comes to mind then. Now that sucks fuel! (drives way nicer IMO )
> 
> The 5850 is a supercharged Elise.
> 
> ...



Well an Elise isn't as fast as the Ford GT, so yeah the 480 is faster than the 5850, but the Ford consumes more.


----------



## phanbuey (Apr 7, 2010)

5850 = caterham super 7


----------



## TheMailMan78 (Apr 7, 2010)

entropy13 said:


> Lamborghinis suck a lot of gas because of the big engines, but they're quite efficient compared to American "super cars" with big engines.
> 
> GTX 480 would be more like American super cars than European.



And you do not know much about American engines then if you think that. But thats a disscussion for another thread. PM me if you want details.



20mmrain said:


> The fact is that Nvidia screwed up this round and was too ambitious. They made a card that runs hotter then the competition. They made a card that uses more power than the competition. They also made a card that despite having a 6 month later due date .... is only about 10% better then the competition.
> With from what I hear Nvidia was having money problems to begin with. I don't think this will be the end for Nvidia. But I think we might see a couple of low years from them. I think we might also see things like the market share start to become more 50/50 over the next year or two.
> 
> But while I have always been more in favor for the red team..... these problems that Nvidia could start having still doesn't make me excited. All these problems mean that price wars are probably out of the question. I think that is bad news for everyone.
> ...



Honestly I think Nvidia will come back the next round and blow ATIs socks off. Fermi as I predicted is Nvidias HD2900. We know what that lead to


----------



## entropy13 (Apr 7, 2010)

> And you do not know much about American engines then if you think that. But thats a disscussion for another thread. PM me if you want details.


Obviously off-topic...

Ok, I don't know much about American engines. I wonder why there are more Japanese cars over here than American cars, even though the American cars are cheaper (because of "trade concessions" we have with the US in place every since they "gave us" independence).

I'm not pertaining to the high-end cars though, but rather the Honda Civic, City, Toyota Altis, Vios, and the Ford Focus, Lynx, the Chevrolet cars that I forgot the names. When it comes to the expensive vehicles over here its usually European cars, so you can't really compare the American high-ends over here.


----------



## Wile E (Apr 7, 2010)

20mmrain said:


> Well for sure if you had a loop..... but for people like me (who haven't invested in a loop yet) and for most of the other people in the world..... it won't be practical.
> 
> Face it even most gamers it wouldn't be practical for.
> 
> ...


Practicality is not a concern at the top end, period. You don't buy top end cards to be practical.

That said, $650 is actually a fair price for the eVGA water cooled card. The stock card is $500, and the full coverage blocks generally go for in between $120-150. An enthusiast would really save any money by buying a reference card and an aftermarket full coverage block.


----------



## Apocolypse007 (Apr 7, 2010)

erocker said:


> I was using Lamborghini as a somewhat generic term, but you seem to know your cars. The newer Ford GT comes to mind then. Now that sucks fuel! (drives way nicer IMO )
> 
> The 5850 is a supercharged Elise.
> 
> ...



so is this the 5970?

http://www.autointhenews.com/insanely-fast-now-has-a-face-the-2011-hennessey-venom-gt/


----------



## Deleted member 24505 (Apr 7, 2010)

yeah but can it go around corners. 

I think the 5970 is a ariel atom (awesome little motor)


----------



## KainXS (Apr 7, 2010)

there are custom designs popping up that use less power












> As we’ve stated before in our GTX 480 review, there will likely be a ton of custom PCB designs out there that could possibly help Nvidia stay in the game, although as it stands right now there still aren’t any GTX 480’s out there for sale yet and the card is indeed quite hot and power hungry. AIBs hope to remedy both of these issues with custom coolers and non-reference designs.
> 
> Taiyanfa's custom design PCB lowers the maximum power consumption to *225W* [75W from PCIe x16, two 6-pin connectors count for additional 150W total], which is most probably achieved by lowering the GPU clock. However, we see a lot of potential in this design - for instance, populating six PCIe slots on All-PCIe boards such as ASUS P6T6/P6T7 or EVGA's X58 Classified 4-Way SLI for a GPGPU computing monster, single card design for HTPC chassis, or simply users that don't have 8+6-pin power - such as numerous workstation chassis or already assembled computers from Dell, HP and the like



would be nice to see a 470 with this design


----------



## dumo (Apr 7, 2010)

It will be in stock tomorrow  http://www.newegg.com/Product/Product.aspx?Item=N82E16814125319&cm_re=GTX_480-_-14-125-319-_-Product

Dunno how many they will have in stock, probly it will be sold out in a few hours


----------



## mdsx1950 (Apr 7, 2010)

I wonder how the 3 way SLI performance will be on a GTX480.


----------



## eidairaman1 (Apr 7, 2010)

if you can keep the temps down long enough to run them in SLI.


----------



## TheMailMan78 (Apr 7, 2010)

entropy13 said:


> Obviously off-topic...
> 
> Ok, I don't know much about American engines. I wonder why there are more Japanese cars over here than American cars, even though the American cars are cheaper (because of "trade concessions" we have with the US in place every since they "gave us" independence).
> 
> I'm not pertaining to the high-end cars though, but rather the Honda Civic, City, Toyota Altis, Vios, and the Ford Focus, Lynx, the Chevrolet cars that I forgot the names. When it comes to the expensive vehicles over here its usually European cars, so you can't really compare the American high-ends over here.



If you honestly want to learn PM me. What I don't know I'm sure Wile E and Erocker could help. Wile E is heavy into Jap tuning and Erocker owns an auto shop. But Ill tell you right now you are following a myth on American engines. A good example is the LS series or the Ford Modulars.


----------



## 20mmrain (Apr 7, 2010)

Wile E said:


> Practicality is not a concern at the top end, period. You don't buy top end cards to be practical.
> 
> That said, $650 is actually a fair price for the eVGA water cooled card. The stock card is $500, and the full coverage blocks generally go for in between $120-150. An enthusiast would really save any money by buying a reference card and an aftermarket full coverage block.



Yeah that might be true about the practicality thing but..... Even most high end machines don't have water cooling. And even if our high end systems are not practical. That still doesn't mean I will buy a water cooling loop just to buy a video card. Buying a water cooling loop just to get a video card that is stock overclocked is not just un-practical..... it's kind of not smart. But if your buying it with the original thought of going to be water cooled or you already are....... then sure it makes sense.  So my point still stands.

Also......to need a water block if you ask me..... should have one hell of an over clock to go with it. That card really does not. Even at 100 extra mhz....... a video card should not need a water block for that.



> Honestly I think Nvidia will come back the next round and blow ATIs socks off. Fermi as I predicted is Nvidias HD2900. We know what that lead to



Yeah but also remember how far behind Nvidia is with their next generation. ATI could be on their second generation from the 5800 series by the time that rolls around. 

No in reality..... I just think that things are finally evening out. I think from now one we will see even competition for awhile! Like one manufacturer will beat the other by 10% from time to time but otherwise will be on a pretty even keel.

But I could be wrong..... but just remember ATI's Commander and cheif made a mission statement to beat Nvidia from now on. I don't think that will always happen..... but I think it will be more even because of that push!


----------



## phanbuey (Apr 8, 2010)

20mmrain said:


> Yeah that might be true about the practicality thing but..... Even most high end machines don't have water cooling. And even if our high end systems are not practical. That still doesn't mean I will buy a water cooling loop just to buy a video card. Buying a water cooling loop just to get a video card that is stock overclocked is not just un-practical..... it's kind of not smart. But if your buying it with the original thought of going to be water cooled or you already are....... then sure it makes sense.  So my point still stands.
> 
> Also......to need a water block if you ask me..... should have one hell of an over clock to go with it. That card really does not. Even at 100 extra mhz....... a video card should not need a water block for that.




No its for people like me that already HAVE a full loop... and the waterblock is for whatever overclock you apply to it.  Assuming that card has voltage control of some sort.

I would totally buy that card if it was cheaper than a stock card and a waterblock.  Especially if they could guarantee that this card was cherry-picked.


----------



## 20mmrain (Apr 8, 2010)

phanbuey said:


> No its for people like me that already HAVE a full loop... and the waterblock is for whatever overclock you apply to it.  Assuming that card has voltage control of some sort.
> 
> I would totally buy that card if it was cheaper than a stock card and a waterblock.  Especially if they could guarantee that this card was cherry-picked.



No I agree.... you are right it is people that already have a loop. But for as long as I can really remember most FTW versions were Air cooled first. Before water cooled...... I might be wrong on that.

But even so.....What my whole point is..... that it must be a bad sign that in order even to make a FTW version it needs to be water cooled. Just more or less saying that the longevity of this card is going to be terrible because of the heat that it runs at.


----------



## Wile E (Apr 8, 2010)

20mmrain said:


> Yeah that might be true about the practicality thing but..... Even most high end machines don't have water cooling. And even if our high end systems are not practical. That still doesn't mean I will buy a water cooling loop just to buy a video card. Buying a water cooling loop just to get a video card that is stock overclocked is not just un-practical..... it's kind of not smart. But if your buying it with the original thought of going to be water cooled or you already are....... then sure it makes sense.  So my point still stands.
> 
> Also......to need a water block if you ask me..... should have one hell of an over clock to go with it. That card really does not. Even at 100 extra mhz....... a video card should not need a water block for that.
> 
> ...


The water cooled cards are targeted at people that already have a loop (or 2 in my case. lol), or those that already plan to get them. And the Factory watercooled cards never were really uber clocked. They left that to the buyer most of the time.

If I planned on grabbing a 480, I'd seriously consider this card for it's single slot profile. It would take me buying a full coverage block to get it down to single slot anyway, so why not buy one that already comes that way for around the same price as DIY?

The eVGA card is targeted at a different market than the air cooled cards. It is definitely niche, and can't really be compared to the air cards, tbh.


----------



## erocker (Apr 8, 2010)

TheMailMan78 said:


> If you honestly want to learn PM me. What I don't know I'm sure Wile E and Erocker could help. Wile E is heavy into Jap tuning and Erocker owns an auto shop. But Ill tell you right now you are following a myth on American engines. A good example is the LS series or the Ford Modulars.



We were speaking about supercars. The Europeans have us beat and then some in that department. They also have the sensibility to use diesels in cars. Fully loaded Jaguar XJ that does 60mpg. Can't get that here. We should take this over to generalnonsense.net Here's the thread: http://www.generalnonsense.net/showthread.php?p=36036#post36036


----------



## Wile E (Apr 8, 2010)

erocker said:


> We were speaking about supercars. The Europeans have us beat and then some in that department. They also have the sensibility to use diesels in cars. Fully loaded Jaguar XJ that does 60mpg. Can't get that here. We should take this over to generalnonsense.net Here's the thread: http://www.generalnonsense.net/showthread.php?p=36036#post36036



One last OT. Americans still have the super car crown. http://www.shelbysupercars.com/car-specs.php 

I'll take mine in Palladium, please. lol


----------



## entropy13 (Apr 8, 2010)

erocker said:


> We were speaking about supercars. The Europeans have us beat and then some in that department. They also have the sensibility to use diesels in cars. Fully loaded Jaguar XJ that does 60mpg. Can't get that here. We should take this over to generalnonsense.net Here's the thread: http://www.generalnonsense.net/showthread.php?p=36036#post36036



Approve my post there please LOL

How long was post moderation in place already? Last time I checked out generalnonsense there weren't any post moderation yet.


----------



## mdsx1950 (Apr 8, 2010)

eidairaman1 said:


> if you can keep the temps down long enough to run them in SLI.



On Water Cooling it should be able to run them on 3-way SLI.


----------



## 20mmrain (Apr 8, 2010)

Wile E said:


> The water cooled cards are targeted at people that already have a loop (or 2 in my case. lol), or those that already plan to get them. And the Factory watercooled cards never were really uber clocked. They left that to the buyer most of the time.
> 
> If I planned on grabbing a 480, I'd seriously consider this card for it's single slot profile. It would take me buying a full coverage block to get it down to single slot anyway, so why not buy one that already comes that way for around the same price as DIY?
> 
> The eVGA card is targeted at a different market than the air cooled cards. It is definitely niche, and can't really be compared to the air cards, tbh.



My whole thing is that the GTX 285 and the GTX 280 FTW were overclocked by over 100 mhz and didn't have to be water cooled. Same with the 9800's and the 8800's if I remember correctly.

That doesn't mean EVGA did not make a version that was.

My whole thing is that .... it just shows how warm it is. 

With that said though..... if there is an enthusiast that is planning on doing 3 way SLI with these..... I would choose no other card but these. But it is the only way you will get SLI because of how much power they take. I don't see them coming out with a GX2 version for quite some time.

For the facts that .... it would take up less space, Even at stock speeds it would give out less heat and it would be the only way not to kill your temps in the rest of your case.

But Anyway don't get me wrong. these cards are still the new big boy on the block and their performance is still really cool. 

There are just too many problems for me...... to actually consider them. Especially the hot warm thing. If I buy a card I don't want it dead in 1 year because it runs hotter then a card really should.


----------



## Wile E (Apr 9, 2010)

Yeah, they aren't for everyone, that's for sure.


----------



## Sanhime (Apr 9, 2010)

nt300 said:


> Not likely. Look like Nvidia need to scrap GF100 completely and move to something that works and make them profit.



I don't know what Nvidia needs to do, but the power consumption and heat output is really up there.  Granted that "on paper" GF100 is suppose to be more powerful than the 5970, and therefore it can be logical to say you expect more power consumption and and heat output.  That being said, we can look at this differently, this threshold may be the new norm.  Since the late 3dfx voodoo and Nvidia TNT, companies been pushing watts and watts and lots of watts.  But we can only push raw watts so much, theres a limited before you start melting plastics haha.  Think how much the Pentium 4 series was being pushed in term of raw watts and voltages before companies started looking at mainstreaming multicore CPUs.  I am sure (or hope) that 6 months from now, Nvidia will find a way to shrink the die even more  and make some core changes and release a "485/495" and we will return back to the typical 70C/80C loads.  

As a sort of a side note, I am worried that Nvidia might be making the same mistake 3dfx made with their voodoo3/4/5 series.  I honestly don't know how much difference GF100 is from its predecessors.  Is it a totally different out-of-the-box architecture or just a improvement over the same idea.


----------



## dumo (Apr 10, 2010)

Galaxy GTX480 and EVGA GTX480 Hydro Copper will be available soon


----------



## KaelMaelstrom (Jun 13, 2010)

we are hating this card just because of heat noise and those. but EVGA has relieve us with their EVGA GTX480 Hydro Copper. the GTX 480 is only for extreme tesselation. say w1zzard did you put the Tesselation setting into Extreme on the Heaven 2.0 benchmark or you just increase the resolution cause i test my rig with one normal GTX 480 and get 20-24 fps with all settings max and then i try the HD 5970 with all settings max and i get is 8-22 fps. very weird, note that i dont overclock or superclock these 2 beast. and then the GTX 480 FTW Hydro Copper, i get 25-32.3 fps. so i say get this card after 4 months or more and many people in the future will like this card. i heard someone named Trubritar in Youtube say "So,respect this card or the beast may just bite you back!!!"


----------



## Conditioned (Aug 4, 2010)

Great work on the review! Also tried to digg it but it whined :/

Anyway, I´m considering this card, and I missed fan noise under 'middle' load, ie playback of video.


----------

