• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce GTX 480 Fermi

So Nvidia claimed 295 and you pulled 320? Yeah I'm with Wile E on this one. Thats just a bold face lie. I would be SO pissed off if I bought one of these and my PSU couldn't take the load.

Yep. They can PR and spin it all the want. The fact of the matter is, this card is capable of consuming over 300w in stock form. I'm happy to see w1z sticking to his guns.
 
You know what's interesting, they continue to recommend a 600W PSU for systems with even a single GTX 480. Minus a graphics card, the average system (whatever is inside the case) would draw around 150W. So even with the crappiest 600W PSU you can find (75% efficiency ; -150W), that's asking consumers for a 300W headroom for a single card, a whopping 450W room when the PSU peaks (cheapo manufacturers market peak wattage as wattage, good quality ones market continuous power as wattage).
 
FYI, it's not "Bold-Faced" but "Bald-Faced" Liar:

The correct term is bald-faced, and refers to a face without whiskers. Beards were commonly worn by businessmen in the 18th and 19th century as an attempt to mask facial expressions when making business deals. Thus a bald-faced liar was a very good liar indeed, and was able to lie without the guilt showing on his face.
 
421w was anadtechs total system load during crysis. thats 102w more than a 5870.

furmark was 479w. just a few watts short of the dual-gpu gtx 295.

i would think whatever the gtx295 needs would be good enough for the 480?

for a single gpu this is sad really. sad because we have so much time to wait before we ever get to buy something from nvidia that is worth buying (next gen toys)... god knows what this stepping-stone of a gpu will lead to.. but i sincerley hope this type of product isnt going to be what i must get used to using if i want the "fastest". I care less about power draw and more about heat when performance isn't the topic of discussion but in this case both issues are just that and quite major ones too.

for you guys like me that have been gaming since the quake days... it's 2010 yall... 2010. This is not progressive. (especially since it's only sometimes faster by 10-13% and slower in other situations like BC2 which i shouldnt have to remind you is a hell of alot more popular than metro 2033)

I don't complain and bitch to put nvidia down, I buy there shit too (ive always bought NV until this year)... I am just disappointed in what their new choice for next gen graphics is and I was hoping for alot more... well on the bright side I get to keep some more money in my pocket.
 
Last edited:
"aluminum = 0.896 kJ per kg per Kelvin
copper = 0.383 kJ per kg per Kelvin"



Hold the spork, man!

Did you take the time to interpret those numbers?

per kg, it takes 0.896 kJ of energy to raise Al 1 K . . . and, per kg, it takes 0.383 kJ of energy to raise Cu 1 K . . .

Taking into account energy in must equal energy out, then we could easily say that it takes 0.896 kJ of energy to lower Al 1 K, and 0.383 kJ of energy to lower Cu 1 K . . .

So, kg for kg, it takes less energy to raise and lower Cu 1 K than it would to raise Al 1 K . . .

For a material to require less energy to raise or lower it's temp, that says to me it's less resistant (not a very scientific term here ) to the heat, and more willing to give off heat, which equates to being more thermally conductive.

But, that's all kg for kg. If you had a 2kg Al HS, it would be much larger than a 2kg Cu HS . . . and based on those numbers you posted, kg for kg, Cu more readibly absorbs and dissipates heat, which means . . .

. . . it's still the better material for cooling. :toast:
 
You know what's interesting, they continue to recommend a 600W PSU for systems with even a single GTX 480. Minus a graphics card, the average system (whatever is inside the case) would draw around 150W. So even with the crappiest 600W PSU you can find (75% efficiency ; -150W), that's asking consumers for a 300W headroom for a single card, a whopping 450W room when the PSU peaks (cheapo manufacturers market peak wattage as wattage, good quality ones market continuous power as wattage).

Not sure if I understood you correctly, but if you're stating that 600W is overkill for a gaming system I'd have to dissagree. You must keep in mind that most PSUs reach optimal efficiency at ~50-55% load. So a 600W PSU is usually targeted at 300-400W systems.
 
Hold the spork, man!

Did you take the time to interpret those numbers?

per kg, it takes 0.896 kJ of energy to raise Al 1 K . . . and, per kg, it takes 0.383 kJ of energy to raise Cu 1 K . . .

Taking into account energy in must equal energy out, then we could easily say that it takes 0.896 kJ of energy to lower Al 1 K, and 0.383 kJ of energy to lower Cu 1 K . . .

So, kg for kg, it takes less energy to raise and lower Cu 1 K than it would to raise Al 1 K . . .

For a material to require less energy to raise or lower it's temp, that says to me it's less resistant (not a very scientific term here ) to the heat, and more willing to give off heat, which equates to being more thermally conductive.

But, that's all kg for kg. If you had a 2kg Al HS, it would be much larger than a 2kg Cu HS . . . and based on those numbers you posted, kg for kg, Cu more readibly absorbs and dissipates heat, which means . . .

. . . it's still the better material for cooling. :toast:

:laugh:

Its very dependant, google search its pretty much an epic sized debate.


I'll have to find more data, but as I said I've not slept so won't be doing any proper digging any time soon :laugh:

But needless to say theres a reason why a combination of aluminium and copper is used in heatsinks and its not just to be cost effective.

As I said check out true vs true copper, you would expect the true copper to completely piss all over the standard version, it beats it by 1-2 Celsius or so.

It is very dependant on heatsink design/size and what airflow is available if copper is an effective heatsink.



Since I'm to tired to explain myself, quoting other people ! wooooo very layman unfortunately but finding things is hard when sleepy, still looking though.

"
Copper and aluminum are both effective materials for heat sink construction, but they have different requirements. If you want to know why, consider a great chef's kitchen.

Aluminum sure can move heat, if it has been done right. It very efficiently absorbs and transfers heat to it's environment and things that interact with it. This works great for bacon in the morning, and even for boiling water, but isn't so good for a large, thick filet mignon. That cold slab of beef sucks the heat right out of the aluminum, and there isn't any left to keep up the cooking. Many people who buy aluminum cookware have a lot of trouble doing steaks properly for this very reason. Aluminum has a low thermal capacity, and a very high thermal conductivity.

As such, aluminum just wicks heat away with little concern for anything else. It won't wick as much as copper, but it sure will move it quickly; Dumping it's capacity as soon as any heat leaves the sink, and quickly soaking up more.

Copper moves heat as well, even if it hasn't been done all that well. Copper very efficiently absorbs and transfers heat as does aluminum. It does it faster, as well. That said, copper has an incredibly high thermal capacity. That big fat steak just can't suck up all the heat that copper will hold on to, and this is where copper and aluminum differ in requirements. Copper won't readily dump all the heat energy it picks up, because it holds so much of it before it changes temperature to any great degree.

That leaves us with a problem. Copper needs help. Somehow, you have to remove all that heat from the copper, as it will just hold on to it otherwise. A copper heat sink can work much better than an aluminum one, but you have to either have loads of pipes and lots of fins and airflow, or you need peltier/water cooling with excellent transfer to help it out.

The thermal capacity of copper, when compared to an aluminum heat sink of the same design, completely removes the benefit of using copper in the first place without help. As a matter of fact, a poorly designed copper sink can be much worse than an aluminum model.

The best way to use the materials is being tried nowadays, and that is combining them. As with most good things, they work better together than apart.
"

More data using cookware as an example lol

"Copper cookware almost always compares favorably to other types of cookware. Stainless steel is not the best conductor, although its strength, durability and ease of cleaning make it a favorite among some cooks. A heavy-gauge aluminum bottom on a stainless steel pan will increase the pan's efficiency, but a thick-gauge aluminum pan is, overall, a better conductor. Aluminum, however, reacts to acidic foods by imparting a metallic taste and sometimes discoloring them -- egg whites beaten in aluminum, for instance, may turn gray. It also does not retain heat for long periods. "

and finally

"ake 2 same sized blocks of metal... one aluminium one copper... heat them to the same temperature. Now monitor temperatures as they cool... one probe in contact with the metal, one a half inch above it's surface, note what happens... The copper block will stay hotter, longer with lower free air temps. The aluminium will cool faster with higher free air temps... because the copper, being higher mass, will retain heat longer.

Yes, because of it's higher thermal conductivity copper soaks up more heat more quickly, but because of it's higher mass it's going to STAY hot. Aluminium isn't as good a heat absorber but, because of it's lower mass, it releases the heat more quickly.

"


Basically the design their using ( copper heatpipes and aluminium fins) gets you the best of both worlds.



Whilst Copper absorbs heat around twice as fast, it dissipates heat about half the rate that aluminium does due to its density.

As I said, tis dependant.

May of used wrong wording earlier maybe, again I blame tiredness !

ha ha
 
Last edited:
Hello everybody, I found out what really happend. And NVIDIA could have released these a lot sooner with lower speed and power but chose not to which dug them deep today.

This is a great NON Bias article and well writen. :toast:

Why Nvidia cut back the GTX480
Less is more

by Charlie Demerjian
March 28, 2010
http://www.semiaccurate.com/2010/03/28/why-nvidia-hacked-gtx480/
Here is some of the article, now I understand what went wrong:
Remember when we said that one problem was 'weak' clusters that would not work at the required voltage? Well, if you want to up the curve on yields, you can effectively lower the curve on power to compensate, and Nvidia did just that by upping the TDP to 250W. This the classic overclocking trick of bumping the voltage to get transistors to switch faster.

While we don't for a second believe that the 250W TDP number is real, initial tests show about a 130W difference in system power between a 188W TDP HD5870 and a '250W' GTX480, that is the official spec. Nvidia lost a 32 shader cluster and still couldn't make 10,000 units. It had to bump the voltage and disable the clusters to get there. Unmanufacturable.

If you are still with us, we did mention that the 480 shader part was faster. How? With the slowest cluster gone, that bumps the speed curve up by a fair bit, and the worst part of the tail is now gone. Bumping the voltage moves the speed curve up more, and the end result was that Nvidia got 700MHz out of a 480 shader unit. That 700/1400 number sounds quite familiar, doesn't it?
 
Last edited:
:laugh:

Its very dependant, google search its pretty much an epic sized debate.


I'll have to find more data, but as I said I've not slept so won't be doing any proper digging any time soon :laugh:

But needless to say theres a reason why a combination of aluminium and copper is used in heatsinks and its not just to be cost effective.

As I said check out true vs true copper, you would expect the true copper to completely piss all over the standard version, it beats it by 1-2 Celsius or so.

It is very dependant on heatsink design/size and what airflow is available if copper is an effective heatsink.



Biggest reason why manufacturers go with full aluminum or hybrid designs boils down to cost of manufacture, cost of material and weight.

Weight plays a big factor in design. Take a Noctua, for example, an excellent cooler that works extremelly well - there's a lot of surface area . . . but, if that beast was pure copper, it would break a motherboard if installed in a verticle position.

Notice, though, how well full copper coolers perform compared to larger aluminum coolers - the Zalman 9700, for example, still performs on-par with these larger beasts, and it's much smaller in size.

. . . and we still haven't taken various other aspects into account, such as thermal expansion, either . . .

It can be debated ad naseum, but it won't change the fact that copper is simply better for cooling than aluminum.
 
Not sure if I understood you correctly, but if you're stating that 600W is overkill for a gaming system I'd have to dissagree. You must keep in mind that most PSUs reach optimal efficiency at ~50-55% load. So a 600W PSU is usually targeted at 300-400W systems.

No, I meant that NVIDIA's is lying about board power, blatantly. 600W PSU requirement hints at that. Now they can't cut down that requirement to say 500W, because without giving the card 300W, it would become unstable/crash.
 
I added to my post by the by XD

I'll say it here though.


Whilst copper is around twice as good at absorbing heat, Aluminium dissipates its heat around twice as fast.

A combination of both is therefore the best solution : ]
 
I added to my post by the by XD

I'll say it here though.


Whilst copper is around twice as good at absorbing heat, Aluminium dissipates its heat around twice as fast.

A combination of both is therefore the best solution : ]

Coppinium! :laugh:
 
So Nvidia claimed 295 and you pulled 320? Yeah I'm with Wile E on this one. Thats just a bold face lie. I would be SO pissed off if I bought one of these and my PSU couldn't take the load.

And what did ATi claim? 182w or something like that, and we are seeing 212w. The furmark maximum number is far beyond what you will ever see during any other normal use of the card.
 
And what did ATi claim? 182w or something like that, and we are seeing 212w. The furmark maximum number is far beyond what you will ever see during any other normal use of the card.

Nobody is talking about ATI. This is a thread about Nvidia. A lie is a lie no matter who states it.
 
Nobody is talking about ATI. This is a thread about Nvidia. A lie is a lie no matter who states it.

That wasn't the point I was getting at, the point was that the 320w will never be seen in real world use. So there is no reason for ATi or nVidia or anyone to use the furmark numbers.

It isn't a lie to say 250w, because under normal use, that is the maximum power draw(257w according to this review).
 
Will they warranty a board that has been run with furmark? Seriously, it is a valid 3D application and in a enclosed case during a warm summer in a warm room what will happen....


Care to try it W1zz? I am genuinely interested if it will cook this POS or if it starts to down clock to protect itself, and thus lose a huge amount of performance for normal people trying to use it.
 
Will they warranty a board that has been run with furmark? Seriously, it is a valid 3D application and in a enclosed case during a warm summer in a warm room what will happen....


Care to try it W1zz? I am genuinely interested if it will cook this POS or if it starts to down clock to protect itself, and thus lose a huge amount of performance for normal people trying to use it.

not possible .. i tried .. card goes to 96° then fan ramps up and it goes to 91°

Capture598.jpg

not working as grill
 
Is that what the whisper quiet 50Db was from? So real users can expect that sort of noise during gaming?

How long did you run Furmark for?
 
Is that what the whisper quiet 50Db was from? So real users can expect that sort of noise during gaming?

How long did you run Furmark for?

like 40 mins while i was recording HD video in the hope I'd get a nice omelette
 
If you want it watercooled and tortured send it my way. I would love to find the maximum for this card, I would just have to chill my loop in my -15F deep freeze.
 
Will they warranty a board that has been run with furmark? Seriously, it is a valid 3D application and in a enclosed case during a warm summer in a warm room what will happen....


Care to try it W1zz? I am genuinely interested if it will cook this POS or if it starts to down clock to protect itself, and thus lose a huge amount of performance for normal people trying to use it.

Lol I could try that out in August if Nvidia care to lend me one for testing :laugh:

The room where my desktop is in my house faces south-west, and if I don't turn on the AC and leave the curtains open so that the sun shines in, in summer there's a green-house effect where the room's temperature reaches 40 degrees!

I'd put the 480 inside a normal mainstream case and run Furmark. Ambient temperature 40 C...
I'd keep a fire extinguisher at hand obviously. :D
 
card is already on its way back to nvidia
 
Back
Top