• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce GTX 580 1536 MB

....the plain old microstuttering and driver problems from ati...

Change the record dude, it's getting worn out. I have no issues with my crossfire cards driver wise. I just want a single gpu in my case. Two huggy cards get hot and noisy on ocassion.

I think a lot of people are missing the point of the GTX580. While it did take the crown as the top single GPU(though the GTX480 still had it), the main point was not to provide a huge performance increase. Instead, similar to G80 to G92 and G70 to G71, it's main purpose to to revise the silicon to provide better thermal and power consumption while still allowing a marginal performance increase.

Yes, when downclocked to GTX480 speeds, the performance difference is marginal. However, when clocked beyond GTX480 speeds, with all 512 SPs enabled, the GTX580 uses less power and puts out less heat.

First Bold Point - while true it had been touted as X% faster and how it's the fastest DX 11 card ever. Thats a pretty strong indicator they're touting it as a faster card whose sole purpose is actually to combat the HD 6970 (Tom Peterson from NV has said as much).

Second Bold Point - It uses less power.... Does it? I'm not going to cherry pick reviews as only ignorant folk do that. But whilst some reviewers show it consuming far less power (GTX 260 levels) others show it consuming a tad more. The problem is the throttler they have on it for power protection. It's artificially skewing power consumption. W1zz and a few others have worked around it but some reviewers have used Furmark and come away with (i.e GTX 260) results.

Bear in mind there are some 480's out there with reworked PCB's with better circuitry that consume less power than a standard 480. Given the things i'm reading more of and how well the rejigged AIC 480's are, I'm more and more unimpressed.
 
First Bold Point - while true it had been touted as X% faster and how it's the fastest DX 11 card ever. Thats a pretty strong indicator they're touting it as a faster card whose sole purpose is actually to combat the HD 6970 (Tom Peterson from NV has said as much).

Well of course that is the way they marketted it. What do you think is going to get them more sales? "We have the fastest most bad ass card on the market" or "The power consumption and heat output is lower than our previous cards but still not as good as the competition"?

Second Bold Point - It uses less power.... Does it? I'm not going to cherry pick reviews as only ignorant folk do that. But whilst some reviewers show it consuming far less power (GTX 260 levels) others show it consuming a tad more. The problem is the throttler they have on it for power protection. It's artificially skewing power consumption. W1zz and a few others have worked around it but some reviewers have used Furmark and come away with (i.e GTX 260) results.

Bear in mind there are some 480's out there with reworked PCB's with better circuitry that consume less power than a standard 480. Given the things i'm reading more of and how well the rejigged AIC 480's are, I'm more and more unimpressed.

If you read the reviews, the good reviews, yes it does use less power. Furmark is the only application that the throttle effects, and it is also the application that I totally ignore when talking about power consumption because it is not realistic. Any review that only puts out Furmark power consumption numbers isn't a valid review in any way, and I tend to not even read them entirely because it is obvious the reviewer has no clue what they are doing.

In real world use, as shown by W1z's review, you are looking at ~20-30w less. Even if you remove the limitter from the equation and use Furmark numbers, it is still consuming 15w less under furmark load, so you are wrong about the limitter artificially making it look like it consumes less power. It might make it seem like it consumes a lot less power than it really does, but it does consume less power none the less.
 
the good reviews

Cherries Picked.

Nah, I'm only having a laugh. You're right. Guru 3D had an engineering sample which was pulling 30-40 Watts more at load.

However, a well binned GF100 chip on a PCB redesigned by a board partner with similar clocks produces a GTX 480 that consumes less Watts and less noise and 10% more perf than a standard GTX 480.

I think, with that evidence, what NV has done is simply refine what the partners were already doing with the 480's. And on that front (with clocks at 772 versus 702 - 10% higher) the GF 110 doesn't actually blow my socks off.

Dont get me wrong I have a bad feeling the HD6970 is going to be hot and noisy. I'm very impatient though but 3-4 more weeks and I can make my purchasing decisions.

So far it is:

HD 6970 or
KFA GTX 480 Anarchy http://www.hexus.net/content/item.php?item=26757
Gigabyte GTX 480 SOC http://www.hexus.net/content/item.php?item=27253
MSI GTX 480 Lightning http://www.techpowerup.com/reviews/MSI/N480GTX_GTX_480_Lightning/
Sparkle GTX 580 Calibre http://www.techpowerup.com/134204/Sparkle-Announces-Calibre-X580-Graphics-Card.html

As you can see, good chance i'll be jumping ship to green unless the HD 6970 is my dream card.
 
If you read the reviews, the good reviews, yes it does use less power. Furmark is the only application that the throttle effects, and it is also the application that I totally ignore when talking about power consumption because it is not realistic. Any review that only puts out Furmark power consumption numbers isn't a valid review in any way, and I tend to not even read them entirely because it is obvious the reviewer has no clue what they are doing.

Furmark is a useful tool to find the max power consumption and temps of the GPU, i think it's good to include that just for reference, other then that i agree.
 
What's wrong with a re-engineer? Cutting down transistors does not mean that they are cutting down on anything especially if they are optimising the transistors. After all, you would happily pay just as much (or slightly less) for a 300m Athlon II X3 rather than 1.05Billion transistors of the 5750.

There is nothing wrong price/performance improvement is OK. But the product becomes cheaper to manufacture and the price goes up. I don`t know about you guys bit some how it feels that prices calculated in agreement between ATI and NV. New cards are coming all the time but prices of old ones are stuck. Nor ati nor NV are interested in price cutting. And they are very keen on cutting cores.
 
There is nothing wrong price/performance improvement is OK. But the product becomes cheaper to manufacture and the price goes up. I don`t know about you guys bit some how it feels that prices calculated in agreement between ATI and NV. New cards are coming all the time but prices of old ones are stuck. Nor ati nor NV are interested in price cutting. And they are very keen on cutting cores.

True, HD5850 was never sold under its msrp, unlike 4850 or 4870 or 4890.
 
However, a well binned GF100 chip on a PCB redesigned by a board partner with similar clocks produces a GTX 480 that consumes less Watts and less noise and 10% more perf than a standard GTX 480.

I think, with that evidence, what NV has done is simply refine what the partners were already doing with the 480's. And on that front (with clocks at 772 versus 702 - 10% higher) the GF 110 doesn't actually blow my socks off.

Actually you made me think of something just now. If you look back at the review W1z did of the Zotac GTX480 with the Zalman cooler on it, even using the same reference PCB design and components the GTX480 was capable of ~20-30w less power consumption just as a result of cooling the GPU core down!:eek:

That really makes me think that the improvements we are seeing here in the GF110 are not really a result of the GPU being tweaked, but instead a result of the Cooler being tweaked to be more efficient...:rolleyes:

Furmark is a useful tool to find the max power consumption and temps of the GPU, i think it's good to include that just for reference, other then that i agree.

I think Furmark has its place in reviews as well, and I think power consumption numbers with Furmark should be in a review as well, but I just don't think they are that important. And if I had my choice of only one power consumption number to put in a review, it would be the real world power consumption, not Furmark.
 
I think VIA is going to PWN all of them this time next year!
 
There is nothing wrong price/performance improvement is OK. But the product becomes cheaper to manufacture and the price goes up. I don`t know about you guys bit some how it feels that prices calculated in agreement between ATI and NV. New cards are coming all the time but prices of old ones are stuck. Nor ati nor NV are interested in price cutting. And they are very keen on cutting cores.

You are quite right, but given the fact that AMD is still a money losing business as a whole, I doubt that they are going to cut down the prices anytime soon (the release of the 6870 forced price down for a few days though) and if Nvidia sticks with challenging AMD's pricing, nobody is going to lower the prices anytime soon. And also the retailers are not keen on cutting prices either.

MSRP of stock 5870 was $400 when it was released, and now you can find versions which sell for $300 (sapphire), 25% price cut in a year and a bit seems quite reasonable to me.

True, HD5850 was never sold under its msrp, unlike 4850 or 4870 or 4890.

MSRP 5850 was $299,

ASUS EAH5850 DIRECTCU/2DIS/1GD5 Radeon HD 5850 (Cy...
 

I'd assume they mean launch price, which was $260. Then it sold out in about a hour and AMD jacked the price massively due to a lack of competition, lasting even after the 400 series launch. Why should these companies bother fighting a price war when they can both just jack things up equally. I seriously expect the 6970 to hit at $500 and the 580 to drop to $500, with both sitting there for a painfully long time.

For those asking about the clock for clock difference between the 480/580

33983.png
 
You are quite right, but given the fact that AMD is still a money losing business as a whole, I doubt that they are going to cut down the prices anytime soon (the release of the 6870 forced price down for a few days though) and if Nvidia sticks with challenging AMD's pricing, nobody is going to lower the prices anytime soon. And also the retailers are not keen on cutting prices either.

MSRP of stock 5870 was $400 when it was released, and now you can find versions which sell for $300 (sapphire), 25% price cut in a year and a bit seems quite reasonable to me.



MSRP 5850 was $299,

ASUS EAH5850 DIRECTCU/2DIS/1GD5 Radeon HD 5850 (Cy...

MSRP was 260 USD, it went to 299 after release.
 
Last edited:
I'd assume they mean launch price, which was $260. Then it sold out in about a hour and AMD jacked the price massively due to a lack of competition, lasting even after the 400 series launch. Why should these companies bother fighting a price war when they can both just jack things up equally. I seriously expect the 6970 to hit at $500 and the 580 to drop to $500, with both sitting there for a painfully long time.

Ah. R&D is not cheap, so its to be expected. Anyways, even if they could lower the price, they wouldn't, just as how Bentleys and Rolls-Royces do not come cheap BECAUSE they are top of the range. Premium stuff demands premium price. Any economist will tell you that. Mid range cards now are perfectly capable of ripping most games to shreds, why bother with the top end?

MSRP was 260 USD, it went to 299 after release.

Current MSRP is USD260, but that was not the case at launch after you factor in the initial discounted price
 
Any way you get my point, 4870 started at 299 and reached eol at 140.
problem was nvidia that did not bring any real competition so we consumers were stuck with the same prices for a whole year.
 
I'd assume they mean launch price, which was $260. Then it sold out in about a hour and AMD jacked the price massively due to a lack of competition, lasting even after the 400 series launch. Why should these companies bother fighting a price war when they can both just jack things up equally. I seriously expect the 6970 to hit at $500 and the 580 to drop to $500, with both sitting there for a painfully long time.

For those asking about the clock for clock difference between the 480/580

http://images.anandtech.com/graphs/graph4012/33983.png

Those scores can't be legit. They suck.
 
Any way you get my point, 4870 started at 299 and reached eol at 140.
problem was nvidia that did not bring any real competition so we consumers were stuck with the same prices for a whole year.

Well, 5870 is still going strong and not at eol yet, so dropping from $399 to $299 for the cheapest one is quite good seeing that its only a year. I expect it to drop to $250 or less when it reaches EOL, roughly similar to the drop (in percentage) of 4870.

4870 started at june 2008 according to wiki, and went EOL because the 5xxx raped it bad hence the lower EOL prices.

I do get your point though. Right now AMD is finding themselves at a point where there were during the Athlon vs Pentium 4 days, so they are trying to profit as much as they still can to reduce their debt.
 
Well, 5870 is still going strong and not at eol yet, so dropping from $399 to $299 for the cheapest one is quite good seeing that its only a year. I expect it to drop to $250 or less when it reaches EOL, roughly similar to the drop (in percentage) of 4870.

4870 started at june 2008 according to wiki, and went EOL because the 5xxx raped it bad hence the lower EOL prices.

I do get your point though. Right now AMD is finding themselves at a point where there were during the Athlon vs Pentium 4 days, so they are trying to profit as much as they still can to reduce their debt.

I bought my 4870 at 150 on june 2009, :)
And it was very strong at that time.
 
I do get your point though. Right now AMD is finding themselves at a point where there were during the Athlon vs Pentium 4 days, so they are trying to profit as much as they still can to reduce their debt.

That is so true, lol they are trying to recover all the money they had lost.
 
I bought my 4870 at 150 on june 2009, :)

Perhaps I was wrong then. 5770 is still selling for $120 cheapest, so the price drop throughout this year and half of last year is almost negligible. But then there is DX11 and crap like that, and you might have jumped on a very good deal.
 
Perhaps I was wrong then. 5770 is still selling for $120 cheapest, so the price drop throughout this year and half of last year is almost negligible. But then there is DX11 and crap like that, and you might have jumped on a very good deal.

Good point there, DX11 that's one more factor that influenced them to keep prices.
 
Those scores can't be legit. They suck.

Yeah. I'm sure anandtech fudges their numbers all the time. Or you know, you could infer that it's 2560x1600.
 
Take a look at my revised "soon to be" spec, guys. Crazyeyesreaper just whooped my ass about the old one...
 
Take a look at my revised "soon to be" spec, guys. Crazyeyesreaper just whooped my ass about the old one...

Tell Crazyeyes to go blow a goat.
 
He was saying pretty much what you guys were saying. That and the case I was going to buy wouldn't fit the card. XD
 
would it be a worthwhile upgrade from an evga gtx285? or should I just save the cash and grab a 6000Series AMD card? Res. used would be 1080p.
 
would it be a worthwhile upgrade from an evga gtx285? or should I just save the cash and grab a 6000Series AMD card? Res. used would be 1080p.

No right answer to that as the 69xx are not out yet.
 
Back
Top