# NVIDIA GeForce GTX 580 SLI



## W1zzard (Nov 7, 2010)

NVIDIA's GeForce GTX 580 has claimed the single GPU performance throne today. We take two of these cards for a spin to see what performance users can expect from this $1000 GPU combination. The review will also give insight into potential performance numbers for 3D Vision Surround.

*Show full review*


----------



## aCid888* (Nov 9, 2010)

The best review on the Internet; as per usual.


Thanks.  



Well done to nVidia on the power consumption and I only hope the 580 makes the cards I'm interested in cheaper to buy!


----------



## Hayder_Master (Nov 9, 2010)

nice w1z, first sli review, dig it guys


----------



## jfgwapo (Nov 9, 2010)

New upcoming drivers will definitely improve SLI scaling


----------



## johnnyfiive (Nov 9, 2010)

Cards are pretty boss... but too expensive. I'll be just fine with my $480 CrossFired 6870s.
(But if anyone feels generous, I'll take two 580s. )


----------



## newtekie1 (Nov 9, 2010)

Hey W1z, I think it would be cool in reviews like this to include some other Crossfire/SLi configuration for comparision as well.  Like maybe in this one it would be nice to see two HD5970s in there to see how things compete(but I know this would be impossible if you don't have two HD5970s).

Other than that, another great review!


----------



## heky (Nov 9, 2010)

+1 on that. Maybe some 6870 crossfire comparison, or later a 6970 crossfire comparison.


----------



## Athlon2K15 (Nov 9, 2010)

Great review as always no complaints here.


----------



## PopcornMachine (Nov 9, 2010)

heky said:


> +1 on that. Maybe some 6870 crossfire comparison, or later a 6970 crossfire comparison.



Yes.  A comparison with multicard 6870s, 6850s, and 460(1G and 768)s would provide more bang per buck and bang per watt information.

Thanks for the review.


----------



## DaMulta (Nov 9, 2010)

Value and Conclusion



> NVIDIA GeForce GTX 580 SLI has it all -screaming performance, and additional features. If one GTX 580 wasn't enough, two of them in SLI will certainly help you bulldoze through just about any game, at any resolution. If 2560 x 1600 isn't enough, you can actually span your display head across multiple physical displays using NVIDIA's 3D Vision Surround feature. The added horsepower will definitely help with stereoscopic 3D gaming, when your monitor is displaying twice the information.


I differently agree, and I see this as a good move forward with Nvidia.


----------



## the54thvoid (Nov 9, 2010)

Sli GTX 580 and only 33 fps in Metro 2033 at 2560 res.  $1000 for 33 fps.  That game must be the shittest coded game in history.  Its programmers should be shot.


----------



## johnnyfiive (Nov 9, 2010)

It's no different than how Crysis was 2 years ago. Both developers should be shot!


----------



## Sadasius (Nov 9, 2010)

johnnyfiive said:


> It's no different than how Crysis was 2 years ago. Both developers should be shot!



Very much agree. They both have developers... 

Very nice review, extremely informative and detailed. 

Looks like the GTX 580 is an awesome card. Now I will wait and see what Cayman can do. I am leaning more towards Nvidia but then again we shall see. I keep hopping over the fence all the time.


----------



## Black Panther (Nov 9, 2010)

newtekie1 said:


> Hey W1z, I think it would be cool in reviews like this to include some other Crossfire/SLi configuration for comparision as well.  Like maybe in this one it would be nice to see two HD5970s in there to see how things compete(but I know this would be impossible if you don't have two HD5970s).
> 
> Other than that, another great review!



Can't agree with that really:

580 SLI = 2 GPU's
5970 CF = 4 GPU's


----------



## pantherx12 (Nov 9, 2010)

the54thvoid said:


> Sli GTX 580 and only 33 fps in Metro 2033 at 2560 res.  $1000 for 33 fps.  That game must be the shittest coded game in history.  Its programmers should be shot.



I imagine like crysis everything is rendered in real time, which actually makes metro and crysis very very very well coded 

All those subtle effects the games have all add up you see. it makes them more life like, or more dynamic ( what ever effect they're after) but they do eat up resources.

It's just a case of diminishing returns though, other games look nearly as good so people think bad coding.

But it's just a case of people not well, I guess seeing all those subtle things!


----------



## newtekie1 (Nov 9, 2010)

the54thvoid said:


> Sli GTX 580 and only 33 fps in Metro 2033 at 2560 res.  $1000 for 33 fps.  That game must be the shittest coded game in history.  Its programmers should be shot.



It isn't shitty coding, it is the fact that they have actually used DX11 heavily, including heavy tessellation.  The main problem is actually memory at that resolution, which is why if you look W1z had to drop down to 0xAA just to get usable results.  Whenever a game pushes a huge amount of detail along with a huge amount of tessellation you are going to run into poor performance at extreme resolutions.

I say the developers should be praised, because without people like them pushing the envolope of performance and demanding more from the hardware, we would see advances in hardware and we'd all be playing crappy DX9 console ports at 720p...



Black Panther said:


> Can't agree with that really:
> 
> 580 SLI = 2 GPU's
> 5970 CF = 4 GPU's



How many GPUs doesn't matter to me.  The two cards are priced similarly and perform similarly, so I'd be nice to see how they do when paired.  Kind of like a "This is what $1000 buys your from nVidia, and this is what $1000 buys you from ATi."  That is information that someone looking to put out $1000 might be interested in, IMO.


----------



## Frag_Maniac (Nov 9, 2010)

Once again, great thorough review. I'm sure driver improvement will have SLI performance doing much better. It is rather odd though that in Metro it does OK at 2560 but is severely crippled at 1920. I'm beginning to think even Cryostasis is far better optimized than Metro.


----------



## DaMulta (Nov 9, 2010)

newtekie1 said:


> It isn't shitty coding, it is the fact that they have actually used DX11 heavily, including heavy tessellation.  The main problem is actually memory at that resolution, which is why if you look W1z had to drop down to 0xAA just to get usable results.  Whenever a game pushes a huge amount of detail along with a huge amount of tessellation you are going to run into poor performance at extreme resolutions.
> 
> I say the developers should be praised, because without people like them pushing the envolope of performance and demanding more from the hardware, we would see advances in hardware and we'd all be playing crappy DX9 console ports at 720p...
> 
> ...



1 grand is a lot, but it's not a whole lot of money. It's less than 100USD per month, per year, and around 50-100USD per month electric depending on how you use them. 

This market is for serous gamers that just want to play games full out, or someone who only upgrades every few years. That's how I view the fastest of fast GPU market.


2 GPU vs 4 GPU it does not matter, because it's two cards on top of being almost at the same price point. I agree with newtekie


----------



## PopcornMachine (Nov 9, 2010)

DaMulta said:


> 1 grand is a lot, but it's not a whole lot of money.



What other planet do you live on?


----------



## PopcornMachine (Nov 9, 2010)

newtekie1 said:


> It isn't shitty coding, it is the fact that they have actually used DX11 heavily, including heavy tessellation.



If someone develops a game that completely ignores the capability of the hardware available to the average person, then that is also shitty coding.


----------



## newtekie1 (Nov 9, 2010)

PopcornMachine said:


> If someone develops a game that completely ignores the capability of the hardware available to the average person, then that is also shitty coding.



That is why there are lower settings in the game.  If they forced you to play at the maximum settings, then they would be ignoring the capability of the hardware available to the average person.  They are using all the feature sets available to them as much as possible, that is a good thing.  

What is the point of DX11 if it isn't used?  Why still code games with DX9/10 when we have DX11 GPUs?  That is essentially what you are wanting them to do, right?  You don't want them to use DX11 to its fullest.


----------



## Fourstaff (Nov 9, 2010)

PopcornMachine said:


> If someone develops a game that completely ignores the capability of the hardware available to the average person, then that is also shitty coding.



Its called a port from PS3/Xbox360 which is what pcgamers have been receiving of late.


----------



## PopcornMachine (Nov 9, 2010)

newtekie1 said:


> That is why there are lower settings in the game.  If they forced you to play at the maximum settings, then they would be ignoring the capability of the hardware available to the average person.  They are using all the feature sets available to them as much as possible, that is a good thing.
> 
> What is the point of DX11 if it isn't used?  Why still code games with DX9/10 when we have DX11 GPUs?  That is essentially what you are wanting them to do, right?  You don't want them to use DX11 to its fullest.



Show me where this game is playable at any resolution that anyone here wants to play at, without spending a $1000 on video cards.

What is the point of arguing if you're going to ignore facts?


----------



## newtekie1 (Nov 9, 2010)

PopcornMachine said:


> Show me where this game is playable at any resolution that anyone here wants to play at, without spending a $1000 on video cards.
> 
> What is the point of arguing if you're going to ignore facts?



I have no problem playing it with a single $190 GTX460 and $50 Celeron at 1920x1080...

One of us is ignorant of the facts, but it ain't me.


----------



## PopcornMachine (Nov 9, 2010)

newtekie1 said:


> I have no problem playing it with a single $190 GTX460 and $50 Celeron at 1920x1080...
> 
> One of us is ignorant of the facts, but it ain't me.



My mistake.  27fps is playable. Have fun with that.


----------



## newtekie1 (Nov 9, 2010)

PopcornMachine said:


> My mistake.  27fps is playable. Have fun with that.



I get pretty steady 60FPS, but you just keep up those ignorant posts.


----------



## PopcornMachine (Nov 9, 2010)

newtekie1 said:


> I get pretty steady 60FPS, but you just keep up those ignorant posts.



Sure you do.

Hey look, it's a unicorn!


----------



## ShogoXT (Nov 9, 2010)

I had crossfire 4850s and 4890s. Not doing it anymore considering microstutter and min fps. Single powerful GPU is where its at.


----------



## Black Panther (Nov 9, 2010)

OK guys let's remember the title of this thread now?


----------



## Sadasius (Nov 9, 2010)

ShogoXT said:


> I had crossfire 4850s and 4890s. Not doing it anymore considering microstutter and min fps. Single powerful GPU is where its at.



I also prefer the single powerful GPU. You always have to rely that Sli or CrossFire profiles are being updated and many of the times it has errors. I had it both ways and I prefer it the single powerful GPU to be the best and much more smooth in game play and system stability.



Black Panther said:


> OK guys let's remember the title of this thread now?



"NVIDIA GeForce GTX 580 SLI "  K now what?!? 

Oh crap you did not say 'Simon says'!


----------



## newtekie1 (Nov 9, 2010)

PopcornMachine said:


> Sure you do.
> 
> Hey look, it's a unicorn!



I guess you missed my point about being able to turn down setting in the game, didn't you.

Guess what, when you turn down the settings the game actually gets pretty good performance.  I know, amazing, right!


----------



## ShogoXT (Nov 9, 2010)

I agree though you cant compare 5970 because its 2 GPUs. So people saying its faster, is mostly irrelevant and debatable.



newtekie1 said:


> I guess you missed my point about being able to turn down setting in the game, didn't you.



Lies! But seriously though I guess when you upgrade your rig enough you start raising your standards on quality and performance. When I first started PC gaming. When my old computer got to playing diablo 2 (pentium 166 2mb integrated 2d graphics, 24MB ram), I thought 28k modem and freezing every minute was playable. 

The scaling chart is invaluable for this reviews so thanks for putting those up. Maybe il reconsider if I get a 30inch. 

Sorry for going off topic.


----------



## newtekie1 (Nov 9, 2010)

ShogoXT said:


> But seriously though I guess when you upgrade your rig enough you start raising your standards on quality and performance.



And I would agree with you.  However, when you lower the settings on Metro it still looks just as good as the other games that I play with their settings maxed out, sometimes even better still.  That was my point from the beginning.  Just because the option is there doesn't mean you have to use it.  Everyone doesn't have to play every game maxed out.  And every game that doesn't run on mid-range hardware maxed out isn't poorly coded.  Anyway, enough about Metro, back on topic.


----------



## Deleted member 67555 (Nov 10, 2010)

newtekie1 said:


> And I would agree with you.  However, when you lower the settings on Metro it still looks just as good as the other games that I play with their settings maxed out, sometimes even better still.  That was my point from the beginning.  Just because the option is there doesn't mean you have to use it.  Everyone doesn't have to play every game maxed out.  And every game that doesn't run on mid-range hardware maxed out isn't poorly coded.  Anyway, enough about Metro, back on topic.



Good point! I can still game on a 4830 at certain settings that I'm quite ok with....

But I sure would like a card like this...but not at $500...This is a card for Fits 
I sure would like to play with one though


----------



## mlee49 (Nov 10, 2010)

Wonder if Evga will release an SLI Profile update soon.

Nice review W1z, but some of these games/benches are too much.  I'd like to see removed:

UT3
3DMark(all of them)
WoW(even thought I'm sure it actually does bring in search results)
and any other game/bench with over 200fps

It seems like a waste of time to run 5 resolution runs of 20 some games where you're just murdering the frame rate.  Maybe clear up the review by some 10 pages of benches I just skip over anyways.

Not sure if you agree.


----------



## AlienIsGOD (Nov 10, 2010)

I agree on the WoW, if you want to look at an MMO for DX sake; then LOTRO is a more accurate representation of DX10/11 graphics.  WoW is too cartoony overall, a few bosses may look pretty but overall it still has that "I was made in 2004" look to it.  LOTRO at least makes use of DX10/11 to a more noticable visual degree IMO.


----------



## W1zzard (Nov 10, 2010)

wow is one of the few directx 11 titles out there, it gets an expansion and graphics update next month anyway. ut3 engine is used for lots and lots of current and upcoming games.


----------



## crazyeyesreaper (Nov 10, 2010)

true UT3 is used alot but just about every game on UT3 engine will run fine on an 8800gt for the most part its like running say Aquamark theres no point really another example of bad games to bench with

Splinter Cell conviction,  Cryostasis, World In Conflict, etc 

good games to use are well already used in the TPU reviews

Bad Company 2
CoD 4 probably should be updated to a new CoD game to better reflect performance but overall not needed to replace
Far Cry 2 is still relevant
Stalker should be changed over to Call of Pripyt
most of the rest is fine for the most part but WoW is pretty much useless to bench at this point. but then again it brings traffic to the site i suppose.

3dmark is just garbage its great for who wants to measure epeen with scores but there not really worth looking at a gain of 30% in 3dmark ends up being 5% in general or so it seems.

Metro is the new crysis so its going to stay for a long time to come.

but overall i dont really give a dam w1zz does all the work so im just thankfull i get reviews of new hardware that dosent scream said company paid for good lip service. that and the numerous games and benches tested put most review sites to shame these days


----------



## DrunkenMafia (Nov 10, 2010)

630W!!!! holy crappo, at least nvidia is keeping all the 1kw psu's    selling..


----------



## Wile E (Nov 10, 2010)

Black Panther said:


> Can't agree with that really:
> 
> 580 SLI = 2 GPU's
> 5970 CF = 4 GPU's



Number of GPUs don't matter. All that matters is how many slots it takes. Both occupy a 16x slot, and are a single card. As consumers, we don't buy a gpu at a time like we do cpus, we buy an entire package. 

5970 is a single package, therefore a single card, therefore it is completely ok to compare 2 of them vs 2 580's


----------



## heky (Nov 10, 2010)

+1 on that. 100% agree with Wile E.


----------



## HalfAHertz (Nov 10, 2010)

Wile E said:


> Number of GPUs don't matter. All that matters is how many slots it takes. Both occupy a 16x slot, and are a single card. As consumers, we don't buy a gpu at a time like we do cpus, we buy an entire package.
> 
> 5970 is a single package, therefore a single card, therefore it is completely ok to compare 2 of them vs 2 580's



I think the point here was more that there may always be underlying problems with scaling and new titles, micro-stuttering, etc.

Anyway, as always great job W1z! Keep up the good work!


----------



## Blacklash (Nov 10, 2010)

Meh if 6870 Crossfire were in the article few would be bothering with 580 SLi. Costs less than a single 580 and doesn't trail 580 SLi enough to warrant the obscene price difference.


----------



## Easy Rhino (Nov 10, 2010)

i don't remember the last time any new top of the line card, let alone an nvidia card, was also the best performance per dollar.


----------



## Andrew (Nov 10, 2010)

*Nice review*

Nice review
Please when the time comes do a review of 4-WAY 580 VS 5970 CrossFire Black Edition
Thanks


----------



## Bjorn_Of_Iceland (Nov 10, 2010)

W1zzard said:


> wow is one of the few directx 11 titles out there, it gets an expansion and graphics update next month anyway. ut3 engine is used for lots and lots of current and upcoming games.


A wise decision imo


----------



## Frick (Nov 10, 2010)

I missed a comparision to 480 SLI, but otherwise it's all good.


----------



## entropy13 (Nov 10, 2010)

Easy Rhino said:


> i don't remember the last time any new top of the line card, let alone an nvidia card, was also the best performance per dollar.



It would seem any new top of the line card was never the best p/$, although I'm only basing it since 3870X2...


----------



## mdsx1950 (Nov 10, 2010)

Impressive. Quad SLi GTX 580 reviews anyone?


----------



## TAViX (Nov 11, 2010)

Not pleased. Where are the CrossFire comparisons????????????????????????????????????????????????????????????????????????????????


----------



## Frag_Maniac (Nov 11, 2010)

Frick said:


> I missed a comparision to 480 SLI, but otherwise it's all good.


The 480 in SLI stomps it in most tests. Usually this is just a need for driver tweaking, esp with new architecture. What's scary though is the 480 at launch scaled fairly well in SLI. It was really only 2560 performance that needed improving. 

So while on the one hand I'm psyched about the 580, I'm tentative too, until I see much improved SLI scaling.


----------



## The Jedi (Nov 11, 2010)

I'm ambivalent about 3DMark, but as "another metric" I noticed 3DMark Vantage scores were missing.


----------



## Forbin Rhodes (Nov 12, 2010)

*Quadfire X*

Following benchmarks for Quadfire X (2 X 5970 @ stock 725/1000)

Crysis- (Very high settings 4X 1920x1080) 66.91 fps avg

Heaven 2.1 (1920X1080 4X) 102.5 fps avg

When I decide to water cool the graphics cards the scores should increase by 35%.

I didn't see any temps in the GTX580 review?


----------



## Over_Lord (Nov 12, 2010)

HD5970 and HD6870 CFX still rock. And now HD5970 costs 388 pounds or so, thats cheaper than GTX580 with more horsepower. And I think 2xHD6870 costs about 500$, same as GTX580, yet they perform really well or maybe better


----------



## OrbitzXT (Nov 13, 2010)

Did I overlook the temperature/fan noise part of the review or was it not included for some reason? Part of the reason I stopped using ATI cards is because of the heat and noise they output compared to most nVidia cards. I was curious how this configuration holds up.


----------



## Frag_Maniac (Nov 15, 2010)

thunderising said:


> HD5970 and HD6870 CFX still rock. And now HD5970 costs 388 pounds or so, thats cheaper than GTX580 with more horsepower. And I think 2xHD6870 costs about 500$, same as GTX580, yet they perform really well or maybe better


Well at the 580 launch, here in the US, there were still 5970s priced well above the base model 580 on Newegg. Also the 5970 really doesn't have more power. It trades blows evenly with the 580, and when you factor that in, it's still the better card because it has way better DX11 support.


----------

