# ASUS GeForce GTX 570



## W1zzard (Dec 6, 2010)

Today NVIDIA launches their new GeForce GTX 570 graphics card which is based on the new more power efficient Fermi technology that we saw on the GTX 580, too. ASUS' GTX 570 is a full reference design implementation, the only difference is a small clock speed increase of 10 MHz.

*Show full review*


----------



## newtekie1 (Dec 7, 2010)

Ok, so where are all the people saying it is overpriced now?

I think for $350 it is a good value considering it matches a GTX480.  So for the performance that you got less than a year ago for $500 you can now get for $350, I like.


----------



## entropy13 (Dec 7, 2010)

No box and bundle for this it seems.


----------



## Yellow&Nerdy? (Dec 7, 2010)

If the price settles just above 300€, it's pretty good. Great review, as always. This ought to put some pressure on AMD Cayman. Maybe we'll get some good ol' price war.


----------



## cherubrock (Dec 7, 2010)

Why isn't lack of DisplayPort listed as a con? Nvidia isn't getting with the times and are still only supporting old connector standards, while AMD started supporting DP as soon as it was available. Yes, I know that DP availability on monitors isn't widespread yet, but if you're buying a videocard that you're hoping will be a long-term investment, you might be wondering if your future monitor will have DP and you'll want to use that. 

Personally, lack of DP is the main reason keeping me away from Nvidia's cards. I know I'll be getting a pro monitor someday soon, and I'm hoping leave behind ancient DVI, with it's screws and bendable pins.


----------



## W1zzard (Dec 7, 2010)

entropy13 said:


> No box and bundle for this it seems.



no box. but asus sent the accessories:


----------



## assaulter_99 (Dec 7, 2010)

cherubrock said:


> Why isn't lack of DisplayPort listed as a con? Nvidia isn't getting with the times and are still only supporting old connector standards, while AMD started supporting DP as soon as it was available. Yes, I know that DP availability on monitors isn't widespread yet, but if you're buying a videocard that you're hoping will be a long-term investment, you might be wondering if your future monitor will have DP and you'll want to use that.
> 
> Personally, lack of DP is the main reason keeping me away from Nvidia's cards. I know I'll be getting a pro monitor someday soon, and I'm hoping leave behind ancient DVI, with it's screws and bendable pins.



Why don't you give some concrete reasons as to why it should be noted as a con? It already has HDMI, which will be around for some time. Also it should be noted that it also supports it, if the board manufacturers want to get in the game.


----------



## N3M3515 (Dec 7, 2010)

Excellent price!!!, i hope AMD can bring the same performance on the HD6950 for 299USD!


----------



## AltecV1 (Dec 7, 2010)

fixed the noise and heat but the power consumption is still horrible.......WHY?????


----------



## CDdude55 (Dec 7, 2010)

Looks like a fantastic card, it mingles with and beats in most tests a GTX 480 for a nice price.

I thought it was going to cost over $400 at first which pulled me away.


----------



## mdsx1950 (Dec 7, 2010)

I gotta say this is awesome! GTX480 performance for $50 less. With better power consumption and lower heat.


----------



## crow1001 (Dec 7, 2010)

Still benching Dirt 2 in DX9 I see, hell why not bench Metro in DX9 as well.


----------



## W1zzard (Dec 7, 2010)

crow1001 said:


> Still benching Dirt 2 in DX9 I see, hell why not bench Metro in DX9 as well.



oh .. dirt 2 isn't supposed to be in those benchmarks. its supposed to be replaced by f1 2010. ignore the dirt 2 numbers. while correct they will be removed in the next review


----------



## dir_d (Dec 7, 2010)

329 MSRP is pretty good for this card. This is the card to get so far if you held out on DX11. Just need to see what AMD has been doing now

Great Review W1zz i enjoyed it.


----------



## newtekie1 (Dec 7, 2010)

crow1001 said:


> Still benching Dirt 2 in DX9 I see, hell why not bench Metro in DX9 as well.



Because DX11 makes a visual difference in Metro2033, and it makes no difference in Dirt2 perhaps?


----------



## crow1001 (Dec 7, 2010)

FFS there is plenty of comparisons out there of DX9 and DX11 in dirt 2, there is a difference, DX11 cards that fly in DX11 Dirt 2 but benched in DX9 is crazy. And yeah Metro 2033 and DX11 tessellation, you need to take stills and zoom in to tell the difference lol. I can't name any other site that benches it in DX9.

http://www.pcgameshardware.com/aid,...mpared-Top-article-of-December-2009/Practice/


----------



## assaulter_99 (Dec 7, 2010)

Jeez why are people so jackasses? There are plenty of benches in Dx11 and all they do is whine about Dirt 2, which afaik, was one of the most gimmicky Dx11 games out there. What did we get in that mode? Better flag response to wind? Sexier water pools and splashes? Its not like it would have stressed the gpu 50% more than it did.


----------



## cadaveca (Dec 7, 2010)

The real DX11 features in Dirt2 are only really used in replays, crow1001. without using the exact same replay, it'd be impossible to ensure that DX11 load is consistent through many cards. I thikn you ask for a bit too much, and give other sites a bit too much credit.


----------



## newtekie1 (Dec 7, 2010)

crow1001 said:


> FFS there is plenty of comparisons out there of DX9 and DX11 in dirt 2, there is a difference, DX11 cards that fly in DX11 Dirt 2 but benched in DX9 is crazy. And yeah Metro 2033 and DX11 tessellation, you need to take stills and zoom in to tell the difference lol. I can't name any other site that benches it in DX9.
> 
> http://www.pcgameshardware.com/aid,...mpared-Top-article-of-December-2009/Practice/



Yes, the crowd and flags looks slightly better, big deal, but do you notice this when flying around the track doing 150MPH?  No.

Difference that you have to stop the gameplay, and zoom way into the scene, to see aren't difference worth taking a performance hit for, IMO.  Which is exactly why I play Dirt2 in DX9.


----------



## Imsochobo (Dec 7, 2010)

newtekie1 said:


> Ok, so where are all the people saying it is overpriced now?
> 
> I think for $350 it is a good value considering it matches a GTX480.  So for the performance that you got less than a year ago for $500 you can now get for $350, I like.



newtekie, your a little fanboy tho.
but its more acceptable, prices are still way way too high... how long does it have to take before my videocard is under orginal price of 180£ ???
Its currently 200£ over a year after, this shows nvidia failing to actually properly compete.
I shall congratulate nvidia for making the 580 which is good, 570 is good but power consumtion is still pretty bad....
and price should be 300$ if it had more effiecent design.

Lets think about cayman, 10% less transistors for same performance, and already amd is miles ahead in terms of performance/mm^2 and performance/watt.

Nvidia, i sincerely hope you can compete with the next gen, and maybe prove that fermi is actually good.
Like ati with HD2900 ( fail for many except extreme oc)

as last notice.
it compete's with ati's current pricing, which wont drop unless ati see drop in sales, or nvidia prices lower.


----------



## crow1001 (Dec 7, 2010)

LOL just got my spanking new DX11 570, let's go play some Dirt 2, with all them nice DX11 features like tessellation and DX11 lighting effects but first I must edit the config file to run in DX9 because why would I want to run a DX11 tiltle in DX11 lol. I could understand if it killed performance but it does not, so why run in DX9 again??? Any punter out there who is considering buying a DX11 card like the 570 will run Dirt 2 at its default DX path which is DX11, what use it to them running it in DX9, may as well dump the bench for something more useful. Or run all the other DX11 benchameks in DX9/10 as there are plenty of other titles out there that look not much different in DX11, AvP....catch my drift.


----------



## assaulter_99 (Dec 7, 2010)

crow1001 said:


> LOL just got my spanking new DX11 570, let's go play some Dirt 2, with all them nice DX11 features like tessellation and DX11 lighting effects but first I must edit the config file to run in DX9 because why would I want to run a DX11 tiltle in DX11 lol. I could understand if it killed performance but it does not, so why run in DX9 again??? Any punter out there who is considering buying a DX11 card like the 570 will run Dirt 2 at its default DX path which is DX11, what use it to them running it in DX9, may as well dump the bench for something more useful.



Well go have fun with it, your negative and pointless comments will sorely miss us.


----------



## motasim (Dec 7, 2010)

Imsochobo said:


> newtekie, your a little fanboy tho.
> but its more acceptable, prices are still way way too high... how long does it have to take before my videocard is under orginal price of 180£ ???
> Its currently 200£ over a year after, this shows nvidia failing to actually properly compete.
> I shall congratulate nvidia for making the 580 which is good, 570 is good but power consumtion is still pretty bad....
> ...




I completely disagree; for this performance and at this price ($350) it is a very good deal, but I'll wait for one more week and see what Mr. Cayman has to say before I make up my mind  I am sooooooooooooo glad now that I didn't get the 5870 nor the GTX 480 and decided to wait till end of the year and see, let the price battle begin


----------



## Fourstaff (Dec 7, 2010)

Move along, nothing to see here that we didn't expect, except perhaps the nice OC and good price point. And I like how Wiz got rid of the old generation cards, perhaps he thinks Nvidia and AMD both have the next gen cards already? (as opposed to ATi's 5xxx against Nvidia's 4xx which was unimpressive)


----------



## crow1001 (Dec 7, 2010)

assaulter_99 said:


> Well go have fun with it, your negative and pointless comments will sorely miss us.





It's constructive criticism, I'm not one of the sheep and a suckass.


----------



## newtekie1 (Dec 7, 2010)

Imsochobo said:


> newtekie, your a little fanboy tho.
> but its more acceptable, prices are still way way too high... how long does it have to take before my videocard is under orginal price of 180£ ???
> Its currently 200£ over a year after, this shows nvidia failing to actually properly compete.
> I shall congratulate nvidia for making the 580 which is good, 570 is good but power consumtion is still pretty bad....
> ...



So it competes against ATi's prices, which you seem to not have a problem with and praise ATi left and right, yet it I'm the fanboy for saying it isn't overpriced.

Are you sure you're labelling the right person the fanboy here?



crow1001 said:


> LOL just got my spanking new DX11 570, let's go play some Dirt 2, with all them nice DX11 features like tessellation and DX11 lighting effects but first I must edit the config file to run in DX9 because why would I want to run a DX11 tiltle in DX11 lol. I could understand if it killed performance but it does not, so why run in DX9 again??? Any punter out there who is considering buying a DX11 card like the 570 will run Dirt 2 at its default DX path which is DX11, what use it to them running it in DX9, may as well dump the bench for something more useful. Or run all the other DX11 benchameks in DX9/10 as there are plenty of other titles out there that look not much different in DX11, AvP....catch my drift.



Why run in DX9?  Because DX11 does impact performance for no visual improvement in Dirt2.  Haven't I already said that?


----------



## crow1001 (Dec 7, 2010)

The 570 does over 90FPS average in DX11 at 1080p ultra quality 4xAA, looks like a big hit in performance right there.

As said where's the noticeable visual improvement in Alien v predator in DX11? may as well bench it in DX10 then.


----------



## Imsochobo (Dec 7, 2010)

motasim said:


> I completely disagree; for this performance and at this price ($350) it is a very good deal, but I'll wait for one more week and see what Mr. Cayman has to say before I make up my mind  I am sooooooooooooo glad now that I didn't get the 5870 nor the GTX 480 and decided to wait till end of the year and see, let the price battle begin



compared to what the price is for the actual cards from amd you will be well, shocked...
nvidia uses usually twice as much die space for the same workload (4870 vs 260 5870 vs 470) yeah..
and yet they're priced pretty much the same... and memory bus is also smaller with amd.
Just saying, graphics cards could be alot cheaper by around 50bucks atleast for 5850 5870 5970 480 570 580..


----------



## W1zzard (Dec 7, 2010)

W1zzard said:


> oh .. dirt 2 isn't supposed to be in those benchmarks. its supposed to be replaced by f1 2010. ignore the dirt 2 numbers. while correct they will be removed in the next review



that. please stop the dirt 2 discussion. i'd remove them now but then the summaries wouldn't be correct anymore


----------



## assaulter_99 (Dec 7, 2010)

crow1001 said:


> It's constructive criticism, I'm not one of the sheep and a suckass.



I just had a look at your recent posts, you sure are someone really negative. I won't get into cheap arguments. Once again, have fun with your gtx 570.


----------



## wolf (Dec 7, 2010)

fantastic review W1zzard, this is exactly what I expectd from the GTX570, GTX480 performance with all the improvements GF110 brought.

I also see a few reviews commenting that, while the noise level does measure XX.X dB the sound is a lot less harsh on the ear than other cards with the same noise output. really looks like Nvidia did their research and planning with the new coolers.


----------



## WhiteLotus (Dec 7, 2010)

W1zzard said:


> that. please stop the dirt 2 discussion. i'd remove them now but then the summaries wouldn't be correct anymore



How many benchs do you actually do, to me that statement indicates that you already did the f1 bench but forgot to include it. So do you do others that aren't included....?


----------



## ArmoredCavalry (Dec 7, 2010)

Sweet, $350 is a good price point imo.

Now, time to wait another week, and hope AMD can present some good competition. Maybe I'll upgrade my HD 5850.


----------



## W1zzard (Dec 7, 2010)

WhiteLotus said:


> How many benchs do you actually do, to me that statement indicates that you already did the f1 bench but forgot to include it. So do you do others that aren't included....?



i rebenched all cards on new drivers. forgot to remove dirt 2 from the testing process
i bench crysis 2 every time but cant show you anything


----------



## crow1001 (Dec 7, 2010)

Crysis 2, is that a typo lol.


----------



## Fourstaff (Dec 7, 2010)

crow1001 said:


> Crysis 2, is that a typo lol.



How so?


----------



## assaulter_99 (Dec 7, 2010)

W1zzard said:


> i bench crysis 2 every time but cant show you anything



Hmm very interesting indeed! I wonder if the launch is still for early next year (since you seem to have it in hand) and how these cards fare with the game! If you did this on purpose, you are one cunning man!  That is only gonna start the hype!


----------



## Wyverex (Dec 7, 2010)

newtekie1 said:


> Ok, so where are all the people saying it is overpriced now?
> 
> I think for $350 it is a good value considering it matches a GTX480.  So for the performance that you got less than a year ago for $500 you can now get for $350, I like.


Not trying to be rude or anything, but where do you see the price as $350? 

Taken from the review:




I still think that $400 is a bit too much, but it's still a nice card, for the performance it offers. Well, hopefully Cayman will be good and start a new price war so we, the customers, can win again


----------



## Fourstaff (Dec 7, 2010)

Wyverex said:


> Not trying to be rude or anything, but where do you see the price as $350?



http://www.newegg.com/Product/Product.aspx?Item=N82E16814130593

Amidoinitrite?


----------



## assaulter_99 (Dec 7, 2010)

Well its stated on the last page of the review! 

NVIDIA's GeForce GTX 570 comes at an MSRP of $329.


----------



## CDdude55 (Dec 7, 2010)

Wyverex said:


> Not trying to be rude or anything, but where do you see the price as $350?
> 
> I still think that $400 is a bit too much, but it's still a nice card, for the performance it offers. Well, hopefully Cayman will be good and start a new price war so we, the customers, can win again



End of the review:








Newegg:


----------



## Wyverex (Dec 7, 2010)

So much for my _detailed_ reading 
I stand corrected

PS maybe w1zz should update the first page of review


----------



## DanTheMan (Dec 7, 2010)

W1zzard said:


> i rebenched all cards on new drivers. forgot to remove dirt 2 from the testing process
> i bench crysis 2 every time but cant show you anything



W1zzard I know you work very hard, but you've got to love a guy that can "play" the upcoming games that we can only dream of right now. I bet your local friends say all the time - can I come over and try out that new video card/game. I know you have to honor the NDA but man I bet it's tempting to let your friends see the level of details before stuff is released.  Hopefully AMD cards will play nice with Crysis 2 - I'm really waiting for this game.


----------



## CDdude55 (Dec 7, 2010)

DanTheMan said:


> Hopefully *all* cards will play nice with Crysis 2 - I'm really waiting for this game.



fixed.


----------



## wolf (Dec 7, 2010)

crysis 2 ?






heres hoping it plays a lot better than the original on the same hardware.


----------



## johnnyfiive (Dec 7, 2010)

Would be awesome to see some 570 SLI numbers compared to 6870 CrossFire numbers.


----------



## N3M3515 (Dec 7, 2010)

DanTheMan said:


> W1zzard I know you work very hard, but you've got to love a guy that can "play" the upcoming games that we can only dream of right now. I bet your local friends say all the time - can I come over and try out that new video card/game. I know you have to honor the NDA but man I bet it's tempting to let your friends see the level of details before stuff is released.  Hopefully AMD cards will play nice with Crysis 2 - I'm really waiting for this game.



I think W1zzard was just being sarcastic......


----------



## ensabrenoir (Dec 7, 2010)

*Seeing green*

Enough with all this tech  talk!  Lets talk about our feelings!      I hope someone out there invent a hack, mod kit or majic potion that allows us to combine and run amd & Nvidia cards at the same time combining the best of both.(like fusion on steroids)  This is the best i've from the green team in quite a while.  Next build  will be venturing into sli country.


----------



## Fourstaff (Dec 7, 2010)

ensabrenoir said:


> Enough with all this tech  talk!  Lets talk about our feelings!      I hope someone out there invent a hack, mod kit or majic potion that allows us to combine and run amd & Nvidia cards at the same time combining the best of both.(like fusion on steroids)  This is the best i've from the green team in quite a while.  Next build  will be venturing into sli country.



Its called Lucid Hydra

http://en.wikipedia.org/wiki/Hydra_Engine


----------



## wolf (Dec 7, 2010)

Fourstaff said:


> Its called Lucid Hydra
> 
> http://en.wikipedia.org/wiki/Hydra_Engine



lol I was going to say the same thing!


----------



## douglatins (Dec 7, 2010)

the 580 seems overpriced now

I should have gotten 2 of these........ maybe i will after i sell my 480, actually i will for sure!

W1zz would u lower the score of the 580 now? This matters a lot to me


----------



## ensabrenoir (Dec 7, 2010)

Fourstaff said:


> Its called Lucid Hydra
> 
> http://en.wikipedia.org/wiki/Hydra_Engine



oh yeah forgot about that  heard it has issues though


----------



## Fourstaff (Dec 7, 2010)

ensabrenoir said:


> oh yeah forgot about that  heard it has issues though



Yeah, Lucid is still currently in its infancy, and ATI and Nvidia is not playing nice with them.


----------



## ensabrenoir (Dec 7, 2010)

Fourstaff said:


> Yeah, Lucid is still currently in its infancy, and ATI and Nvidia is not playing nice with them.



Yeah they are the issue!  Can almost understand their p.o.v. though.  Sort of like putting  a Mustang engine in a Camaro the purist wont have it and the corporations don't want to share the profits.


----------



## Relayer (Dec 8, 2010)

So the new 480sp (gtx-570) isn't any faster, slower actually, than the last gen 480sp (gtx-480)? How is this a win? It uses less power and is cooler than the most power hungry, hottest GPU ever. That doesn't seem like a really big accomplishment. It is cheaper than the gtx-480.

Suppose Barts was 1600sp, what do you think it's performance would be? Certainly it would kick the hell out of the 5870, not just match it.


----------



## wolf (Dec 8, 2010)

Relayer said:


> So the new 480sp (gtx-570) isn't any faster, slower actually, than the last gen 480sp (gtx-480)? How is this a win? It uses less power and is cooler than the most power hungry, hottest GPU ever. That doesn't seem like a really big accomplishment. It is cheaper than the gtx-480.



it IS an accomplishment, it's on par with what was for 9 months the fastest single GPU you could buy, now its cooler, quieter and a considerable amount cheaper too. also consider it likely that it would have been a GTX475 had ATi not jumped to next generation naming. you have to just think of it as a first gen refresh because thats what it is. can't wait to see what ATi can pull off in terms of a single GPU this generation, Nvidia always tends to lead there and that's why I prefer them.



Relayer said:


> Suppose Barts was 1600sp, what do you think it's performance would be? Certainly it would kick the hell out of the 5870, not just match it.



nope, essentially it would match it. the reason they cut down the SP's in the first place was that cypress was unbalanced in terms of ROPS and SP's (too many SP's), so they tuned that a little, and slightly beefed up the tesselation.

supposing barts were 1600 sp essentially is a 5870.


----------



## Relayer (Dec 8, 2010)

wolf said:


> nope, essentially it would match it. the reason they cut down the SP's in the first place was that cypress was unbalanced in terms of ROPS and SP's (too many SP's), so they tuned that a little, and slightly beefed up the tesselation.
> 
> supposing barts were 1600 sp essentially is a 5870.




Sorry, I don't buy that premise at all. A 1600sp Barts would cream the 5870. They, of course, would increase other areas of the card to match.


----------



## wolf (Dec 8, 2010)

Relayer said:


> Sorry, I don't buy that premise at all. A 1600sp Barts would cream the 5870.



we'll never know, but I stand by my logic.



Relayer said:


> They, of course, would increase other areas of the card to match.



then thats not a 1600sp barts is it


----------



## Relayer (Dec 8, 2010)

wolf said:


> we'll never know, then thats not a 1600sp barts is it



Whatever. You can try and avoid the point I was making. It's not hard to improve on the hottest, most power hungry design ever after 8mos. Especially if you are going to make it slower overall, clock for clock.


----------



## jamsbong (Dec 8, 2010)

This is a good value high performance card. Kinda like the 5850 was.

One thing some of you may have notice is that the 2nd Generation NV 40nm chips are a much more mature. On the other hand, the ATI 40nm has always been mature and so is less special when the 68xx cards were launched. Maybe the 69xx cards could change that perception.

The next critical thing for GPU performance upgrade will highly depend on the new manufacturing tech. I think both ATI and NV are counting on GF or TMSC to give them 28nm or similar.


----------



## wolf (Dec 8, 2010)

Relayer said:


> Whatever. You can try and avoid the point I was making. It's not hard to improve on the hottest, most power hungry design ever after 8mos. Especially if you are going to make it slower overall, clock for clock.



no problems it's just a difference in opinion. however I did already make my answer to your point, that the refinement to GF100 is a decent accomplishment after Nvidia dropped the ball so badly with it.

GF110 is essentially the same performance clock for clock, what theyve been able to do with the refinement is increase yeilds and up the clockspeeds. clock for clock you'd have to be comparing the 570 to a 470, where it is faster solely because more SP's are active, and perhaps a little in FP16 heavy titles (as GF110 is capable of double GF100 in that regard). the real advantage is decent clockspeed increases, and much better OC headroom for enthusiasts, and lets face it, GF110 is an enthusiast GPU.

All I think is that Nvidia have done a good job turning around the bad situation they were in, we now have a GTX580 which consistantly beats a 480 while using less power and making less heat, and likewise for the 470, however power is almost the same, but the performance delta is bigger than 480-580. more like 20-25% (570 vs 470) as oppose to 10-15% (580 vs 480).


----------



## TinksMeOff (Dec 8, 2010)

*2560X1600 conclusion*

The Conclusion made this statement which is unclear to me what it means.

"In most of the latest DirectX 10 and DirectX 11 games, the GTX 570 will provide you comfortable gameplay with quite some eye-candy enabled, at 1920 x 1200 resolutions. It will also make gaming at 2560 x 1600 possible with some loss of detail."

I just bought a 30 HP ZR30w IPS LCD and I am wondering if the 580 has the same quality loss issue.  What cards are you comparing the 'quality' factor against exactly?  Why is there a quality loss in the 570?  Thanks for any clarification on this statement

Edit:  It just dawned on my you're talking about turning down game graphic settings in order to get proper FPS.


----------



## N3M3515 (Dec 8, 2010)

wolf said:


> supposing barts were 1600 sp essentially is a 5870.



How you think barts with 1600 sp would perform just like a 5870 is beyond my comprehension, considering 1120 sp barts is faster than 1440 sp cypress.

*On topic: seing how 570 is 22% avg faster than 470, that would be 35% faster than 5850, so, HD 6950 would "only" have to be 40% - 45% faster than 5850 in order to be faster than gtx 570 and i bet 6950 is going to cost 299USD.

If anyone cares to think, one would reach this conclusion: 

9800GTX = HD4850
GTX 285 < HD5850
GTX 480(GTX 570) < HD6950 ??


----------



## Benetanegia (Dec 8, 2010)

Relayer said:


> Whatever. You can try and avoid the point I was making. It's not hard to improve on the hottest, most power hungry design ever after 8mos. Especially if you are going to make it slower overall, clock for clock.



You can look at it this way. If GF100 ==> R600 then GF110 ==> RV670, but without the luxury and advantage of having an smaller node. Nvidia has done a much much better job "fixing" GF100 than AMD did "fixing" R600. Who knows what will happen with the next chip.



N3M3515 said:


> How you think barts with 1600 sp would perform just like a 5870 is beyond my comprehension, considering 1120 sp barts is faster than 1440 sp cypress.



He thinks that because it's the plain truth. 1120 SP Barts is NOT faster than 1440 SP Cypres, not at all. lol. What's beyond my comprehension is how you can even make that comparison. 1440 SP Cypress @ 900 Mhz >> Barts @ 900 Mhz and both attain similar clocks when OCed, so it's not as if Barts coud do 1200 Mhz.

The reason that Barts with 1120 SPs is similar to Cypress is because the architecture couldn't handle so many SPs to begin with, hence they lowered the ammount of them.


----------



## N3M3515 (Dec 8, 2010)

Benetanegia said:


> You can look at it this way. If GF100 ==> R600 then GF110 ==> RV670, but without the luxury and advantage of having an smaller node. Nvidia has done a much much better job "fixing" GF100 than AMD did "fixing" R600. Who knows what will happen with the next chip.



Much much better?? why?
I agree better, but no that much much much much.........like if it was 50% or something 
gtx 580 20% faster than gtx 480 and better at perf per watt 
hd 3870 = hd 2900xt but much much much better at perf per watt


----------



## Benetanegia (Dec 8, 2010)

N3M3515 said:


> Much much better?? why?
> I agree better, but no that much much much much.........like if it was 50% or something
> gtx 580 20% faster than gtx 480 and better at perf per watt
> hd 3870 = hd 2900xt but much much much better at perf per watt



No it didn't have better perf/watt really if we consider that the node change already yields about 2x the perf/watt. All the improvements came from going 55nm. Nvidia has achieved significantly better performance at lower power consumption on the same "fucked up" node.


----------



## N3M3515 (Dec 8, 2010)

Benetanegia said:


> No it didn't have better perf/watt really if we consider that the node change already yields about 2x the perf/watt. All the improvements came from going 55nm. Nvidia has achieved significantly better performance at lower power consumption on the same "fucked up" node.



Then you can't compare the two, you are based on different scenarios, a fair comparison is if amd had to do de trick at the same nm.

EDIT: a fair comparison will be when cayman is out


----------



## wolf (Dec 8, 2010)

N3M3515 said:


> How you think barts with 1600 sp would perform just like a 5870 is beyond my comprehension, considering 1120 sp barts is faster than 1440 sp cypress.



keep in mind they did make other tweaks, mainly to tesselation performance, but not to straight forward shader architecture.

and now that your comparing 1120sp barts to 1440sp cypress, keep in mind the clockspeed differences between the two. 1440sp cypress is a 5850 clocked at 725mhz core, the 1120sp barts is clocked at 900mhz core, with the same amount of ROPS.

keep in mind also that when a 5850 and 5870 are clocked the same the difference is almost nul, somewhere in the vicinity of 2-3% while losing 10% of it's sp's. this confirms the chip was unbalanced in terms of ROPS vs SP's.

I am being swayed to think that a "barts" with 1600 sp would be a tad faster, but only because it would have more tesselation performance, not more shader performance, just keep in mine 6800's are clocked faster than 5800's to help make up for their lack in sp's

5850 = 725mhz, 6850 = 775mhz
5870 = 850mhz, 6870 = 900mhz

again it's just a difference in opinion, and we will never know the difference because such a card wont be made now.

sorry for being soo off topic


----------



## N3M3515 (Dec 8, 2010)

wolf said:


> keep in mind they did make other tweaks, mainly to tesselation performance, but not to straight forward shader architecture.
> 
> and now that your comparing 1120sp barts to 1440sp cypress, keep in mind the clockspeed differences between the two. 1440sp cypress is a 5850 clocked at 725mhz core, the 1120sp barts is clocked at 900mhz core, with the same amount of ROPS.
> 
> ...



You make a good point, the card's shaders aren't being used at its full potential, i guess that's what amd addressed with the revamped shader system in the 6900's or whatever that is called.


----------



## Benetanegia (Dec 8, 2010)

N3M3515 said:


> Then you can't compare the two, you are based on different scenarios, a fair comparison is if amd had to do de trick at the same nm.



Then maybe they would do as good as job as Nvidia has done now, but that's out of the question Nvidia has done a better job than AMD did back then and that's all that matters according to what MY point was.

HD3870 vs HD2900XT perf/watt (same perf)







68% vs 100%

GTX480 vs GTX570 perf/watt






74% vs 100% without a node change.

So let's define what "much much better" is, because it's clear that we are both using it liberally, but ultimately if HD3870 had a much much better perf/watt then Nvidia is doing a much much better job. 



N3M3515 said:


> EDIT: a fair comparison will be when cayman is out



This is not AMD vs Nvidia. Why does people have to convert every discussion in an AMD is better than Nvidia, or viceversa argument? Its stupid.

I am talking about how the 2 companies stepped out from their failure and ho Nvidia seems like it's doing better. does it mean Nvidia is better? No. Does it mean next Nvidia chip will be a lot better no, but it does open a very very big posibility.


----------



## wolf (Dec 8, 2010)

N3M3515 said:


> You make a good point, the card's shaders aren't being used at its full potential, i guess that's what amd addressed with the revamped shader system in the 6900's or whatever that is called.



yeah they must have figured out the proper ratio needed for using the chip to its peak potential, and decided to amp up the tesselation while in there.


----------



## newtekie1 (Dec 8, 2010)

Relayer said:


> So the new 480sp (gtx-570) isn't any faster, slower actually, than the last gen 480sp (gtx-480)?



You know there are more specs than just number of SPs.



Relayer said:


> How is this a win? It uses less power and is cooler than the most power hungry, hottest GPU ever. That doesn't seem like a really big accomplishment. It is cheaper than the gtx-480.



Lower power consumption and lower temperatures was the point of the tweaks, not to make the card faster clock for clock.  In fact, the card would likely be equal clock for clock if the memory system was kept the same.


----------



## Hayder_Master (Dec 8, 2010)

great review, awesome overclocking for this card


----------



## Relayer (Dec 8, 2010)

newtekie1 said:


> Lower power consumption and lower temperatures was the point of the tweaks, not to make the card faster clock for clock.  In fact, the card would likely be equal clock for clock if the memory system was kept the same.



Well, you've obviously been given information that I wasn't aware of. I just assumed that more performance was always what it was about. They could have saved themselves a lot of money and effort if they just added the software tweak to reduce peak consumption with Furmark/OCCT to the 480 and dropped the price.


----------



## CDdude55 (Dec 8, 2010)

Relayer said:


> Well, you've obviously been given information that I wasn't aware of. I just assumed that more performance was always what it was about. They could have saved themselves a lot of money and effort if they just added the software tweak to reduce peak consumption with Furmark/OCCT to the 480 and dropped the price.



But that wasn't the point, they refreshed GF100 to create a more efficient design, of course if all they wanted to do was reduce the clock speeds in OCCT and Furmark they would of just added a power limiter to the 400 series. But they instead addressed the issues while giving the cards a nice performance boost.


----------



## jasper1605 (Dec 8, 2010)

I like the addition of a system power usage on the voltage tweaking page  (sorry if I missed that on other reviews, but it's helpful to see)


----------



## Relayer (Dec 8, 2010)

CDdude55 said:


> But that wasn't the point, they refreshed GF100 to create a more efficient design, of course if all they wanted to do was reduce the clock speeds in OCCT and Furmark they would of just added a power limiter to the 400 series. But they instead addressed the issues while giving the cards a nice performance boost.



That's the point, the 570 offers no performance boost over the comparable last gen. 480sp GPU, just lower power and pricing. While I appreciate both of those "features" I'm just a bit disappointed that's all they've done.


----------



## Benetanegia (Dec 8, 2010)

Relayer said:


> That's the point, the 570 offers no performance boost over the comparable last gen. 480sp GPU, just lower power and pricing. While I appreciate both of those "features" I'm just a bit disappointed that's all they've done.



Comparable "last gen" is GTX470 so yes it does offer a significant performance boost. 

It's not next gen anyway and everybody knows that. It's all part of marketing wars, AMD named Barts HD6000 although it's not a new generation either so Nvidia is forced to move up one generation too.

And speaking of Barts (sorry because is offtopic), there's one thing I realized when looking at this review and that is that Barts is not as fast as it first appeared to be:







It's a lot slower than Cypress despite running at nearly 10% higher clocks. I'm 100% sure that Barts looked like it was faster because of the new optimizations on the newer drivers and what we are looking at now in the chart above is HD58xx cards performing much better (also compared to GTX470) than they did on Barts launch.

From Guru3d



> Speaking of AMD, the ATI graphics team at default driver settings applies an image quality optimization which can be seen, though very slightly and in certain conditions. It gives their cards ~8% extra performance. NVIDIA does not apply such a tweak and opts better image quality. We hope to see that move from AMD/ATI soon as well.



It's just that extra 8% that made Barts look so fast on release reviews and now that new drivers have been used on all the cards is the reason that HD5xxx cards are faster now.


----------



## N3M3515 (Dec 8, 2010)

Benetanegia said:


> Comparable "last gen" is GTX470 so yes it does offer a significant performance boost.
> 
> It's not next gen anyway and everybody knows that. It's all part of marketing wars, AMD named Barts HD6000 although it's not a new generation either so Nvidia is forced to move up one generation too.
> 
> ...



Still after watching that, 6850 still faster than gtx 460 1GB and 6870 faster than 5850 and equal to gtx 470, and has way better scaling in crossfire than cypress ever had. So it's fairly fast for its price tag imho. Whatever that 8% was i don't see it relative to gtx 470, hd5850 or gtx 460 1GB, maybe hd5870.

EDIT: do you have an image that compares with and without the optimization?


----------



## newtekie1 (Dec 8, 2010)

Relayer said:


> Well, you've obviously been given information that I wasn't aware of. I just assumed that more performance was always what it was about. They could have saved themselves a lot of money and effort if they just added the software tweak to reduce peak consumption with Furmark/OCCT to the 480 and dropped the price.



Except that isn't what lowered the power consumption, that only lowers power consumption when OCCT and Furmark are run, not everywhere else.  So the lower power consumption we see everywhere else was thanks to the tweaks.

Performance is not always the driving force between tweaks and refreshes.  Look at some history and do a little research:

G80 -> G92 = Not for Performance, for Power and Heat Improvments
RV600 -> RV670 = Not for Performance, for Power and Heat Improvements
G70 -> G71 = Not for Performance, for Power and Heat Improvements



Relayer said:


> That's the point, the 570 offers no performance boost over the comparable last gen. 480sp GPU, just lower power and pricing. While I appreciate both of those "features" I'm just a bit disappointed that's all they've done.




Again, looking at one specification of the graphics card, and simply comparing two graphics card on that single spec alone and saying the two are the same is obsurd.  This isn't a GTX480.  Yes, it has 480SPs like the GTX480, but it also has a 320-bit memory bus like the GTX470.  Why doesn't the HD5770 perform better than the HD4890?  They are both 800SP cards, so you must be fuming that the HD5770 doesn't outperform the HD4890.  It is a real disappointment to you, I'm sure.  Why not make some more obsurd comparisons and then base your opinion on those?  The 1600SP HD5870 gets its ass handed to it by the 512SP GTX580, the HD5870 must be a huge piece of shit by your standards.:shadedshu


----------



## Relayer (Dec 9, 2010)

Benetanegia said:


> Comparable "last gen" is GTX470 so yes it does offer a significant performance boost.
> 
> It's not next gen anyway and everybody knows that. It's all part of marketing wars, AMD named Barts HD6000 although it's not a new generation either so Nvidia is forced to move up one generation too.
> 
> ...



I was comparing the 2 480sp parts. The gtx-470 should be slower. It's older and has fewer SP. 

AMD optimizations are irrelevant. A 1600sp Barts would be more than 8% faster than Cypress. I know you don't agree. We'll never be able to settle that other than to try and apply some common sense. So, I guess we'll just disagree about it.

AMD's fault that nVidia changed to the 500 series. OK, if you say so.


----------



## Benetanegia (Dec 9, 2010)

N3M3515 said:


> Still after watching that, 6850 still faster than gtx 460 1GB and 6870 faster than 5850 and equal to gtx 470, and has way better scaling in crossfire than cypress ever had. So it's fairly fast for its price tag imho. Whatever that 8% was i don't see it relative to gtx 470, hd5850 or gtx 460 1GB, maybe hd5870.



But it does tell a very different story than "Barts it's as fast as Cypress while having less SPs". It's been demostrated like a million times that a HD5850 @ HD5870 clocks is just as fast as HD5870, so that clearly means that 1440 SPs at 850 Mhz are 12% faster (91/81 = 1.12 = +12%) than 1120 SPs @ 900 Mhz as you can see in the chart above. And probably you could actually disable more SPs and would still get similar performance/clock down to 1280 SP. That's why Cypress is only about 60% faster than RV790 or Juniper at same clocks despite being 2x them, because the dispatch unit was never able to feed so many SIMDs. Why do you think that Barts has 2 of them but only 14 SIMD units? Because that's the hot spot.

That's why a 1600 SP Barts would only be just as fast as Cypress (+/- 5%), because Barts actually is Cypress with 6 SIMDs less.

And about the GTX460 and GTX470 and how they relate to Barts performance... any guess why both cards got a 50 Mhz bump just some weeks before they launched? Where do you think a 725 Mhz HD6850 would be in the chart?



> EDIT: do you have an image that compares with and without the optimization?



Sure, you can find some here:

http://www.guru3d.com/article/exploring-ati-image-quality-optimizations/


----------



## micropage7 (Dec 9, 2010)

im just waiting for custom board layout, it would be nice since stock cooler is kinda boring


----------



## Volkszorn88 (Dec 9, 2010)

For about 450 USD, you can purchase a Sapphire HD 5970, which is still a power house monster.


----------



## N3M3515 (Dec 9, 2010)

Benetanegia said:


> But it does tell a very different story than "Barts it's as fast as Cypress while having less SPs". It's been demostrated like a million times that a HD5850 @ HD5870 clocks is just as fast as HD5870, so that clearly means that 1440 SPs at 850 Mhz are 12% faster (91/81 = 1.12 = +12%) than 1120 SPs @ 900 Mhz as you can see in the chart above. And probably you could actually disable more SPs and would still get similar performance/clock down to 1280 SP. That's why Cypress is only about 60% faster than RV790 or Juniper at same clocks despite being 2x them, because the dispatch unit was never able to feed so many SIMDs. Why do you think that Barts has 2 of them but only 14 SIMD units? Because that's the hot spot.
> 
> That's why a 1600 SP Barts would only be just as fast as Cypress (+/- 5%), because Barts actually is Cypress with 6 SIMDs less.
> 
> ...



And what about the amazing crossfire scaling of barts?

EDIT: i read the article, honestly i thought it was worst.


----------



## Benetanegia (Dec 9, 2010)

N3M3515 said:


> And what about the amazing crossfire scaling of barts?



And why does midrange always scale better than high-end on multi-GPU setups?

Because system is less of a "bottleneck"

and

Lower SP count = better internal management and utilization of resources = better scaling

And besides that, has anyone tested HD58xx Crossfire scaling with the newer drivers? I have not seen any review doing so. Maybe scaling is just better with newer drivers and that alongside with the lower SP count (= better utilization) makes Barts look much better, when it's not "much" better, only a bit better. 

Almost everyone compares reviews and reviews are made when cards are launched. Comparing HD5xxx CF scaling and HD68xx CF scaling reviews by W1zzard, for example, is pointless right now, there's been a full year of optimizations in between.

Actually I'm just asking, has anyone extensively compared them with latest drivers to see if that's true?



> EDIT: i read the article, honestly i thought it was worst.



But you can see that there IS a 8% performance difference, which was my point. Regarding the visuals, it's an optimization and an optimization should never be part of default settings no matter to what extent is noticeable or how many people are actually able to see it while gaming. 99.9% of people would not be able to tell the difference between a "raw" 25 GB 1080p Blu-ray disk movie and a good 5GB 1080p DivX rip, but that's not a green card for anyone to start selling DVDs with lossy DivX movies on it as if they were Blu-rays or simply as HD.

Very few people is able to see the difference between an actual diamond and zirconia or moissanite, but if you buy a diamond and *pay* for a diamond you want a diamond. You get the point.

AMD should be honest about it and disable them by default.

For me it is very noticeable and annoying. You can't almost see it on acreenshots, but on games (or videos) it is very noticeable, at least for many people. Me, I wouldn't probably care so much because the first thing I do when I install new drivers is going to the CP and enable the High Quality profile. Regardless of that I don't like companies cheating and I do consider it cheating. For me "if you don't see it, it's not cheating" is not an excuse.


----------



## N3M3515 (Dec 9, 2010)

Benetanegia said:


> And why does midrange always scale better than high-end on multi-GPU setups?
> 
> Because system is less of a "bottleneck"
> 
> ...



Man you can write...lol
So, barts is nothing, they could have launched it at the beginning of the year?
Comparing to old benchmarks?, correct me if i'm wrong but every site benchmarks with the latest drivers all the graphic cards!, only W1zz didn't put 5870 and 5850 crossfire result but a lot of other sites did, and barts scales way better than cypress acording to them.


----------



## Benetanegia (Dec 9, 2010)

N3M3515 said:


> Man you can write...lol
> So, barts is nothing, they could have launched it at the beginning of the year?



Definately.



> correct me if i'm wrong but every site benchmarks with the latest drivers all the graphic cards!



Then I correct you. 

Most reviews that I remember, use older drivers for older cards and the launch drivers (beta drivers most of the times) with the new cards. Maybe my memory is failing on this.

Anyway, can you link me to one of those reviews? I don't remember seeing Hd58xx CF on HD68xx reviews, but I may have just overlooked them. 

And please link me to a extensive review, not one of those who test 3-4 games at one resolution... that's far from conclusive and most probably than not any advantage seen there is especific optimizations made to those games and the games used in the review as well as the settings were "suggestions" from the manufacturer...


----------



## N3M3515 (Dec 10, 2010)

Benetanegia said:


> Definately.
> 
> 
> 
> ...



Look here:
Techreport
Anandtech
Guru3d


----------

