# Zotac GeForce GTX 260 Amp² Edition 216 Shaders



## W1zzard (Sep 16, 2008)

Today NVIDIA released their updated GeForce GTX 260 GPUs which come with 24 extra shaders, for a total shader count of 216. Zotac has also overclocked their card beyond the GTX 280 clock speeds which results in a card that is just 1% slower than the regular GTX 280, yet costs over $130 less.

*Show full review*


----------



## btarunr (Sep 16, 2008)

Does someone still want to buy a GTX 280? Awesome card for $299.


----------



## newtekie1 (Sep 16, 2008)

An awesome review, and one I was waiting for ever since the updated GTX260 was announced.  I totally agree that the new card should have a different name, GTX260+ would be my preference, but maybe they are saving the + label for the die shrunk cards.  It seems though, that the manufactures of the cards are taking control of the naming and at least making it pretty clear in their own ways that this isn't a normal GTX260.

I assume that this card can be used in SLi with a normal GTX260?

And damn, it is great to see that it does what nVidia wanted and gives them a card above the HD4870 that doesn't cost an arm and a leg to get.  $300 is just an amazing price for this card, and since this is an overclocked version, I wonder if that means normal GTX260 216's will be in the $280-$290 range?  That would be wonderful as it will drive the prices of the normal GTX260's down further.  Still $300 for GTX280 performance is wonderful, of course once both are overclocked the GTX280 still holds the lead, but is it worth an extra $130?

It was very interesting to see that overclocking wasn't affected, usually adding more shaders lowers the final overclock as it is harder to get more shaders stable.  However, it seems that all the GTX cards max out at about 715MHz.  What is even more impressive is that it uses less power than a GTX260, I wonder why?


----------



## btarunr (Sep 16, 2008)

Does this have the same DeviceID as the older GTX 260? If so, yes SLI with older 260 looks possible. GTX 280 isn't worth $420 anymore, HD 4870 X2 is a no brainer. For $560, you'd get two of these GTX 260 212SP cards (stock speed) for use in SLI. Watch them own a HD 4870 X2 while being just $10 more.


----------



## Mussels (Sep 16, 2008)

ahh gotta love the graphs at the end, saves me wading through a million benchmark results. this card is better per $ and per watt than the original GTX260, and it almost matches a GTX280.

Seems like a card i could upgrade to at last...


----------



## DaMulta (Sep 16, 2008)

They should of named it 270


----------



## Edito (Sep 16, 2008)

The card is simply amazing at this price range and its interesting the way nvidia manage they resources, the shaders and the clocks cause this card has less shader processors less memory clock and runs better in various aspects i like it... i think ill buy one in xmas...


----------



## johnnyfiive (Sep 16, 2008)

NVidia are the sneakiest bastards ever! That is an AWESOME CARD for the price! Wow... totally would get that over a 4870 HANDS DOWN.


----------



## Darkrealms (Sep 16, 2008)

Thanks for another great review *W1zzard*!

I agree there goes half the sales that would have been on a GTX280, its almost a why bother.  

Question, maybe this was just me (or I didn't read enough text) but why did this card do SO poorly on the Quake 4 section??

I agree a name change would have been helpful GTX265 was my vote, LoL.

So from what I've read of the comments this could be SLI'd with a current/old GTX260?  
Any thoughts on if this would be bogged down by the older "poorly performing" (LoL) models?


----------



## W1zzard (Sep 16, 2008)

i think the problems in quake 4 are because we use AA which seems to work right only on ati drivers. next complete rebench i'll test quake 4 with out AA


----------



## Kwod (Sep 16, 2008)

W1zzard said:


> i think the problems in quake 4 are because we use AA which seems to work right only on ati drivers. next complete rebench i'll test quake 4 with out AA



Ready 4 a stoopid question
Can you run a 260 with a P45 mobo?


----------



## erocker (Sep 16, 2008)

Yes you can.  Any PCI-E x16 slot will work.  Now I'm off to look for a cheaper 260GTX without the +.


----------



## Darkrealms (Sep 16, 2008)

W1zzard said:


> i think the problems in quake 4 are because we use AA which seems to work right only on ati drivers. next complete rebench i'll test quake 4 with out AA


Thanks, actually thats probably a good thing.  It lets people know how the cards do with everything on in the games they like.  I was just curious.


Kwod said:


> Ready 4 a stoopid question
> Can you run a 260 with a P45 mobo?





erocker said:


> Yes you can.  Any PCI-E x16 slot will work.  Now I'm off to look for a cheaper 260GTX without the +.


I think he was harasing me for all my questions  ; )


----------



## EastCoasthandle (Sep 16, 2008)

GeForce GTX 260 AMP2! Edition
GPU 650MHz 
Mem 1050MHz
216 Processor Core @ 1400MHz

VS

Geforce GTX 260
GPU 576MHz
Mem 999MHz
192 Processor Core @ 1242MHz


I'm willing to assume that if the regualr 260 was OC'd to the GeForce GTX 260 AMP2! Edition results would have been very similar.  I honestly don't believe anyone with either a regular 260 or a 4870 have anything to worry about.  A simple overclock should bring the regular 260 and 4870 up to snuff IMO .


----------



## r9 (Sep 16, 2008)

DaMulta said:


> They should of named it 270



Go and figure out  NVIDIA. They don`t change anything on the card just model name 8800gs to 9600 gso 8800gt to 9800gt and for the first time they change some thing and they want keep the name


----------



## newtekie1 (Sep 16, 2008)

r9 said:


> Go and figure out  NVIDIA. They don`t change anything on the card just model name 8800gs to 9600 gso 8800gt to 9800gt and for the first time they change some thing and they want keep the name



This isn't the first time they have done this.  They did it with the 8800GTS 640MB too, then they reused the 8800GTS name again with 512MB 8800GTS which was essential an entirely different card.

I don't know about their naming scheme, it is too confusing IMO, but that is why you have to do your research before buying.


----------



## Scrizz (Sep 17, 2008)

dam nice card!


----------



## Kursah (Sep 17, 2008)

I just signed up to Step-Up to EVGA's version, the GTX260 Core 216...I'm in line atm, can't wait to get this process over with! I figured all I gotta pay is shipping, might as well get a more powerful product to last a little longer lol.


----------



## Sasqui (Sep 17, 2008)

Shit!  I now wish the silicon fairies would turn my 4870 into a 260+

I'm still amazed by the difference between my old 2900 and my now 4870, but...

Ultimate question - can AMD/ATI do something similar?


----------



## Ekklesis (Sep 17, 2008)

Sasqui said:


> Ultimate question - can AMD/ATI do something similar?






Not a chance...


----------



## Edito (Sep 17, 2008)

ppl im a bit confused about the 800 Shader Processors on the 4870, and now the GTX260 got only 216 Shader Processors and its better than HD4870??? how come???? isn´t some how strange or stupid???


----------



## Mussels (Sep 17, 2008)

Edito said:


> ppl im a bit confused about the 800 Shader Processors on the 4870, and now the GTX260 got only 216 Shader Processors and its better than HD4870??? how come???? isn´t some how strange or stupid???



no. they're different GPU's by different companies. Its like saying a car engine at 1,500RPM is faster than an engine at 1,000RPM... you're ignoring factors such as cylinders in the engine (4 cylinder, 6 cylidner, V8 etc) and what gear its in, how much load its pulling.

Its a hell of a lot more complicated than just the one number!


----------



## Edito (Sep 17, 2008)

Ok ok i understand but damn 800 Shader processors, feel like all the 800 Shader are doin nothing... but i agree with u...


----------



## Mussels (Sep 17, 2008)

its marketing. they love quoting the biggest number they legally can.


----------



## Bjorn_Of_Iceland (Sep 17, 2008)

wow.. just wow. what a great card


----------



## Ekklesis (Sep 17, 2008)

Edito said:


> Ok ok i understand but damn 800 Shader processors, feel like all the 800 Shader are doin nothing... but i agree with u...





Actually we can say without being wrong that in "the real world" ati's cards have 160SP cause if I remember well they are organised in clusters that have 4+1SP but they cannot act like 800SP all the time...


----------



## alexp999 (Sep 17, 2008)

Just had an email through from EVGA about their GTX 260 with 216 cores. Just waiting for the official press release to put it in the news. Seems that they have not clocked it as high. I see w1zz has got this one to 715 core. Is that as high as was atainable on the old gtx 260?

Check these out:

http://www.evga.com/products/prodlist.asp?family=GeForce+GTX+200+Series+Family


----------



## insider (Sep 17, 2008)

That's a rip off price, you could buy a 4850 X2 soon for around $400 that will easily thrash that card.


----------



## newtekie1 (Sep 17, 2008)

alexp999 said:


> Just had an email through from EVGA about their GTX 260 with 216 cores. Just waiting for the official press release to put it in the news. Seems that they have not clocked it as high. I see w1zz has got this one to 715 core. Is that as high as was atainable on the old gtx 260?



715MHz was the max W1z could get the old 260GTX and 280GTX.  It seems the GT200 core just maxes out at 715MHz.


----------



## DarkMatter (Sep 17, 2008)

newtekie1 said:


> 715MHz was the max W1z could get the old 260GTX and 280GTX.  It seems the GT200 core just maxes out at 715MHz.



IIRC Wizzard put the GTX260 to 726Mhz. 

EDIT: Nope. It's 715Mhz. Both 260 and 280. They are limited to 715Mhz?

I'm sure I've seen a GTX260 @726Mhz somewhere, so it might be something only Zotac has made?


----------



## yogurt_21 (Sep 18, 2008)

Edito said:


> ppl im a bit confused about the 800 Shader Processors on the 4870, and now the GTX260 got only 216 Shader Processors and its better than HD4870??? how come???? isn´t some how strange or stupid???



800 is divided up into 160 clusters of 5 shaders, 1 complex 4 simple. so theirs 160 complex and 640 simple shaders on the 4800 series this means that in complex vs complex it's more like 160 v 216.


----------



## zithe (Sep 18, 2008)

batmang said:


> NVidia are the sneakiest bastards ever! That is an AWESOME CARD for the price! Wow... totally would get that over a 4870 HANDS DOWN.



So, the 4870 just can't be overclocked?

Any reason why you just blew off the 4k series?

Edit: Read the date.


----------



## Edito (Sep 18, 2008)

yogurt_21 said:


> 800 is divided up into 160 clusters of 5 shaders, 1 complex 4 simple. so theirs 160 complex and 640 simple shaders on the 4800 series this means that in complex vs complex it's more like 160 v 216.



Thanks for the explanation im clear now with ur help and Ekklesis... 

but they will need to improve even more cause of the 4850x2, maybe they will need to pull out an 260GTX2 or GX2 or 280GX2...


----------



## newtekie1 (Sep 18, 2008)

Edito said:


> but they will need to improve even more cause of the 4850x2, maybe they will need to pull out an 260GTX2 or GX2 or 280GX2...



Not really, nVidia already has a card that performs at about where the HD4850x2 will perform, the generation old 9800GX2.  And the 9800GX2 already goes for under the expected $300 price tag of the HD4850x2.


----------



## zithe (Sep 18, 2008)

newtekie1 said:


> Not really, nVidia already has a card that performs at about where the HD4850x2 will perform, the generation old 9800GX2.  And the 9800GX2 already goes for under the expected $300 price tag of the HD4850x2.



...So 4850 crossfire is the same speed as a 9800gx2? M'kay...


----------



## Poisonsnak (Sep 18, 2008)

*performance / dollar and performance / watt scales*

I must be missing something since he did this in the 4850 review from the 17th too, but the scales on the graphs for the Performance / Dollar and Performance / Watt seem funny to me.

Usually the card under review is at 100% and the other cards fall in at their relative performance but in this review W1zzard put the card at 74% under performance per watt and at 28% under performance per dollar.  I guess you could just go through and multiply all the values by (100/74) to change the scale to reference 100%, but since it's on the other review too I thought there might be a reason for it (and that's why I'm asking)


----------



## newtekie1 (Sep 18, 2008)

zithe said:


> ...So 4850 crossfire is the same speed as a 9800gx2? M'kay...



Read here.  The two are very close in performance.


----------



## W1zzard (Sep 18, 2008)

Poisonsnak said:


> I must be missing something since he did this in the 4850 review from the 17th too, but the scales on the graphs for the Performance / Dollar and Performance / Watt seem funny to me.
> 
> Usually the card under review is at 100% and the other cards fall in at their relative performance but in this review W1zzard put the card at 74% under performance per watt and at 28% under performance per dollar.  I guess you could just go through and multiply all the values by (100/74) to change the scale to reference 100%, but since it's on the other review too I thought there might be a reason for it (and that's why I'm asking)



no reason. its a bug, i forgot to adjust the scaling factor


----------



## Poisonsnak (Sep 18, 2008)

W1zzard said:


> no reason. its a bug, i forgot to adjust the scaling factor



Oh ok, no biggie, just was curious.  With all the time it takes to do a review like that I can imagine you miss the odd thing here or there.  (seriously - I can't believe you take the time to benchmark 15 games + 3dmark + all the other stuff - it's awesome)


----------



## Kwod (Sep 19, 2008)

newtekie1 said:


> Not really, nVidia already has a card that performs at about where the HD4850x2 will perform, the generation old 9800GX2.  And the 9800GX2 already goes for under the expected $300 price tag of the HD4850x2.



9800GX2 collapses at 2560x1600 in some games, but the 4870x2 2 Gig and the 4850x2 2 Gig will keep pumping out the numbers+ ATI have HDMI dongle to connect to HDTV that's superior to old gen Nvidia.


----------



## Tatty_One (Sep 19, 2008)

alexp999 said:


> Just had an email through from EVGA about their GTX 260 with 216 cores. Just waiting for the official press release to put it in the news. Seems that they have not clocked it as high. I see w1zz has got this one to 715 core. Is that as high as was atainable on the old gtx 260?
> 
> Check these out:
> 
> http://www.evga.com/products/prodlist.asp?family=GeForce+GTX+200+Series+Family



715 I think on the old one.....would that mean that the old 260 would actually be faster if it clocked say 20mhz+ higher on the core than the new+??


----------



## Tatty_One (Sep 19, 2008)

insider said:


> That's a rip off price, you could buy a 4850 X2 soon for around $400 that will easily thrash that card.



How soon at $400?  I didnt see any notifications of BIG price reductions?


----------



## Kwod (Sep 19, 2008)

Is the 4850x2 on sale in the US?


----------



## W1zzard (Sep 19, 2008)

Kwod said:


> Is the 4850x2 on sale in the US?



it has only been paperlaunched so far. but i hear "very soon" regarding market availability


----------



## newtekie1 (Sep 19, 2008)

Kwod said:


> 9800GX2 collapses at 2560x1600 in some games, but the 4870x2 2 Gig and the 4850x2 2 Gig will keep pumping out the numbers+ ATI have HDMI dongle to connect to HDTV that's superior to old gen Nvidia.



I don't think the people spending $1,300+ on a monitor are going to be worried about getting anything less than high end graphics cards, so the HD4850x2 or 9800GX2 isn't going to be in their consideration, so your argument doesn't really work in a practical real world situation.


----------



## Kwod (Sep 20, 2008)

newtekie1 said:


> I don't think the people spending $1,300+ on a monitor are going to be worried about getting anything less than high end graphics cards, so the HD4850x2 or 9800GX2 isn't going to be in their consideration, so your argument doesn't really work in a practical real world situation.



Newer games will put all 512 ram cards under pressure at 1920x1200, so I think the GX2 is old news and has an inferior HDTV solution.


----------



## newtekie1 (Sep 20, 2008)

Kwod said:


> Newer games will put all 512 ram cards under pressure at 1920x1200, so I think the GX2 is old news and has an inferior HDTV solution.



I highly doubt we will see any games int he lifespan of either card that will be limitted by 512MB at 1920x1200.  And there is nothing wrong with nVidia's HDTV solution, it outputs in HDMI, the 9800GX2 even has the connector on it natively, and it outputs whatever sound configuration your sound card supports, be it 5.1 or 7.1.  I'd actually like you to provide some proof that the HD4850x2 pulls ahead of the 9800GX2 at 2560x1600 as I am not just going to take your word(and I don't have a 2560x1600 monitor to test myself).


----------



## Kwod (Sep 20, 2008)

newtekie1 said:


> I highly doubt we will see any games int he lifespan of either card that will be limitted by 512MB at 1920x1200.  And there is nothing wrong with nVidia's HDTV solution, it outputs in HDMI, the 9800GX2 even has the connector on it natively, and it outputs whatever sound configuration your sound card supports, be it 5.1 or 7.1.  I'd actually like you to provide some proof that the HD4850x2 pulls ahead of the 9800GX2 at 2560x1600 as I am not just going to take your word(and I don't have a 2560x1600 monitor to test myself).



You don't need to take my word champ......you can look at some GPU shoot-outs just as I did, and if you pay attention, you might notice that some{if not all} of the 512 cards choke at 1600p in some games.


----------



## newtekie1 (Sep 20, 2008)

I simply ask you to point me to said shootouts. You make the statement, now prove it, I'm to lazy to do your research to prove your point for you.  You have to do a little work here.


----------



## Kwod (Sep 20, 2008)

newtekie1 said:


> Istatement, now prove it, I'm to lazy to do your research to prove your point for you.  You have to do a little work here.



I have nothing to prove to you.....put me on ignore and problem solved.


----------



## Darkrealms (Sep 20, 2008)

Why are people comparing last gen 9800GX2 vs 4870/4850 X2 2GB?  Doesn't that in itself kind of prove Nvidia is still current.  Of course ATI's new cards should beat it.  I honestly think that Nvidia won't put out a GTX X2, if they were going to it should have been a while ago now.  My thoughts.

*Kwod* if you are going to make statements it is your responsibility to back them up.  You have to expect people to ask for proof when you make a claim.  This is a Tech forum, people want to see the facts.

On topic.  There is an upgrade path for the new GTX260 through EVGA?  What does it cost?


----------



## Kwod (Sep 20, 2008)

Darkrealms said:


> *Kwod* if you are going to make statements it is your responsibility to back them up.  You have to expect people to ask for proof when you make a claim.  This is a Tech forum, people want to see the facts.



I thought it was common knowledge.....I''m the newbie afterall




> Yikes. So your first question probably is: what happened with the three-card 4870 X2 + 4870 setup? The answer: with only 512MB of memory per GPU, they just couldn't handle GRID at this resolution. That's why I've excluded some other configs, as well. They just can't do this. The cards with more than 512MB of memory can, though, and the 4780 X2 is tops among them.



http://techreport.com/articles.x/15293/9

http://forums.techpowerup.com/showthread.php?t=71468

Also, see this thread for a pic of the 4870 1 gig destroying the 4870 512 in Crysis.

IMO, buying the 4850 512 isn't such a bad move as it's cheap as chips, but I don't think it's a good idea to be buying any expensive 512's anymore.


----------



## Darkrealms (Sep 20, 2008)

Kwod said:


> I thought it was common knowledge.....I''m the newbie afterall
> 
> 
> 
> ...


No worries  ; )
There are people that post and never back it up (yes some of them are crazy) so if people don't they tend to be ignored.  Sometimes even if they have valid points, we've just had too many crazy claims : (

Honestly if I wasn't an invidiot I'd probably have a 4850 right now.  But I'm a fan, I've always admitted it.  They are both good cards (especially at the $240 price point).


----------



## Kwod (Sep 21, 2008)

Darkrealms said:


> No worries  ; )
> There are people that post and never back it up (yes some of them are crazy) ).



Oh fair go Darkrealms, I only make baseless claims and let fly with wild exaggerations 60% of the time at most


----------



## newtekie1 (Sep 21, 2008)

Darkrealms said:


> Why are people comparing last gen 9800GX2 vs 4870/4850 X2 2GB?  Doesn't that in itself kind of prove Nvidia is still current.  Of course ATI's new cards should beat it.  I honestly think that Nvidia won't put out a GTX X2, if they were going to it should have been a while ago now.  My thoughts.



Well the fact is that the G92 cards were simply amazing, and it has taken ATi this long to bring out something to compete with them.  The HD4870 and HD4850 were great cards, but the fact is that the G92 9800GTX was dead even with the HD4850.  The RV770 was hyped by a lot of people as God's gift to graphics cards, but only the very top end HD4870 could actually outperform the top end of nVidia's last generation and even then the I've seen Pre-Overclocked 9800GTX's getting pretty damn close to the HD4870's performance.  RV770 was a wonderful thing for both ATi and the graphics card market in general because it finally allowed ATi to actually be competitive with nVidia, something we haven't really seen since the X1950 days.  The GT200 core is just a waste, IMO, it is way too big, and way too expensive.  Personally, I think nVidia was already onto a good thing with the G92 core, and should have just improved on that instead of designing a monolithic GPU like the GT200.  There really is no reason that a simply improved G92 couldn't compete with the RV770.  I personally would have liked to see a G92 with DX10.1 support, GDDR5 support, and another set or two of shaders.

I also highly doubt we will see a GTX X2 card, especially not before the die shrink hits.  Though, I doubt we would have seen it by now either way.  The 9800GX2 didn't come out until way after the original cards.  I just think the GT200 core is way to big of a beast to put two in a single card, it would be too hard to keep it cool.



Darkrealms said:


> On topic.  There is an upgrade path for the new GTX260 through EVGA?  What does it cost?



It depends on how much you paid for your previous card.  The GTX260 216SP costs $300 from eVGA's site.


----------



## Darkrealms (Sep 22, 2008)

newtekie1 said:


> Well the fact is that the G92 cards were simply amazing, and it has taken ATi this long to bring out something to compete with them.  The HD4870 and HD4850 were great cards, but the fact is that the G92 9800GTX was dead even with the HD4850.  The RV770 was hyped by a lot of people as God's gift to graphics cards, but only the very top end HD4870 could actually outperform the top end of nVidia's last generation and even then the I've seen Pre-Overclocked 9800GTX's getting pretty damn close to the HD4870's performance.  RV770 was a wonderful thing for both ATi and the graphics card market in general because it finally allowed ATi to actually be competitive with nVidia, something we haven't really seen since the X1950 days.  The GT200 core is just a waste, IMO, it is way too big, and way too expensive.  Personally, I think nVidia was already onto a good thing with the G92 core, and should have just improved on that instead of designing a monolithic GPU like the GT200.  There really is no reason that a simply improved G92 couldn't compete with the RV770.  I personally would have liked to see a G92 with DX10.1 support, GDDR5 support, and another set or two of shaders.
> 
> I also highly doubt we will see a GTX X2 card, especially not before the die shrink hits.  Though, I doubt we would have seen it by now either way.  The 9800GX2 didn't come out until way after the original cards.  I just think the GT200 core is way to big of a beast to put two in a single card, it would be too hard to keep it cool.
> 
> ...


I here a lot of people saying how ATI is all that and Nvidias offerings are crap : (  I don't see many people put it in perspective like you just did.
Thanks for the info about how EVGA does their step up.  I wonder if thats before or after rebates  ; P


----------



## Kursah (Sep 22, 2008)

Step up is not affected by Rebates, all they care about is the price you initially paid, which you can screenshot the invoice and upload it to them for verifacation too. I am in the midst of step-up from a 260 192p to a 216p, I initially paid 299, then Newegg had a sale for about 269 a day or two later, they refunded the difference, then I sent in a 30-40 MIR (i forget lol)...but since I initially paid 299, and the new card is 299, I only pay the cost of shipping.

That probably doesn't happen for eveyrone, but at least with rebates, they don't hold it against ya.



EDIT: http://www.evga.com/stepup/default.asp


----------



## newtekie1 (Sep 28, 2008)

Yeah, just report it, the mods will be around eventually and delete the message and ban the account, and they can see it even if he deletes it.


----------



## Darkrealms (Sep 28, 2008)

Wouldn't be the first time I reported a post and got a mod mad at me because the OP removed it before they got a hold of it.  My post is deleted and would have been anyway.  
Thanks *erocker*.


----------



## newtekie1 (Sep 28, 2008)

Darkrealms said:


> Wouldn't be the first time I reported a post and got a mod mad at me because the OP removed it before they got a hold of it.  My post is deleted and would have been anyway.
> Thanks *erocker*.



That issue has long been fix, in fact W1z fixed it after my incident, now all deleted messages are still viewable to the mods, and editted posts have a history so the mods can see the orignals after someone edits them.


----------

