# NVIDIA to Upgrade GeForce GTX 260 with 24 Additional Shaders



## btarunr (Aug 21, 2008)

In a move that can be seen as retaliation to the HD 4870 variations that come with high-performance cores and up to 1 GB of GDDR5 memory and preparations to counter an upcoming Radeon HD 4850 X2, NVIDIA has decided to give the GeForce GTX 260 an upgrade with an additional Texture Processing Cluster (TPC) enabled in the GTX 260 G200 core. The original GTX 260 graphics processor (GPU) had 8 TPCs, (24 x 8 = 192 SPs), the updated core will have 9 TPCs, that amounts to an additional 24 shader processors, which should increase the core's shader compute power significantly over merely increasing frequencies. It is unclear at this point as to what the resulting product would be called.

Everything else remains the same with frequencies, memory size, memory bus width. This upgrade could take shape by this September.

*View at TechPowerUp Main Site*


----------



## jbunch07 (Aug 21, 2008)

so where the shaders already there just not activated?


----------



## btarunr (Aug 21, 2008)

jbunch07 said:


> so where the shaders already there just not activated?



Read any video card review, the page where GPU Information is laid out, like here: http://www.techpowerup.com/reviews/Sapphire/HD_4870_X2/ 

Does the transistor count row tell you something?


----------



## EastCoasthandle (Aug 21, 2008)

People who already own 4800 series or 200 series card now can pretty much max out all settings with 4xAA/16xAF and, in most cases I believe it's over 50 FPS however, it depends on their native resolution.  So IMO I don't see this as being a popular card. Besides, I couldn't imagine those who just purchased a 4800 series or 200 series buying this.  And, I would be curious to know if people with 260s could actually stepup.  If they can I would imagine the bulk of sales would come from that IMO.


----------



## jbunch07 (Aug 21, 2008)

btarunr said:


> Read any video card review, the page where GPU Information is laid out, like here: http://www.techpowerup.com/reviews/Sapphire/HD_4870_X2/
> 
> Does the transistor count row tell you something?



i see that the gtx280 and gtx260 have the same transistor count.


----------



## EastCoasthandle (Aug 21, 2008)

jbunch07 said:


> i see that the gtx280 and gtx260 have the same transistor count.



Are you saying that the 280 and 260 are essential the same?  They disabled some features on the 280 and called it a 260.  Then later re-enabled some features claiming that they added 24 shaders on the other 260?


----------



## jbunch07 (Aug 21, 2008)

EastCoasthandle said:


> Are you saying that the 280 and 260 are essential the same?  They disabled some features on the 280 and called it a 260.  Then later re-enabled some features claiming that they added 24 shaders on the other 260?



thats what it looks like to me.


----------



## kyle2020 (Aug 21, 2008)

jbunch07 said:


> thats what it looks like to me.



that'll be a knife in Nvidias back if too many people notice that.


----------



## EastCoasthandle (Aug 21, 2008)

jbunch07 said:


> thats what it looks like to me.



This will be very interesting if it turns out to be true.  I can only guess that some 260 owners wouldn't like this (if they couldn't setup).


----------



## candle_86 (Aug 21, 2008)

thats what Nvidia always does people. How do you think the 8800GTS 640 112 was made?


----------



## Sasqui (Aug 21, 2008)

kyle2020 said:


> that'll be a knife in Nvidias back if too many people notice that.



I'd be pissed if I had paid for a 260 already!


----------



## jbunch07 (Aug 21, 2008)

EastCoasthandle said:


> This will be very interesting if it turns out to be true.  I can only guess that some 260 owners wouldn't like this (if they couldn't setup).



well i dont think it would be the first time something like this has happened.


----------



## Darkrealms (Aug 21, 2008)

Go figure I just ordered a 260 yesterday.  Oh well.  
Thanks for the info BTA.


----------



## Kursah (Aug 21, 2008)

jbunch07 said:


> thats what it looks like to me.



Yeah, that's what I was wondering too...but how they've disabled the shaders/cores is what I've been curious about...there's no way to get 1GB of memory tho since the extra chip(s) are missing, but if a GTX260 could get the same ammount of shaders as a GTX280, I wouldn't complain!

Though I'd rather see the 200b's released in 260 and 280 flavors sooner than later instead of a shader increase imo..but either way some extra performance wouldn't hurt, especially if prices stay similar or decline to keep things competetive. I may have a couple step-up options from EVGA coming soon then!


----------



## kyle2020 (Aug 21, 2008)

Sasqui said:


> I'd be pissed if I had paid for a 260 already!



so would I. Seems like companys enjoy doing this sort of thing, and the faithful buers will always buy from them. Heads out the sand people! Brand loyalty is out the window!


----------



## EastCoasthandle (Aug 21, 2008)

jbunch07 said:


> well i dont think it would be the first time something like this has happened.



True but during the G80 era there was no real competition.  So it flew under the radar as an acceptable practice.




kyle2020 said:


> so would I. Seems like companies enjoy doing this sort of thing, and the faithful buyers will always buy from them. Heads out the sand people! Brand loyalty is out the window!



Well, that's the whole point of being indoctrinated...I think...


----------



## candle_86 (Aug 21, 2008)

You buy on release you get burned. How do you think 8800GTS users felt when the 8800GTS with 112 shaders instead of 96 popped up?


----------



## jbunch07 (Aug 21, 2008)

Kursah said:


> Yeah, that's what I was wondering too...but how they've disabled the shaders/cores is what I've been curious about...there's no way to get 1GB of memory tho since the extra chip(s) are missing, but if a GTX260 could get the same ammount of shaders as a GTX280, I wouldn't complain!
> 
> Though I'd rather see the 200b's released in 260 and 280 flavors sooner than later instead of a shader increase imo..but either way some extra performance wouldn't hurt, especially if prices stay similar or decline to keep things competetive. I may have a couple step-up options from EVGA coming soon then!



I just wanna know what is involved in enabling those extra shaders? if someone could do it themselves?


----------



## Kursah (Aug 21, 2008)

jbunch07 said:


> I just wanna know what is involved in enabling those extra shaders? if someone could do it themselves?



We'll find out when they show up! If it's a simple BIOS tweak or a change in GTX260 fab process, or what the deal is...too bad it's not just a driver tweak! I suppose it could be...but doubtfully.

I wonder if they'll run the GTX260's at 1.18v to maintain stability with more shaders or if they can keep them at 1.12v. My GTX runs nice and cool overall..even OC'd I hit 65C load at 1.12v, I think the hightest I hit at 1.18v was around 73C.


----------



## jbunch07 (Aug 21, 2008)

Kursah said:


> We'll find out when they show up! If it's a simple BIOS tweak or a change in GTX260 fab process, or what the deal is...too bad it's not just a driver tweak! I suppose it could be...but doubtfully.
> 
> I wonder if they'll run the GTX260's at 1.18v to maintain stability with more shaders or if they can keep them at 1.12v. My GTX runs nice and cool overall..even OC'd I hit 65C load at 1.12v, I think the hightest I hit at 1.18v was around 73C.



hmm i guess we will just have to wait and see what all involved. 
unless someone on here knows the answer please stand up.


----------



## candle_86 (Aug 21, 2008)

prolly more complex, they where most likly laser cut


----------



## wolf2009 (Aug 21, 2008)

EastCoasthandle said:


> Are you saying that the 280 and 260 are essential the same?  They disabled some features on the 280 and called it a 260.  Then later re-enabled some features claiming that they added 24 shaders on the other 260?



I'm surprised how many people dont know this. But i read somewhere that the chips that Nvidia manufactures that have defects, they get put into cards like G80 8800GTS and GTX260 . The defective shaders or something like that is "disabled" . The perfect chips go into GTX280 and 8800gtx . This saves Nvidia money.


----------



## newtekie1 (Aug 21, 2008)

Not really surprising, and everyone needs to realize this has been a common practice in computer industry for decades.  The processor manufactures do it, and so do the video card manufactures.  ATi and nVidia have been cutting down cores to make lower cards for a very long time, so don't get in a big huff about it now.

Though I hope nVidia doesn't keep the GTX260 name, I would prefer GTX270 or GTX265 to keep confusion down.



wolf2009 said:


> I'm surprised how many people dont know this. But i read somewhere that the chips that Nvidia manufactures that have defects, they get put into cards like G80 8800GTS and GTX260 . The defective shaders or something like that is "disabled" . The perfect chips go into GTX280 and 8800gtx . This saves Nvidia money.



Yep, exactly.  And the G92's that are defective get put in 8800GT's and 8800GS's(9600GSO's).  Ati didn't do it with the RV670, but they did with RV600.  The 2900GT was just a defective RV600 with the defective shaders turned off.

Intel and AMD use similar techniques with their processors.  The original E6300 and E6400 were just defective Conroe cores that had the defective parts of the L2 cache disabled.  Same thing with the Celerons and Penium E2000 series, they are just Allendale cores(from the E4000 series) with the defective parts of the L2 cache disabled.  The Celeron 400 series are Allendale cores with an entire processing core disabled to give the appearence of only having a single core processor.

AMD does this too, some of their single core processors are really dual-core processor with a defective core turned off, they started doing this at the end of the Socket 939 era.



jbunch07 said:


> I just wanna know what is involved in enabling those extra shaders? if someone could do it themselves?



No, you can't do it yourselves, nVidia(and ATi) long stopped this by physically breaking the connection on the die itself.


----------



## jbunch07 (Aug 21, 2008)

candle_86 said:


> prolly more complex, they where most likly laser cut



i got some lasers!


----------



## btarunr (Aug 21, 2008)

not "defective", just the ones that happen to perform lower when binning compared what's required to make it to a GTX 280.


----------



## jbunch07 (Aug 21, 2008)

newtekie1 said:


> Not really surprising, and everyone needs to realize this has been a common practice in computer industry for decades.  The processor manufactures do it, and so do the video card manufactures.  ATi and nVidia have been cutting down cores to make lower cards for a very long time, so don't get in a big huff about it now.
> 
> Though I hope nVidia doesn't keep the GTX260 name, I would prefer GTX270 or GTX265 to keep confusion down.
> 
> ...



thats what i figured.

oh and i was well aware that cpu manufactures have been doing it for quite some time now, i guess i just didn't really think of them doing it quite as much with video cards. but seems i was wrong.


----------



## btarunr (Aug 21, 2008)

Darkrealms said:


> Go figure I just ordered a 260 yesterday.  Oh well.
> Thanks for the info BTA.



If you can cancel the order, do it.


----------



## Megasty (Aug 21, 2008)

Haven't they learned from this silly garbage yet. GD, just put out the best card you can at the time & move on. They're basically unlocking the 260 to perform near 280 levels OOB & kicking the 280 up with a 55nm ver. The sandbagging is making me sick.


----------



## EastCoasthandle (Aug 21, 2008)

wolf2009 said:


> I'm surprised how many people dont know this. But i read somewhere that the chips that Nvidia manufactures that have defects, they get put into cards like G80 8800GTS and GTX260 . The defective shaders or something like that is "disabled" . The perfect chips go into GTX280 and 8800gtx . This saves Nvidia money.



I'm not, most people don't get that involved. But from my understanding I don't believe that it's necessarily defective.  It's just a binned differently.  I guess it's just examples like this that make the 200 series different from the 4800 series.

Edit:
looks like BTA beat me to it.








Megasty said:


> Haven't they learned from this silly garbage yet. GD, just put out the best card you can at the time & move on. They're basically unlocking the 260 to perform near 280 levels OOB & kicking the 280 up with a 55nm ver. The sandbagging is making me sick.



I have to admit that would be a good example of sandbagging.


----------



## newtekie1 (Aug 21, 2008)

Megasty said:


> Haven't they learned from this silly garbage yet. GD, just put out the best card you can at the time & move on. They're basically unlocking the 260 to perform near 280 levels OOB & kicking the 280 up with a 55nm ver. The sandbagging is making me sick.



They did put out the best card they could at the time, the GTX280.  They also put out a lower card to fill another market, that is how the video card industry works, companies can't just put out the high end card and nothing else, they have to fill several price points.


----------



## alexp999 (Aug 21, 2008)

Just out of interest, will these actually make a difference? (other than benches). I'm assuming this is a hardware unlock as opposed to a bios unlock?


----------



## Megasty (Aug 21, 2008)

newtekie1 said:


> They did put out the best card they could at the time, the GTX280.  They also put out a lower card to fill another market, that is how the video card industry works, companies can't just put out the high end card and nothing else, they have to fill several price points.



But in doing that, they're just telling us that the 260 is just an underpowered 280, even though most of us already knew that. However, most of the folks that bought 260 didn't know & also wouldn't be pleased at all to fine out. I'm not saying they should just put out HE cards, that's stupid. I'm saying they shouldn't tinker with the mid-high range cards. That leads to market confusion & consumer dissatisfaction. It wouldn't be a good idea to leave it with the 260 name just as you said. They just might end up calling it a plus or something, although that would probably be just as bad.


----------



## newconroer (Aug 21, 2008)

Kursah said:


> Yeah, that's what I was wondering too...but how they've disabled the shaders/cores is what I've been curious about...there's no way to get 1GB of memory tho since the extra chip(s) are missing, but if a GTX260 could get the same ammount of shaders as a GTX280, I wouldn't complain!
> 
> Though I'd rather see the 200b's released in 260 and 280 flavors sooner than later instead of a shader increase imo..but either way some extra performance wouldn't hurt, especially if prices stay similar or decline to keep things competetive. I may have a couple step-up options from EVGA coming soon then!



Ya, but with this whole fiasco about 200b being released in Q4 instead of Aug/September, things just don't add up.

Why release a + version of a 260 and not a 280? I have a feeling that it goes like this :

260+ announced
280+ announced not long after

Neither will be 55nm, but the former gets the extra shader, and the later gets a shader clock increase.

'200b' will most likely end up as an entry model to the GT300 series, or THE GT300 itself, complete with DDR5 and 55nm and other stuff.

I was thinking earlier that unless the GT200b was going to better or at least appropriatley comparable to the X2, then showing it at Nvision seems like a whole 'to-do' for nothing; possibly almost embarrassing.

Releasing rehashed/updated versions would seem more logical, and then they can get back to work on the next cards.

I'll give it one more month, and if nothing pops up, I'll move over to ATi in one of my machines.


----------



## AddSub (Aug 21, 2008)

Question: Would I be able to SLI a plain GTX 260 with this one (GTX 265?)


----------



## newconroer (Aug 21, 2008)

Too early to tell Add.


----------



## Selene (Aug 21, 2008)

I would think you could use a GTX260 and a GTX265, but probly have to run the GTX260 in the frist slot.
But who knows, this type stuff does not bother me, I get the card that i need for the games I play.
If new games come out, and i cant play them max settings for my screan res I upgrade so that i can.
I normaly upgrade ever 1 to 1.5 years to kinda keep up, but almost never buy any thing when it first comes out.
Now i did jump on the 8800GT bandwagon, got two of them, paid $289.99 each, and have yet to have them not give me what i need to play my games.

But my next upgrade will be GTX300, I have a friend who ill not name, but has told me that GTX300 will be DX10.1 and this is what im wanting for SC2 and Diablo3.


----------



## DarkMatter (Aug 21, 2008)

btarunr said:


> not "defective", just the ones that happen to perform lower when binning compared what's required to make it to a GTX 280.



And not exactly that either. They disable some of the clusters for redundancy, which means that IN CASE one or two are defective or not as fast it doesn't matter. Just like what Sony did with the Cell processor. Of course defective GTX280 chips (the chip that pretends to be the GTX280, that are first selected to be, prior to any test) are also used as GTX260. What I mean is that not only defective GTX280 become GTX260, many chips are labeled as GTX260 without seeing if they could be GTX280. They sell a lot more cheaper cards than the expensive cards after all. It's common bussiness. How do you guys think that so many people has been able to flash lesser cards into the big brothers otherwise?

This move must mean that yields have improved a lot (the continuous price drops already suggested this too), otherwise it would be nearly impossible for them to do this IMO.


----------



## lemonadesoda (Aug 21, 2008)

What on earth is all the fuss about? ATI have been doing it for ages... e.g. X800XT vs. X800Pro etc. So have Intel with their clock multipliers. Exactly the same chip/tranny count. Just locked to a lower performance level due to:

1./ Provide a different price/performance offering
2./ Have a lower-end product to make use of die that didnt pass the highest quality testing, e.g. locked out shaders due to fail.


----------



## newtekie1 (Aug 21, 2008)

Megasty said:


> But in doing that, they're just telling us that the 260 is just an underpowered 280, even though most of us already knew that. However, most of the folks that bought 260 didn't know & also wouldn't be pleased at all to fine out. I'm not saying they should just put out HE cards, that's stupid. I'm saying they shouldn't tinker with the mid-high range cards. That leads to market confusion & consumer dissatisfaction. It wouldn't be a good idea to leave it with the 260 name just as you said. They just might end up calling it a plus or something, although that would probably be just as bad.



We've already known the 260 is just an underpowered 280, if you read any review about the card you would already know that.  If you do just the slightest bit of research before buying, which everyone should do, then you already knew that the 260 was just a cut down 280.  If you didn't already know that before buying it, then you deserve to get "screwed" because you didn't do your research.  It isn't like nVidia tried to hide the fact in anyway, they made it pretty obvious from before the cards were even released that they both would be using the same core.  Again, this isn't anything new, it is a practice that has been used for decades in the computer industry, and the video card companies(both ATi and nVidia) have been doing it since at least 2002 and probably before then, I just can't remember that far back.


----------



## candle_86 (Aug 21, 2008)

agreed, you didnt hear 8800GT users whine about the 8800GTS 512 or 8800GTS 512 buyers whine about the 9800GTX+.


----------



## DarkMatter (Aug 21, 2008)

newtekie1 said:


> We've already known the 260 is just an underpowered 280, if you read any review about the card you would already know that.  If you do just the slightest bit of research before buying, which everyone should do, then you already knew that the 260 was just a cut down 280.  If you didn't already know that before buying it, then you deserve to get "screwed" because you didn't do your research.  It isn't like nVidia tried to hide the fact in anyway, they made it pretty obvious from before the cards were even released that they both would be using the same core.  Again, this isn't anything new, it is a practice that has been used for decades in the computer industry, and the video card companies(both ATi and nVidia) have been doing it since at least 2002 and probably before then, I just can't remember that far back.



I can't remember a time where this didn't happen. Well I do. It was when the graphics chips where "single core", they couldn't dissable any one of the cores. 

Anyway the comment itself was stupid, Megasty. What is the HD4850 besides an underpowered HD4870? It's the same practice, but instead of dissabling cores they lower the clock below what most of the chips could achieve to ensure that most chips will function.


----------



## btarunr (Aug 21, 2008)

Considering this revised GTX 260 is just 24 SP's away from GTX 280, wouldn't they rather call this the GTX 2*7*0 ? ATI seems to be having luck with the number 7 these days 

jk


----------



## jbunch07 (Aug 21, 2008)

btarunr said:


> Considering this revised GTX 260 is just 24 SP's away from GTX 280, wouldn't they rather call this the GTX 2*7*0 ? ATI seems to be having luck with the number 7 these days
> 
> jk



so are they going to rename or rebadge it or just leave it a 260?


----------



## Megasty (Aug 21, 2008)

newtekie1 said:


> We've already known the 260 is just an underpowered 280, if you read any review about the card you would already know that.  If you do just the slightest bit of research before buying, which everyone should do, then you already knew that the 260 was just a cut down 280.  *If you didn't already know that before buying it, then you deserve to get "screwed" because you didn't do your research.*  It isn't like nVidia tried to hide the fact in anyway, they made it pretty obvious from before the cards were even released that they both would be using the same core.  Again, this isn't anything new, it is a practice that has been used for decades in the computer industry, and the video card companies(both ATi and nVidia) have been doing it since at least 2002 and probably before then, I just can't remember that far back.



I have to completely agree with you there. Its kinda sad that the majority of 'enthusiasts' just buy cards for the name, they just keep the buying spree within the range of what they can afford. I'm just saying that it is screw around with already established cards. The outcome only ends up being a few more fps then the originals anyway. Wasting manpower on manipulating 'old' cards just make the 'new' models suffer. It hinders progression overall. It took NV 2 years to release a single GPU that destroyed the 8800GTX. Most of that was due to little competition. But constantly releasing cards that performs within 5-10% of latter versions does nothing for us.


----------



## PVTCaboose1337 (Aug 21, 2008)

Too bad for all GTX 260 Owners...  Good thing I got my good ole 4850!


----------



## DarkMatter (Aug 21, 2008)

btarunr said:


> Considering this revised GTX 260 is just 24 SP's away from GTX 280, wouldn't they rather call this the GTX 2*7*0 ? ATI seems to be having luck with the number 7 these days
> 
> jk



Haha. But they are having even more luck with the 5 isn't it? HD48*5*0. _Yeah I know you probably meant because of RV*77*0 too._

But I agree with most of you, they should use a different name. The article doesn't say it is going to be named GTX260 anyway. Nvidia could replace the GTX260 with another card, with a different name, and you could still present the news in th same way they did.

Anyway, I think that this new card is what Nvidia wanted the GTX260 to be, but due to low yields  they couldn't. One cluster dissabled, just like with 8800GT. That's what I think.


----------



## Megasty (Aug 21, 2008)

DarkMatter said:
			
		

> Anyway the comment itself was stupid, Megasty. What is the HD4850 besides an underpowered HD4870? It's the same practice, but instead of dissabling cores they lower the clock below what most of the chips could achieve to ensure that most chips will function.



You don't seem to understand what I was saying. All high end cards have lower counterparts. I'm just saying that's it dumb to constantly fumble around with cards which are *ALREADY* out, especially when they only end up slightly faster than the prior versions - its not a true step up from anything. Who is going to buy a 260+, 265 or whatever when the 260 is much cheaper plus it only get a few fps less than the _newer_ model... Its just like the 9800GTX & GTX+, the "+" costs $50-70 more & is only 5-10% better


----------



## newtekie1 (Aug 21, 2008)

Megasty said:


> I have to completely agree with you there. Its kinda sad that the majority of 'enthusiasts' just buy cards for the name, they just keep the buying spree within the range of what they can afford. I'm just saying that it is screw around with already established cards. The outcome only ends up being a few more fps then the originals anyway. Wasting manpower on manipulating 'old' cards just make the 'new' models suffer. It hinders progression overall. It took NV 2 years to release a single GPU that destroyed the 8800GTX. Most of that was due to little competition. But constantly releasing cards that performs within 5-10% of latter versions does nothing for us.



That is pretty much how the industry works though.  They didn't get a card out that could outperform the 8800GTX in graphical horse power, however it did outperform it in other aspects.  The 9800GTX might not have really performed any better than the 8800GTX, and in some cases it performed worse.  However, nVidia made huge gains in power consumption and heat output while still keeping the same performance level, not to mention the increased overclocking headroom and improved price.  ATi did the same thing with the HD3800 series over the HD2900 series.  And before that, nVidia did it with the 7900 series over the 7800 series, and ATi did it with the x1950 series over the x1900 series.

It is a process, every few years they release something that make a huge leap in performance.  But it usually puts out an insane amount of heat, sucks up an insane amount of power, and costs a fortune.  Then they work on lowering the heat output, lowering the power consumption, and release a product that performs similar in games, but is overall better.  This is the reason I always skip the first generation cards.  It is the reason I went with G92 based cards, and never bought a G80 card.  It is the reason I wend with a HD3850 and never bothered with the HD2900's.  And it is the reason I have 9800GTX's as my primary cards right now, and won't move on to the GTX280's, I'll wait to move onto that generation once they have worked on them.


----------



## DarkMatter (Aug 21, 2008)

Megasty said:


> You don't seem to understand what I was saying. All high end cards have lower counterparts. I'm just saying that's it dumb to constantly fumble around with cards which are *ALREADY* out, especially when they only end up slightly faster than the prior versions - its not a true step up from anything. Who is going to buy a 260+, 265 or whatever when the 260 is much cheaper plus it only get a few fps less than the _newer_ model... Its just like the 9800GTX & GTX+, the "+" costs $50-70 more & is only 5-10% better



The GTX+ costs more because retailers sell them for more. Not because Nvidia wants it that way.


----------



## btarunr (Aug 21, 2008)

DarkMatter said:


> But I agree with most of you, they should use a different name. The article doesn't say it is going to be named GTX260 anyway. Nvidia could replace the GTX260 with another card, with a different name, and you could still present the news in th same way they did.



"One more TPC!NVIDIA will offer a upgraded GTX 260 in mid-September" is the article's title.


----------



## Megasty (Aug 21, 2008)

DarkMatter said:


> The GTX+ costs more because retailers sell them for more. Not because Nvidia wants it that way.



You're forgetting about UMAP


----------



## mdm-adph (Aug 21, 2008)

newtekie1 said:


> It is a process, every few years they release something that make a huge leap in performance.  But it usually puts out an insane amount of heat, sucks up an insane amount of power, and costs a fortune.



That's a very poignant quote, and one that perfectly describes the relationship of the G200 to the G92, and the R600 to the X1950 series.  

(_Especially_ the R600.)


----------



## DarkMatter (Aug 21, 2008)

btarunr said:


> "One more TPC!NVIDIA will offer a upgraded GTX 260 in mid-September" is the article's title.



8800GS and 9600GT were announced as a cut-down 8800GT on many sites. Does that mean they are? Well the GS in some way it is, with one TPC cluster and one ROP cluster dissabled, but why not a cut-down GTS? You know what I mean?



Megasty said:


> You're forgetting about UMAP



Yet again, that lowest price is not dictated by Nvidia. UMAP only forces them to advertise the lowest price, but not the one they want.


----------



## AddSub (Aug 21, 2008)

Well, either way I think I can work it out. If they can't be SLI'd (GTX 260 and GTX 265/270), I can always sell my single GTX 260 and get two of those new ones.  Make a profit even, since I got my GTX 260 for only $174, and I'm sure I can eBay-it-off for more than that.


----------



## Megasty (Aug 21, 2008)

Most e-tellers are like that, but the ones that get the most business show the UMAP price along with rebates & junk. One example is egg. On egg, the 9800GTX+ is overall cheaper than the regular  But egg doesn't care about UMAP do they. Everyone else basically are ruled by UMAP.


----------



## btarunr (Aug 21, 2008)

DarkMatter said:


> 8800GS and 9600GT were announced as a cut-down 8800GT on many sites. Does that mean they are? Well the GS in some way it is, with one TPC cluster and one ROP cluster dissabled, but why not a cut-down GTS? You know what I mean?



With the same frequencies / memory, etc?

"Everything else remains the same with frequencies, memory size, memory bus width. This upgrade could take shape by this September."


----------



## DarkMatter (Aug 21, 2008)

btarunr said:


> With the same frequencies / memory, etc?
> 
> "Everything else remains the same with frequencies, memory size, memory bus width. This upgrade could take shape by this September."



You didn't get what I mean.  But it's my fault as I wasn't clear. I mean that one thing is what Nvidia will do and another thing what a site says. It can sometimes match. Sometimes is the key word. 

They are replacing the GTX260 with the new one, which means that the regular GTX260 will not continue selling (except the ones that are already in stores of course). They could probably name it GTX260 and they will probably do, but they could just do a different thing. We just don't know. It would be something similar to FX5600 and FX5700 IIRC.


----------



## Megasty (Aug 21, 2008)

DarkMatter said:


> You didn't get what I mean.  But it's my fault as I wasn't clear. I mean that one thing is what Nvidia will do and another thing what a site says. It can sometimes match. Sometimes is the key word.
> 
> They are replacing the GTX260 with the new one, which means that the regular GTX260 will not continue selling (except the ones that are already in stores of course). They could probably name it GTX260 and they will probably do, but they could just do a different thing. We just don't know. It would be something similar to FX5600 and FX5700 IIRC.



If they do that then its great...but are they going to charge the same for it like when ATI changed the failing X1800s for the X1900s, or will it be a new card altogether. Completely replacing the 260 with this one would work if it doesn't have a price premium. But I can see all kinds of confusion occurring if the regular 260 is available while this is.


----------



## btarunr (Aug 21, 2008)

Ok, I made an edit.


----------



## DarkMatter (Aug 21, 2008)

Megasty said:


> If they do that then its great...but are they going to charge the same for it like when ATI changed the failing X1800s for the X1900s, or will it be a new card altogether. Completely replacing the 260 with this one would work if it doesn't have a price premium. But I can see all kinds of confusion occurring if the regular 260 is available while this is.



X1900s price was significantly higher, at least here. But again, it was not Ati's fault.

Confusion will only occur if the card has the same name. We don't know the name, we just know what a confusing article said in one site. As I said before, I remember very well how many sites (I think Expreview was one of them) presented the 9600GT as a cut down version of g92, because it had many things in common. It's very common to do such things, it's like:

-Ey dude, I bought an sports car.
-Which one?
-The McLaren F1.
-I don't know it.
-It's like a Saleen S7, but with...

But that doesn't mean it is literally identical. This new card will be like the GTX260, but with one more cluster. So semantically it does make sense to present it like a GTX260 with one more cluster (or a GTX280 with one less), (edited->) instead of "A new GT200 based chip with 9 TPC clusters, x ROP partitions, xxx mhz, etc.". And it will be the same chips that are going to be selected as the "GTX260+" instead of GTX260 (I suppose both chips won't coexist), but that doesn't mean they will have the same name.

Again we don't know anything really. I replied because I think many people were taking conclusions out of some news that, well, are far from being conclusive.


----------



## Bull Dog (Aug 21, 2008)

Wonderful, just what we need.

Either we are going to get another SKU (the better of the two options).  Or worse, we will get a muddled SKU that has two different performance levels.  Brilliant.


----------



## Solaris17 (Aug 21, 2008)

i know if a way to see if its a bios tweak...any brave 260 soul out their want to flash it with a 280 bios?


----------



## trt740 (Aug 21, 2008)

wow this will perform almost exactly like a 280 gtx when oced.


----------



## candle_86 (Aug 21, 2008)

not quite, but close yes. About as close as the 8800GTS 640 112 did to the 8800GTX


----------



## trt740 (Aug 21, 2008)

candle_86 said:


> not quite, but close yes. About as close as the 8800GTS 640 112 did to the 8800GTX



no the current 260gtx when overclocked, does already almost match stock 280 gtx performance and when you add these shaders and I bet it's gonna be 55nm, making it overclock insanely, your gonna have 280 gtx performance.


----------



## alexp999 (Aug 21, 2008)

So anyone tried a 280 bios on a 260 like solaris said?

I'm tempted, but dont want to brick it in case i can even blind flash or use another gfx card to flash it.


----------



## candle_86 (Aug 21, 2008)

well i flashed an 8800GS with an 8800GT bios, it was screwy in windows for sure, 4 rops, 64 shaders ect, but i flashed it back and all went normal


----------



## alexp999 (Aug 21, 2008)

candle_86 said:


> well i flashed an 8800GS with an 8800GT bios, it was screwy in windows for sure, 4 rops, 64 shaders ect, but i flashed it back and all went normal



Do you reckon it would work if you know the 260 can do 280 clock/shader/mem speeds?


----------



## candle_86 (Aug 21, 2008)

well to be on the safe side id modify the bios with stock 260 clocks and flash it just to be safe.


----------



## alexp999 (Aug 22, 2008)

candle_86 said:


> well to be on the safe side id modify the bios with stock 260 clocks and flash it just to be safe.



Just thinking about it, it prob wont work cus the mem config is different.

Would still be interesting to try the bios that comes out for this one, maybe it will even be implemented into a bios editor...?


----------



## candle_86 (Aug 22, 2008)

it worked on my 8800GS didnt make it a 256bit bus i only had 6 chips. Nvidia bios detects memory on boot up, not like ATI bios where bus width and size are hard coded into bios.


----------



## alexp999 (Aug 22, 2008)

candle_86 said:


> it worked on my 8800GS didnt make it a 256bit bus i only had 6 chips. Nvidia bios detects memory on boot up, not like ATI bios where bus width and size are hard coded into bios.



Ok fair enough, that actually seems sensbile (so would that mean if one chip busted it could still carry on functioning?)

Still I think i will leave it to someone more experienced in gfx card flashing.


----------



## candle_86 (Aug 22, 2008)

yes actully, i had a 6600GT loose one of its ram modules and it kept working


----------



## trt740 (Aug 22, 2008)

flashing it won't work as you have said


----------



## Megasty (Aug 22, 2008)

I wouldn't even attempt that sort of flash anyway. The card is fast enough already...yeah, I know I keep on surprising myself with the stuff I say at times. The 280 bios wouldn't be just a massive OC for the 260 because of the points mentioned above - hardware & structure limitations. The 280 bios would just fry it instantly.


----------



## Kursah (Aug 22, 2008)

I've considered flashing my 260 with a 280 bios, but with the missing memory chips and most likely hardware disabled GPU sections...even if it did work you'd have an overvolted and overclocked GTX260. But maybe someone out there will try it and find out something more positive? Won't be me! I'll flash my 260, but only with a 260 bios!


----------



## flclisgreat (Aug 22, 2008)

newtekie1 said:


> Not really surprising, and everyone needs to realize this has been a common practice in computer industry for decades.  The processor manufactures do it, and so do the video card manufactures.  ATi and nVidia have been cutting down cores to make lower cards for a very long time, so don't get in a big huff about it now.
> 
> Though I hope nVidia doesn't keep the GTX260 name, I would prefer GTX270 or GTX265 to keep confusion down.
> 
> ...




i thought all that was "common" knowledge? people didn't know that? i always just assumed that all the celerons/semprons where f'ed up allandale/athlon's with cache/cores disabled


----------



## PCpraiser100 (Aug 22, 2008)

By the time GTX 280 Rev. 2 comes out, it will be too late. the HD 4870 X2 will already be lower than the newest revision of the GTX 200 series.


----------



## TooFast (Aug 22, 2008)

what a joke! is this all nvidia has??? once the 4850x2 hits the street it will really be over this round.


----------



## eidairaman1 (Aug 22, 2008)

gotta love competition, you Green Team Members should be very Happy that Red Team is giving them competition.


----------



## jbunch07 (Aug 22, 2008)

eidairaman1 said:


> gotta love competition, you Green Team Members should be very Happy that Red Team is giving them competition.



exactly! with out competition the 8800gtx would still be their flagship card.


----------



## phanbuey (Aug 22, 2008)

AddSub said:


> Question: Would I be able to SLI a plain GTX 260 with this one (GTX 265?)



probably not without some sort of modding...  and why would you want to?? - the slowest card determines the speed in SLI, so you would be basically SLI'ing two normal 260's but paying more for the fancier version.

I was gonna buy another 260 this week to sli, but i figure ill wait till their price goes through the floor with this announcement.  If you OC 260 to about 725 Core and 1450 shaders, it can play on the same level as a stock 280, if you sli 2 OC'd 260's youre probably looking at the performance of a stock GT300 (384 shaders etc)...  that's plenty enough juice, even for Crysis at very high.

Im actually kind of happy this is coming out... means i get to buy a cheapo 260.


----------



## chron (Aug 22, 2008)

Does anyone here remember when ATI's X1800GTO could be unlocked to an X1800XL? X1800GTO had 12 pipes, while the XL had 16.  All that was needed was a bio update.  Some cards had an extra VRM allowing you to do this, while some didn't and were laser locked.  Same situation I think...


----------



## bas3onac1d (Aug 23, 2008)

Decreasing the shader count on a lower-end card is nothing new.  They never actually produce different chips for the two highest ends cards.  If a 200 is produced and one or two of the clusters are dysfunctional then they can just disable them and use the chip as a GTX 260, the same goes for chips that can't stay stable at 280 speeds, even if all the shaders work.

However, as more 260s sell than 280s (this always happens with cheaper cards) nVidia must use some perfectly functional cores in the 260.  This has been going on for generations with both card makers.  Before they were laser locked these chips could be unlocked and run at full speed.  Sometimes people got cores that actually did have dysfunctional parts and the bios change never worked for them, it was chance.


----------



## newtekie1 (Aug 23, 2008)

phanbuey said:


> probably not without some sort of modding...  and why would you want to?? - the slowest card determines the speed in SLI, so you would be basically SLI'ing two normal 260's but paying more for the fancier version.
> 
> I was gonna buy another 260 this week to sli, but i figure ill wait till their price goes through the floor with this announcement.  If you OC 260 to about 725 Core and 1450 shaders, it can play on the same level as a stock 280, if you sli 2 OC'd 260's youre probably looking at the performance of a stock GT300 (384 shaders etc)...  that's plenty enough juice, even for Crysis at very high.
> 
> Im actually kind of happy this is coming out... means i get to buy a cheapo 260.



When nVidia released the 8800GTS 640mb with 112 SP's instead of the 96, you could still SLI both together.  I don't know if the one with more SP's made use of the extra shaders when in SLI though, I think you are right in saying they wouldn't.

Also, when they did this with the 8800GTS, they released the new card at the same price point as the old 8800GTS, and discontinued the previous card, so the prices didn't actually go down for the old cards.


----------



## candle_86 (Aug 23, 2008)

no it doesnt, a 112 and a 96 makes both a 96


----------



## Hayder_Master (Aug 24, 2008)

still far from 4870 , and by the way guys anyone have link for 4870 with 1g i need to know what price of it too


----------

