Thursday, August 21st 2008
NVIDIA to Upgrade GeForce GTX 260 with 24 Additional Shaders
In a move that can be seen as retaliation to the HD 4870 variations that come with high-performance cores and up to 1 GB of GDDR5 memory and preparations to counter an upcoming Radeon HD 4850 X2, NVIDIA has decided to give the GeForce GTX 260 an upgrade with an additional Texture Processing Cluster (TPC) enabled in the GTX 260 G200 core. The original GTX 260 graphics processor (GPU) had 8 TPCs, (24 x 8 = 192 SPs), the updated core will have 9 TPCs, that amounts to an additional 24 shader processors, which should increase the core's shader compute power significantly over merely increasing frequencies. It is unclear at this point as to what the resulting product would be called.
Everything else remains the same with frequencies, memory size, memory bus width. This upgrade could take shape by this September.
Source:
Expreview
Everything else remains the same with frequencies, memory size, memory bus width. This upgrade could take shape by this September.
86 Comments on NVIDIA to Upgrade GeForce GTX 260 with 24 Additional Shaders
Edit:
looks like BTA beat me to it. I have to admit that would be a good example of sandbagging.
Why release a + version of a 260 and not a 280? I have a feeling that it goes like this :
260+ announced
280+ announced not long after
Neither will be 55nm, but the former gets the extra shader, and the later gets a shader clock increase.
'200b' will most likely end up as an entry model to the GT300 series, or THE GT300 itself, complete with DDR5 and 55nm and other stuff.
I was thinking earlier that unless the GT200b was going to better or at least appropriatley comparable to the X2, then showing it at Nvision seems like a whole 'to-do' for nothing; possibly almost embarrassing.
Releasing rehashed/updated versions would seem more logical, and then they can get back to work on the next cards.
I'll give it one more month, and if nothing pops up, I'll move over to ATi in one of my machines.
But who knows, this type stuff does not bother me, I get the card that i need for the games I play.
If new games come out, and i cant play them max settings for my screan res I upgrade so that i can.
I normaly upgrade ever 1 to 1.5 years to kinda keep up, but almost never buy any thing when it first comes out.
Now i did jump on the 8800GT bandwagon, got two of them, paid $289.99 each, and have yet to have them not give me what i need to play my games.
But my next upgrade will be GTX300, I have a friend who ill not name, but has told me that GTX300 will be DX10.1 and this is what im wanting for SC2 and Diablo3.
This move must mean that yields have improved a lot (the continuous price drops already suggested this too), otherwise it would be nearly impossible for them to do this IMO.
1./ Provide a different price/performance offering
2./ Have a lower-end product to make use of die that didnt pass the highest quality testing, e.g. locked out shaders due to fail.
Anyway the comment itself was stupid, Megasty. What is the HD4850 besides an underpowered HD4870? It's the same practice, but instead of dissabling cores they lower the clock below what most of the chips could achieve to ensure that most chips will function.
jk
But I agree with most of you, they should use a different name. The article doesn't say it is going to be named GTX260 anyway. Nvidia could replace the GTX260 with another card, with a different name, and you could still present the news in th same way they did.
Anyway, I think that this new card is what Nvidia wanted the GTX260 to be, but due to low yields they couldn't. One cluster dissabled, just like with 8800GT. That's what I think.
It is a process, every few years they release something that make a huge leap in performance. But it usually puts out an insane amount of heat, sucks up an insane amount of power, and costs a fortune. Then they work on lowering the heat output, lowering the power consumption, and release a product that performs similar in games, but is overall better. This is the reason I always skip the first generation cards. It is the reason I went with G92 based cards, and never bought a G80 card. It is the reason I wend with a HD3850 and never bothered with the HD2900's. And it is the reason I have 9800GTX's as my primary cards right now, and won't move on to the GTX280's, I'll wait to move onto that generation once they have worked on them.