Thursday, August 21st 2008
NVIDIA to Upgrade GeForce GTX 260 with 24 Additional Shaders
In a move that can be seen as retaliation to the HD 4870 variations that come with high-performance cores and up to 1 GB of GDDR5 memory and preparations to counter an upcoming Radeon HD 4850 X2, NVIDIA has decided to give the GeForce GTX 260 an upgrade with an additional Texture Processing Cluster (TPC) enabled in the GTX 260 G200 core. The original GTX 260 graphics processor (GPU) had 8 TPCs, (24 x 8 = 192 SPs), the updated core will have 9 TPCs, that amounts to an additional 24 shader processors, which should increase the core's shader compute power significantly over merely increasing frequencies. It is unclear at this point as to what the resulting product would be called.
Everything else remains the same with frequencies, memory size, memory bus width. This upgrade could take shape by this September.
Source:
Expreview
Everything else remains the same with frequencies, memory size, memory bus width. This upgrade could take shape by this September.
86 Comments on NVIDIA to Upgrade GeForce GTX 260 with 24 Additional Shaders
Does the transistor count row tell you something?
Thanks for the info BTA.
Though I'd rather see the 200b's released in 260 and 280 flavors sooner than later instead of a shader increase imo..but either way some extra performance wouldn't hurt, especially if prices stay similar or decline to keep things competetive. I may have a couple step-up options from EVGA coming soon then!
:toast:
I wonder if they'll run the GTX260's at 1.18v to maintain stability with more shaders or if they can keep them at 1.12v. My GTX runs nice and cool overall..even OC'd I hit 65C load at 1.12v, I think the hightest I hit at 1.18v was around 73C.
unless someone on here knows the answer please stand up. :rockout:
Though I hope nVidia doesn't keep the GTX260 name, I would prefer GTX270 or GTX265 to keep confusion down. Yep, exactly. And the G92's that are defective get put in 8800GT's and 8800GS's(9600GSO's). Ati didn't do it with the RV670, but they did with RV600. The 2900GT was just a defective RV600 with the defective shaders turned off.
Intel and AMD use similar techniques with their processors. The original E6300 and E6400 were just defective Conroe cores that had the defective parts of the L2 cache disabled. Same thing with the Celerons and Penium E2000 series, they are just Allendale cores(from the E4000 series) with the defective parts of the L2 cache disabled. The Celeron 400 series are Allendale cores with an entire processing core disabled to give the appearence of only having a single core processor.
AMD does this too, some of their single core processors are really dual-core processor with a defective core turned off, they started doing this at the end of the Socket 939 era. No, you can't do it yourselves, nVidia(and ATi) long stopped this by physically breaking the connection on the die itself.
oh and i was well aware that cpu manufactures have been doing it for quite some time now, i guess i just didn't really think of them doing it quite as much with video cards. but seems i was wrong.