Thursday, August 21st 2008

NVIDIA to Upgrade GeForce GTX 260 with 24 Additional Shaders

In a move that can be seen as retaliation to the HD 4870 variations that come with high-performance cores and up to 1 GB of GDDR5 memory and preparations to counter an upcoming Radeon HD 4850 X2, NVIDIA has decided to give the GeForce GTX 260 an upgrade with an additional Texture Processing Cluster (TPC) enabled in the GTX 260 G200 core. The original GTX 260 graphics processor (GPU) had 8 TPCs, (24 x 8 = 192 SPs), the updated core will have 9 TPCs, that amounts to an additional 24 shader processors, which should increase the core's shader compute power significantly over merely increasing frequencies. It is unclear at this point as to what the resulting product would be called.

Everything else remains the same with frequencies, memory size, memory bus width. This upgrade could take shape by this September.
Source: Expreview
Add your own comment

86 Comments on NVIDIA to Upgrade GeForce GTX 260 with 24 Additional Shaders

#76
flclisgreat
newtekie1Not really surprising, and everyone needs to realize this has been a common practice in computer industry for decades. The processor manufactures do it, and so do the video card manufactures. ATi and nVidia have been cutting down cores to make lower cards for a very long time, so don't get in a big huff about it now.

Though I hope nVidia doesn't keep the GTX260 name, I would prefer GTX270 or GTX265 to keep confusion down.



Yep, exactly. And the G92's that are defective get put in 8800GT's and 8800GS's(9600GSO's). Ati didn't do it with the RV670, but they did with RV600. The 2900GT was just a defective RV600 with the defective shaders turned off.

Intel and AMD use similar techniques with their processors. The original E6300 and E6400 were just defective Conroe cores that had the defective parts of the L2 cache disabled. Same thing with the Celerons and Penium E2000 series, they are just Allendale cores(from the E4000 series) with the defective parts of the L2 cache disabled. The Celeron 400 series are Allendale cores with an entire processing core disabled to give the appearence of only having a single core processor.

AMD does this too, some of their single core processors are really dual-core processor with a defective core turned off, they started doing this at the end of the Socket 939 era.



No, you can't do it yourselves, nVidia(and ATi) long stopped this by physically breaking the connection on the die itself.
i thought all that was "common" knowledge? people didn't know that? i always just assumed that all the celerons/semprons where f'ed up allandale/athlon's with cache/cores disabled
Posted on Reply
#77
PCpraiser100
By the time GTX 280 Rev. 2 comes out, it will be too late. the HD 4870 X2 will already be lower than the newest revision of the GTX 200 series.
Posted on Reply
#78
TooFast
what a joke! is this all nvidia has??? once the 4850x2 hits the street it will really be over this round.
Posted on Reply
#79
eidairaman1
The Exiled Airman
gotta love competition, you Green Team Members should be very Happy that Red Team is giving them competition.
Posted on Reply
#80
jbunch07
eidairaman1gotta love competition, you Green Team Members should be very Happy that Red Team is giving them competition.
exactly! with out competition the 8800gtx would still be their flagship card.
Posted on Reply
#81
phanbuey
AddSubQuestion: Would I be able to SLI a plain GTX 260 with this one (GTX 265?)
probably not without some sort of modding... and why would you want to?? - the slowest card determines the speed in SLI, so you would be basically SLI'ing two normal 260's but paying more for the fancier version.

I was gonna buy another 260 this week to sli, but i figure ill wait till their price goes through the floor with this announcement. If you OC 260 to about 725 Core and 1450 shaders, it can play on the same level as a stock 280, if you sli 2 OC'd 260's youre probably looking at the performance of a stock GT300 (384 shaders etc)... that's plenty enough juice, even for Crysis at very high.

Im actually kind of happy this is coming out... means i get to buy a cheapo 260. :D
Posted on Reply
#82
chron
Does anyone here remember when ATI's X1800GTO could be unlocked to an X1800XL? X1800GTO had 12 pipes, while the XL had 16. All that was needed was a bio update. Some cards had an extra VRM allowing you to do this, while some didn't and were laser locked. Same situation I think...
Posted on Reply
#83
bas3onac1d
Decreasing the shader count on a lower-end card is nothing new. They never actually produce different chips for the two highest ends cards. If a 200 is produced and one or two of the clusters are dysfunctional then they can just disable them and use the chip as a GTX 260, the same goes for chips that can't stay stable at 280 speeds, even if all the shaders work.

However, as more 260s sell than 280s (this always happens with cheaper cards) nVidia must use some perfectly functional cores in the 260. This has been going on for generations with both card makers. Before they were laser locked these chips could be unlocked and run at full speed. Sometimes people got cores that actually did have dysfunctional parts and the bios change never worked for them, it was chance.
Posted on Reply
#84
newtekie1
Semi-Retired Folder
phanbueyprobably not without some sort of modding... and why would you want to?? - the slowest card determines the speed in SLI, so you would be basically SLI'ing two normal 260's but paying more for the fancier version.

I was gonna buy another 260 this week to sli, but i figure ill wait till their price goes through the floor with this announcement. If you OC 260 to about 725 Core and 1450 shaders, it can play on the same level as a stock 280, if you sli 2 OC'd 260's youre probably looking at the performance of a stock GT300 (384 shaders etc)... that's plenty enough juice, even for Crysis at very high.

Im actually kind of happy this is coming out... means i get to buy a cheapo 260. :D
When nVidia released the 8800GTS 640mb with 112 SP's instead of the 96, you could still SLI both together. I don't know if the one with more SP's made use of the extra shaders when in SLI though, I think you are right in saying they wouldn't.

Also, when they did this with the 8800GTS, they released the new card at the same price point as the old 8800GTS, and discontinued the previous card, so the prices didn't actually go down for the old cards.
Posted on Reply
#85
candle_86
no it doesnt, a 112 and a 96 makes both a 96
Posted on Reply
#86
Hayder_Master
still far from 4870 , and by the way guys anyone have link for 4870 with 1g i need to know what price of it too
Posted on Reply
Add your own comment
Dec 22nd, 2024 07:28 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts