• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA to Upgrade GeForce GTX 260 with 24 Additional Shaders

jbunch07

New Member
Joined
Feb 22, 2008
Messages
5,260 (0.86/day)
Location
Chattanooga,TN
Processor i5-2500k
Motherboard ASRock z68 pro3-m
Cooling Corsair A70
Memory Kingston HyperX 8GB 2 x 4GB 1600mhz
Storage OCZ Agility3 60GB(boot) 2x320GB Raid0(storage)
Display(s) Samsung 24" 1920x1200
Case Custom
Power Supply PC Power and Cooling 750w
Software Win 7 x64
Not really surprising, and everyone needs to realize this has been a common practice in computer industry for decades. The processor manufactures do it, and so do the video card manufactures. ATi and nVidia have been cutting down cores to make lower cards for a very long time, so don't get in a big huff about it now.

Though I hope nVidia doesn't keep the GTX260 name, I would prefer GTX270 or GTX265 to keep confusion down.



No, you can't do it yourselves, nVidia(and ATi) long stopped this by physically breaking the connection on the die itself.

thats what i figured.

oh and i was well aware that cpu manufactures have been doing it for quite some time now, i guess i just didn't really think of them doing it quite as much with video cards. but seems i was wrong.
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,300 (7.53/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro

Megasty

New Member
Joined
Mar 18, 2008
Messages
1,263 (0.21/day)
Location
The Kingdom of Au
Processor i7 920 @ 3.6 GHz (4.0 when gaming)
Motherboard Asus Rampage II Extreme - Yeah I Bought It...
Cooling Swiftech.
Memory 12 GB Crucial Ballistix Tracer - I Love Red
Video Card(s) ASUS EAH4870X2 - That Fan Is...!?
Storage 4 WD 1.5 TB
Display(s) 24" Sceptre
Case TT Xaser VI - Fugly, Red, & Huge...
Audio Device(s) The ASUS Thingy
Power Supply Ultra X3 1000W
Software Vista Ultimate SP1 64bit
Haven't they learned from this silly garbage yet. GD, just put out the best card you can at the time & move on. They're basically unlocking the 260 to perform near 280 levels OOB & kicking the 280 up with a 55nm ver. The sandbagging is making me sick.
 

EastCoasthandle

New Member
Joined
Apr 21, 2005
Messages
6,885 (0.96/day)
System Name MY PC
Processor E8400 @ 3.80Ghz > Q9650 3.60Ghz
Motherboard Maximus Formula
Cooling D5, 7/16" ID Tubing, Maze4 with Fuzion CPU WB
Memory XMS 8500C5D @ 1066MHz
Video Card(s) HD 2900 XT 858/900 to 4870 to 5870 (Keep Vreg area clean)
Storage 2
Display(s) 24"
Case P180
Audio Device(s) X-fi Plantinum
Power Supply Silencer 750
Software XP Pro SP3 to Windows 7
Benchmark Scores This varies from one driver to another.
I'm surprised how many people dont know this. But i read somewhere that the chips that Nvidia manufactures that have defects, they get put into cards like G80 8800GTS and GTX260 . The defective shaders or something like that is "disabled" . The perfect chips go into GTX280 and 8800gtx . This saves Nvidia money.

I'm not, most people don't get that involved. But from my understanding I don't believe that it's necessarily defective. It's just a binned differently. I guess it's just examples like this that make the 200 series different from the 4800 series.

Edit:
looks like BTA beat me to it.






Haven't they learned from this silly garbage yet. GD, just put out the best card you can at the time & move on. They're basically unlocking the 260 to perform near 280 levels OOB & kicking the 280 up with a 55nm ver. The sandbagging is making me sick.

I have to admit that would be a good example of sandbagging.
 
Last edited:

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.09/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
Haven't they learned from this silly garbage yet. GD, just put out the best card you can at the time & move on. They're basically unlocking the 260 to perform near 280 levels OOB & kicking the 280 up with a 55nm ver. The sandbagging is making me sick.

They did put out the best card they could at the time, the GTX280. They also put out a lower card to fill another market, that is how the video card industry works, companies can't just put out the high end card and nothing else, they have to fill several price points.
 

alexp999

Staff
Joined
Jul 28, 2007
Messages
8,012 (1.26/day)
Location
Dorset, UK
System Name Gaming Rig | Uni Laptop
Processor Intel Q6600 G0 (2007) @ 3.6Ghz @ 1.45625v (LLC) / 4 GHz Bench @ 1.63v | AMD Turion 64 X2 TL-62 2 GHz
Motherboard ASUS P5Q Deluxe (Intel P45) | HP 6715b
Cooling Xigmatek Dark Knight w/AC MX2 ~ Case Fans: 2 x 180mm + 1 x 120mm Silverstone Fans
Memory 4GB OCZ Platinum PC2-8000 @ 1000Mhz 5-5-5-15 2.1v | 2 x 1GB DDR2 667 MHz
Video Card(s) XFX GTX 285 1GB, Modded FTW BIOS @ 725/1512/1350 w/Accelero Xtreme GTX 280 + Scythe sinks| ATI X1250
Storage 2x WD6400AAKS 1 TB Raid 0, 140GB Raid 1 & 80GB Maxtor Basics External HDD (storage) | 160GB 2.5"
Display(s) Samsung SyncMaster SM2433BW @ 1920 x 1200 via DVI-D | 15.4" WSXGA+ (1680 x 1050 resolution)
Case Silverstone Fortress FT01B-W ~ Logitech G15 R1 / Microsoft Laser Mouse 6000
Audio Device(s) Soundmax AD2000BX Onboard Sound, via Logitech X-230 2.1 | ADI SoundMAX HD Audio
Power Supply Corsair TX650W | HP 90W
Software Windows 7 Ultimate Build 7100 x64 | Windows 7 Ultimate Build 7100 x64
Benchmark Scores 3DM06: 19519, Vantage: P16170 ~ Win7: -CPU 7.5 -MEM 7.5 -AERO 7.9 -GFX 6.0 -HDD 6.0
Just out of interest, will these actually make a difference? (other than benches). I'm assuming this is a hardware unlock as opposed to a bios unlock?
 

Megasty

New Member
Joined
Mar 18, 2008
Messages
1,263 (0.21/day)
Location
The Kingdom of Au
Processor i7 920 @ 3.6 GHz (4.0 when gaming)
Motherboard Asus Rampage II Extreme - Yeah I Bought It...
Cooling Swiftech.
Memory 12 GB Crucial Ballistix Tracer - I Love Red
Video Card(s) ASUS EAH4870X2 - That Fan Is...!?
Storage 4 WD 1.5 TB
Display(s) 24" Sceptre
Case TT Xaser VI - Fugly, Red, & Huge...
Audio Device(s) The ASUS Thingy
Power Supply Ultra X3 1000W
Software Vista Ultimate SP1 64bit
They did put out the best card they could at the time, the GTX280. They also put out a lower card to fill another market, that is how the video card industry works, companies can't just put out the high end card and nothing else, they have to fill several price points.

But in doing that, they're just telling us that the 260 is just an underpowered 280, even though most of us already knew that. However, most of the folks that bought 260 didn't know & also wouldn't be pleased at all to fine out. I'm not saying they should just put out HE cards, that's stupid. I'm saying they shouldn't tinker with the mid-high range cards. That leads to market confusion & consumer dissatisfaction. It wouldn't be a good idea to leave it with the 260 name just as you said. They just might end up calling it a plus or something, although that would probably be just as bad.
 
Joined
Jun 20, 2007
Messages
3,942 (0.62/day)
System Name Widow
Processor Ryzen 7600x
Motherboard AsRock B650 HDVM.2
Cooling CPU : Corsair Hydro XC7 }{ GPU: EK FC 1080 via Magicool 360 III PRO > Photon 170 (D5)
Memory 32GB Gskill Flare X5
Video Card(s) GTX 1080 TI
Storage Samsung 9series NVM 2TB and Rust
Display(s) Predator X34P/Tempest X270OC @ 120hz / LG W3000h
Case Fractal Define S [Antec Skeleton hanging in hall of fame]
Audio Device(s) Asus Xonar Xense with AKG K612 cans on Monacor SA-100
Power Supply Seasonic X-850
Mouse Razer Naga 2014
Software Windows 11 Pro
Benchmark Scores FFXIV ARR Benchmark 12,883 on i7 2600k 15,098 on AM5 7600x
Yeah, that's what I was wondering too...but how they've disabled the shaders/cores is what I've been curious about...there's no way to get 1GB of memory tho since the extra chip(s) are missing, but if a GTX260 could get the same ammount of shaders as a GTX280, I wouldn't complain!

Though I'd rather see the 200b's released in 260 and 280 flavors sooner than later instead of a shader increase imo..but either way some extra performance wouldn't hurt, especially if prices stay similar or decline to keep things competetive. I may have a couple step-up options from EVGA coming soon then!

:toast:

Ya, but with this whole fiasco about 200b being released in Q4 instead of Aug/September, things just don't add up.

Why release a + version of a 260 and not a 280? I have a feeling that it goes like this :

260+ announced
280+ announced not long after

Neither will be 55nm, but the former gets the extra shader, and the later gets a shader clock increase.

'200b' will most likely end up as an entry model to the GT300 series, or THE GT300 itself, complete with DDR5 and 55nm and other stuff.

I was thinking earlier that unless the GT200b was going to better or at least appropriatley comparable to the X2, then showing it at Nvision seems like a whole 'to-do' for nothing; possibly almost embarrassing.

Releasing rehashed/updated versions would seem more logical, and then they can get back to work on the next cards.

I'll give it one more month, and if nothing pops up, I'll move over to ATi in one of my machines.
 
Joined
Aug 9, 2006
Messages
1,065 (0.16/day)
System Name [Primary Workstation]
Processor Intel Core i7-920 Bloomfield @ 3.8GHz/4.55GHz [24-7/Bench]
Motherboard EVGA X58 E758-A1 [Tweaked right!]
Cooling Cooler Master V8 [stock fan + two 133CFM ULTRA KAZE fans]
Memory 12GB [Kingston HyperX]
Video Card(s) constantly upgrading/downgrading [prefer nVidia]
Storage constantly upgrading/downgrading [prefer Hitachi/Samsung]
Display(s) Triple LCD [40 inch primary + 32 & 28 inch auxiliary displays]
Case Cooler Master Cosmos 1000 [Mesh Mod, CFM Overload]
Audio Device(s) ASUS Xonar D1 + onboard Realtek ALC889A [Logitech Z-5300 Spk., Niko 650-HP 5.1 Hp., X-Bass Hp.]
Power Supply Corsair TX950W [aka Reactor]
Software This and that... [All software 100% legit and paid for, 0% pirated]
Benchmark Scores Ridiculously good scores!!!
Question: Would I be able to SLI a plain GTX 260 with this one (GTX 265?)
 
Joined
Jun 20, 2007
Messages
3,942 (0.62/day)
System Name Widow
Processor Ryzen 7600x
Motherboard AsRock B650 HDVM.2
Cooling CPU : Corsair Hydro XC7 }{ GPU: EK FC 1080 via Magicool 360 III PRO > Photon 170 (D5)
Memory 32GB Gskill Flare X5
Video Card(s) GTX 1080 TI
Storage Samsung 9series NVM 2TB and Rust
Display(s) Predator X34P/Tempest X270OC @ 120hz / LG W3000h
Case Fractal Define S [Antec Skeleton hanging in hall of fame]
Audio Device(s) Asus Xonar Xense with AKG K612 cans on Monacor SA-100
Power Supply Seasonic X-850
Mouse Razer Naga 2014
Software Windows 11 Pro
Benchmark Scores FFXIV ARR Benchmark 12,883 on i7 2600k 15,098 on AM5 7600x
Too early to tell Add.
 
Joined
Jan 21, 2008
Messages
237 (0.04/day)
System Name PC2.1
Processor Intel i7 3770k @4.6GHZ
Motherboard MSI Z68A-GD80
Cooling Corsair H100i
Memory 16GB Corsair XMS 1866MHz
Video Card(s) SLI EVGA 780 Classifieds
Storage Samsung 830 250gb /Samsung EVO 840 120GB
Display(s) 3x Dell 27" IPS screens
Case Thermaltake T81 Urban
Power Supply Cooler Master V1000
Software Windows 8.1 64bit
I would think you could use a GTX260 and a GTX265, but probly have to run the GTX260 in the frist slot.
But who knows, this type stuff does not bother me, I get the card that i need for the games I play.
If new games come out, and i cant play them max settings for my screan res I upgrade so that i can.
I normaly upgrade ever 1 to 1.5 years to kinda keep up, but almost never buy any thing when it first comes out.
Now i did jump on the 8800GT bandwagon, got two of them, paid $289.99 each, and have yet to have them not give me what i need to play my games.

But my next upgrade will be GTX300, I have a friend who ill not name, but has told me that GTX300 will be DX10.1 and this is what im wanting for SC2 and Diablo3.
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.27/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
not "defective", just the ones that happen to perform lower when binning compared what's required to make it to a GTX 280.

And not exactly that either. They disable some of the clusters for redundancy, which means that IN CASE one or two are defective or not as fast it doesn't matter. Just like what Sony did with the Cell processor. Of course defective GTX280 chips (the chip that pretends to be the GTX280, that are first selected to be, prior to any test) are also used as GTX260. What I mean is that not only defective GTX280 become GTX260, many chips are labeled as GTX260 without seeing if they could be GTX280. They sell a lot more cheaper cards than the expensive cards after all. It's common bussiness. How do you guys think that so many people has been able to flash lesser cards into the big brothers otherwise?

This move must mean that yields have improved a lot (the continuous price drops already suggested this too), otherwise it would be nearly impossible for them to do this IMO.
 
Joined
Aug 30, 2006
Messages
7,223 (1.08/day)
System Name ICE-QUAD // ICE-CRUNCH
Processor Q6600 // 2x Xeon 5472
Memory 2GB DDR // 8GB FB-DIMM
Video Card(s) HD3850-AGP // FireGL 3400
Display(s) 2 x Samsung 204Ts = 3200x1200
Audio Device(s) Audigy 2
Software Windows Server 2003 R2 as a Workstation now migrated to W10 with regrets.
What on earth is all the fuss about? ATI have been doing it for ages... e.g. X800XT vs. X800Pro etc. So have Intel with their clock multipliers. Exactly the same chip/tranny count. Just locked to a lower performance level due to:

1./ Provide a different price/performance offering
2./ Have a lower-end product to make use of die that didnt pass the highest quality testing, e.g. locked out shaders due to fail.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.09/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
But in doing that, they're just telling us that the 260 is just an underpowered 280, even though most of us already knew that. However, most of the folks that bought 260 didn't know & also wouldn't be pleased at all to fine out. I'm not saying they should just put out HE cards, that's stupid. I'm saying they shouldn't tinker with the mid-high range cards. That leads to market confusion & consumer dissatisfaction. It wouldn't be a good idea to leave it with the 260 name just as you said. They just might end up calling it a plus or something, although that would probably be just as bad.

We've already known the 260 is just an underpowered 280, if you read any review about the card you would already know that. If you do just the slightest bit of research before buying, which everyone should do, then you already knew that the 260 was just a cut down 280. If you didn't already know that before buying it, then you deserve to get "screwed" because you didn't do your research. It isn't like nVidia tried to hide the fact in anyway, they made it pretty obvious from before the cards were even released that they both would be using the same core. Again, this isn't anything new, it is a practice that has been used for decades in the computer industry, and the video card companies(both ATi and nVidia) have been doing it since at least 2002 and probably before then, I just can't remember that far back.
 
Joined
Dec 28, 2006
Messages
4,378 (0.67/day)
Location
Hurst, Texas
System Name The86
Processor Ryzen 5 3600
Motherboard ASROCKS B450 Steel Legend
Cooling AMD Stealth
Memory 2x8gb DDR4 3200 Corsair
Video Card(s) EVGA RTX 3060 Ti
Storage WD Black 512gb, WD Blue 1TB
Display(s) AOC 24in
Case Raidmax Alpha Prime
Power Supply 700W Thermaltake Smart
Mouse Logitech Mx510
Keyboard Razer BlackWidow 2012
Software Windows 10 Professional
agreed, you didnt hear 8800GT users whine about the 8800GTS 512 or 8800GTS 512 buyers whine about the 9800GTX+.
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.27/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
We've already known the 260 is just an underpowered 280, if you read any review about the card you would already know that. If you do just the slightest bit of research before buying, which everyone should do, then you already knew that the 260 was just a cut down 280. If you didn't already know that before buying it, then you deserve to get "screwed" because you didn't do your research. It isn't like nVidia tried to hide the fact in anyway, they made it pretty obvious from before the cards were even released that they both would be using the same core. Again, this isn't anything new, it is a practice that has been used for decades in the computer industry, and the video card companies(both ATi and nVidia) have been doing it since at least 2002 and probably before then, I just can't remember that far back.

I can't remember a time where this didn't happen. Well I do. It was when the graphics chips where "single core", they couldn't dissable any one of the cores. :D

Anyway the comment itself was stupid, Megasty. What is the HD4850 besides an underpowered HD4870? It's the same practice, but instead of dissabling cores they lower the clock below what most of the chips could achieve to ensure that most chips will function.
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,300 (7.53/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Considering this revised GTX 260 is just 24 SP's away from GTX 280, wouldn't they rather call this the GTX 270 ? ATI seems to be having luck with the number 7 these days :laugh:

jk
 

jbunch07

New Member
Joined
Feb 22, 2008
Messages
5,260 (0.86/day)
Location
Chattanooga,TN
Processor i5-2500k
Motherboard ASRock z68 pro3-m
Cooling Corsair A70
Memory Kingston HyperX 8GB 2 x 4GB 1600mhz
Storage OCZ Agility3 60GB(boot) 2x320GB Raid0(storage)
Display(s) Samsung 24" 1920x1200
Case Custom
Power Supply PC Power and Cooling 750w
Software Win 7 x64
Considering this revised GTX 260 is just 24 SP's away from GTX 280, wouldn't they rather call this the GTX 270 ? ATI seems to be having luck with the number 7 these days :laugh:

jk

so are they going to rename or rebadge it or just leave it a 260?
 

Megasty

New Member
Joined
Mar 18, 2008
Messages
1,263 (0.21/day)
Location
The Kingdom of Au
Processor i7 920 @ 3.6 GHz (4.0 when gaming)
Motherboard Asus Rampage II Extreme - Yeah I Bought It...
Cooling Swiftech.
Memory 12 GB Crucial Ballistix Tracer - I Love Red
Video Card(s) ASUS EAH4870X2 - That Fan Is...!?
Storage 4 WD 1.5 TB
Display(s) 24" Sceptre
Case TT Xaser VI - Fugly, Red, & Huge...
Audio Device(s) The ASUS Thingy
Power Supply Ultra X3 1000W
Software Vista Ultimate SP1 64bit
We've already known the 260 is just an underpowered 280, if you read any review about the card you would already know that. If you do just the slightest bit of research before buying, which everyone should do, then you already knew that the 260 was just a cut down 280. If you didn't already know that before buying it, then you deserve to get "screwed" because you didn't do your research. It isn't like nVidia tried to hide the fact in anyway, they made it pretty obvious from before the cards were even released that they both would be using the same core. Again, this isn't anything new, it is a practice that has been used for decades in the computer industry, and the video card companies(both ATi and nVidia) have been doing it since at least 2002 and probably before then, I just can't remember that far back.

I have to completely agree with you there. Its kinda sad that the majority of 'enthusiasts' just buy cards for the name, they just keep the buying spree within the range of what they can afford. I'm just saying that it is screw around with already established cards. The outcome only ends up being a few more fps then the originals anyway. Wasting manpower on manipulating 'old' cards just make the 'new' models suffer. It hinders progression overall. It took NV 2 years to release a single GPU that destroyed the 8800GTX. Most of that was due to little competition. But constantly releasing cards that performs within 5-10% of latter versions does nothing for us.
 
Last edited:

PVTCaboose1337

Graphical Hacker
Joined
Feb 1, 2006
Messages
9,501 (1.38/day)
Location
Texas
System Name Whim
Processor Intel Core i5 2500k @ 4.4ghz
Motherboard Asus P8Z77-V LX
Cooling Cooler Master Hyper 212+
Memory 2 x 4GB G.Skill Ripjaws @ 1600mhz
Video Card(s) Gigabyte GTX 670 2gb
Storage Samsung 840 Pro 256gb, WD 2TB Black
Display(s) Shimian QH270 (1440p), Asus VE228 (1080p)
Case Cooler Master 430 Elite
Audio Device(s) Onboard > PA2V2 Amp > Senn 595's
Power Supply Corsair 750w
Software Windows 8.1 (Tweaked)
Too bad for all GTX 260 Owners... Good thing I got my good ole 4850!
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.27/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
Considering this revised GTX 260 is just 24 SP's away from GTX 280, wouldn't they rather call this the GTX 270 ? ATI seems to be having luck with the number 7 these days :laugh:

jk

Haha. But they are having even more luck with the 5 isn't it? HD4850. Yeah I know you probably meant because of RV770 too.

But I agree with most of you, they should use a different name. The article doesn't say it is going to be named GTX260 anyway. Nvidia could replace the GTX260 with another card, with a different name, and you could still present the news in th same way they did.

Anyway, I think that this new card is what Nvidia wanted the GTX260 to be, but due to low yields they couldn't. One cluster dissabled, just like with 8800GT. That's what I think.
 

Megasty

New Member
Joined
Mar 18, 2008
Messages
1,263 (0.21/day)
Location
The Kingdom of Au
Processor i7 920 @ 3.6 GHz (4.0 when gaming)
Motherboard Asus Rampage II Extreme - Yeah I Bought It...
Cooling Swiftech.
Memory 12 GB Crucial Ballistix Tracer - I Love Red
Video Card(s) ASUS EAH4870X2 - That Fan Is...!?
Storage 4 WD 1.5 TB
Display(s) 24" Sceptre
Case TT Xaser VI - Fugly, Red, & Huge...
Audio Device(s) The ASUS Thingy
Power Supply Ultra X3 1000W
Software Vista Ultimate SP1 64bit
DarkMatter said:
Anyway the comment itself was stupid, Megasty. What is the HD4850 besides an underpowered HD4870? It's the same practice, but instead of dissabling cores they lower the clock below what most of the chips could achieve to ensure that most chips will function.

You don't seem to understand what I was saying. All high end cards have lower counterparts. I'm just saying that's it dumb to constantly fumble around with cards which are ALREADY out, especially when they only end up slightly faster than the prior versions - its not a true step up from anything. Who is going to buy a 260+, 265 or whatever when the 260 is much cheaper plus it only get a few fps less than the newer model... Its just like the 9800GTX & GTX+, the "+" costs $50-70 more & is only 5-10% better :rolleyes:
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.09/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
I have to completely agree with you there. Its kinda sad that the majority of 'enthusiasts' just buy cards for the name, they just keep the buying spree within the range of what they can afford. I'm just saying that it is screw around with already established cards. The outcome only ends up being a few more fps then the originals anyway. Wasting manpower on manipulating 'old' cards just make the 'new' models suffer. It hinders progression overall. It took NV 2 years to release a single GPU that destroyed the 8800GTX. Most of that was due to little competition. But constantly releasing cards that performs within 5-10% of latter versions does nothing for us.

That is pretty much how the industry works though. They didn't get a card out that could outperform the 8800GTX in graphical horse power, however it did outperform it in other aspects. The 9800GTX might not have really performed any better than the 8800GTX, and in some cases it performed worse. However, nVidia made huge gains in power consumption and heat output while still keeping the same performance level, not to mention the increased overclocking headroom and improved price. ATi did the same thing with the HD3800 series over the HD2900 series. And before that, nVidia did it with the 7900 series over the 7800 series, and ATi did it with the x1950 series over the x1900 series.

It is a process, every few years they release something that make a huge leap in performance. But it usually puts out an insane amount of heat, sucks up an insane amount of power, and costs a fortune. Then they work on lowering the heat output, lowering the power consumption, and release a product that performs similar in games, but is overall better. This is the reason I always skip the first generation cards. It is the reason I went with G92 based cards, and never bought a G80 card. It is the reason I wend with a HD3850 and never bothered with the HD2900's. And it is the reason I have 9800GTX's as my primary cards right now, and won't move on to the GTX280's, I'll wait to move onto that generation once they have worked on them.
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.27/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
You don't seem to understand what I was saying. All high end cards have lower counterparts. I'm just saying that's it dumb to constantly fumble around with cards which are ALREADY out, especially when they only end up slightly faster than the prior versions - its not a true step up from anything. Who is going to buy a 260+, 265 or whatever when the 260 is much cheaper plus it only get a few fps less than the newer model... Its just like the 9800GTX & GTX+, the "+" costs $50-70 more & is only 5-10% better :rolleyes:

The GTX+ costs more because retailers sell them for more. Not because Nvidia wants it that way.
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,300 (7.53/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
But I agree with most of you, they should use a different name. The article doesn't say it is going to be named GTX260 anyway. Nvidia could replace the GTX260 with another card, with a different name, and you could still present the news in th same way they did.

"One more TPC!NVIDIA will offer a upgraded GTX 260 in mid-September" is the article's title.
 
Top