• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce 8800 GS Rebranded to 9600 GSO

imperialreign

New Member
Joined
Jul 19, 2007
Messages
7,043 (1.11/day)
Location
Sector ZZ₉ Plural Z Alpha
System Name УльтраФиолет
Processor Intel Kentsfield Q9650 @ 3.8GHz (4.2GHz highest achieved)
Motherboard ASUS P5E3 Deluxe/WiFi; X38 NSB, ICH9R SSB
Cooling Delta V3 block, XPSC res, 120x3 rad, ST 1/2" pump - 10 fans, SYSTRIN HDD cooler, Antec HDD cooler
Memory Dual channel 8GB OCZ Platinum DDR3 @ 1800MHz @ 7-7-7-20 1T
Video Card(s) Quadfire: (2) Sapphire HD5970
Storage (2) WD VelociRaptor 300GB SATA-300; WD 320GB SATA-300; WD 200GB UATA + WD 160GB UATA
Display(s) Samsung Syncmaster T240 24" (16:10)
Case Cooler Master Stacker 830
Audio Device(s) Creative X-Fi Titanium Fatal1ty Pro PCI-E x1
Power Supply Kingwin Mach1 1200W modular
Software Windows XP Home SP3; Vista Ultimate x64 SP2
Benchmark Scores 3m06: 20270 here: http://hwbot.org/user.do?userId=12313
Just my thought, I'll admit (as I have before) that I'm ATI loyal, but I'll still recommend nVidia's hardware as needs be - but stuff like this . . . IMO, as long as it's out in the open, it's cool, as I don't see where the extra cost of the newer "product" justifies what will more than likely be only a software level performance increase*, and if people wish to make the purchase knowing it's a rebadge, that's their prerogative. Now, if nVidia kept hush-hush about it, I'd say that's crooked, and they would deserve a slap on the wrist.



*and if it truly is only a software level increase, with no change to the hardware, it wouldn't surprise me if 8800GS owners notice a drop in driver performance right after this cards release; and I'd also bet my money on some 3rd party drivers surfacing for the 8800 GS
 
Joined
Dec 28, 2006
Messages
4,378 (0.67/day)
Location
Hurst, Texas
System Name The86
Processor Ryzen 5 3600
Motherboard ASROCKS B450 Steel Legend
Cooling AMD Stealth
Memory 2x8gb DDR4 3200 Corsair
Video Card(s) EVGA RTX 3060 Ti
Storage WD Black 512gb, WD Blue 1TB
Display(s) AOC 24in
Case Raidmax Alpha Prime
Power Supply 700W Thermaltake Smart
Mouse Logitech Mx510
Keyboard Razer BlackWidow 2012
Software Windows 10 Professional
fail i ask, you blame Nvidia solely for rebranding, lets take a tour shall we

x600 pro known as RV370 aka RV360 with an intergraded Ratio chip aka 9600XT on PCIe, that applies to the x300 and x1050 also, they are all the 9600XT amazing isnt it.

x1650 except the XT where x1600 cores shurnk and rebranded, and given a new model name not a new core amazing.

x1250 IGP is really RV410 aka the x700, no one is blameless here, read your history
 

X-TeNDeR

New Member
Joined
Oct 14, 2007
Messages
110 (0.02/day)
Location
Sharpening The Underdog's Teeth
Processor AMD Athlon II X4 630 Propus C2 (2.8GHz) 95w
Motherboard Asus M4A89GTD Pro/USB3 (AMD 890GX/SB850)
Cooling Scythe Yasya+MX-3, 3x140mm+1x120mm case fans
Memory Mushkin 2x2gb DDR3 1600MHz 6-8-6-24 1.65v
Video Card(s) Club3D Radeon HD 4850 512mb (Stock Fan+MX-3)
Storage WD 1TB SATA3 + WD MyBook 1TB ext+Toshiba 160gb ext
Display(s) LG W2486L '24 FHD Led 1920x1080
Case Lancool Dragonlord K62R1 (AMD Edition)
Audio Device(s) ASUS Xonar DX+Microlab X16 2.1+Senn 448 'Phones
Power Supply Seasonic M12II-620 80+ Bronze Modular
Software Micro$oft Windows 7 Ultimate 64 Bit
^i'm almost certain that these all had different clocks and pcb designs, while some had different core revisions and die sizes.
Here, nVidia is simply rebranding the 8800GS to the new model, thats not the same in my book.

 
Last edited:
M

moto666

Guest
I wonder how can they do that?
I mean nothing new was maked between 8000 and the 9000 series????
New HDR or special PUREVIDEO or anything! Nothing???

So they can just rename all of the 8000 cards to 9000 when they think... :wtf:
 
Joined
Nov 22, 2007
Messages
1,398 (0.22/day)
Location
Hyderabad,India
System Name MSI apache ge62 2qd
Processor intel i7 5700HQ
Memory 12 Gb
Video Card(s) GTX960m
Storage 1TB
Display(s) Dell 24'
When can I get my 9800 IGP.
On a serious note
Trog has been saying this since the release of the 8800GT that it was mwant to be a 9x but ATI made them release it. Now Nvidia are trying to eat the cake.
 

Tatty_Two

Gone Fishing
Joined
Jan 18, 2006
Messages
25,941 (3.76/day)
Location
Worcestershire, UK
Processor Intel Core i9 11900KF @ -.080mV PL max @220w
Motherboard MSI MAG Z490 TOMAHAWK
Cooling DeepCool LS520SE Liquid + 3 Phanteks 140mm case fans
Memory 32GB (4 x 8GB SR) Patriot Viper Steel Bdie @ 3600Mhz CL14 1.45v Gear 1
Video Card(s) Asus Dual RTX 4070 OC + 8% PL
Storage WD Blue SN550 1TB M.2 NVME//Crucial MX500 500GB SSD (OS)
Display(s) AOC Q2781PQ 27 inch Ultra Slim 2560 x 1440 IPS
Case Phanteks Enthoo Pro M Windowed - Gunmetal
Audio Device(s) Onboard Realtek ALC1200/SPDIF to Sony AVR @ 5.1
Power Supply Seasonic CORE GM650w Gold Semi modular
Software Win 11 Home x64
That's basically what most others are saying. In light of this, there really is no defense for them to simply re-badge a video card simply because it didn't sell well under a different name :rolleyes:

Agreed, and the sad thing is, it's a pretty decent card really.
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,244 (7.54/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
8800 GS -> 9600 GSO

X1600 XT -> X1650 Pro

How does it affect you? Does it jack up prices? Does the 9600 GSO magically perform better than 8800 GS? Does NVidia jack up prices?

Simple answer: NO.

But ATIncompetents like BumbRush can continue to crib. So go on, crib.
 

beyond_amusia

New Member
Joined
Feb 20, 2007
Messages
1,140 (0.18/day)
Location
Baltimore, Maryland
System Name Cozad (Asus G60JX)
Processor Core i5 M 430
Memory 8 GB DDR3 1066
Video Card(s) nVidia GeForce 360M
Storage 500GB
Display(s) 16 inch LED LCD
Software Windows 7 Ultimate x64 SP1
that's pretty shady of them... That's be like Microsoft giving Windows ME a new GUI and calling it Windows Vista... Huh? They already did that???
 

BumbRush

New Member
Joined
Mar 5, 2008
Messages
225 (0.04/day)
yes but in the case of the 1650 and 1950 vs the 1600/1900 cards they didnt totaly change the name to sell them as a totaly new card........*shakes head*

read the posts back over the last couple pages, insted of just ranting without understanding why ppl are bitching.
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,244 (7.54/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Oh, so in your description, there's a 'total change' and partial change? X1650 is a different name, alright. As long as this name change doesn't affects prices and affects only people who take fun in whining about NVidia, it's rather pointless in making it a big issue.

If there's a business strategy you're whining about, big deal. At least there's nothing foul from NVidia. It's not rebadging the GPU's and jacking up the prices.

Besides, such stuff isn't new to the industry. If you think only NVidia plays 'foul', think again. Compare the Radeon 8500 LE to Radeon 9100 and tell me the difference. Wasn't that a similar rebadge?
 
Last edited:

webwizard

New Member
Joined
Apr 1, 2008
Messages
61 (0.01/day)
Processor Intel Core 2 Duo E8500 3.16 GHz 1333 MHz
Motherboard Asus P5N-E SLI nVidia nForce 650i Intel 775
Cooling Arctic Cooling Freezer 7 Pro plus 2 120mm AntecTriCool DBB
Memory DDR 2 4GB Samsung
Video Card(s) 2 XFX 9600GT XXX Alpha Dog stock cooler
Storage Western Digital 500 GB
Display(s) Dell 22" LCD
Case Sonata III 500
Audio Device(s) Creative Labs Sound Blaster X-Fi Extreme Audio
Power Supply Antec Earthwatts 500
Software XP Pro SP2
I don't know who wanted the card when it was the 8800 GS let alone as 9600 GSO.
 

EastCoasthandle

New Member
Joined
Apr 21, 2005
Messages
6,885 (0.96/day)
System Name MY PC
Processor E8400 @ 3.80Ghz > Q9650 3.60Ghz
Motherboard Maximus Formula
Cooling D5, 7/16" ID Tubing, Maze4 with Fuzion CPU WB
Memory XMS 8500C5D @ 1066MHz
Video Card(s) HD 2900 XT 858/900 to 4870 to 5870 (Keep Vreg area clean)
Storage 2
Display(s) 24"
Case P180
Audio Device(s) X-fi Plantinum
Power Supply Silencer 750
Software XP Pro SP3 to Windows 7
Benchmark Scores This varies from one driver to another.
Oh, so in your description, there's a 'total change' and partial change? X1650 is a different name, alright. As long as this name change doesn't affects prices and affects only people who take fun in whining about NVidia, it's rather pointless in making it a big issue.

If there's a business strategy you're whining about, big deal. At least there's nothing foul from NVidia. It's not rebadging the GPU's and jacking up the prices.

Besides, such stuff isn't new to the industry. If you think only NVidia plays 'foul', think again. Compare the Radeon 8500 LE to Radeon 9100 and tell me the difference. Wasn't that a similar rebadge?

Please stop attempting to make some sort of comparison from 6+ years ago . Even though that comparison is not the same it's still wrong no matter who you find doing it. Also:
-9100 was marketed in the IGP more so then just a discrete GPU
-9100 offered HyperZ while the 8500LE offered HyperZ II
-9100 offered UMA (Unified Memory Architecture), the 8500LE offered High Performance Memory Support
-9100 was AGP 3.0 compliant capable of AGP 8x support with fast write, the 8500LE supported up to AGP 4x
and other options that differentiated it from the 8500LE. It was not simply a re-badged GPU simply because it didn't sell well as a 8500LE.




-
 
Last edited:

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,244 (7.54/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Regardless of it being marketed as an IGP, it was a discrete GPU alright, which was based on the R200. Right, HyperZ II was a mucho better feature, and whoa, the R200 really needed the bandwidth of AGP 3.0 :rolleyes:

Regardless of the facts, face it. Both companies be it NVidia or St. ATi have done such a rebadge in the past and will continue to do so in their commercial interests. As long as by doing so they're not making the consumers pay more, complaining about it is pointless.
 

EastCoasthandle

New Member
Joined
Apr 21, 2005
Messages
6,885 (0.96/day)
System Name MY PC
Processor E8400 @ 3.80Ghz > Q9650 3.60Ghz
Motherboard Maximus Formula
Cooling D5, 7/16" ID Tubing, Maze4 with Fuzion CPU WB
Memory XMS 8500C5D @ 1066MHz
Video Card(s) HD 2900 XT 858/900 to 4870 to 5870 (Keep Vreg area clean)
Storage 2
Display(s) 24"
Case P180
Audio Device(s) X-fi Plantinum
Power Supply Silencer 750
Software XP Pro SP3 to Windows 7
Benchmark Scores This varies from one driver to another.
Regardless of it being marketed as an IGP, it was a discrete GPU alright, which was based on the R200. Right, HyperZ II was a mucho better feature, and whoa, the R200 really needed the bandwidth of AGP 3.0 :rolleyes:

Regardless of the facts, face it. Both companies be it NVidia or St. ATi have done such a rebadge in the past and will continue to do so in their commercial interests. As long as by doing so they're not making the consumers pay more, complaining about it is pointless.

After showing you the differences between both they are clearly not the same arch. rebadge. The only company in the discrete GPU market in 2008 that has done this is Nvidia. No excuse found in some other company years ago justify this practice. Wrong is still wrong and just because a few think wrong is right doesn't make it right.
 
Joined
Aug 18, 2007
Messages
1,037 (0.16/day)
System Name mine
Processor 2600k@5.2ghz 1.41 vcore
Motherboard asrock z77 OC Formula
Cooling pa120.3, apogee HD, mcp655 pump
Memory 16gigs ddr3 1600mhz sammy's @ 1886
Video Card(s) evga gtx470 under water on a seperate loop
Storage Crucial M4 128gig SSD
Display(s) HP 2511x 25inch led
Case none ATM
Audio Device(s) onboard
Power Supply PC Power & Cooling 750watt seasonic x-1250watt(dead)
Software win 8 pro 64bit
i suppose i'm in the minority here, but i agree with nvidia on this move.

first off the g9x core should have been labeled as a geforce 9xxx to begin with. it is different enough from the 8800gts/gtx/ultra to deserve its own name.

second i own a 8800gs and 8800gt and also a 9600gt. the gs doesnt perform good enough to deserve the 8800 title. dont get me wrong for the price it is a very good card. but not in the same league as a 8800gt. its performance when overclocked is almost identical to a 9600gt.

on my backup rig with a e2180@3.5 and 2gigs of ram both the 9600gt @stock and 8800gs@740/950 score 11.5k in 3dmark06.

1 last thing. if nvidia are able to sell the rest of thier stock, it means more profit. more profit means more money for R&D. more money for R&D mean better,faster and possibly cheaper cards in the future.

if you dont like the new naming strategy, dont buy one. if your worried uninformed consumers will buy it because it has a higher model number, they could certainly do worse than a re badged 8800gs. what about the 1gig 8400/8500. more sales, more profit is always better.
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,244 (7.54/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
After showing you the differences between both they are clearly not the same arch. rebadge. The only company in the discrete GPU market in 2008 that has done this is Nvidia. No excuse found in some other company years ago justify this practice. Wrong is still wrong and just because a few think wrong is right doesn't make it right.



Come off it. HyperZ II and AGP 3.0 were merely marketing instruments to sell the same old wine (R200) with the same elementary specifications (and GPU parameters) in a new bottle (Radeon 9100) So it pretty much was a rebadge.

Bite on the logic, rebadging GPU's is something both companies have done in the past. I'm sure there's a better example than the R200, I just have to look.

And saying "the only company that did it in 2008" is bad logic. So, lets say this year if ATI does something similar, I must use your logic and say "no no, the only company that did it in September 2008 was ATI". But the fact remains none of the companies wear angels' gowns.

And yes, there still is no contest for this statement: "So what, if they rebadged it as long as they're not cheating the consumers by asking them to pay more" ?
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.94/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
can we stop with the jokes they are not funny.

- Christine

that photoshop pic was pretty amusing.


More or less, its somewhat dirty - but they already did it with the 8800GT -> 9600GT (with a few minor changes) and ATI did it with the x38x0 cards (again with minor changes)

I guess its just how things are.

You have to agree tho, the 8800GS was a great value card that really failed due to no one knowing about it. - and its only $30 more for the 9600GT now. needs a relaunch and a price drop.
 

EastCoasthandle

New Member
Joined
Apr 21, 2005
Messages
6,885 (0.96/day)
System Name MY PC
Processor E8400 @ 3.80Ghz > Q9650 3.60Ghz
Motherboard Maximus Formula
Cooling D5, 7/16" ID Tubing, Maze4 with Fuzion CPU WB
Memory XMS 8500C5D @ 1066MHz
Video Card(s) HD 2900 XT 858/900 to 4870 to 5870 (Keep Vreg area clean)
Storage 2
Display(s) 24"
Case P180
Audio Device(s) X-fi Plantinum
Power Supply Silencer 750
Software XP Pro SP3 to Windows 7
Benchmark Scores This varies from one driver to another.
Come off it. HyperZ II and AGP 3.0 were merely marketing instruments to sell the same old wine (R200) with the same elementary specifications (and GPU parameters) in a new bottle (Radeon 9100) So it pretty much was a rebadge.

Bite on the logic, rebadging GPU's is something both companies have done in the past. I'm sure there's a better example than the R200, I just have to look.

And saying "the only company that did it in 2008" is bad logic. So, lets say this year if ATI does something similar, I must use your logic and say "no no, the only company that did it in September 2008 was ATI". But the fact remains none of the companies wear angels' gowns.

And yes, there still is no contest for this statement: "So what, if they rebadged it as long as they're not cheating the consumers by asking them to pay more" ?

That entire post is flawed. Truth remains that both 8500LE and 9100 were different. However, we will agree to disagree on the fact that such practices are wrong.
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,244 (7.54/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Oh, and the GeForce 4 MX 4000 (which essentially had AGP 3.0 compliancy) wasn't a rebadge :rolleyes:. 8500 LE and 9100 were identical to the point where they were the same R200 core with the same clocks, same GPU / memory parameters. With maybe just the HyperZ II and AGP 3.0 thrown in...so the 9100 is a rebadge of the 8500 LE. And yes it did exist as a discrete GPU though it was used as an IGP.

Agree that such practices are wrong (if they're rebadging something and asking you and me to pay more). Disagree that ATI have stayed clean and never rebadged their GPU's. That's not a double negative, neither companies have stayed clean on this issue.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.10/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
difference being, is when ATI rebrand a product, they at least make some changes to the card

X1600 used the RV530 GPU
X1650 used the RV535 GPU

real differences in GPU alone: difference in manufacturing process (90nm vs 80nm)

IIRC, the 1650s were also clocked slightly higher.


I'm not saying ATI isn't guilty of rebranding as well, they've done it quite a few times over the years - but they at least make some small changes to improve the card over the previous model.


No they didn't change anything. The x1600Pro and x1650 both use the RV530. The only difference between the two was that ATi upped the memory clock on the x1650 by a whole 10MHz. It is 100% exactly like what nVidia is doing now. The x1600Pro was never going to sell as the x1600Pro with the x1650 series out, so they just renamed it so it would sell. The rest of the x1650 series used RV535, but the basic x1650 used RV530 still, they didn't even change the PCB design. I wasn't talking about entire series of cards, I named two specific cards, do your research. Also, the DDR2 x1650Pro's were rebanded DDR2 x1600XT's both using the RV530.

not everyone understands whats underneath the heatsink, some will just follow whatever the highest number is and buy it, nvidia picked up on that and just exploited it, like the whole 9 series to date lol

And the whole 3800 series to date, again I didn't see you complain when ATi did it and I don't see you making the same comments about ATi. Why? My guess: Fanboy.

Anyone that buys computer parts based entirely on the number printed on it deserves to get ripped off. There is no excuse for not doing your research before buying computer parts.

I don't really care that ATi does it, and I don't care that nVidia does it. I'm just pointing out that both of them have done it, do do it, and will continue to do it. If you are going to bash one for doing it, bash the other, otherwise you are a hypocrit and a fanboy.
 
Last edited:

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.94/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
No they didn't change anything. The x1600Pro and x1650 both use the RV530. The only difference between the two was that ATi upped the memory clock on the x1650 by a whole 10MHz. It is 100% exactly like what nVidia is doing now. The x1600Pro was never going to sell as the x1600Pro with the x1650 series out, so they just renamed it so it would sell. The rest of the x1650 series used RV535, but the basic x1650 used RV530 still.

he is correct, it was the XT model that had the different core.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.10/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
After showing you the differences between both they are clearly not the same arch. rebadge. The only company in the discrete GPU market in 2008 that has done this is Nvidia. No excuse found in some other company years ago justify this practice. Wrong is still wrong and just because a few think wrong is right doesn't make it right.

Yep, lets just limit it to 2008, it doesn't matter that it has happened in the past, ignore that, just focus on 2008. If you just look at 2008 a lot of things look a lot better.

We aren't talking years ago, we are talking one generation of cards ago. ATi just did it the last generation. The x1650 came out Feb of 07, just just over a year ago ATi did exactly the same thing nVidia is doing today. We aren't talking ancient history here like you seem to want to make everything think.

And what about ATi's move with the RV370. It is best known as the core used in the x300. However, what did they do, they also used it in the x1K series in the x1050. Doesn't seem like that big of a deal, right? In fact the x1050 probably seemed like a great buy to the avarage consumer, the people you guys seem to want to say nVidia is trying to trick here. The x1050 had a 400MHz core clock, just like the x300, but the memory clock was 333MHz vs. the 250MHz clock of the x300. Seems like a great improvement for the consumer, right? Except they slashed the memory bus to 64-bit effectively making the x1050 perform a lot worse than the x300 despite the higher clock speeds. Now that is a shady trick to try and fool the consumer if I have ever seen one, or at least one that is far worse than what nVidia is doing now.
 
Last edited:

imperialreign

New Member
Joined
Jul 19, 2007
Messages
7,043 (1.11/day)
Location
Sector ZZ₉ Plural Z Alpha
System Name УльтраФиолет
Processor Intel Kentsfield Q9650 @ 3.8GHz (4.2GHz highest achieved)
Motherboard ASUS P5E3 Deluxe/WiFi; X38 NSB, ICH9R SSB
Cooling Delta V3 block, XPSC res, 120x3 rad, ST 1/2" pump - 10 fans, SYSTRIN HDD cooler, Antec HDD cooler
Memory Dual channel 8GB OCZ Platinum DDR3 @ 1800MHz @ 7-7-7-20 1T
Video Card(s) Quadfire: (2) Sapphire HD5970
Storage (2) WD VelociRaptor 300GB SATA-300; WD 320GB SATA-300; WD 200GB UATA + WD 160GB UATA
Display(s) Samsung Syncmaster T240 24" (16:10)
Case Cooler Master Stacker 830
Audio Device(s) Creative X-Fi Titanium Fatal1ty Pro PCI-E x1
Power Supply Kingwin Mach1 1200W modular
Software Windows XP Home SP3; Vista Ultimate x64 SP2
Benchmark Scores 3m06: 20270 here: http://hwbot.org/user.do?userId=12313
No they didn't change anything. The x1600Pro and x1650 both use the RV530. The only difference between the two was that ATi upped the memory clock on the x1650 by a whole 10MHz. It is 100% exactly like what nVidia is doing now. The x1600Pro was never going to sell as the x1600Pro with the x1650 series out, so they just renamed it so it would sell. The rest of the x1650 series used RV535, but the basic x1650 used RV530 still, they didn't even change the PCB design. I wasn't talking about entire series of cards, I named two specific cards, do your research. Also, the DDR2 x1650Pro's were rebanded DDR2 x1600XT's both using the RV530.

Sorry, but it was a little hard to discern from your initial post if you were talking about the series as a whole, or two specific cards - you mentioned the X1650, and without a suffix we all tend to assume that would be the whole 1650 series;

But, like I stated before - the X1600 PRO used the RV530 core, the X1650 might have run the RV530 right off the back, but it wasn't long after release that they changed to the RV535. Besdides, the release of the X1650 XT was very shortly after the initial release of the X1650, and considering those cards were running the RV560, the series was in need of a change, hence X1600 -> X1650.

Is there a major difference between the two cores? No, not at all. It's a ide shrink and a slight boost to the MEM clocks.

And how could the X1650 PROs be rebranded as the X1600XTs? The X1600 XT was released before the introduction of the X1650 PRO. :wtf: Asides from, IIRC, the X1600 XT used GDDR3 whereas the X1650 PRO used GDDR2.





Anyhow, I'm not saying ATI isn't guilty of rebadging stuff down the line - but they make slight changes to the card itself, or have something new in the works when they do it. nVidia typically doesn't.

But the fact of the matter - nVidia decided to rebrand the card. It's out in the open, there's nothing shady about it. That's their deal. It's up to the customer to decide if the rebadge bothers them or not. Personally, I could care less - I don't think it's crooked or shady of nVidia. Now, if they had done the rebadge and kept secretive about it, that'd be a different story; but they didn't . . .
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.10/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
Sorry, but it was a little hard to discern from your initial post if you were talking about the series as a whole, or two specific cards - you mentioned the X1650, and without a suffix we all tend to assume that would be the whole 1650 series;

But, like I stated before - the X1600 PRO used the RV530 core, the X1650 might have run the RV530 right off the back, but it wasn't long after release that they changed to the RV535. Besdides, the release of the X1650 XT was very shortly after the initial release of the X1650, and considering those cards were running the RV560, the series was in need of a change, hence X1600 -> X1650.

Is there a major difference between the two cores? No, not at all. It's a ide shrink and a slight boost to the MEM clocks.

And how could the X1650 PROs be rebranded as the X1600XTs? The X1600 XT was released before the introduction of the X1650 PRO. :wtf: Asides from, IIRC, the X1600 XT used GDDR3 whereas the X1650 PRO used GDDR2.

The standard x1650 always used the RV530, AFAIK it never switched to the RV535. The x1650XT was a completely different beast and has nothing to do with it. The series was in need of a change, but that has nothing to do with it either, the fact is that the rebadged the x1600Pro as the x1650. Nothing was changed with the cards, they even used the same PCBs. As for the x1600 series to x1650 series, yeah for the most part there was a die shrink and the renaming was justified, I agree with that. However, rebading the old RV530 x1600Pro's as an x1650 was exactly what nVidia is doing now, and most likely for the same reasons.

You also to my post confused. The x1650 Pro was not rebranded as x1600XT. The GDDR2 x1600XT's were renamed to x1650 Pros. The x1650Pro's that had GDDR2 were originally x1600XT's that had GDDR2. Yes, both used GDDR2, and both used the RV530. The x1600XT again wouldn't sell as the x1600XT once the x1650 series was out, and the rebranded the x1600XT. The x1600XT became the x1650Pro.
 

vampire622003

New Member
Joined
Mar 6, 2007
Messages
135 (0.02/day)
Location
Austin
Processor AMD Phenom X4 @ 2.5GHZ
Motherboard AsRock
Memory 5GB DDR2-800MHz @ 891MHz
Video Card(s) ATI HD 4850 GDDR3 512MB w/ ZEROTherm GX815 Cooler
Storage 2X Western Digital SATAII 320GB (RAID 0)
Display(s) Acer 17" LCD Model AL1716
Audio Device(s) Sound Blaster Audigy 2 ZS
Power Supply ORION 585Watt PSU
Software Windows 7 x64
this is dumb, all its done is given the ATi fanboys something to bitch about, good on you guys, i guess you need to pick on every little thing nvidia does since theyve been whooping ATi for over a year.

like has been already pointed out, the name on the heatsink doesn't mean dick, its whats under the heatsink that counts, and this card is a proven excellent price/performance budget gamer, so whats the problem?

business is business, gaming is gaming, nothing has changed, life goes on.

-Wolf.
Wow. You have Nvidia printed all over you don't you? OWNED.
 
Top