• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA's shady trick to boost the GeForce 9600GT

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,639 (3.74/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
When we first reviewed NVIDIA's new GeForce 9600 GT we noticed a discrepancy between the advertised core speed and the frequency reported by the clock generator. After further investigation we can now answer what causes this.

Show full review
 
Last edited:
I wouldn't call it a shady trick.

Is it not entirely possible that nVidia's engineers simply used the PCI-e bus speed as the base because it would mean the card would be slightly cheaper to manufacture? I mean, the standard PCI-e bus speed is 100MHz, and only a very few people actually change that speed. So, perhaps they just figured they could use that to calculate the base 25MHz frequency instead of actually putting a crystal on the board, which would increase costs(slightly, but every little bit counts).

Linkboost does increase the PCI-e frequency but ONLY when a compatible graphics card is insalled. AFAIK the 9600GT isn't a compatible graphics card, so it won't be affected. However, I might be wrong here.

I don't think it was really an attempt to pull the wool over our eyes by nVidia, I just think it was a move to make the product cheaper and easier to produce that wasn't totally thought through or properly explained to the reviews/end-users.
 
Now if you could change the voltage in bios for the video card this would really rock. No software needed for a simple core clock.

Now when you do this, the shaders are staying the same correct? They are not linked on this the same way the core is correct?
 
I wouldn't call it a shady trick.

Is it not entirely possible that nVidia's engineers simply used the PCI-e bus speed as the base because it would mean the card would be slightly cheaper to manufacture? I mean, the standard PCI-e bus speed is 100MHz, and only a very few people actually change that speed. So, perhaps they just figured they could use that to calculate the base 25MHz frequency instead of actually putting a crystal on the board, which would increase costs(slightly, but every little bit counts).

Linkboost does increase the PCI-e frequency but ONLY when a compatible graphics card is insalled. AFAIK the 9600GT isn't a compatible graphics card, so it won't be affected. However, I might be wrong here.

I don't think it was really an attempt to pull the wool over our eyes by nVidia, I just think it was a move to make the product cheaper and easier to produce that wasn't totally thought through or properly explained to the reviews/end-users.

Well, I would agree with you if they didnt flat out deny or refuse to answer questions about it. I mean once it becomes an issue/concern, why not give an accurate explanation?
 
Dont know why they want to hide it... but the fact they do is wrong. I mean is it ok to lie about clock speeds in drivers?
 
mean the card would be slightly cheaper to manufacture?

a 27 mhz crystal is already on the board for the memory clock, so why not use that like we did for the last 25 or so years?
 
somthing tells me there is more to it! altho i think its a very good find, some1 has a lot of time to spare well done w1zzard why they didnt tell you i dont know! i wonder if other new nvidia cards will end up the same?!
 
will the same apply to a 9800GT?
 
So that's how they did it!
I was like crazy on another thread why is the 9600GT so close to the 8800GT even though it has half the shaders. It just didn't add up.
Well, now it does!

Thanks W1zzard for clearing that out :)
 
Good Tech,...bad execution. I agree with Wizz.
 
Nice find and good digging up of subject matter! :D
 
depends on how they will implement the pcb design
If they do, that may be the only reason to get a 9800GT, to the 1337, it makes no difference i guess.
 
depends on how they will implement the pcb design

With that note, do you think this will only affect reference design 9600GT cards?
 
i would think they would boast about something thing like that or at least make users aware?
you guys are pure genius.
 
a 27 mhz crystal is already on the board for the memory clock, so why not use that like we did for the last 25 or so years?

I don't know, why use 25MHz as the base at all? All very good questions.
 
what if someone just pressed the wrong key when coding the drivers :roll:
 
Wow, just wow. This is an excellent find. This brings up 2 questions:
#1 Will people with intel chipsets loss out on this added boost (without boostlink)?
#2 What is the probablity of the card failing/malfunctioning because of this?
 
1) if you crank up the pcie clock on intel you will see the same clock increase
2) thats a good question. if a card can fail from overclocking (i have never seen evidence for that) and a manufacturer can detect it (no evidence for that either) and you didn't even know that the funky nvidia chipset overclocked your card, and made it go boom, you should get a new card from the mfgr or find a lawyer :)
 
That's what I call a professionally done investigative report. Great stuff W1zzard.
 
Thats a bit odd why they would mislead like that...

Shady a bit I would say...

Definitely very odd.
 
Well, I would agree with you if they didnt flat out deny or refuse to answer questions about it. I mean once it becomes an issue/concern, why not give an accurate explanation?

Because perhaps they dont wanna give the competition idea's.....this has got to be a good thing, the method that is, not being secretive. I am all for performance enhancements, just dont like it being shady.
 
I Agree Tatty. If they fudged the numbers or something, I may understand, but a flat out refusal or secretive notion, is just wrong.
 
Kudos w1zz.. and that proves that the 8800GT still > 9600GT however.

Its a great boost for 9600GT users and I agree, why did they have to be so shady with this? Why didnt they just tell us in the first place?
 
Nice read Wiz

Can the crystal be changed on the G92 (8800GTS) and would it give a big boost in performance.
 
Back
Top