• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA's shady trick to boost the GeForce 9600GT

Wrong guess. I still read the clocks from the PLL. I just use hardcoded 25MHz crystal clock for G94 based display adapters now.

how did you manage to get it to read correctly with a 27mhz crystal?
 
How honorable. I'm sure nVidia appreciates it. :shadedshu

Thanks for "pleasant" comments, mate. I guess that next time I'll simply never share results of my findings with anyone and simply won't allow such reviews to appear. It is damn sad to kill 2 weeks on investigating G94 PLL clock internals then to read such comments from ATI fans. Community doesn't deserve sharing technical details with it.
 
Why are there two Unwinders here? Don't let one guy get you down mate, 99% of the people here love RivaTuner, deffinitely including myself. I would be nowhere without it.:( Thank you very much for your hard work and dedication to this wonderful program.:toast:
 
=d wow share it man..
 
Why are there two Unwinders here? Don't let one guy get you down mate, 99% of the people here love RivaTuner, deffinitely including myself. I would be nowhere without it.:( Thank you very much for your hard work and dedication to this wonderful program.:toast:

I second that, Rivatuner as been probably the single most important peice of overclocking software I have ever used.
 
Please don't listen to a few AZZholes

Thanks for "pleasant" comments, mate. I guess that next time I'll simply never share results of my findings with anyone and simply won't allow such reviews to appear. It is damn sad to kill 2 weeks on investigating G94 PLL clock internals then to read such comments from ATI fans. Community doesn't deserve sharing technical details with it.

Rivatuner is a fantastic program. I for one would pay for it and want thank you very very much for making it. It and Atitools are the best overclocking tools for video cards made thanks very very much. I'm sorry if some nimrod insulted you, you are truely one of the video card, overclocking GODs :respect: I cannot believe anyone would do such a dumb thing.!!!!!!!!!!
 
Why are there two Unwinders here?

My bad, I use different logins when posting here from home and from the office. And I do not mean RivaTuner, I mean publishing technical details about hardware internals like this G94 clocking issue. After discovering that PCIE bus is used as a source of 25MHz reference clock I discussed it with Guru3D staff and we decided to post investigation results just in our forum and w1zzard decided to create a review about it. And even with his ideal and newbie friendly review, after seeing the reaction of community, after seeing how many peoply misunderstand it, treat it wrongly or start seeing conspiracy theories inside, I'm more and more certain that such info shouldn't be provided to public. Less people know about it = less headache the developers have.
 
My bad, I use different logins when posting here from home and from the office. And I do not mean RivaTuner, I mean publishing technical details about hardware internals like this G94 clocking issue. After discovering that PCIE bus is used as a source of 25MHz reference clock I discussed it with Guru3D staff and we decided to post investigation results just in our forum and w1zzard decided to create a review about it. And even with his ideal and newbie friendly review, after seeing the reaction of community, after seeing how many peoply misunderstand it, treat it wrongly or start seeing conspiracy theories inside, I'm more and more certain that such info shouldn't be provided to public. Less people know about it = less headache the developers have.

no you were right to tell us, because I for one couldn't understand how it was beating a 8800gs until your info came out.
 
I dont agree i personally think such things should be released to the public...not to totally dimiss your idea ut to the ppl that understand things like this will help them to hose intrested the ppl that understand could teach them as for everyone else...well unwinder as you know theirs always gonna be a couple @zzholes and ppl always trying to disprove you. Im sure you get that enough as a developer. i know this info helped me...after i got my 9600GT me and w1zz were talking quite often for long periods of time about the card...seeing as i got it b4 release i wanted a working copy of gpu-z and thats when i noticed the discrepency so me and w1zz talked about it for hours going over theorys etc...why this was happening dual oscilators the whole works..and im thankfull for the article it covered or cleared up a few things me and w1zz didnt finish discussing.
 
I, i'm new on this forum to and just want to post an opinion on some things said here.

First when i read
Thanks for "pleasant" comments, mate. I guess that next time I'll simply never share results of my findings with anyone and simply won't allow such reviews to appear. It is damn sad to kill 2 weeks on investigating G94 PLL clock internals then to read such comments from ATI fans. Community doesn't deserve sharing technical details with it.

I think there is something wrong. I'm what we can call ATI fan, i really love ATI not because they always do performant cards but cause i like the way they do their cards. I persoanly liked my rage 128 fury cause it was great for video and show better image quality, i loved my 8500 and my 9800, and my X1950. Those are good cards. They do crapy thing with the 2900, ok i won't denie it and i have a geforce2 at home.
The true issue is just "stupide fans" that just don't even know why they are. They always expect to beat the other side, and again there is no point on this here, cause i use rivatuner on my old radeon. I don't even think is an ATI fan at all ...So what's the maening of this? Take a brake, you can't make understand reason to stupide guys ;)

Overwer this thread is really great and have been report on many reviwers websites even in my country (France) and it truly help buyers. even if i m not playing to buy Nvidia cards cause of their politic, juste like this one (ATI strart doing same shady things :/), i will change my stat of mind when making computer for friends or even at work, recommanding this card for what it is and not for the triky way it take boost.

So again thanks for it, and keep seeking suchs thing, your the ones that "stop" the marketing world to destroy all the rest ^^
 
unwinder, tho Im not a writer reviewer, this sort of thing youve done is ESSENTIAL. No one should shoot the message nor the messenger. How many people in community would still be scratching there heads without this knowledge? And what kind of firestorm would come of THAT? not knowing whats going on? Youve done us all a great service with this, and of course with Riva Tuner. Thank You John
 
this explains why there are reviews at newegg that say the 9600 series will BSOD on them when the 8800 won't. And i think this might answer allot of other questions *goes to evga forums for mor info*

Plus AlexUnwinder: you are a god, Riva Tuner is a great program, its way better than nTune, which imo is crap.
 
My bad, I use different logins when posting here from home and from the office. And I do not mean RivaTuner, I mean publishing technical details about hardware internals like this G94 clocking issue. After discovering that PCIE bus is used as a source of 25MHz reference clock I discussed it with Guru3D staff and we decided to post investigation results just in our forum and w1zzard decided to create a review about it. And even with his ideal and newbie friendly review, after seeing the reaction of community, after seeing how many peoply misunderstand it, treat it wrongly or start seeing conspiracy theories inside, I'm more and more certain that such info shouldn't be provided to public. Less people know about it = less headache the developers have.

I completely understand where you're coming from on this, and looking back over the last x number of pages in this thread, it entirelly validates and supports your reasoning without a doubt - but I also feel that the public has a right to know, to an extent. Those that understand the technical aspects of it and that want to know, versus the "general public" . . . and it's typically the "general public" that f* things up for everyone in all walks of life :shadedshu

TBH, though, I'm entirelly appreciative of the info. As much as I'm ATI loyal, and have been for years, I try to stay as up to date as possible as to what nVidia is up to. Even as a red camp loyalist, I'll be the first to admit ATI has been beaten senseless over the last few years, and my claims to loyalty don't stop me from recommending nVidia hardware now and then, either.
 
My bad, I use different logins when posting here from home and from the office. And I do not mean RivaTuner, I mean publishing technical details about hardware internals like this G94 clocking issue. After discovering that PCIE bus is used as a source of 25MHz reference clock I discussed it with Guru3D staff and we decided to post investigation results just in our forum and w1zzard decided to create a review about it. And even with his ideal and newbie friendly review, after seeing the reaction of community, after seeing how many peoply misunderstand it, treat it wrongly or start seeing conspiracy theories inside, I'm more and more certain that such info shouldn't be provided to public. Less people know about it = less headache the developers have.


The more we know, it influences our purchase decisions better. The last people we bank on are neutral reviewers, of which very less are left. Hence we need to read such articles from W1z and you. It's a dying breed in a world of cash.....neutral technologists.
 
Putting faster crystals on electronic devices to boost performance is not new practice.
Maybe there was a shortage on the 25MHz crystals....
 
You don't get it. ^

lutherrcp.jpg



Try again, this time read it all.
 
after i read all of the statements above, i have a little bit question.

for the reference clock of the 9600 GT (650 Mhz ), that clock is generated by PCIE/4 * 26. But according to this link http://www.bfgtech.com/bfgr96512gtoce.aspx i'm little bit confuse with how they can generate 675 Mhz GPU clock ? does BFG change the multi (26) of the clock generators ?

anyone can explain this ?
 
Last edited:
Thanks for "pleasant" comments, mate. I guess that next time I'll simply never share results of my findings with anyone and simply won't allow such reviews to appear. It is damn sad to kill 2 weeks on investigating G94 PLL clock internals then to read such comments from ATI fans. Community doesn't deserve sharing technical details with it.

Don't let one idiot fanboy throw you off, and yes he is an idiot fanboy, he has been on my ignore list for as long as I can remember because of his idiotic comments. Don't judge the whole community by one idiots comments. It would be like judging the entire human race by one idiot racist's comments, it just isn't fair. There are those of use in the community that love to know the technical details behind what is going on. And it can prove to be helpful too. We might see this information become useful later on down the road when people are having problems with their G94 based cards, and can't figure out why they are unstable when they aren't overclocking the cards, and it could come down the the fact that their PCI-E bus is running too fast and they didn't even realize that could be a problem.

after i read all of the statements above, i have a little bit question.

for the reference clock of the 9600 GT (650 Mhz ), that clock is generated by PCIE/4 * 26. But according to this link http://www.bfgtech.com/bfgr96512gtoce.aspx i'm little bit confuse with how they can generate 675 Mhz GPU clock ? does BFG change the multi (26) of the clock generators ?

anyone can explain this ?

That is essentially how all overclocking is achieve, you are just changing the integers that the reference clock is being divided and multiplied by.

So to get 675MHz, the number 26 is changed to 27.
 
So to get 675MHz, the number 26 is changed to 27.
is this mean that the integer (26) value can be edit in BIOS right ?
 
is this mean that the integer (26) value can be edit in BIOS right ?

No. When you set the wanted frequency in an overclocking utility it will change the numbers the drivers use to set the frequency, you don't change the multipliers manually.
 
No. When you set the wanted frequency in an overclocking utility it will change the numbers the drivers use to set the frequency, you don't change the multipliers manually.

ok thanks a lot bro...
 
So without having to read soooo many pages on this I would just like to know. Is there any way to "disable" this "feature" and rerun it against the pack again and get some real, true, performance scores on it?

Id like to know how well it would stand on its own if they didnt doing this behind everyones back to influence buyers, and sell more units because they lied about the "stock" performance of the card.

This has me all upset and think back in the fx days when they used a driver trick to get performance gains. If ATI knew about this, and implemented a thing of their own, how far away would the scores be then?

Please find a way to disable the feature and retest them all again.
 
Sounds like a shady trick to me. People will see the review with linkboost tech, but without knowing about it, and then when they buy it for their system, which may not be a linkboost mobo, and the performance is not there. Kind of cheating to win the benchmarks i guess.
 
G92 overclocking

Riva Tuner, and increasing the PCI-e bus speed, both work. But the best option in my case (with two eVGA brand 8800 GTs) was to replace the video card bios on both cards with
the bios from a higher-clocked model.

I turned two ordinary cards into "Super Super Clocked" models running at 700-1725-1000, with no problems and no other form of overclocking needed. (My cards were able to go a little higher from there with Riva Tuner, but there wasn't much of a performance gain.)
 
Back
Top