Wednesday, October 3rd 2012
NVIDIA Forces EVGA to Pull EVBot Support from GTX 680 Classified
According to an Overclockers.com report, NVIDIA forced EVGA to remove voltage control, more specifically, support for its EVBot accessory, on its GeForce GTX 680 Classified graphics card. EVBot, apart from realtime monitoring, gives users the ability to fine-tune voltages, a feature NVIDIA doesn't want users access to. This design change was communicated by EVGA's Jacob Freeman, in response to a forum question a users who found his new GTX 680 Classified card to lack the EVBot header.
"Unfortunately newer 680 Classified cards will not come with the EVBot feature. If any questions or concerns please contact us directly so we can offer a solution," said Freeman. Hinting that NVIDIA is behind the design change, he said "Unfortunately we are not permitted to include this feature any longer," later adding "It was removed in order to 100% comply with NVIDIA guidelines for selling GeForce GTX products, no voltage control is allowed, even via external device." To make matters worse, Freeman said that EVGA has no immediate plans to cut prices of the GTX 680 Classified.
Source:
Overclockers.com
"Unfortunately newer 680 Classified cards will not come with the EVBot feature. If any questions or concerns please contact us directly so we can offer a solution," said Freeman. Hinting that NVIDIA is behind the design change, he said "Unfortunately we are not permitted to include this feature any longer," later adding "It was removed in order to 100% comply with NVIDIA guidelines for selling GeForce GTX products, no voltage control is allowed, even via external device." To make matters worse, Freeman said that EVGA has no immediate plans to cut prices of the GTX 680 Classified.
99 Comments on NVIDIA Forces EVGA to Pull EVBot Support from GTX 680 Classified
I think Nvidia have the card as fast as it can go on the set up it is on. More volts may well fry the thing and they don't want that bad PR (and I don't blame them).
Also, as Ben says, the 690 decimated any partners chance of making a viable dual 680 card.
It's not like Apple though. This Kepler voltage situation is being controlled to stop damage being done to hardware that's already pushing the boat out.
I say Nvidia are looking after us this time (well not me, I'm with red).
Also for those who say it is not as dangerous anymore..... Well maybe you are not trying to overclock as high as you can to make it dangerous. Just because cards are able to reach higher clocks before it is considered dangerous..... doesn't mean that it is not dangerous anymore.
As for the fail safe features that protect a product from a bad overclock..... I think that this is a good feature..... and like I said there comes a point in time anyway when you push a product so High that even fail safe features won't protect you any longer.
Nvidia removed voltage control and it's shocking again. Looks like fanboys will never learn.
And for those "back to AMD":
Enthusiasts and most people who value their money overclock their GPU to get the maximum performance out of it. It is very important in high end cards like 680.
Intel took away the overclocking feature and people still pay the premium for K series. But things are different here in GPU arena where both players are very close in performance.
They couldn't deliver on GK110 and instead sold a $300 chip at a $500 price point, they complicated overclocking with their "GPU Boost" nonsense, and they screwed enthusiasts by not allowing voltage tuning on any of their cards.
I feel bad for uninformed folks who bought something like the Classified or the MSI Lightning 680.
I generally like Nvidia's cards, but the GTX 6xx series is a load of crap and a huge step backwards from the GTX 5xx series IMO.
sneaky sneaky.
This is Bullshit
Go. You sir, hit a raw nerve and said a lot of important things that people really need to understand. From a guy that grew up with the ABIT boards that started the initial craze, it really does make me sad. The reality is, as you know, and many people don't seem to understand is that overclocking for all intents and purposes is dead and dying...killed by wanting to spur upgrades. Every major player is guilty.
Remember when AMD and nvidia over the course of a couple generations started limiting TDP per product for segmentation and sold it as a feature? Remember then when they started doing set voltages and turbo modes (which in the case of 7900 for example actually makes it a WORSE over-all product they sell for more money)? Remember when Intel essentially made a product that doesn't really work worth a damn over the typical best perf/clock ratio of ~1.175 volts (Ivy Bridge)?
As you infer by stating 'everyone gets the same results', the diluted form of overclocking we have today is more of a product feature to allow for lower and looser binning, skimp on the BOM on lower-end skus, and is built into an inflated price on upper-end ones...exactly the opposite of the initial market and purpose. I was fine with the advent of software voltage control (that took advantage of clock/voltage potential of a process with realistic cooling) versus soldering to change the properties of a resistor...less release of the magic smoke that way. What pisses me off are things like strict conformance to the pci-e (etc) power specs, or even more bullshit, limiting tdps below them. Not doing the later, for instance, saved GF104/110 from being a colossal failure and made the products quite appealing. If you want proof, look at the stock tdp and power consumption of those products, how many pci-e connectors they had, how well they clocked...and power consumption after overclocking. Now that kepler doesn't suck...gone.
Another example,
AMD knows if you buy a 7870, you are likely going to run it at ~1200+mhz...the max the chip will do limited by process tech, not tdp.
They also know if you'd like to buy a 7850 and get 7870 performance, so they institute tdp restrictions (130w) and bios locks (1050mhz) to limit the average clock to under the performance of the formers stock ability (7850 @ 1075mhz ~ 7870 @ 1ghz). That way the bottom-line price difference is preserved while the upper sku still appears worthy at stock.
Even worse is when you can smell WHY they do these things on the grand scale. In this case it's obvious 8000 will have 150w/225w max tdp parts (7850 is 130w, 7950 200w...10% less used logic and ~20% less tdp than 70 skus...).
Why would anyone buy a 150w part that outperforms a stock 7870 by 10% if 7850 already could a year earlier for the same price? They wouldn't. Now they can sell it as a huge efficiency improvement (more shaders than 7870 at lower initial clock and overclocking potential granting a net gain in terms but not actual power consumption beyond scant used transistors for CU difference), which again...is bullshit. While obviously GK104 is well-designed, no doubt it's tdp, just like 7950 and 8950, is built so GTX770 will look justifiably better at 225w, again more units at a lower sweet clock/voltage spot.
In short...it's all a big fucking ruse. Any average Joe that thinks they are an overclocker anymore is either lying to themselves or extremely delusional.
So, ya got what ya wanted, but didn't consider the consequences....
Now, I am not directing this at anyone specifically, this is just my general feeling. But then, when I try to tell people how to OC, to get a bit extra, they were doing something COMPLETELY different.
OC is not dead. It's just more cleverly hidden. If you don't own a sodlering iron, and you think you're an overclocker, you're sorely mistaken. You still need that iron.
Perfecdt example..all the claims of IVB running hot...no, actually it doesn't. you just failed to give proper cooling. :laugh:
Obviously the strategy of allowing people free voltage control, even if maxing the slider out will likely kill the card, didn't work, because idiots will just max the slider out then bitch when the card pops. So now we have nVidia limiting voltages again. Thank the idiots that bitched when they overvolted the cards too high and bitched.
I never refunded a card for OCing it because I never overvolt it badly, but this is my understanding of it.
IF I am correct, why bother with voltage tweaking, just let ppl fry cards and put them in the market for more! :laugh:
But again we don't know WHY NVIDIA asked EVGA to remove it. Might be for safety reasons. Who knows.
most of these people sit around and contribute NOTHING to society. what gives them the right to be critical of anybody?? boggles my mind...