# NVIDIA Forces EVGA to Pull EVBot Support from GTX 680 Classified



## btarunr (Oct 3, 2012)

According to an Overclockers.com report, NVIDIA forced EVGA to remove voltage control, more specifically, support for its EVBot accessory, on its GeForce GTX 680 Classified graphics card. EVBot, apart from realtime monitoring, gives users the ability to fine-tune voltages, a feature NVIDIA doesn't want users access to. This design change was communicated by EVGA's Jacob Freeman, in response to a forum question a users who found his new GTX 680 Classified card to lack the EVBot header. 

"Unfortunately newer 680 Classified cards will not come with the EVBot feature. If any questions or concerns please contact us directly so we can offer a solution," said Freeman. Hinting that NVIDIA is behind the design change, he said "Unfortunately we are not permitted to include this feature any longer," later adding "It was removed in order to 100% comply with NVIDIA guidelines for selling GeForce GTX products, no voltage control is allowed, even via external device." To make matters worse, Freeman said that EVGA has no immediate plans to cut prices of the GTX 680 Classified. 





*View at TechPowerUp Main Site*


----------



## [H]@RD5TUFF (Oct 3, 2012)

The fuck .. . ..  WHY!?!?!?!?


----------



## cdawall (Oct 3, 2012)

What a crock of shit :shadedshu


----------



## DannibusX (Oct 3, 2012)

Damn.

Sucks that you aren't allowed to modify something that you paid hundreds of dollars for.


----------



## hardcore_gamer (Oct 3, 2012)

DannibusX said:


> Sucks that you aren't allowed to modify something that you paid hundreds of dollars for.



Just like Apple


----------



## [H]@RD5TUFF (Oct 3, 2012)

Absolute horse shit, I expect a full refund for my card and EV bot!!!!


----------



## Maban (Oct 3, 2012)

Quick, somebody post a lolwut meme!


----------



## NeoXF (Oct 3, 2012)

XFX leaving nVidia-like fiasco in 3... 2... 1...


----------



## Maban (Oct 3, 2012)

I would think that EVGA is big enough to stand up to them.


----------



## TheLostSwede (Oct 3, 2012)

In related news, Asus wasn't allowed to make their dual GPU MARS card either, as Nvidia wouldn't let them...
So apparently the chip makers are now controlling what their "partners" are allowed to do, or not to do with the chips they buy from them.
This industry is clearly going down the toilet, as the manufacturers don't have the balls to stand up to the chip makers. :shadedshu


----------



## hardcore_gamer (Oct 3, 2012)

May be nVidia is thinking of releasing K series GPUs and charge us extra for tweakability like Intel does.


----------



## eidairaman1 (Oct 3, 2012)

I see EVGA launching an AMD lineup that are fully balls to wall


----------



## buggalugs (Oct 3, 2012)

not surprised. Nvidia has a long reputation of strong arm tactics.

 In a way, I dont blame them though, if they're getting a lot of returned cards that have been killed by too much voltage, they have to do something. Too many guys like to play the overclocking game with no real understanding of what they're doing and bump the voltage to dangerous levels for 24/7 use.


----------



## eidairaman1 (Oct 3, 2012)

buggalugs said:


> not surprised. Nvidia has a long reputation of strong arm tactics.
> 
> In a way, I dont blame them though, if they're getting a lot of returned cards that have been killed by too much voltage, they have to do something. Too many guys like to play the overclocking game with no real understanding of what they're doing and bump the voltage to dangerous levels for 24/7 use.



same with flashing, but you see cards with dual bios on them now


----------



## HammerON (Oct 3, 2012)

Horse pucky


----------



## Sinzia (Oct 3, 2012)

Do not like.


----------



## buggalugs (Oct 3, 2012)

eidairaman1 said:


> same with flashing, but you see cards with dual bios on them now



Ya, but at least with flashing it can be repaired, a GPU killed with voltage is well and truly dead.


----------



## eidairaman1 (Oct 3, 2012)

buggalugs said:


> Ya, but at least with flashing it can be repaired, a GPU killed with voltage is well and truly dead.



i know that. AMD seems to give more leeway now


----------



## SIGSEGV (Oct 3, 2012)

i smell something fishy


----------



## eidairaman1 (Oct 3, 2012)

SIGSEGV said:


> i smell something fishy



yup and its puke green


----------



## remixedcat (Oct 3, 2012)

the hell nvidia? why? makes me worry about other things....


----------



## HammerON (Oct 3, 2012)

Control freaks:shadedshu


----------



## Benetanegia (Oct 3, 2012)

TheLostSwede said:


> In related news, Asus wasn't allowed to make their dual GPU MARS card either, as Nvidia wouldn't let them...



Do you have a proof link or something? It seems to me that the only reason they didn't release it is that a GK104 based MARS is simply redundant as GTX 690 is already based on the full chip and clocked like the GTX680. Afaik Asus told TPU that the card was never meant to release:



> We received an update from ASUS, clarifying that this card will not be released. The design was only displayed during a factory tour, to show ASUS craftmanship.



http://www.techpowerup.com/171202/ASUS-ROG-MARS-III-Dual-GTX-680-PCB-Pictured.html


----------



## Animalpak (Oct 3, 2012)

buggalugs said:


> not surprised. Nvidia has a long reputation of strong arm tactics.
> 
> In a way, I dont blame them though, if they're getting a lot of returned cards that have been killed by too much voltage, they have to do something. Too many guys like to play the overclocking game with no real understanding of what they're doing and bump the voltage to dangerous levels for 24/7 use.



Exactly and to speak of the guarantees vary widely ... Further problems of a different nature would create a big mess for both partners and nvidia


----------



## bogami (Oct 3, 2012)

WTF : No voltage control ?First they throw you brick in a face three slot fat and you shouldn't over clock with voltage regulator help!Yah so they will sell as overclocked cards for abnormal price!


----------



## Benetanegia (Oct 3, 2012)

I don't think it's a big loss anyway. I lost interest in OCing a long time ago*. The reasons are that 1) it became too easy 2) it is allowed 3) it involves less and less risks every generation.

That's not OCing. To me OC only really made sense when it was something a few select people were capable or brave enough to perform. When it meant going against the norm and when it meant a performance advantage to those few brave enough to perform a hard mod or use a bios coming from from forum member in an "obscure" website.

Now OC is widespread, it's allowed and encouraged by the vendors themselves, it's even covered by warranty and you are given full control over every single thing related to OCing your PC. What's the point, use the tools and you'll achieve the same results as the rest of the world (with slight differences based on luck) and if it all comes down to that, is this really such a loss? Not in my eyes, but maybe I'm alone.

* I still OC to safe 24/7 clocks but don't do it for fun or to find the limits, etc. Just simple practical OCing now.


----------



## the54thvoid (Oct 3, 2012)

The Kepler design is pretty well tied to it's strict clock/voltage manipulation to achieve the clock boosts it gets.  As the dynamic clocks are hardware related it would be very hard to safely allow voltage adjustments.

I think Nvidia have the card as fast as it can go on the set up it is on.  More volts may well fry the thing and they don't want that bad PR (and I don't blame them).

Also, as Ben says, the 690 decimated any partners chance of making a viable dual 680 card.

It's not like Apple though.  This Kepler voltage situation is being controlled to stop damage being done to hardware that's already pushing the boat out.

I say Nvidia are looking after us this time (well not me, I'm with red).


----------



## 20mmrain (Oct 3, 2012)

Alright This sucks..... Time to switch back to AMD/ATI next round...... there goes another $2000 Dollar purchase from me you won't get Nvidia. (Especially if this rule carriers over to the next gen)

Also for those who say it is not as dangerous anymore..... Well maybe you are not trying to overclock as high as you can to make it dangerous. Just because cards are able to reach higher clocks before it is considered dangerous..... doesn't mean that it is not dangerous anymore. 
As for the fail safe features that protect a product from a bad overclock..... I think that this is a good feature..... and like I said there comes a point in time anyway when you push a product so High that even fail safe features won't protect you any longer.


----------



## tacosRcool (Oct 3, 2012)

That is so lame!


----------



## hardcore_gamer (Oct 3, 2012)

This is not "protecting". This is dumbing down. Overclockers were here for a long time. Why didn't they introduce this "feature" when people started to damage 6600GTs by overclocking in the past? I see this as a move to take away tweakability from the hardware.


----------



## Recus (Oct 3, 2012)

First: Geforce GTX 680 classified power consumption with overvoltage revealed, shocking.

Nvidia removed voltage control and it's shocking again. Looks like fanboys will never learn. 
	

	
	
		
		

		
			





And for those "back to AMD":


----------



## Bjorn_Of_Iceland (Oct 3, 2012)

remixedcat said:


> the hell nvidia? why? makes me worry about other things....



because nvidia loves money.


----------



## hardcore_gamer (Oct 3, 2012)

Recus said:


> Looks like fanboys will never learn.



Yes, fanboys will never learn. They always try to justify a company even if they take away some important feature from the users.

Enthusiasts and  most people who value their money overclock their GPU to get the maximum performance out of it. It is very important in high end cards like 680. 

Intel took away the overclocking feature and people still pay the premium for K series. But things are different here in GPU arena where both players are very close in performance.


----------



## BigMack70 (Oct 3, 2012)

The GTX 6xx series cards have always been a giant "F U" to consumers...

They couldn't deliver on GK110 and instead sold a $300 chip at a $500 price point, they complicated overclocking with their "GPU Boost" nonsense, and they screwed enthusiasts by not allowing voltage tuning on any of their cards.

I feel bad for uninformed folks who bought something like the Classified or the MSI Lightning 680.

I generally like Nvidia's cards, but the GTX 6xx series is a load of crap and a huge step backwards from the GTX 5xx series IMO.


----------



## Solaris17 (Oct 3, 2012)

knowing EVGA though in future releases they will probably do something like MSI and provide voltage read pads and pads you can attach trim pots too.

sneaky sneaky.


----------



## TheLostSwede (Oct 3, 2012)

Benetanegia said:


> Do you have a proof link or something? It seems to me that the only reason they didn't release it is that a GK104 based MARS is simply redundant as GTX 690 is already based on the full chip and clocked like the GTX680. Afaik Asus told TPU that the card was never meant to release:
> 
> 
> 
> http://www.techpowerup.com/171202/ASUS-ROG-MARS-III-Dual-GTX-680-PCB-Pictured.html




Not as such, but I spoke to a friend of mine at Asus at MOA of all things who told me this. I'm pretty sure he knows what he's talking about, but of course the official excuse would be something that doesn't make Nvidia look bad...


----------



## 3870x2 (Oct 3, 2012)

There might be a reason other than money and control that NV is doing this.


----------



## [H]@RD5TUFF (Oct 3, 2012)

This really pisses me off, but it's not like AMD is a real alternative with the shittiest drivers in history.


----------



## Easy Rhino (Oct 3, 2012)

TheLostSwede said:


> In related news, Asus wasn't allowed to make their dual GPU MARS card either, as Nvidia wouldn't let them...
> So apparently the chip makers are now controlling what their "partners" are allowed to do, or not to do with the chips they buy from them.



this has always been the case.


----------



## hv43082 (Oct 3, 2012)

Wow the pro AMD people are all over this thread like flies over poop!


----------



## dj-electric (Oct 3, 2012)

Becuase of Nvidia my Lightning 680s can't overvolt pass 1.175v

*This is Bullshit*


----------



## alwayssts (Oct 3, 2012)

*You want unbiased?  I'll flame them all.  Ready? 

 Go.*



Benetanegia said:


> What's the point, use the tools and you'll achieve the same results as the rest of the world (with slight differences based on luck) and if it all comes down to that.



You sir, hit a raw nerve and said a lot of important things that people really need to understand.  From a guy that grew up with the ABIT boards that started the initial craze, it really does make me sad.  The reality is, as you know, and many people don't seem to understand is that overclocking for all intents and purposes is dead and dying...killed by wanting to spur upgrades.  Every major player is guilty.  

Remember when AMD and nvidia over the course of a couple generations started limiting TDP per product for segmentation and sold it as a feature?  Remember then when they started doing set voltages and turbo modes (which in the case of 7900 for example actually makes it a WORSE over-all product they sell for more money)?  Remember when Intel essentially made a product that doesn't really work worth a damn over the typical best perf/clock ratio of ~1.175 volts (Ivy Bridge)?  

As you infer by stating 'everyone gets the same results', the diluted form of overclocking we have today is more of a product feature to allow for lower and looser binning, skimp on the BOM on lower-end skus, and is built into an inflated price on upper-end ones...exactly the opposite of the initial market and purpose.  I was fine with the advent of software voltage control (that took advantage of clock/voltage potential of a process with realistic cooling) versus soldering to change the properties of a resistor...less release of the magic smoke that way.  What pisses me off are things like strict conformance to the pci-e (etc) power specs, or even more bullshit, limiting tdps below them.  Not doing the later, for instance, saved GF104/110 from being a colossal failure and made the products quite appealing.  If you want proof, look at the stock tdp and power consumption of those products, how many pci-e connectors they had, how well they clocked...and power consumption after overclocking.  Now that kepler doesn't suck...gone.

Another example, 

AMD knows if you buy a 7870, you are likely going to run it at ~1200+mhz...the max the chip will do limited by process tech, not tdp.
They also know if you'd like to buy a 7850 and get 7870 performance, so they institute tdp restrictions (130w) and bios locks (1050mhz) to limit the average clock to under the performance of the formers stock ability (7850 @ 1075mhz ~ 7870 @ 1ghz).  That way the bottom-line price difference is preserved while the upper sku still appears worthy at stock. 

Even worse is when you can smell WHY they do these things on the grand scale.  In this case it's obvious 8000 will have 150w/225w max tdp parts (7850 is 130w, 7950 200w...10% less used logic and ~20% less tdp than 70 skus...).

Why would anyone buy a 150w part that outperforms a stock 7870 by 10% if 7850 already could a year earlier for the same price?  They wouldn't.  Now they can sell it as a huge efficiency improvement (more shaders than 7870 at lower initial clock and overclocking potential granting a net gain in  terms but not actual power consumption beyond scant used transistors for CU difference), which again...is bullshit.  While obviously GK104 is well-designed, no doubt it's tdp, just like 7950 and 8950, is built so GTX770 will look justifiably better at 225w, again more units at a lower sweet clock/voltage spot.

In short...it's all a big fucking ruse.  Any average Joe that thinks they are an overclocker anymore is either lying to themselves or extremely delusional.


----------



## cadaveca (Oct 3, 2012)

alwayssts said:


> In short...it's all a big fucking ruse. Any average Joe that thinks they are an overclocker anymore is either lying to themselves or extremely delusional.



Really, I think the complaining about it is just as stupid though, since "overclockers" wanted OCing to go mainstream, pushed marketing reps, and then it happened. And now that it's happened, everyone wants things back the way they were, since companies have taken steps to ensure that them offering these features doesn't bankrupt them.


So, ya got what ya wanted, but didn't consider the consequences....



Now, I am not directing this at anyone specifically, this is just my general feeling. But then, when I try to tell people how to OC, to get a bit extra, they were doing something COMPLETELY different.


OC is not dead. It's just more cleverly hidden. If you don't own a sodlering iron, and you think you're an overclocker, you're sorely mistaken. You still need that iron.

Perfecdt example..all the claims of IVB running hot...no, actually it doesn't. you just failed to give proper cooling.


----------



## newtekie1 (Oct 3, 2012)

Well after the last generation with idiots pushing 1.2v+ though 4 Phase cards, then bashing nVidia when the VRMs popped, I could see why nVidia wants to now limit voltage control...

Obviously the strategy of allowing people free voltage control, even if maxing the slider out will likely kill the card, didn't work, because idiots will just max the slider out then bitch when the card pops.  So now we have nVidia limiting voltages again. Thank the idiots that bitched when they overvolted the cards too high and bitched.


----------



## linoliveira (Oct 3, 2012)

The cards are made to run at the stock voltage and frequencies, so if you push 0.001v and 1MHz more you shoul brick the warranty of the product, so you don't get a replacement straight away. am I wrong?
I never refunded a card for OCing it because I never overvolt it badly, but this is my understanding of it.
IF I am correct, why bother with voltage tweaking, just let ppl fry cards and put them in the market for more!


----------



## MxPhenom 216 (Oct 3, 2012)

newtekie1 said:


> *Well after the last generation with idiots pushing 1.2v+ though 4 Phase cards, then bashing nVidia when the VRMs popped, I could see why nVidia wants to now limit voltage control...*
> Obviously the strategy of allowing people free voltage control, even if maxing the slider out will likely kill the card, didn't work, because idiots will just max the slider out then bitch when the card pops.  So now we have nVidia limiting voltages again. Thank the idiots that bitched when they overvolted the cards too high and bitched.



I was just about to say that....


----------



## Jstn7477 (Oct 3, 2012)

newtekie1 said:


> Well after the last generation with idiots pushing 1.2v+ though 4 Phase cards, then bashing nVidia when the VRMs popped, I could see why nVidia wants to now limit voltage control...
> 
> Obviously the strategy of allowing people free voltage control, even if maxing the slider out will likely kill the card, didn't work, because idiots will just max the slider out then bitch when the card pops.  So now we have nVidia limiting voltages again. Thank the idiots that bitched when they overvolted the cards too high and bitched.



I must say the same with all the people still wanting to unlock HD 6950s long after their release, as if the cards now are exactly the same as the ones that originally unlocked. I'm tired of people bricking their cards and not even trying to back up the original BIOS, then they come on here and cry because they think upgrading a video card's BIOS is always necessary and yields hidden performance improvements or something in all cases.


----------



## TheMailMan78 (Oct 3, 2012)

3870x2 said:


> There might be a reason other than money and control that NV is doing this.



Everyone is always quick to bash without looking at the reasons why someone did something. I would like to know the reasoning behind the removal instead of assuming it was out of "greed".....and if it was greed NVIDIA is well within its right. They designed the chip. They make the drivers. Its their property. EVGA is just a mfg plant making money off NVIDIA's coat tails.

But again we don't know WHY NVIDIA asked EVGA to remove it. Might be for safety reasons. Who knows.


----------



## Easy Rhino (Oct 3, 2012)

TheMailMan78 said:


> Everyone is always quick to bash without looking at the reasons why someone did something. I would like to know the reasoning behind the removal instead of assuming it was out of "greed".....and if it was greed NVIDIA is well within its right. They designed the card. They make the drivers. Its their property. EVGA is just a mfg plant making money off NVIDIA's coat tails.
> 
> But again we don't know WHY NVIDIA asked EVGA to remove it. Might be for safety reasons. Who knows.



i see a lot of people (not just on TPU) criticize business decisions by amd,intel,nvidia, whomever. 

most of these people sit around and contribute NOTHING to society. what gives them the right to be critical of anybody?? boggles my mind...


----------



## TheMailMan78 (Oct 3, 2012)

Easy Rhino said:


> i see a lot of people (not just on TPU) criticize business decisions by amd,intel,nvidia, whomever.
> 
> most of these people sit around and contribute NOTHING to society. what gives them the right to be critical of anybody?? boggles my mind...



Common sense is not so common anymore.


----------



## Easy Rhino (Oct 3, 2012)

TheMailMan78 said:


> Common sense is not so common anymore.



i don't even think it has to do with common sense. these people just consume. they constantly consume and judge other people's work without contributing anything themselves.


----------



## cadaveca (Oct 3, 2012)

Easy Rhino said:


> i don't even think it has to do with common sense. these people just consume. they constantly consume and judge other people's work without contributing anything themselves.



Meh. I grew up i na church, so this applies:


Judge not lest ye be judged.


Which isn't as wholesome as most think..it's telling you to turn a blind eye, rather than to not judge.






Someone needs to tell people "NO!", clearly, or else things like the economy wouldn't be as bad as they are. People should be MORE critical.


Personally, I think nVidia is doing the right thing here. It might not be evident why, immediately, but i think they have a plan.

I jsut hope they follow through on it.


----------



## ironwolf (Oct 3, 2012)

Maban said:


> Quick, somebody post a lolwut meme!



I concur!


----------



## brandonwh64 (Oct 3, 2012)

Maybe like dave said, it has come a time were OCing is mainstream cause people wanted it to be and Nvidia is just trying to save the customer and factory time and headaches with people that "Think" they can OC anything when in turn the just burn the card to the ground. 

If you need that much voltage to OC a card then learn some entry level electronic engineering and build a Vmod like it was done in the old days.


----------



## radrok (Oct 3, 2012)

I personally think that Nvidia did the right thing by locking voltage on reference boards because the VR circuitry is meant to operate more or less at stock and an excessive stress could be fatal.

On the other hand I do not agree with their choice to impose a lock on partners custom boards which are engineered to withstand higher voltages.

To solve this issue AIB should use a bios switch on custom boards which should only be accessible after removing something like a smartly positioned sticker which can't be reapplied without noticing.
You switch that bios? Well you can have your unlocked voltage at the cost of your entire warranty period.
You want to keep your warranty? Buy reference or do not touch that sticker protected switch.

There you have your solution.


----------



## TheMailMan78 (Oct 3, 2012)

brandonwh64 said:


> Maybe like dave said, it has come a time were OCing is mainstream cause people wanted it to be and Nvidia is just trying to save the customer and factory time and headaches with people that "Think" they can OC anything when in turn the just burn the card to the ground.
> 
> If you need that much voltage to OC a card then learn some entry level electronic engineering and build a Vmod like it was done in the old days.



I can also see where OC would be a lot of fun in the older days when the gains were so noticeable. Today with all the ports and the hardware so far ahead of the software I don't see a point in OC anymore. But I can appreciate people will still wanna do it. I think a true overclocker today doesn't use "autotune" software or push button OC. They still do it old school via bios tweaks and hard volt mods. 

Hell I remember trying to do the pencil mod when I first started......then I realized I didn't know WTF I was doing and stopped. I think NVIDIA and the like are just protecting themselves from guys like me that don't come to the realization that "I dunno WTF I am doing" BEFORE they blow the hardware and have to RMA it.


----------



## erocker (Oct 3, 2012)

There's still 3rd party support for overvolting your Nvidia cards that isn't dictated by Nvidia. However, the 680 Classified is a product that is aimed towards overclocking/overvolting and is not a reference design so I have no idea why Nvidia stepped in.


----------



## TheMailMan78 (Oct 3, 2012)

erocker said:


> so I have no idea why Nvidia stepped in.



Just to troll you I think.


----------



## cadaveca (Oct 3, 2012)

erocker said:


> is not a reference design so I have no idea why Nvidia stepped in



Maybe they want to release their own reference OC platform?


I mean, after all, everybody's doing it...


----------



## erocker (Oct 3, 2012)

TheMailMan78 said:


> Just to troll you I think.



Just the answer I'm looking for. Thanks. 

You're wrong though as I don't really care and it doesn't affect me.


----------



## radrok (Oct 3, 2012)

erocker said:


> There's still 3rd party support for overvolting your Nvidia cards that isn't dictated by Nvidia. However, the 680 Classified is a product that is aimed towards overclocking/overvolting and is not a reference design so I have no idea why Nvidia stepped in.



Kepler is a clocking monster I would not be surprised that in betwen Nvidias reasons to lock voltages there is a "performance gain" one to be avoided.


----------



## erocker (Oct 3, 2012)

radrok said:


> Kepler is a clocking monster I would not be surprised that in betwen Nvidias reasons to lock voltages there is a "performance gain" one to be avoided.



I can agree, however these cards were advertised to do these things. People who bought these cards with this feature in mind have now been stolen from. Regardless, as I said there's still 3rd party software to do it. So really, it isn't an issue.


----------



## brandonwh64 (Oct 3, 2012)

erocker said:


> Just the answer I'm looking for. Thanks.



I seriously just spit sweet tea all over my work screen.

My theory is that the number of RMA's on this card have jump quite high and Nvidia stepped in to intervene?


----------



## TheMailMan78 (Oct 3, 2012)

erocker said:


> Just the answer I'm looking for. Thanks.
> 
> You're wrong though as I don't really care and it doesn't affect me.



Always here to help troll.  *edited by erocker


----------



## erocker (Oct 3, 2012)

brandonwh64 said:


> I seriously just spit sweet tea all over my work screen.
> 
> My theory is that the number of RMA's on this card have jump quite high and Nvidia stepped in to intervene?



Maybe, but it still really isn't right. 

Example:

Here, buy this item. (It has feature x, feature y, feature z) 

It sounds good, you buy the item.

After purchase they take away "feature z"

Did you get what you paid for?


----------



## brandonwh64 (Oct 3, 2012)

erocker said:


> Maybe, but it still really isn't right.
> 
> Example:
> 
> ...



Yea I hear ya, If I paid that amount of money to do what I wanted to do with it (within its limits) and they took a feature that was a selling point away, I would be sending the card back for a full refund.


----------



## EarthDog (Oct 3, 2012)

Two other vendors we work with (Im an editor at overclockers.com, FYI) also had trouble with Nvidia blessing more voltage and increased power limits (Top end 680 from one AIB, and a 660 from another).



erocker said:


> There's still 3rd party support for overvolting your Nvidia cards that isn't dictated by Nvidia.


Yes and no...

They are all still limited to 1.175v or 1.21v with a modded bios. The point is going PAST what other cards can do since they all hit 1.175v.


----------



## Easy Rhino (Oct 3, 2012)

if somebody has this evga card, can't they just use another party's voltage tool to overclock? also, can't we just ask w1zzard to solve this issue for us?


----------



## EarthDog (Oct 3, 2012)

Technically you can add the part yourself... and use the evbot.

http://www.digikey.com/product-detail/en/87230-3/A26593-ND/353085


(Thanks to Bobnova at ocf for that link/information)


----------



## brandonwh64 (Oct 3, 2012)

Easy Rhino said:


> if somebody has this evga card, can't they just use another party's voltage tool to overclock? also, can't we just ask w1zzard to solve this issue for us?



I don't think wizzard would risk some of his ties with nvidia to make a tool that would allow you to go over that said voltage.

Just thinking out loud here.


----------



## TheMailMan78 (Oct 3, 2012)

brandonwh64 said:


> I don't think wizzard would risk some of his ties with nvidia to make a tool that would allow you to go over that said voltage.
> 
> Just thinking out loud here.



I would stop thinking out loud if I were you. It hurts peoples ears. 

Doesnt Trixx do this already?


----------



## brandonwh64 (Oct 3, 2012)

TheMailMan78 said:


> I would stop thinking out loud if I were you. It hurts peoples ears.
> 
> Doesnt Trixx do this already?



No it only goes to 1.175V or if lucky 1.2V


----------



## PatoRodrigues (Oct 3, 2012)

Want some easy overvoltage and take some risks?

Buy the MSI GTX670 Power Edition. haha.... 9.3 volts of FUN. 



Just kidding. A bizarre error from MSI (or Richtek) though.


----------



## TheMailMan78 (Oct 3, 2012)

brandonwh64 said:


> No it only goes to 1.175V or if lucky 1.2V



Ah ok.....nevermind then (scurries off to a corner)


----------



## sneekypeet (Oct 3, 2012)

EarthDog said:


> Technically you can add the part yourself... and use the evbot.
> 
> http://www.digikey.com/product-detail/en/87230-3/A26593-ND/353085
> 
> ...



This is what I was thinking since they left the placement (at least in the image). So EVGA didn't lock out the EVbots functionality, they just removed the link for connectivity?


----------



## EarthDog (Oct 3, 2012)

Not sure... its wired for it, and the evbot works on its own not off the card (has its own FW, not associated with the GPU - you update FW on the evbot via the motherboard, not the GPU).

I would GUESS, yes.


----------



## radrok (Oct 3, 2012)

erocker said:


> I can agree, however these cards were advertised to do these things. People who bought these cards with this feature in mind have now been stolen from. Regardless, as I said there's still 3rd party software to do it. So really, it isn't an issue.



You mean the modified bios?

Also it is AIBs fault in part, cause they promised what they couldn't deliver.


----------



## Benetanegia (Oct 3, 2012)

http://www.brightsideofnews.com/new...proving-quality-or-strangling-innovation.aspx



> We support overvoltaging up to a limit on our products, but have a maximum reliability spec that is intended to protect the life of the product. We don’t want to see customers disappointed when their card dies in a year or two because the voltage was raised too high.
> 
> Regarding overvoltaging above our max spec, we offer AICs two choices:
> 
> ...





> Yes, you’ve seen some cases of boards getting out into the market with OV features only to have them disabled later. This is due to the fact that AICs decided later that they would prefer to have a warranty. This is simply a choice the AICs each need to make for themselves. How, or when they make this decision, is entirely up to them.
> 
> With regards to your MSI comment below, we gave MSI the same choice I referenced above -- change their SW to disable OV above our reliability limit or not obtain a warranty. They simply chose to change their software in lieu of the warranty. Their choice. It is not ours to make, and we don’t influence them one way or the other.



I don't know if this is 100% true, but if true it looks very reasonable to me. It paints quite a different picture than what it's been, for the most part, assumed in this thread and other forums. For instance, Nvidia wouldn't be forcing anything. I do understand, that maybe before now they'd get the warranty from Nvidia no matter what, but personally I see no reason for Nvidia offering a warranty if their specs are not met. From manufacturer's perspective not getting the warranty might be scary enough that it really forces them to the only option, but it's not like Nvidia is aiming a gun to their heads.

I think that when Jacob Freeman said "It was removed in order to *100% comply* with NVIDIA guidelines for selling GeForce GTX products" he might actually be like saying "If we want to get the warranty...".


----------



## Hilux SSRG (Oct 3, 2012)

Sucks for persons who bought the "Classified" model card from Evga who were interested in it's feature-set and willing to pay top dollar.


----------



## radrok (Oct 3, 2012)

Hilux SSRG said:


> Sucks for persons who bought the "Classified" model card from Evga who were interested in it's feature-set and willing to pay top dollar.



Indeed, the only way to get voltage control is to flash a 680 Lightning with the old BIOS or hotwire with an Asus card and Motherboard.


----------



## m1dg3t (Oct 3, 2012)

Hate to say I told ya so...


----------



## HumanSmoke (Oct 3, 2012)

Benetanegia said:


> I don't know if this is 100% true...


Sounds reasonable. MSI look like they've been playing fast and loose with specification, and there is no way that the anti-Nvidia crowd or casual tech reader/Googling card upgrader is going to parse the information solely as being about MSI. The inference will be- especially on forum threads- that Nvidia is producing an inferior/unstable product. Once the affected users start and trolls start bombing the Newegg reviews and the like, you'll end up with a PR problem out of proportion with the situation. Remember when EVGA recalled a batch of GTX 680 SuperClocked cards? It took about five minutes before some forums managed to spin that into _"the GTX 680 as a design is borked"_


----------



## Benetanegia (Oct 3, 2012)

Reading about this topic on another forum I came across this:

http://www.tomshardware.co.uk/MSI-GTX-660-670-overvolting-PowerEdition,news-40278.html

edit: lol it's the same posted by Humansmoke (kinda)



> A small component completely superfluous to the normal circuit in one of the ground connections causes major overvoltage in the PWM chip in question – instead of the 5 volts specified by Richtek, the chip is hit with up to 9.3 volts.





> We find it hard to believe in a design accident here, since the circuit in question is a standard design – if implemented correctly.



I think there's much more to the voltage story than we can know by just looking at a single source.

Right now, to me the whole thing just looks like the kids have been misbehaving and toying with things they shouldn't have and now daddy is angry.


----------



## eidairaman1 (Oct 3, 2012)

alwayssts said:


> In short...it's all a big fucking ruse. Any average Joe that thinks they are an overclocker anymore is either lying to themselves or extremely delusional.



that right there i agree with, people taking a Intel SB or IVB up to 4.5GHz isnt anything special anymore.


----------



## MxPhenom 216 (Oct 4, 2012)

Like Erocker said, this is bullshit for people who bought eitherh te MSI lightning or Evga Classified card with Voltage tweaking in mind. BUt on the other hand, I can see why Nvidia is doing this. So that people whop buy these cards thinking they know what they are doing won't throw voltages at the cards and grenade them, like so many of the GTX570s did, which is why the power limit and voltage limit was on those cards in the first place.


----------



## eidairaman1 (Oct 4, 2012)

MxPhenom 216 said:


> Like Erocker said, this is bullshit for people who bought eitherh te MSI lightning or Evga Classified card with Voltage tweaking in mind. BUt on the other hand, I can see why Nvidia is doing this. So that people whop buy these cards thinking they know what they are doing won't throw voltages at the cards and grenade them, like so many of the GTX570s did, which is why the power limit and voltage limit was on those cards in the first place.



well a price slash should be due then cuz technically youre paying for their software too


----------



## MxPhenom 216 (Oct 4, 2012)

TheMailMan78 said:


> Common sense is not so common anymore.


----------



## radrok (Oct 4, 2012)

Found an interesting read about the matter on another forum

http://www.brightsideofnews.com/new...proving-quality-or-strangling-innovation.aspx


----------



## m1dg3t (Oct 4, 2012)

I'm going to start patenting the shit I say! It's starting to become contagious now lol Imitation is the greatest form of flattery they say.

Thank you all! You know who you are


----------



## eidairaman1 (Oct 4, 2012)

MxPhenom 216 said:


> http://blog.tmcnet.com/on-rads-rada...mmon_sense_super_power-thumb-300x412-8243.jpg



Ow my kidney


----------



## HossHuge (Oct 4, 2012)

brandonwh64 said:


> No it only goes to 1.175V or if lucky 1.2V



I'm not sure what version of Trixx you're using brandonwh64, but every one I've ever used goes to 1.3.


----------



## xBruce88x (Oct 4, 2012)

Don't worry guys... Freeman will just whack nVidia with his crowbar and save the day!

seriously though... this is a bad move on nVidia's part.


----------



## EarthDog (Oct 4, 2012)

HossHuge said:


> I'm not sure what version of Trixx you're using brandonwh64, but every one I've ever used goes to 1.3.
> 
> http://img.techpowerup.org/121003/voltage control.jpg


This is an Nvidia card not an AMD card...


----------



## brandonwh64 (Oct 4, 2012)

EarthDog said:


> This is an Nvidia card not an AMD card...



I was about to say the EXACT same thing. Hoss nvidia's bios limits are set to 1.175 or SOMETIMES 1.2V but The bot made you go above that.


----------



## TheoneandonlyMrK (Oct 4, 2012)

Benetanegia said:


> I don't know if this is 100% true, but if true it looks very reasonable to me. It paints quite a different picture than what it's been, for the most part, assumed in this thread and other forums. For instance, Nvidia wouldn't be forcing anything. I do understand, that maybe before now they'd get the warranty from Nvidia no matter what, but personally I see no reason for Nvidia offering a warranty if their specs are not met. From manufacturer's perspective not getting the warranty might be scary enough that it really forces them to the only option, but it's not like Nvidia is aiming a gun to their heads.
> 
> I think that when Jacob Freeman said "It was removed in order to 100% comply with NVIDIA guidelines for selling GeForce GTX products" he might actually be like saying "If we want to get the warranty...".



not very fair or reasonable imho(by nvidia or Evga), patrons should be repaid the premium they payed on these cards at least.


----------



## Benetanegia (Oct 4, 2012)

theoneandonlymrk said:


> not very fair or reasonable imho(by nvidia or Evga), patrons should be repaid the premium they payed on these cards at least.



I mean it's reasonable for Nvidia to impose those conditions. Either you respect the specs or you lose warranty. If you buy a phone and use it as a frisbee, and it breaks you don't expect the warranty to cover it, do you?

MSI put a part speced with a maximum operating voltage of 5v, way up to 8v, why is Nvdia supposed to cover that graphics card with a warranty?

EVGA is not as bad, or maybe it is, since they are allowing users (even idiots) to use whatever voltage they want, way higher than the maximum safe voltage specified by Nvidia. It's EVGA who has to take responsabilty if it breaks, not Nvidia. That's why specifications are set in the first place.

Nvidia just said, "we quit covering parts that do not meet our specifications, if you want to go beyond that, you are on your own", MSI and EVGA simply prefered to get the warranty. It's all their fault.

EDIT: Of course EVGA should keep the feature and suck it up if they get a very high return rate. Problem is they probably know full well they'd end up having big loses. All the related stories makes me think that Nvidia was covering those faulty chips until now, which is unthinkable in any industry that I know off. Anywhere if you push something beyond the specifications you're SOL.


----------



## Xzibit (Oct 4, 2012)

I think Nvidia has every right to do product control.

The way its doing it or atleast the info that is being reported on is bad.

If project greenlight was established as a product quality control for AIB designs.  Common sense would make you beleive all designs go through Nvidia.  So they had to approve the desing at some point as per the BSN report.

So what changed ?

Did Nvidia strong arm the MSI and EVGA to change those designs or suffer ?

Reasonable thing to do if they were approved by Nvidia at somepoint just to phase them out.  They were advertise and sold as such and if approved Nvidia and AIB should suffer the RMAs but Nvidia shouldn't strong arm to drop "card x" or AIB to risk warranties on all others cards.


----------



## Benetanegia (Oct 4, 2012)

Xzibit said:


> or AIB to risk warranties on all others cards.



From what I read they don't risk losing warranties on other cards, only the ones affected.

And the cards are still approved. Greenlight seems to be about what AIBs can do, not about what is covered by Nvidia's own warranty. They are and have always been free to create such products, they still are, but Nvidia doesn't cover the warranty. And they shouldn't, why should they cover warranties of products that exceed their safety limits? Overvolting is allowed and covered by warranty, up to a point, extreme overvolting is not covered. Makes 100% sense.

What it doesn't make any sense is that it's implied that before now, Nvidia actually covered designs that went beyond their specifications and limits. At least to me, that's competely unheard off in any industry and if AMD is doing that too, they better change that model since it's not reasonable, nor fair for them. Nvidia has typically offered more extreme OC cards and maybe that's why tho.


----------



## EarthDog (Oct 4, 2012)

then perhaps its time for the AIB's to step up and drop the crutches... I dont mind paying more of a premium for such cards (being an 'extreme' overclocker). I can see why most wouldnt want to pay for that, but most, that know anything, wouldnt be buying such a card for 'ambient' overclocking in the first place. There will always be people 'not in the know' that buy a Lightning or a Matrix just for ambient use so AIB's get the best of both worlds that way.


----------



## HossHuge (Oct 5, 2012)

TheMailMan78 said:


> I would stop thinking out loud if I were you. It hurts peoples ears.
> 
> Doesnt Trixx do this already?





brandonwh64 said:


> I was about to say the EXACT same thing. Hoss nvidia's bios limits are set to 1.175 or SOMETIMES 1.2V but The bot made you go above that.



MM mentioned Trixx, so that's what I thought you were talking about.


----------

