• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

ASUS GeForce GTX 590 3 GB


Because its a better designed card.

thats what you think and Im not trying to change your mind, but I havent heard of one dying at stock clocks, it trades blows with the 6990 very well, admittedly slightly slower on average, but while being shorter and quieter.

obviously we have different definitions of what makes a gfx card fail or not.

580 = good/great! Wish I owned one!
590 = piece of shit. How many blown review cards does it take for people to get this?!

How would you feel if you bought a car and the first time you red-lined it the engine shot out and killed your dog? Would you deem that car a piece of shit?
 
because its a better designed card.

AMEN

The UnderDOG (AMD) always has to fight for what it gets, if that means making a better design, longer lasting, more diverse product lineup then so be it. I would rather for them to take their time and have a better product than to have $750 go up in flames - literally.
 
Because its a better designed card.

You have to keep in mind the large difference in performance and range, overvolting a monstrous dual GPU card is completely different from overvolting a more power efficient single GPU card. The risks are there, and the risks move further up the latter when you what to push the most out of an already beefy multi GPU card.
 
Great review as always Wiz.

Sorry if this has been asked already, but next time could you please go with the 11.4 drivers for AMD? I realize that this time around it probably came in close to the time that you were reviewing the 590 and you couldn't get around to it, but I feel that the performance differences are probably significant enough to test with their newest driver for next time. Thanks.
 
Its not an opinion. Its fact. This card was rushed out the door to beat ATI. In the process they cut corners and BOOM went the dynamite. It has a fundamental flaw in the design. Remember W1zz blown card is not an isolated incident. You just have to accept the green team dropped the ball. They didnt even ship it with the right drivers. End of story.

No, the only corner they cut was in the BIOS in not locking down the voltage to lower levels.

You can't say that because the card can't handle 1.2v it is a shitty design. The fact is that nVidia designed the card to run at 0.94v, and it does that just fine. Raising the voltages beyond that puts it out of the area it was designed for.

And 1.2v certainly isn't a mild overvolt, not on a Fermi card. Remember, the maximum you could even go on the original Fermi cards was 1.087v(without modding the BIOS). So yes, on a Fermi 1.2v is a huge voltage bump.
 
You have to keep in mind the large difference in performance and range, overvolting a monstrous dual GPU card is completely different from overvolting a more power efficient single GPU card. The risks are there, and the risks move further up the latter when you what to push the most out of an already beefy multi GPU card.

It doesn't have to be if more/better power delivery components are in the design of the card. Then again, add those things and price goes up. Nvidia has a rather expensive design with Fermi. That being said, it won't take much for a 3rd party to improve on the design a bit. It will obviously cost more.

No, the only corner they cut was in the BIOS in not locking down the voltage to lower levels.

You can't say that because the card can't handle 1.2v it is a shitty design. The fact is that nVidia designed the card to run at 0.94v, and it does that just fine. Raising the voltages beyond that puts it out of the area it was designed for.

And 1.2v certainly isn't a mild overvolt, not on a Fermi card. Remember, the maximum you could even go on the original Fermi cards was 1.087v(without modding the BIOS). So yes, on a Fermi 1.2v is a huge voltage bump.

100% correct. I think what people are saying though, is that it could of been designed a little better. Enthusiasts like to push their enthusiast cards.
 
but next time could you please go with the 11.4 drivers for AMD

latest amd driver is catalyst 11.2, i dont waste my time on betas except for the reviewed product.

both ati and nvidia send out magical new beta drivers at the time their competition launches, and those changes may not even make it into the whql build
 
100% correct. I think what people are saying though, is that it could of been designed a little better. Enthusiasts like to push their enthusiast cards.

I agree, but enthusiasts also know what happens when you push enthusiast cards too far. Or rather they used to know, now it seems they just assume that since the voltage slider goes all the way to 11, that there is no problem with putting it there...:shadedshu

I'm an enthusiast, I have an enthusiast CPU, if I go in the BIOS of my motherboard right now I have the option to pump some stupidly high voltage through my CPU that would surely fry it. I also have the option to pump some stupidly high voltage through the RAM as well. If I decided to do that, and things started to pop, is it eVGA's fault? Should I blame Intel for having a "shitty" processor design that couldn't handle the voltage? Should I be upset at Corsair for having a "rushed" RAM product that pops under voltages completely out of spec from the RAM designed voltage? No. It is my fault for messing around with voltages, I knew the risks. So why is it suddenly different with GPUs? They give you the option to use that voltage, you are the one that is actually deciding to use the voltage. A real enthusiast knows the risk involved with messing with voltages, and a real enthusiast knows that they themselves are the only ones to blame for blowing something up from overclocking/overvolting.
 
Last edited:
I agree, but enthusiasts also know what happens when you push enthusiast cards too far. Or rather they used to know, now it seems they just assume that since the voltage slider goes all the way to 11, that there is no problem with putting it there...:shadedshu

So much this. People are messing with powers they don't understand!

I was actually suprised when w1z bumped it all the way to 1.2 at once.
 
So much this. People are messing with powers they don't understand!

I was actually suprised when w1z bumped it all the way to 1.2 at once.

Maybe because they advertised he could on the box?
 
Maybe because they advertised he could on the box?

Does it say how much he could increase it? I only see Voltage Tweak! and "Up to 50% faster clock speed!" which I don't think mean you're supposed to increase the voltage with 20%.

Now I realize it IS a bad thing indeed, especially with that protection thing turned on, but I don't think it's as bad as everyone says either.
 
testing voltage tuning on msi hd 6950 twin frozr iii now..

so guys .. where should i stop ? after 15 - 25 mv like nvidia recommends ? or go as far as the slider lets me ?
 
testing voltage tuning on msi hd 6950 twin frozr iii now..

so guys .. where should i stop ? after 15 - 25 mv like nvidia recommends ? or go as far as the slider lets me ?

DO IT! Blow that bitch!

I agree, but enthusiasts also know what happens when you push enthusiast cards too far. Or rather they used to know, now it seems they just assume that since the voltage slider goes all the way to 11, that there is no problem with putting it there...:shadedshu

I'm an enthusiast, I have an enthusiast CPU, if I go in the BIOS of my motherboard right now I have the option to pump some stupidly high voltage through my CPU that would surely fry it. I also have the option to pump some stupidly high voltage through the RAM as well. If I decided to do that, and things started to pop, is it eVGA's fault? Should I blame Intel for having a "shitty" processor design that couldn't handle the voltage? Should I be upset at Corsair for having a "rushed" RAM product that pops under voltages completely out of spec from the RAM designed voltage? No. It is my fault for messing around with voltages, I knew the risks. So why is it suddenly different with GPUs? They give you the option to use that voltage, you are the one that is actually deciding to use the voltage. A real enthusiast knows the risk involved with messing with voltages, and a real enthusiast knows that they themselves are the only ones to blame for blowing something up from overclocking/overvolting.

See the problem is W1zz knows what hes doing. He blew the card. Hes also not alone. Other reviewers blew the card as well. Its junk. Just accept it. Relax and push out. It won't hurt as much.
 
Does it say how much he could increase it? I only see Voltage Tweak! and "Up to 50% faster clock speed!" which I don't think mean you're supposed to increase the voltage with 20%.

Now I realize it IS a bad thing indeed, especially with that protection thing turned on, but I don't think it's as bad as everyone says either.

Plus, even at 1.0v W1z was able to get 815MHz out of the card(so something like a 30% overclock), that is a damn good clock speed. It would have been nice to see what happened at 1.05v or 1.1v, I bet 900MHz might have been possible, and still probably been safe from killing the card. And 900MHz on a 512 Shader Fermi would be one hell of a beast...

testing voltage tuning on msi hd 6950 twin frozr iii now..

so guys .. where should i stop ? after 15 - 25 mv like nvidia recommends ? or go as far as the slider lets me ?

Go as far as you feel is safe. It will be different from card to card. Personally, I'm glad I have sites like your's that give me an idea of what is safe and sometime what isn't.:D

Personally, ever since I heard about the other GF110 cards popping, I don't think I'd go over 1.1v on any fermi card, that is just my safety maximum. I think having a feel for what is safe and sometime learning the hard way is part of the enthusiast game.

See the problem is W1zz knows what hes doing. He blew the card. Hes also not alone. Other reviewers blew the card as well. Its junk. Just accept it. Relax and push out. It won't hurt as much.

Yes, he does know what he is doing, and I'm sure he knows the risks of it before he even did it. However, I'm sure he will also agree that just because the option is there, that doesn't mean everyone should use it, just like with the options to raise voltages on anything else in your system. Just because it blew because people are putting too much voltage through it, that doesn't make it junk. You go max out the voltage on your CPU with the stock cooler, and when things start to pop, I'll make you admit the CPU and motherboard were junk.

See, the real problem is that people have become way to complacent with GPU overclocking(and to an extent with CPU overclocking as well). It has become so easy, that everyone seems to think there is nothing to it, and they don't really know what is going on. I remember when raising the voltage on a GPU required that you know how to solder, and there were some real risks involved. Now that a simple piece of software can be used, everyone seems to think the risks are gone. Well this and the other GF110 cards show us the risks aren't gone.
 
Last edited:
testing voltage tuning on msi hd 6950 twin frozr iii now..

so guys .. where should i stop ? after 15 - 25 mv like nvidia recommends ? or go as far as the slider lets me ?

Its an ati gpu, what would nvidia know about what that can take safetly?

I always thought ati was better at taking more mv's than nvidia gpu's.
 
so guys .. where should i stop ? after 15 - 25 mv like nvidia recommends ? or go as far as the slider lets me ?

Possibly start with smaller steps? You are after all the OC guru, so I am in no position to advise you on the dark art of overclocking.
 
testing voltage tuning on msi hd 6950 twin frozr iii now..

so guys .. where should i stop ? after 15 - 25 mv like nvidia recommends ? or go as far as the slider lets me ?

MAX IT OUT!

GO BIG or send it to me and i will LOL
 
testing voltage tuning on msi hd 6950 twin frozr iii now..

So guys .. Where should i stop ? After 15 - 25 mv like nvidia recommends ? Or go as far as the slider lets me ?

go for all she's got!
I need warp speed now
 

Attachments

  • ENTERPRISE.jpg
    ENTERPRISE.jpg
    5.3 KB · Views: 379
Yes, he does know what he is doing, and I'm sure he knows the risks of it before he even did it. However, I'm sure he will also agree that just because the option is there, that doesn't mean everyone should use it, just like with the options to raise voltages on anything else in your system. Just because it blew because people are putting too much voltage through it, that doesn't make it junk. You go max out the voltage on your CPU with the stock cooler, and when things start to pop, I'll make you admit the CPU and motherboard were junk.

See, the real problem is that people have become way to complacent with GPU overclocking(and to an extent with CPU overclocking as well). It has become so easy, that everyone seems to think there is nothing to it, and they don't really know what is going on. I remember when raising the voltage on a GPU required that you know how to solder, and there were some real risks involved. Now that a simple piece of software can be used, everyone seems to think the risks are gone. Well this and the other GF110 cards show us the risks aren't gone.
Well according to his review it wasn't extreme at all

As a first step, I increased the voltage from 0.938 V default to 1.000 V, maximum stable clock was 815 MHz - faster than GTX 580! Moving on, I tried 1.2 V to see how much could be gained here, at default clocks and with NVIDIA's power limiter enabled. I went to heat up the card and then *boom*, a sound like popcorn cracking, the system turned off and a burnt electronics smell started to fill up the room. Card dead! Even with NVIDIA power limiter enabled. Now the pretty looking, backlit GeForce logo was blinking helplessly and the fan did not spin, both indicate an error with the card's 12V supply.
After talking to several other reviewers, this does not seem to be an isolated case, and many of them have killed their cards with similar testing, which is far from being an extreme test.
 
What would be extreme then?
 
being able to overclock it at that voltage?
 
Maybe because they advertised he could on the box?

No, it isn't.


See the problem is W1zz knows what hes doing. He blew the card. Hes also not alone. Other reviewers blew the card as well. Its junk. Just accept it. Relax and push out. It won't hurt as much.

Obviously he doesn't, at least in terms of the 590. He jacked the thing up to 1.2v without understanding what the cards limit is. Nvidia clearly states the cards are not supposed to be run anywhere near that voltage.

You aren't even supposed to run a 580 at 1.2v, what made him think you could do that to two sandwiched together is baffling.

http://www.tweaktown.com/news/19192...590_why_some_have_gone_up_in_smoke/index.html

Calling a card junk because someone ran it will over voltage specification and blew it up is idiotic.
 
You aren't even supposed to run a 580 at 1.2v, what made him think you could do that to two sandwiched together is baffling.


Its not two gpu's sandwiched, its side by side. I dont think Nvidia have used the sandwich design since the 7950gx2.
 
Back
Top