• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

ASUS GeForce GTX 590 3 GB

Really, you stuck the stock cooler on your processor, selected the maximum voltage available for everything in the BIOS, and then ran a stess testing program, and nothing blew up? I doubt it.

i don't doubt him, thermal throttling generally works on a CPU and then it doesn't blow up
 
Really, you stuck the stock cooler on your processor, selected the maximum voltage available for everything in the BIOS, and then ran a stess testing program, and nothing blew up? I doubt it.

my friend ran a 16 hour stress test on his GTX570 at like 1.15v and it blew the card up.
 
Wow REALLY? Your the first to ever make my Ignore list for extreme ignorance
I'm with MM this card is Garbage....It's an Enthusiast Card...EN-THOO-ZZ-EE-ASS-TAH
It's not a mid level card it's not a work station Card it's supposed to be HIGH END!!

GARBAGE!!!
If the 6990 did the same thing I'd call it Garbage too!

Re-read what you quoted and try again. Talk about extreme ignorance.
 
please stop the fighting or there will be infractions, tears and closed comments for this review
 
No, it isn't.




Obviously he doesn't, at least in terms of the 590. He jacked the thing up to 1.2v without understanding what the cards limit is. Nvidia clearly states the cards are not supposed to be run anywhere near that voltage.

You aren't even supposed to run a 580 at 1.2v, what made him think you could do that to two sandwiched together is baffling.

http://www.tweaktown.com/news/19192...590_why_some_have_gone_up_in_smoke/index.html

Calling a card junk because someone ran it will over voltage specification and blew it up is idiotic.

the thing is nvidia have a overdraw protection in thier cards . this is what failed you should be able to set voltage to anything and the card would protect itks self this was not the case. this is the problem.
 
please stop the fighting or there will be infractions, tears and closed comments for this review

you might as well just close the thread.
 
the thing is nvidia have a overdraw protection in thier cards . this is what failed you should be able to set voltage to anything and the card would protect itks self this was not the case. this is the problem.

Logic isn't a welcome guest upon a fanboys ears.
 
you might as well just close the thread.

agree with that one, this threads not going anywhere, just fans of either side arguing the same point over and over again
 
Don't close it W1zz. Ill leave. So the fanboys can discuss how the Illuminati destroyed the 590 on your test bench with their mind bullets.
 
agree with that one, this threads not going anywhere, just fans of either side arguing the same point over and over again

The only problem with this thread is this..People miss the Fact that Nvidia's top Enthusiast card isn't really for enthusiasts it's more for people that want a PnP card so they can claim they have the best...The reference design is Flawed and When Nvidia's partners come out with a better design we will get to see what this card can really do.
 
the thing is nvidia have a overdraw protection in thier cards . this is what failed you should be able to set voltage to anything and the card would protect itks self this was not the case. this is the problem.

Correct, that is a flaw. As I said, I would guess the problem is that the power limitter doesn't lower the voltage, it only lowers the clocks. So if you set the voltage too high, even if it tries to save the card by lowering the clocks, it won't because the voltage is still too high. But this is likely something that can be fixed with a driver, and it certainly doesn't mean the card itself's design is flawed. Either way you look at it, it is a software problem, not a design flaw.

The only problem with this thread is this..People miss the Fact that Nvidia's top Enthusiast card isn't really for enthusiasts it's more for people that want a PnP card so they can claim they have the best...The reference design is Flawed and When Nvidia's partners come out with a better design we will get to see what this card can really do.

The design is not flawed, the software that supports the card is. You can still overvolt and overclock the card, things enthusiusts like to do.
 
How would you feel if you bought a car and the first time you red-lined it the engine shot out and killed your dog? Would you deem that car a piece of shit?

not even close to the same comparison, but I did enjoy the part about my dog :laugh:

No, the only corner they cut was in the BIOS in not locking down the voltage to lower levels.

You can't say that because the card can't handle 1.2v it is a shitty design. The fact is that nVidia designed the card to run at 0.94v, and it does that just fine. Raising the voltages beyond that puts it out of the area it was designed for.

And 1.2v certainly isn't a mild overvolt, not on a Fermi card. Remember, the maximum you could even go on the original Fermi cards was 1.087v(without modding the BIOS). So yes, on a Fermi 1.2v is a huge voltage bump.

this. the example drawn with a 6900 series card is completely different, look at the difference in stock votlages, this can account for the amount of voltage the card can handle. fermi cards not only dont scale well afer 1.1v, they don't like it either.

100% correct. I think what people are saying though, is that it could of been designed a little better. Enthusiasts like to push their enthusiast cards.

I completely grant this is a let down for the overclocking crowd, but the card is not a failed product. overclocking is always a temptation but the fact is none have blown up in stock trim, at stock clocks.

I agree, but enthusiasts also know what happens when you push enthusiast cards too far. Or rather they used to know, now it seems they just assume that since the voltage slider goes all the way to 11, that there is no problem with putting it there...:shadedshu

also this. a few review sites have had the card clock down and save itself just fine, and this is way before getting to 1.2v you just need to be patient, clock/volt the card up slowly and quit while you're ahead, hec ~670mhz core and 3700mhz memory is already 10% faster and a lot to gain from a card that is already such a beast. I think people got their hopes up with the specs and wanted two fully fledged GTX580's on one PCB. while thats a nice thought, at least the reviews with popped cards have shown us t'aint gunna happen.
 
Correct, that is a flaw. As I said, I would guess the problem is that the power limitter doesn't lower the voltage, it only lowers the clocks. So if you set the voltage too high, even if it tries to save the card by lowering the clocks, it won't because the voltage is still too high. But this is likely something that can be fixed with a driver, and it certainly doesn't mean the card itself's design is flawed. Either way you look at it, it is a software problem, not a design flaw.

problem is that people judge the cards very quickly, it's gonna be like crysis 2 where everyone moans about it because its missing a feature for the first few weeks
 
problem is that people judge the cards very quickly, it's gonna be like crysis 2 where everyone moans about it because its missing a feature for the first few weeks

Saddly this is all too often true, Vista would be another great example.
 
Don't close it W1zz. Ill leave. So the fanboys can discuss how the Illuminati destroyed the 590 on your test bench with their mind bullets.

The Rosicrutians have mind bullets not the Illuminati:shadedshu
 
...and down goes another Nvidia related thread. lol
 
Vista would be another great example.

still using vista, don't see the point of upgrading, was gonna get a 6000 or 500 series gfx card but they've been really disappointing as well so i'm still on g92 for now, still running DDR2 as its not worth the cost upgrading, the only thing thats been good lately is core i and their far too expensive for me
 
still using vista, don't see the point of upgrading, was gonna get a 6000 or 500 series gfx card but they've been really disappointing as well so i'm still on g92 for now, still running DDR2 as its not worth the cost upgrading, the only thing thats been good lately is core i and their far too expensive for me

If I didn't get Win7 for free, I'd still be on Vista.

If I didn't get a super amazing deal on the Mobo and CPU in my main rig, I'd still be running the 780i and Q9650(X3370) w/ DDR2. The only reason I upgraded was because selling the X3370 paid for the new CPU and Mobo.
 
If I didn't get Win7 for free, I'd still be on Vista.

i can get a deal which means 7 businesses for £40 and ultimate for £60 but i still can't be bothered to buy it
 
wow this thread became crazzy.
Here's my 2 cents on the topic. When 480 was released it was a huge letdown especially considering the time it took Nvidia to come up with the answer to 5xxx series, and the fact that 5970 still owned it. But fanboys were arguing that it's single gpu vs dual gpu which isn't fair. Fine, although I would argue that it doesn't matter how many cores a card will have as long as it priced and considered as the flagman of the company it will be valid to put it agains the opponents solution. So anyway now, in 2011 Nvidia released their dual gpu solution, which is still slower ( yes, yes barely, but atm a victory is a victory no matter how small the difference is) then ATI's soultion. NOW Nvidia fandom has no excuses to say that it's unfair to compair both solutions, and NOW they are pushing for such "important" aspects of the flagman card as ... SIZE, POWER CONSUMPTION and NOISE. Did that matter when gtx 280 was released ? or did that matter when 9800x2 was out ? or when 295 was released? oh wait... they were holding the position of the fastest cards atm, so whenever ATI fanboys were pointing out those flaws in the cards mentioned above, they were discarded for the same reason.

P.S. I'm not an ATI fanboy btw, as I had 480 in SLI ( and actually it was pretty good, after I've played around it optimal air flow to keep them at 70-80c during load).


To W1zzard : what were the clocks on 6990? the 830 or 880?


Oh btw I also think that 1.2v is a bit to much, and it seems that the card can reach 580s clocks with voltages below or at 1.05. But the fact that they released a card with voltage limiter working incorrectly is shameful, as if they never tested it, to see that it actually works.Question is : will it pass OCCT error test....
 
Last edited:
wow this thread became crazzy.
Here's my 2 cents on the topic. When 480 was released and was a huge letdown especially considering the time it took Nvidia to come up with the answer to 5xxx series, and the fact that 5970 still owned it. But fanboys were arguing that it's single gpu vs dual gpu which isn't fair. Fine, although I would argue that it doesn't matter how many cores a card will have as long as it priced and considered as the flagman of the company it will be valid to put it agains the opponents solution. So anyway now, in 2011 Nvidia released their dual gpu solution, which is still slower ( yes, yes barely, but atm a victory is a victory no matter how small the difference is) then ATI's soultion. NOW Nvidia fandom has no excuses to say that it's unfair to compair both solutions, and NOW they are pushing for such "imporant" aspects of the flagman card as ... SIZE, POWER CONSUMPTION and NOISE. Did that matter when gtx 280 was released ? or did that matter when 9800x2 was out ? or when 295 was released? oh wait... they were holding the position of the fastest cards atm, so whenever ATI fanboys were poiting out those flaws in the cards mentioned above, they were discarded for the same reason.

AMEN

Your read my mind, now that radeon is faster then now they do care about noise and other stuff besides perf.
No matter what happens nvidia always win lol

There is a saying for that in my country: "Juan Zapata, si no la GANA(win), la EMPATA!(draw)"
 
Your read my mind, now that radeon is faster then now they do care about noise and other stuff besides perf.
No matter what happens nvidia always win lol

There is a saying for that in my country: "Juan Zapata, si no la GANA(win), la EMPATA!(draw)"

from what I've seen most users here fully accept that the radeon is the faster card, I certainly have no reservations about that. It was Nvidia themselves that worked so hard to give it other attributes like quietness/length, because they probably knew it would be too hard to straight out beat the 6990 for speed while staying below 375w. and good on them for doing it, they are capitalizing on the cards strengths, as they should.
 
from what I've seen most users here fully accept that the radeon is the faster card, I certainly have no reservations about that. It was Nvidia themselves that worked so hard to give it other attributes like quietness/length, because they probably knew it would be too hard to straight out beat the 6990 for speed while staying below 375w. and good on them for doing it, they are capitalizing on the cards strengths, as they should.

Fair enough, but i'm talking about people that when nvidia has the perf lead, they don't care about acustics, temps, power, because it is the absolute best and "enthusiasts" only care about perf, but when ati has the lead, oh! they magically prefer the nvidia card because is quieter, etc, etc, such hypocrisy....
i've had both camps, from gf mx440, to hd4870 and both camps have had their good and not so good cards, fanboys prefer nvidia or ati no matter how full of crap they may be....
The only reason that kind of people will accept losing is if for example 6990 were like 500% faster than gtx 590 AND they still prefer gtx 590 because is smaller and quieter LOL hilarious :roll::roll:
 
Back
Top