• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

ASUS GeForce GTX 590 3 GB

consider them additional free data points, dont look at them if you don't care. they do provide some insight to some people looking at this from a non-consumer perspective.



nvidia's power capping reduces the clock speeds, which reduces power consumption and performance, effectively leaving performance per watt the same.

your precious hardocp measures power consumption at the wall and subtracts system wattage without graphics card in idle. ask yourself where the power is in that measurement that the cpu/memory/hdd/motherboard/psu inefficiency consume by going from idle without graphics card to 3d gaming load.



i know of someone working on a 3x 8 pin card without power limit that it designed like a tank

Exactly. W1z's reviews are more like a well written science research papers. They provide the most raw data of all reviews on the web and are the most analytical. On top of that he regularly goes beyond the line of duty, throughly recording voltage and frequency scaling - giving an inside of the quality of the silicon itself the inner workings of the architecture.

Now on the 590: Exactly what was expected from the leaked clocks. It's just not as efficient as cayman. And it also backs up my crazy paranoia theory that Nvidia purposefully did not allow EVGA and Galaxy to make dual 560 card and only limited them to using 460 chips because they knew that it would step all over the 590.

A dual 560 chip OCed to 900core at roughly 350W can easily play with the big boys see guru3d's 560 sli review for reference.

Oh and remember everybody, dig the review it takes just a few seconds.
 
check this video out
I guess w1z isn't the only one.

smoke came out of the 24 pin power connector for the mobo, not the card.

this card seems to be bringing back alot of the fermi original issues. Now don't get me wrong, my 480's run just fine but we should have had the 580 the first time around. It almost seems like nv needs to bring out a revision for the 590 already. perhaps fixing the voltage limiter and upping the clocks, seems like at 700MHZ this card would actually beat the 6990.

all in all it has been a long time since I've seen W1z put up a 7.0 on a flagship card.
 
Sniff sniff... I smell a massive recall coming...
 
smoke came out of the 24 pin power connector for the mobo, not the card.

this card seems to be bringing back alot of the fermi original issues. Now don't get me wrong, my 480's run just fine but we should have had the 580 the first time around. It almost seems like nv needs to bring out a revision for the 590 already. perhaps fixing the voltage limiter and upping the clocks, seems like at 700MHZ this card would actually beat the 6990.

all in all it has been a long time since I've seen W1z put up a 7.0 on a flagship card.

I think that's the first time. The lowest I see are 8.0's for the GTX 480 and the HD 3870X2.
 
So nvidia and ati end up sharing the performance crown? Not sure when that's happened before, or if it even has. The limited safe overclock potential could be an issue if they can't sort out the power limiter in the drivers, but overall this is actually pretty great. Now you can pick the brand you prefer without concern of trading down any performance. If only it were always so even.
 
a single GTX 580 cant take 1.2v very well at all - so why think a dual board can?

it can at stock but i wouldn't try to increase the frequency...
 
Looks like a decent card, performance is similar but does seem to lag behind the 6990 at times.

Very unfortunate to here about the power limiting system affecting the card in such a way, even though such a card is already very beefy, so i personally wouldn't worry about not being able to volt the card higher to get more clocks out of it.
 
a single GTX 580 cant take 1.2v very well at all - so why think a dual board can?

The GTX580 can't take 1.2v just fine, it was the GTX570 that has the problems.

smoke came out of the 24 pin power connector for the mobo, not the card.

It is hard to tell from the video, but at 720p when you slow it down, it definitely comes from the card. Specifically from the fuse that W1z highlighted in the review, which is located on the back of the PCB right at the corner by the PCI-E plugs, which is right over the 24 pin. The smoke that seems to rise from the 24 pin is from the little spark that flies down when the "fuse" pops.
 
Last edited:
Why oh why do you insist at reviewing at such ridiculously low resolutions for these monsters? These cards are made to drive multiple monitors or the VERY LEAST a single monitor at 1920x1080.

Second, how do you calculate performance per watt when you know that the Nvidia cards use power protection to attain lower power usage than the 6990.

If anyone want's to see a more accurate review, head over to [H]ardocp.

you must be joking hardOCP only wishes their review was half of what w1zzard does.seriously wtf!!:nutkick:
 
You know I got crucified when I said on EVGA's forums that "The gtx 590 will be almost as powerful as the 6990....But it will lose to it because of the clocks dropping" I guess I was right.... Man O man it is interesting though when you make a prediction about video cards and people act like your taking away their B-day when your talking about their favorite brand.
It still is a nice card and as long as the price is right.... I could totally see buying one!!!! Because Nvidia still has one more thing that ATI for the most part does not have. Better drivers!

Nice review Keep up the great work!
 
Why oh why do you insist at reviewing at such ridiculously low resolutions for these monsters? These cards are made to drive multiple monitors or the VERY LEAST a single monitor at 1920x1080.

Man, it is a good thing W1z includes these higher resolutions and breaks down all the important information by them, otherwise you might have just had a point. But saddly, you don't.

Second, how do you calculate performance per watt when you know that the Nvidia cards use power protection to attain lower power usage than the 6990.

ATi cards use power protection as well. However, that doesn't really matter, because if the power protection does kick in, and it probably doesn't, then the performance would suffer as well, balancing things out. The lower power consumption numbers would go hand in hand with lower performance numbers as well.

But, like I said, the power protection probably doesn't kick in anyway, since W1z is using the average power consumption number(not the furmark unrealistic peak number) to give his performance per watt numbers. Since the average number is measured during normal gameplay scenarios, the power protection never really kicks in.

If anyone want's to see a more accurate review, head over to [H]ardocp

Accurate? [H]ardocp? You mean the site that does minor tweaks that are hard to spot between each graphics card in the same benchmark? Like enabled 2xAA for one card, while leaving it off for the rest in their Crysis Warhead test? Or upping the shaders on another card one notch higher, also on Crysis Warhead?
 
Last edited:
Darn it W1zz, why'd you have to kill the card? Willing to bet that results at 815MHz core would've been quite impressive, although I think the heat and power draw would be as well...

As for everyone complaining that the card blew - there is a reason why a video card's manufacturer's warranty is voided by overclocking, and this is it. I agree 100% that nVidia's (alleged) power-throttling hardware and drivers should've prevented this, and of course they should have done proper internal testing before releasing. (Even a disclaimer, something along the lines of "if you give the cores more than 1v you *will* kill the card", would've been better than nothing).

I think the dual-GPU 560 cards already in development, plus any potential dual-GPU 570s, will be better products than this - IMO nVidia felt compelled to use full-fledged GF110s because ATI's 6990 has standard 6970 GPUs, and it's a decision that's backfired on them. I forsee many RMAs on these cards in the near future and I think nVidia will regret releasing the GTX 590 without making it the performer it should have been. ATI wins this round, no doubt about it.
 
So those video clips of helicopters dumping water to cool down reactors was live video of W1zzard trying to put out the fire in the test lab. Shame on you NVidia for not truly testing a card out fully before letting someone like W1zzard expose your flaws and for the entire world to see it posted on the web.

I say that Crossfire OC'd 6950 or 6970's would be a killer setup and just skip the 6990 and 590 all together.
 
haha-haha card blew up. :laugh: is that covered in the warranty?:)
 
Why oh why do you insist at reviewing at such ridiculously low resolutions for these monsters? These cards are made to drive multiple monitors or the VERY LEAST a single monitor at 1920x1080.

Second, how do you calculate performance per watt when you know that the Nvidia cards use power protection to attain lower power usage than the 6990.

If anyone want's to see a more accurate review, head over to [H]ardocp.
i smell a fan boy troll:nutkick:
 
There is an Easter egg in the review. First to find gets a cookie.
 
12730570062UhYdq.jpg
 
Accurate? [H]ardocp? You mean the site that does minor tweaks that are hard to spot between each graphics card in the same benchmark? Like enabled 2xAA for one card, while leaving it off for the rest in their Crysis Warhead test? Or upping the shaders on another card one notch higher, also on Crysis Warhead?

The tests are different, hardocp tests maximun playable settings(meaning if one card has 2x antialias and the other one doesn't, it is because fps will lower beyond playable, which i think it's obvious to understand, specially for you that are well educated at this stuff), and here w1zz tests apples to apples, same resolution, antialias, etc.

I like both, they're different points of view.
 
There is an Easter egg in the review. First to find gets a cookie.

Awww man I went through it twice and didn't find anything. I give up what is it?! :D
 
I believe this is at least the 2nd site that had a card fail. I would be concerned about spending my $700 on this.
 
So, W1zz, oh fearless leader, did the fuse blow, or was it other death?

:shadedshu


I wonder what the limit is.:confused:
 
did the fuse blow, or was it other death?

i measured resistance across the fuse and it was blown. after fixing it the card blew again which means that there was some different issue causing the fuse to blow in the first place as protection.

nobody at asus/nvidia/me knows. i sent nvidia instructions to reproduce but this will take a while. and i'm careful now with my 2nd card
 
Back
Top