# ASUS GeForce GTX 590 3 GB



## W1zzard (Mar 22, 2011)

Today NVIDIA releases their new GeForce GTX 590 flagship which is based on two GTX 580 GPUs working on a single card. AMD released their dual-GPU HD 6990 design just two weeks ago, with power and heat being the decisive limit on performance it will be a tough fight between those cards.

*Show full review*


----------



## razaron (Mar 24, 2011)

Interesting. I thought that at stock it would be a bit faster than a 6990.


----------



## brandonwh64 (Mar 24, 2011)

over all 4% faster than 6990. Nice


----------



## treboRR (Mar 24, 2011)

brandonwh64 said:


> over all 4% faster than 6990. Nice



I hoped alot more. but hey


----------



## pr0n Inspector (Mar 24, 2011)

brandonwh64 said:


> over all 4% faster than 6990. Nice



You mean slower.


----------



## brandonwh64 (Mar 24, 2011)

pr0n Inspector said:


> You mean slower.



Look like 4% *FASTER* than 6990 over all


----------



## Spectrum (Mar 24, 2011)

holy shit you managed to kill it? 
did you try again on the other card to make sure it wasn't a bad card?


----------



## pr0n Inspector (Mar 24, 2011)

brandonwh64 said:


> Look like 4% *FASTER* than 6990 over all
> 
> http://tpucdn.com/reviews/ASUS/GeForce_GTX_590/images/perfrel.gif



Would you buy this and game at anything lower than 1920x1200? The first chart is useless. Only the last two are relevant.


----------



## Over_Lord (Mar 24, 2011)

word spread that nvidia dropped the card price by 100$. So they are selling two cherry picked underclocked GTX580's for 350$ each(GTX570 price). That makes it dang VFM...(relatively speaking)

I still think 2xHD6950 CFx for 500$ is the best deal you can probably get.



> As a result GTX 590 and HD 6990 are roughly the same performance when averaged. In lower resolutions GTX 590 wins, at 1920x1200 both are even, and at 2560x1600 AMD's HD 6990 wins by 3%. Since 2560x1600 is the primary resolution that both of those cards should be used for, my conclusion is that GTX 590 is slower than HD 6990, but not by much. It is still disappointing to see that NVIDIA could not turn their single-GPU winning GF110 into a dual-GPU design win too.



IMO this is the ULTIMATE conclusion must-read of wizz's review..

EDIT:
Last 2 lines clearly show Wizz's disapproval of crysis 2's console port sting and being a dx9 game at launch


----------



## mdm-adph (Mar 24, 2011)

So, it's about dead equal at the resolutions that matter most to us.  Honestly, I'm happy -- now there is no more "fastest card."  Everybody can now shut up. 

Honestly, though, are they selling the card at a loss?  That's a lot of tech they're trying to pack into there, and they can't possibly sell it for more than a HD 6990...


----------



## brandonwh64 (Mar 24, 2011)

pr0n Inspector said:


> Would you buy this and game at anything lower than 1920x1200? The first chart is useless. Only the last two are relevant.



I wouldn't buy it, my 5850 still games quite well so i really dont know who would buy this unless they were a heavy bencher


----------



## qubit (Mar 24, 2011)

I don't have time to read the review while I'm at work, so just from the conclusion it doesn't look too good, does it? It was only worth a 7 rating, too. Shame.

And oh god, this looks damning:



> Card blew up during testing, power limiting system does not work reliably


 

It looks like the move to 28nm can't come soon enough, can it? Graphics card performance is becoming seriously held back by power issues now.


----------



## Kreij (Mar 24, 2011)

The fact that the Asus 6990 4GB got a TPU score of 9.1 and this got a 7.0 does not bode well for this card.
I was hoping for a beat down on ATI so the GPU wars would continue. :/


----------



## Yellow&Nerdy? (Mar 24, 2011)

No difference with the 6990 @1920x1080? Then why does it cost 50€ more? Even though it does overclock better (when it does not blow up), the 6990 seems like the better card.


----------



## Frick (Mar 24, 2011)

It's close, and here the prices are identical (except the Gigabyte one with the mouse). And it's only a a few percents difference, so I'd say get whichever one is cheaper. That it blew up during overvolting I don't care about.


----------



## theeldest (Mar 24, 2011)

Yea!! My prediction was spot on. I predicted that the two cards would be within 5% of each other due to needing to fit within the power envelope.

I'll admit, though, I did *not* predict explosions ...


----------



## MoonPig (Mar 24, 2011)

Awesome, ATi have kept the performance title.

Unlucky nVidia, once again you've taken longer to release your slower card.


----------



## MxPhenom 216 (Mar 24, 2011)

MoonPig said:


> Awesome, ATi have kept the performance title.
> 
> Unlucky nVidia, once again you've taken longer to release your slower card.



how do you figure that. the GTX590 overall is 4% faster and with overclocking its even faster. Asus TOP card is clocked at 900mhz. that will smash the 6990. Also drivers come into play here as it did with the 6990 when it released.

The 6990 is also priced at $699

The low factory clocks are what is killing the 590 from winning in every situation. if they clocked it at 725mhz or so it would win.


----------



## Deleted member 24505 (Mar 24, 2011)

Personally i would takw W1z's advice and buy a 580 or stick with what i have, these cards are only for rich guys who need bigger epeen.

From W1z's conclusion, i would say Moonpig is correct, The 6990 was better in more tests than the 590 so overall i would say the 6990 is better, and did not blow up


----------



## Razi3l (Mar 24, 2011)

nvidiaintelftw said:


> how do you figure that. the 6990 overall is 4% faster and with overclocking its even faster. Asus TOP card is clocked at 900mhz. that will smash the 6990. Also drivers come into play here as it did with the 6990 when it released.



What do you mean? The 6990 is still on the launch driver as far as I know, so there is nothing "off" about drivers. 

I think that this card isn't bad, considering it is close to the 6990 but the (slightly) higher price is something that would put you off, and considering this one blew up, I personally wouldn't buy one. But then again I wouldn't buy any dual-gpu card.


----------



## Deleted member 67555 (Mar 24, 2011)

LOL the card blew up!!
That must have made a certain body part pucker up...


----------



## MoonPig (Mar 24, 2011)

nvidiaintelftw said:


> how do you figure that. the 6990 overall is 4% faster and with overclocking its even faster. Asus TOP card is clocked at 900mhz. that will smash the 6990. Also drivers come into play here as it did with the 6990 when it released.
> 
> The 6990 is also priced at $699
> 
> The low factory clocks are what is killing the 590 from winning in every situation. if they clocked it at 725mhz or so it would win.



Thats what i said, the 6990 is a better card.

It consumes a fair amount less in games, it doesn't blow up when you try to clock it, it's faster at the resolutions a that are used now (1200p and 1600p).

ATi hasn't released any new drivers (officially) since they released the 6990 (IIRC), so drivers ain't a comparion here.

Be interesting to see the SLi review. Wonder if the 590 will manage more than 20% scalling. Or are we once again limited by the console generation and 1600p being the max resolution in these reviews.

Maybe Wizz should invest in a 1600p tri-monitor setup to see if thats the case. Surely TPU has the power to request such kit.


----------



## Mussels (Mar 24, 2011)

> Card blew up during testing, power limiting system does not work reliably




ahahahahaah!


me and deathmorew are having a good laugh over that in teamspeak right now.


----------



## Jack Doph (Mar 24, 2011)

Great review, thanks W1zzard 
I was hoping the GTX590 would beat the HD6990 (because I'm not camp-faithful - in fact, I dislike both camps equally), but I'll be investing in the Red camp for sure now.
Pity.. I was hoping the Green team had got it right.. Seems they pushed too little too late, for what can only be called a dismal result in the end.

That said.. I'm sure this card would be great for folding


----------



## mamisano (Mar 24, 2011)

Why oh why do you insist at reviewing at such ridiculously low resolutions for these monsters? These cards are made to drive multiple monitors or the VERY LEAST a single monitor at 1920x1080.

Second, how do you calculate performance per watt when you know that the Nvidia cards use power protection to attain lower power usage than the 6990.

If anyone want's to see a more accurate review, head over to [H]ardocp.


----------



## alucasa (Mar 24, 2011)

Well, duh, so much for the hype, lol


----------



## BraveSoul (Mar 24, 2011)

great review W1zzard, thank you
i wonder if custom pcb's will make it to market, with stronger MOSFETs


----------



## the54thvoid (Mar 24, 2011)

Let's all be adults and call it a performance stalemate.  If i was asked which one i'd like for free, I'd go with the quieter one, given the few % difference means feck all in real world gaming.

I'm sure the AIB/AIC's will do their own versions and quite likely with 3 power connectors to alleviate the voltage/power issues and maybe super dooper cooling.

The base summary is clear, AMD have the edge for 28nm because they have much better efficiency and design.  I think the 590 is surprisingly good at these low clock speeds .

But i wouldn't buy either one.  Total waste of money.  Lets face it, if you have cash to buy one of these you have cash for a good mobo that would allow sli/crossfire.  Give me two 6970's or two 580's any day.


----------



## MxPhenom 216 (Mar 24, 2011)

MoonPig said:


> Thats what i said, the 6990 is a better card.
> 
> It consumes a fair amount less in games, it doesn't blow up when you try to clock it, it's faster at the resolutions a that are used now (1200p and 1600p).
> 
> ...



fixed!


----------



## Frizz (Mar 24, 2011)

590 Prices in AUD 1080-1090

http://www.staticice.com.au/cgi-bin/...gtx+590&spos=3

6990 Prices in AUD 790-860

http://www.staticice.com.au/cgi-bin/...hd+6990&spos=1

Can someone explain why Nvidia's prices are so appalling compared to AMD's in Australia? Judging from W1zzard's review there is not much that would justify a 590 to be bought over a 6990 in our country except for CUDA, PhysX, 3D surround. An extra 200+ AUD for those features just doesn't seem worth it at all.


----------



## SammyHayabuza (Mar 24, 2011)

*The Big winner is the 6970 in Crossfire!!!*


----------



## Razi3l (Mar 24, 2011)

randomflip said:


> 590 Prices in AUD
> 
> http://www.staticice.com.au/cgi-bin/...gtx+590&spos=3
> 
> ...



NVIDIA tend to always overprice their products. The price of a 590 here is also somewhat higher than a 6990 yet it is slower and likes to go ka-boom!


----------



## Harlequin_uk (Mar 24, 2011)

so the 6950 in CF is faster in 1 or 2 dgames for alot less money

btw , what was asus reaction when you told then you blew the card up? got to pay for it now?


----------



## Yellow&Nerdy? (Mar 24, 2011)

nvidiaintelftw said:


> how do you figure that. the 6990 overall is 4% faster and with overclocking its even faster. Asus TOP card is clocked at 900mhz. that will smash the 6990. Also drivers come into play here as it did with the 6990 when it released.
> 
> The 6990 is also priced at $699
> 
> The low factory clocks are what is killing the 590 from winning in every situation. if they clocked it at 725mhz or so it would win.



Did you even read the review? The card BLEW UP when Wizz tried to take it beyond 815 MHz. I doubt that 900 MHz can be reached, unless the VRM-components are improved drastically and water cooling is used, which would lead to the card costing like 900$. Also, it's overall 4% faster. At bigger resolutions than 1920x1080, the 6990 is as fast/faster.


----------



## Mussels (Mar 24, 2011)

mamisano said:


> Why oh why do you insist at reviewing at such ridiculously low resolutions for these monsters? These cards are made to drive multiple monitors or the VERY LEAST a single monitor at 1920x1080.
> 
> Second, how do you calculate performance per watt when you know that the Nvidia cards use power protection to attain lower power usage than the 6990.
> 
> If anyone want's to see a more accurate review, head over to [H]ardocp.



... you realise he tests multiple resolutions to make the cards comparable to their lower end brethren? you can read and/or scroll down in your browser, yes?








performance per watt is very easy to calculate... you measure the power they use vs the performance they give out. how can you get any more accurate than that? If the nvidia drivers/power protection hamper the performance by limiting the power, then that is going to be taken into account. its completely insane to think you'd measure it any other way, when thats how its going to behave for people gaming/benchmarking on the cards.


your argument for going over to [H] is... well, its not strong. lol.


----------



## horik (Mar 24, 2011)

SammyHayabuza said:


> *The Big winner is the 6970 in Crossfire!!!*



nah,6950 CF


----------



## Razi3l (Mar 24, 2011)

Mussels said:


> performance per watt is very easy to calculate... you measure the power they use vs the performance they give out. how can you get any more accurate than that? If the nvidia drivers/power protection hamper the performance by limiting the power, then that is going to be taken into account. its completely insane to think you'd measure it any other way, when thats how its going to behave for people gaming/benchmarking on the cards.



Well if he's worried about the power limiter throttling the card, he can always disable it and watch the fireworks.


----------



## W1zzard (Mar 24, 2011)

mamisano said:


> Why oh why do you insist at reviewing at such ridiculously low resolutions for these monsters? These cards are made to drive multiple monitors or the VERY LEAST a single monitor at 1920x1080.



consider them additional free data points, dont look at them if you don't care. they do provide some insight to some people looking at this from a non-consumer perspective.



> Second, how do you calculate performance per watt when you know that the Nvidia cards use power protection to attain lower power usage than the 6990.



nvidia's power capping reduces the clock speeds, which reduces power consumption and performance, effectively leaving performance per watt the same.

your precious hardocp measures power consumption at the wall and subtracts system wattage without graphics card in idle. ask yourself where the power is in that measurement that the cpu/memory/hdd/motherboard/psu inefficiency consume by going from idle without graphics card to 3d gaming load.



BraveSoul said:


> i wonder if custom pcb's will make it to market, with stronger MOSFETs



i know of someone working on a 3x 8 pin card without power limit that it designed like a tank


----------



## Razi3l (Mar 24, 2011)

W1zzard said:


> i know of someone working on a 3x 8 pin card without power limit that it designed like a tank



I guess that they are ASUS? Because none of their DCII cards have the power limiter. I bet it will also have a pretty big price tag, and I would imagine it will be like the ARES, so.. £1000 I bet. Anyways, great review. I had a feeling these may have trouble with voltage+overclocking


----------



## 2DividedbyZero (Mar 24, 2011)

so how long will we see arguments along the lines :

nVidia - yeah buts our card can rocks your poor Radeon effort

AMD - untils it blows up


----------



## Jack Doph (Mar 24, 2011)

W1zzard said:


> i know of someone working on a 3x 8 pin card without power limit that it designed like a tank



I'm sure that will stay within the ATX specs 
Seriously though.. For a card to blow up like it did, are consumers at this end of the market truly expected to pay this price when they cannot even be guaranteed their card will survive, let alone beat the rival?
And to what end purpose?


----------



## Frizz (Mar 24, 2011)

Razi3l said:


> NVIDIA tend to always overprice their products. The price of a 590 here is also somewhat higher than a 6990 yet it is slower and likes to go ka-boom!



I just heard from a local PC store rep, they say the price is much higher in Aus due to the limited stock..? 

Doesn't seem like PCcasegear.com.au have bought a batch of these cards to sell either, they usually have it up before release with coming soon or put it up for sale on release date so I guess that makes sense then.


----------



## W1zzard (Mar 24, 2011)

Jack Doph said:


> I'm sure that will stay within the ATX specs



/care      as long as it works, you can sell it and make money with it

there is no official spec beyond 2x 8 pin as far as i know


----------



## Jack Doph (Mar 24, 2011)

W1zzard said:


> /care
> 
> there is no official spec beyond 2x 8 pin as far as i know



Not the issue. Point 2 is though..


----------



## mdm-adph (Mar 24, 2011)

Kreij said:


> The fact that the Asus 6990 4GB got a TPU score of 9.1 and this got a 7.0 does not bode well for this card.
> I was hoping for a beat down on ATI so the GPU wars would continue. :/



Well, it's not a bad card -- it matches the 6990 in terms of raw performance.  It's just not a _good buy_, because of the much extra power it has to draw to get there, and thus the lower score.


----------



## Bjorn_Of_Iceland (Mar 24, 2011)

> Price-wise both HD 6990 and GTX 590 are tied around $700 which is a lot of money to spend on a graphics card. My recommendation would be to go with a single GPU GTX 580 or 6970 and wait what the future brings in terms of games - most games are console ports, Crysis 2 is DX9. Developers! The PC needs more love from you.


 This is what matters

edit: Fixed


----------



## Jack Doph (Mar 24, 2011)

Bjorn_Of_Iceland said:


> This is what matters



Agreed!
If only the price-point was the same the world over..


----------



## Frizz (Mar 24, 2011)

Jack Doph said:


> Agreed!
> If only the price-point was the same the world over..



+1 amen to that..


----------



## Delta6326 (Mar 24, 2011)

holy crap lots views! great review! i can't believe how close these 2 are  but IT blew up in testing?! (sorry had to)


----------



## catnipkiller (Mar 24, 2011)

155 users viewing lol i never thought it would blow up. poor nvidia


----------



## DOM (Mar 24, 2011)




----------



## Jack Doph (Mar 24, 2011)

Delta6326 said:


> holy crap lots views! great review! i can't believe how close these 2 are  but IT blew up in testing?! (sorry had to)



It appears to me that nVidia could have blown AMD away with this offering, but (for one reason or another - academic at the least now) did not, or chose not to.
For us Aussies, there's no viable or logical reason to go the nVidia way, when AMD can blow nVidia out of the water with a simple flick of the switch, aside from the usual OC capabilities on either side of the camps.
An extra $200 for .. well, for what?


----------



## ERazer (Mar 24, 2011)

ahhh gotta love the smell of burned pcb in the morning, great review w1z as always


----------



## Jack Doph (Mar 24, 2011)

ERazer said:


> ahhh gotta love the smell of burned pcb in the morning




Good call XD


----------



## W1zzard (Mar 24, 2011)

catnipkiller said:


> 155 users viewing lol i never thought it would blow up. poor nvidia



those are really just the people viewing the comments thread in the forums, the number of people reading the review on the website is much higher than that


----------



## Bjorn_Of_Iceland (Mar 24, 2011)

Jack Doph said:


> It appears to me that nVidia could have blown AMD away with this offering,..


yep definitely did not. only thing it blew away were the mosfets


----------



## Jack Doph (Mar 24, 2011)

W1zzard said:


> those are really just the people viewing the comments thread in the forums, the number of people reading the review on the website is much higher than that



Kewlies.
You, sir, are doing an awesome job then


----------



## Frick (Mar 24, 2011)

But it happened when overvolting. So I mean it's not like every card in the world will spontanously blow up.


----------



## Over_Lord (Mar 24, 2011)

nvidiaintelftw said:


> how do you figure that. the GTX590 overall is 4% faster and with overclocking its even faster. Asus TOP card is clocked at 900mhz. that will smash the 6990. Also drivers come into play here as it did with the 6990 when it released.
> 
> The 6990 is also priced at $699
> 
> The low factory clocks are what is killing the 590 from winning in every situation. if they clocked it at 725mhz or so it would win.



4% isn't really a figure to boast of, especially when we are talking 116fps and 120fps...



brandonwh64 said:


> I wouldn't buy it, my 5850 still games quite well so i really dont know who would buy this unless they were a heavy bencher



Exactllyyyy. If a game does demand more, just OC it to HD5870 clocks and wallah, playable(more than that) frame rates again. I see no reason to  upgrade myself. Thats why my card either changes if it dies(under warranty, then RMA) or when the HD7000 series shows up(28nm juice)


----------



## Jack Doph (Mar 24, 2011)

Frick said:


> But it happened when overvolting. So I mean it's not like every card in the world will spontanously blow up.



True, but by how much?
If you wish to compare, pound-for-pound, would you trust your hard-earned investment, in a card that's already touching borderline by default *without* beating the opponent hands-down?


----------



## entropy13 (Mar 24, 2011)

Frick said:


> But it happened when overvolting. So I mean it's not like every card in the world will spontanously blow up.



They have a safety feature that didn't work. That was the point, how it happened is inconsequential. It's like saying that an airbag not working for a car running at 200mph is not an issue because it happened at 200mph, and not all cars reach 200mph.


----------



## AlienIsGOD (Mar 24, 2011)

The card fried  

After reading the review, IMO it seems this card wasn't ready to be released.  If the power limiting system isn't reliable at launch, even tho in W1z's review Nvidia states it " *NVIDIA promises that their power capping will avoid such conditions by constantly monitoring the card's power draw and reducing clocks if necessary - in all applications*" , then this should not have been released yet.  They should have worried less about getting the card out the door and more about providing a stable, COMPLETE, and fully working driver.



Bjorn_Of_Iceland said:


> yep definitely did not. only thing it blew away were the mosfets



roflcopter


----------



## KainXS (Mar 24, 2011)

lol so it blew as soon as power limiter was disabled(oh it was still enabled), . . . .  :/ give it a proper burrial

nice card but if they all die soon as you try to volt mod, then its not for me.


----------



## overclocking101 (Mar 24, 2011)

to bad it blew up. its been a while since we have seen a card that pops so easily. even the 480's didnt burn that fast. I would have spent more made the card longer and added better vrm circutry because they obviously skimped on that


----------



## entropy13 (Mar 24, 2011)

KainXS said:


> lol so it blew as soon as power limiter was disabled, . . . .  :/ give it a proper burrial
> 
> nice card but if they all die soon as you try to volt mod, then its not for me.



It blew even with power limiter enabled.

http://www.techpowerup.com/reviews/ASUS/GeForce_GTX_590/26.html



> As a first step, I increased the voltage from 0.938 V default to 1.000 V, maximum stable clock was 815 MHz - faster than GTX 580! Moving on, I tried 1.2 V to see how much could be gained here, at *default clocks* and with *NVIDIA's power limiter enabled*. I went to heat up the card and then *boom*, a sound like popcorn cracking, the system turned off and a burnt electronics smell started to fill up the room. Card dead! Even with NVIDIA power limiter enabled. Now the pretty looking, backlit GeForce logo was blinking helplessly and the fan did not spin, both indicate an error with the card's 12V supply.
> 
> After talking to several other reviewers, this does not seem to be an isolated case, and many of them have killed their cards with similar testing, which is far from being an extreme test.


----------



## EastCoasthandle (Mar 24, 2011)

check this video out
I guess w1z isn't the only one.


----------



## alucasa (Mar 24, 2011)

Jack Doph said:


> An extra $200 for .. well, for what?



Fanboyism AKA brand loyalty


----------



## KainXS (Mar 24, 2011)

> the supplied Geforce Drivers 267.52 for Geforce GTX 590 will not stop the card from overheating when overclocking. Please use newer versions from the Nvidia website and stay away from 267.52. Otherwise this may happen ...



hmm, did you do that wiz

PS I edited it fast huh


----------



## newtekie1 (Mar 24, 2011)

Kind of a disappointment really.  I was hoping for more from the card, especially at higher resolutions.

Now I hope they get off their asses and design a dual-GF114 card to fill the gap between the GTX580 and GTX590.

As for the card popping, from my understanding, the way nVidia's power limiter works is that it just lowers the clocks when it senses overcurrent.  The problem is that it doesn't, AFAIK, lower the voltage at the same time.  So it is entirely possible that if the user overvolts the card, that the card will still be in an overcurrent state, even with the lowered clocks.  So by overvolting, the limitter becomes useless.


----------



## Lionheart (Mar 24, 2011)

Bwhahahaha it blew up, sry nvidia nice card, but nothing mind blowing...


----------



## v12dock (Mar 24, 2011)

I think AMD will mark this one in there books as a win


----------



## VulkanBros (Mar 24, 2011)

KainXS said:


> hmm, did you do that wiz
> 
> PS I edited it fast huh



He used 267.71.......


----------



## alexsubri (Mar 24, 2011)

Minimum Power Requirement is 750 Watt's? My God, does this come with an extra Nuclear Plant to Power those 8V Rail's?  ...hd6990 is confirmed at 830mhz, meaning only ~ 5% off the hd6970 clocks gtx590 will need to have an at least 10% slower core clock than its single GPU variant, or it blow your fuses or set your PC on fire and hd6970 CF is already faster than gtx580 SLI in the higher res due to the ram advantage (2gb vs 1,5gb) this will be an easy victory for the hd6990


----------



## Wrigleyvillain (Mar 24, 2011)

mamisano said:


> If anyone want's to see a more accurate review, head over to [H]ardocp.


----------



## Animalpak (Mar 24, 2011)

I hope that you are smart enough to accept criticism

Why nobody notice and say the drivers are young and premature ...

You say only when AMD graphics card reviewed. huh ? 

*
The worst review I've ever read for a graphics card with such power and engineers sorry i do not like it at all.*


----------



## HalfAHertz (Mar 24, 2011)

W1zzard said:


> consider them additional free data points, dont look at them if you don't care. they do provide some insight to some people looking at this from a non-consumer perspective.
> 
> 
> 
> ...



Exactly. W1z's reviews are more like a well written science research papers. They provide the most raw data of all reviews on the web and are the most analytical. On top of that he regularly goes beyond the line of duty, throughly recording voltage and frequency scaling - giving an inside of the quality of the silicon itself the inner workings of the architecture.

Now on the 590: Exactly what was expected from the leaked clocks. It's just not as efficient as cayman. And it also backs up my crazy paranoia theory that Nvidia purposefully did not allow EVGA and Galaxy to make dual 560 card and only limited them to using 460 chips because they knew that it would step all over the 590.

A dual 560 chip OCed to 900core at roughly 350W can easily play with the big boys see guru3d's 560 sli review for reference.

Oh and remember everybody, dig the review it takes just a few seconds.


----------



## yogurt_21 (Mar 24, 2011)

EastCoasthandle said:


> check this video out
> I guess w1z isn't the only one.



smoke came out of the 24 pin power connector for the mobo, not the card. 

this card seems to be bringing back alot of the fermi original issues. Now don't get me wrong, my 480's run just fine but we should have had the 580 the first time around. It almost seems like nv needs to bring out a revision for the 590 already. perhaps fixing the voltage limiter and upping the clocks, seems like at 700MHZ this card would actually beat the 6990. 

all in all it has been a long time since I've seen W1z put up a 7.0 on a flagship card.


----------



## alexsubri (Mar 24, 2011)

Sniff sniff... I smell a massive recall coming...


----------



## entropy13 (Mar 24, 2011)

yogurt_21 said:


> smoke came out of the 24 pin power connector for the mobo, not the card.
> 
> this card seems to be bringing back alot of the fermi original issues. Now don't get me wrong, my 480's run just fine but we should have had the 580 the first time around. It almost seems like nv needs to bring out a revision for the 590 already. perhaps fixing the voltage limiter and upping the clocks, seems like at 700MHZ this card would actually beat the 6990.
> 
> all in all it has been a long time since I've seen W1z put up a 7.0 on a flagship card.



I think that's the first time. The lowest I see are 8.0's for the GTX 480 and the HD 3870X2.


----------



## Harlequin_uk (Mar 24, 2011)

a single GTX 580 cant take 1.2v very well at all - so why think a dual board can?


----------



## LAN_deRf_HA (Mar 24, 2011)

So nvidia and ati end up sharing the performance crown? Not sure when that's happened before, or if it even has. The limited safe overclock potential could be an issue if they can't sort out the power limiter in the drivers, but overall this is actually pretty great. Now you can pick the brand you prefer without concern of trading down any performance. If only it were always so even.


----------



## HalfAHertz (Mar 24, 2011)

Harlequin_uk said:


> a single GTX 580 cant take 1.2v very well at all - so why think a dual board can?



it can at stock but i wouldn't try to increase the frequency...


----------



## CDdude55 (Mar 24, 2011)

Looks like a decent card, performance is similar but does seem to lag behind the 6990 at times.

Very unfortunate to here about the power limiting system affecting the card in such a way, even though such a card is already very beefy, so i personally wouldn't worry about not being able to volt the card higher to get more clocks out of it.


----------



## newtekie1 (Mar 24, 2011)

Harlequin_uk said:


> a single GTX 580 cant take 1.2v very well at all - so why think a dual board can?



The GTX580 can't take 1.2v just fine, it was the GTX570 that has the problems.



yogurt_21 said:


> smoke came out of the 24 pin power connector for the mobo, not the card.



It is hard to tell from the video, but at 720p when you slow it down, it definitely comes from the card.  Specifically from the fuse that W1z highlighted in the review, which is located on the back of the PCB right at the corner by the PCI-E plugs, which is right over the 24 pin.  The smoke that seems to rise from the 24 pin is from the little spark that flies down when the "fuse" pops.


----------



## Athlon2K15 (Mar 24, 2011)

mamisano said:


> Why oh why do you insist at reviewing at such ridiculously low resolutions for these monsters? These cards are made to drive multiple monitors or the VERY LEAST a single monitor at 1920x1080.
> 
> Second, how do you calculate performance per watt when you know that the Nvidia cards use power protection to attain lower power usage than the 6990.
> 
> If anyone want's to see a more accurate review, head over to [H]ardocp.



you must be joking hardOCP only wishes their review was half of what w1zzard does.seriously wtf!!


----------



## 20mmrain (Mar 24, 2011)

You know I got crucified when I said on EVGA's forums that "The gtx 590 will be almost as powerful as the 6990....But it will lose to it because of the clocks dropping" I guess I was right.... Man O man it is interesting though when you make a prediction about video cards and people act like your taking away their B-day when your talking about their favorite brand.
It still is a nice card and as long as the price is right.... I could totally see buying one!!!! Because Nvidia still has one more thing that ATI for the most part does not have. Better drivers!

Nice review Keep up the great work!


----------



## newtekie1 (Mar 24, 2011)

mamisano said:


> Why oh why do you insist at reviewing at such ridiculously low resolutions for these monsters? These cards are made to drive multiple monitors or the VERY LEAST a single monitor at 1920x1080.



Man, it is a good thing W1z includes these higher resolutions and breaks down all the important information by them, otherwise you might have just had a point.  But saddly, you don't.



mamisano said:


> Second, how do you calculate performance per watt when you know that the Nvidia cards use power protection to attain lower power usage than the 6990.



ATi cards use power protection as well.  However, that doesn't really matter, because if the power protection does kick in, and it probably doesn't, then the performance would suffer as well, balancing things out.  The lower power consumption numbers would go hand in hand with lower performance numbers as well.

But, like I said, the power protection probably doesn't kick in anyway, since W1z is using the average power consumption number(not the furmark unrealistic peak number) to give his performance per watt numbers.  Since the average number is measured during normal gameplay scenarios, the power protection never really kicks in.



mamisano said:


> If anyone want's to see a more accurate review, head over to [H]ardocp



Accurate?  [H]ardocp?  You mean the site that does minor tweaks that are hard to spot between each graphics card in the same benchmark?  Like enabled 2xAA for one card, while leaving it off for the rest in their Crysis Warhead test?  Or upping the shaders on another card one notch higher, also on Crysis Warhead?


----------



## Assimilator (Mar 24, 2011)

Darn it W1zz, why'd you have to kill the card? Willing to bet that results at 815MHz core would've been quite impressive, although I think the heat and power draw would be as well...

As for everyone complaining that the card blew - there is a reason why a video card's manufacturer's warranty is voided by overclocking, and this is it. I agree 100% that nVidia's (alleged) power-throttling hardware and drivers should've prevented this, and of course they should have done proper internal testing before releasing. (Even a disclaimer, something along the lines of "if you give the cores more than 1v you *will* kill the card", would've been better than nothing).

I think the dual-GPU 560 cards already in development, plus any potential dual-GPU 570s, will be better products than this - IMO nVidia felt compelled to use full-fledged GF110s because ATI's 6990 has standard 6970 GPUs, and it's a decision that's backfired on them. I forsee many RMAs on these cards in the near future and I think nVidia will regret releasing the GTX 590 without making it the performer it should have been. ATI wins this round, no doubt about it.


----------



## MicroUnC (Mar 24, 2011)

overclockers.ru recieves bad sample GTX 590.

No matter how hard they tried they coudn't get past 650Mhz stable.

http://www.overclockers.ru/images/lab/2011/03/23/2/31_590oc.gif

http://www.overclockers.ru/lab/40930/Obzor_videokarty_NVIDIA_GeForce_GTX_590.html#6


----------



## DanTheMan (Mar 24, 2011)

So those video clips of helicopters dumping water to cool down reactors was live video of W1zzard trying to put out the fire in the test lab. Shame on you NVidia for not truly testing a card out fully before letting someone like W1zzard expose your flaws and for the entire world to see it posted on the web. 

I say that Crossfire OC'd 6950 or 6970's would be a killer setup and just skip the 6990 and 590 all together.


----------



## mtosev (Mar 24, 2011)

haha-haha card blew up.  is that covered in the warranty?


----------



## catnipkiller (Mar 24, 2011)

mamisano said:


> Why oh why do you insist at reviewing at such ridiculously low resolutions for these monsters? These cards are made to drive multiple monitors or the VERY LEAST a single monitor at 1920x1080.
> 
> Second, how do you calculate performance per watt when you know that the Nvidia cards use power protection to attain lower power usage than the 6990.
> 
> If anyone want's to see a more accurate review, head over to [H]ardocp.


i smell a fan boy troll


----------



## btarunr (Mar 24, 2011)

There is an Easter egg in the review. First to find gets a cookie.


----------



## mmaakk (Mar 24, 2011)




----------



## N3M3515 (Mar 24, 2011)

newtekie1 said:


> Accurate?  [H]ardocp?  You mean the site that does minor tweaks that are hard to spot between each graphics card in the same benchmark?  Like enabled 2xAA for one card, while leaving it off for the rest in their Crysis Warhead test?  Or upping the shaders on another card one notch higher, also on Crysis Warhead?



The tests are different, hardocp tests *maximun playable settings*(meaning if one card has 2x antialias and the other one doesn't, it is because fps will lower beyond playable, which i think it's obvious to understand, specially for you that are well educated at this stuff), and here w1zz tests apples to apples, same resolution, antialias, etc.

I like both, they're different points of view.


----------



## HalfAHertz (Mar 24, 2011)

btarunr said:


> There is an Easter egg in the review. First to find gets a cookie.



Awww man I went through it twice and didn't find anything. I give up what is it?!


----------



## dccmadams (Mar 24, 2011)

I believe this is at least the 2nd site that had a card fail. I would be concerned about spending my $700 on this.


----------



## Harlequin_uk (Mar 24, 2011)

newtekie1 said:


> The GTX580 can't take 1.2v just fine, it was the GTX570 that has the problems.



then the ppl on xtreme are wrong when they are blowing up 580`s with 1.2v then.....


----------



## cadaveca (Mar 24, 2011)

So, W1zz, oh fearless leader, did the fuse blow, or was it other death?

:shadedshu


I wonder what the limit is.


----------



## W1zzard (Mar 24, 2011)

cadaveca said:


> did the fuse blow, or was it other death?



i measured resistance across the fuse and it was blown. after fixing it the card blew again which means that there was some different issue causing the fuse to blow in the first place as protection.

nobody at asus/nvidia/me knows. i sent nvidia instructions to reproduce but this will take a while. and i'm careful now with my 2nd card


----------



## TheMailMan78 (Mar 24, 2011)

KaBOOM!


----------



## Bjorn_Of_Iceland (Mar 24, 2011)

Going to be a while until we see his SLi review then


----------



## Cliffro (Mar 24, 2011)

VulkanBros said:


> He used 267.71.......



According to the GPU-Z screenshot, he did not use 267.71

http://tpucdn.com/reviews/ASUS/GeForce_GTX_590/images/gpuz_oc.gif






According to the image he posted in the review, it was running 267.52

All of that said, I am still disappointed but only slightly. I never see myself owning a $600+ card, If I were to come close, it would be 2 GTX 570's or 2 HD 6950's, Or future equivalents. But I still need to upgrade the rest of my system before I would worry about it.


----------



## cadaveca (Mar 24, 2011)

W1zzard said:


> i measured resistance across the fuse and it was blown. after fixing it the card blew again which means that there was some different issue causing the fuse to blow in the first place as protection.



Ah, OK. Very odd, to say the least...in the vid ECH linked to, you can see the fuse blow, but it looks to be the opposite one....very interesting indeed.

They dunno whut happened, yet the "fuse" is there.


----------



## mmaakk (Mar 24, 2011)

Is kinda funny that the ASUS 590 box says big and loud "*Voltage Tweak*" up to 50% faster


----------



## yogurt_21 (Mar 24, 2011)

mmaakk said:


> Is kinda funny that the ASUS 590 box says big and loud "*Voltage Tweak*" up to 50% faster



I think that might be the easter egg bta was referring to.


----------



## the54thvoid (Mar 24, 2011)

mmaakk said:


> Is kinda funny that the ASUS 590 box says big and loud "*Voltage Tweak*" up to 50% faster explosion



fixed.


----------



## btarunr (Mar 24, 2011)

Nope, keep trying.


----------



## mmaakk (Mar 24, 2011)

yogurt_21 said:


> I think that might be the easter egg bta was referring to.



bta, where's my cookie?

EDIT:


----------



## W1zzard (Mar 24, 2011)

Cliffro said:


> According to the GPU-Z screenshot, he did not use 267.71
> 
> http://tpucdn.com/reviews/ASUS/GeForce_GTX_590/images/gpuz_oc.gif
> 
> ...



interesting find, must have happened at some point during switching cards back and forth. i retested max oc on the correct driver, got the same results and uploaded the screenshot.

the card blew on another rig which definitely had 267.71 installed (i double checked while on the phone with nvidia)


----------



## erocker (Mar 24, 2011)

One of these things is not like the others. Is that what blew up?


----------



## W1zzard (Mar 24, 2011)

erocker said:


> http://i403.photobucket.com/albums/pp112/erocker414/590wtf.jpg
> 
> One of these things is not like the others. Is that what blew up?



nope, you can look at the closeup shots, they are from the dead card, i couldnt find any visible damage at that time


----------



## DanTheMan (Mar 24, 2011)

btarunr said:


> There is an Easter egg in the review. First to find gets a cookie.




Is it on page four (bottom photo) - the match above the chip


----------



## W1zzard (Mar 24, 2011)

DanTheMan said:


> Is it on page four (bottom photo) - the match above the chip



no easter egg, it has been in the reviews for ages to illustrate size. nice find though, didn't think of it in that context


----------



## yogurt_21 (Mar 24, 2011)

entropy13 said:


> I think that's the first time. The lowest I see are 8.0's for the GTX 480 and the HD 3870X2.



yeah you're right, closet ones are the x550 7.5 and x1300 7.8 neither of which are flagships. 

so it is the first time a flagship has scored this low on tpu.


----------



## Kreij (Mar 24, 2011)

Nope, that's Asus' tag line. They have used it for awhile.


----------



## yogurt_21 (Mar 24, 2011)

btarunr said:


> Nope, keep trying.








pic of crysis 2 with reference to dx 11?



Kreij said:


> Nope, that's Asus' tag line. They have used it for awhile.



yeah figured that out, still odd


off topic ff4 isn't liking this whole multi quote thing.


----------



## DanTheMan (Mar 24, 2011)

W1zzard said:


> no easter egg, it has been in the reviews for ages to illustrate size. nice find though, didn't think of it in that context



Well is it Page 21, that the Peak Power (356 W) is higher than the Maximum Power (334 W)


----------



## Zubasa (Mar 24, 2011)

TheMailMan78 said:


> KaBOOM!
> 
> http://lh5.ggpht.com/_n3LFno1UoeI/SdQwVHm07MI/AAAAAAAAI64/lo_LYSuc0tU/happy.jpg


Don't forget Bada*BOOM*. :shadedshu


----------



## yogurt_21 (Mar 24, 2011)

DanTheMan said:


> Well is it Page 21, that the Peak Power (356 W) is higher than the Maximum Power (334 W)



interesting, must have been swapped inadvertently


----------



## cadaveca (Mar 24, 2011)

yogurt_21 said:


> interesting, must have been swapped inadvertently



Uh, no, you  guys just looked at pics, didn't read the text that explains why those figures are where they are. 

Rather than telling you why, I'll let you go and check it out for yourself


----------



## MicroUnC (Mar 24, 2011)

Can my rig feed this card?

<= Specs


----------



## H82LUZ73 (Mar 24, 2011)

SammyHayabuza said:


> *The Big winner is the 6970 in Crossfire!!!*


----------



## DanishDevil (Mar 24, 2011)

MicroUnC said:


> Can my rig feed this card?
> 
> <= Specs



I would recommend upping your power supply if you're going to be overclocking either the card or the CPU.


----------



## DanTheMan (Mar 24, 2011)

cadaveca said:


> Uh, no, you  guys just looked at pics, didn't read the text that explains why those figures are where they are.
> 
> Rather than telling you why, I'll let you go and check it out for yourself



Actually I did read the text above the graphs and did not "just looked at the pics" but the wording between the sentence and what is used in the graphs is not the same so it was a little confusing.


----------



## yogurt_21 (Mar 24, 2011)

cadaveca said:


> Uh, no, you  guys just looked at pics, didn't read the text that explains why those figures are where they are.
> 
> Rather than telling you why, I'll let you go and check it out for yourself


but I likes pictures. lol

ah



> Due to NVIDIA's power limiting system being active in our "Furmark Maximum Test", and no way to disable that feature, this data single point should not be considered the maximum possible power consumption as the normal 3D Maximum test show higher power draw than 334 W.


----------



## MicroUnC (Mar 24, 2011)

DanishDevil said:


> I would recommend upping your power supply if you're going to be overclocking either the card or the CPU.



I'll be oc'ing only CPU!


----------



## temp02 (Mar 24, 2011)

How embarrassing... and thats really all I have to say about this fai... erm... card.


----------



## DanishDevil (Mar 24, 2011)

MicroUnC said:


> I'll be oc'ing only CPU!



I would still suggest something closer to 1000W. My personal favorites are Corsair or Enermax. Feel free to make a new thread, though.


----------



## MicroUnC (Mar 24, 2011)

DanishDevil said:


> I would still suggest something closer to 1000W. My personal favorites are Corsair or Enermax. Feel free to make a new thread, though.



nah.... that's it!

thanks man


----------



## Animalpak (Mar 24, 2011)

soo rumors about a revision ??


----------



## H82LUZ73 (Mar 24, 2011)

yogurt_21 said:


> smoke came out of the 24 pin power connector for the mobo, not the card.
> 
> this card seems to be bringing back alot of the fermi original issues. Now don't get me wrong, my 480's run just fine but we should have had the 580 the first time around. It almost seems like nv needs to bring out a revision for the 590 already. perhaps fixing the voltage limiter and upping the clocks, seems like at 700MHZ this card would actually beat the 6990.
> 
> all in all it has been a long time since I've seen W1z put up a 7.0 on a flagship card.



No it came from one of the 8 pin power plugs on the card. if it was the Mobo the cpu fan and memory all that area would be like Wizz`s 590 up in smoke.


----------



## DanishDevil (Mar 24, 2011)

The video linked from the Swedish overclocking site also suggests that the resistor blew, which sits right next to the 8-pin plugs. W1zz took it one step further and blew out a mosfet as well, though.


----------



## alexsubri (Mar 24, 2011)

*Wizz's PSU before testing GTX 590*






*Wizz's PSU after testing GTX 590*


----------



## Frick (Mar 24, 2011)

entropy13 said:


> They have a safety feature that didn't work. That was the point, how it happened is inconsequential. It's like saying that an airbag not working for a car running at 200mph is not an issue because it happened at 200mph, and not all cars reach 200mph.



No it isn't. I get your point though, and I still don't see it as that bad.


----------



## MicroUnC (Mar 24, 2011)

What happaned to Nvidia could happen to anyone, This doesn't make GTX 590 a bad card, or a failure for the company. Besides GTX 590 was produced & will be shipped in limited quantities.

Intel had faulty P67 chipset so what?

And i admit that ATi is a winner here. But i'am still getting the card! cus i support them.

And both company's are great.


----------



## H82LUZ73 (Mar 24, 2011)

Wizz could the problem be a bad batch of mosfet`s and resistors ? Like they had to use old stock from somewhere ? If that is the case would ASUS and Nvidia wait until Japan is back up and running ...minus the extra radiation power ? Very interesting card to say the least ,It performs to what i was expecting but was also expecting it to overclock better.


----------



## Kreij (Mar 24, 2011)

@W1zz ... Why on the Power Consumption page does the text for idle still say "Windows *Vista* Aero sitting at desktop" ?
Should be 7, no?


----------



## newtekie1 (Mar 24, 2011)

N3M3515 said:


> The tests are different, hardocp tests *maximun playable settings*(meaning if one card has 2x antialias and the other one doesn't, it is because fps will lower beyond playable, which i think it's obvious to understand, specially for you that are well educated at this stuff), and here w1zz tests apples to apples, same resolution, antialias, etc.
> 
> I like both, they're different points of view.



The problem is that they are inconsistant even if we assume the reasoning is true.

Again, I'll use the Crysis Warhead benchmark.

For the GTX580 SLi, they raised the shaders to Enthusiast.  But for the HD6990 they left the shaders at Gamer and instead raised the AA.  If GTX580 SLI can run with enthusiast shaders, and get 30FPS which they consider "playable" then the HD6990 should have been able to do the same, considering they raised AA with that card and it still have 35FPS.  Why didn't they increase the shaders instead of AA?


----------



## Deleted member 84940 (Mar 24, 2011)

1.2v is taking he pee abit aint it?


----------



## TheMailMan78 (Mar 24, 2011)

alexsubri said:


> *Wizz's PSU before testing GTX 590*
> http://www.youfrenzy.com/wp-content/uploads/2010/11/Nuclear-Plant.jpg
> 
> *Wizz's PSU after testing GTX 590*
> http://guyananewstoday.com/wp-content/uploads/2011/03/Explosion-at-Japan-nuclear-plant.jpg



Thats in horrible taste but damn its funny!


----------



## laszlo (Mar 24, 2011)

nice review;i suspect nvidia clocked the cards to this "stock" speed just to avoid the kind of damage occurred  during oc.


----------



## Frick (Mar 24, 2011)

laszlo said:


> nice review;i suspect nvidia clocked the cards to this "stock" speed just to avoid the kind of damage occurred  during oc.



But there was no damage when overclocked, it occured when changing voltage.


----------



## W1zzard (Mar 24, 2011)

Kreij said:


> @W1zz ... Why on the Power Consumption page does the text for idle still say "Windows *Vista* Aero sitting at desktop" ?
> Should be 7, no?



wow .. you're the first to notice this since i changed my test system to windows 7 .. fixed! thanks



Dumachi said:


> 1.2v is taking he pee abit aint it?



i dont consider it an unrealistic setting, people are going to tweak this, and with power capping technology they (and i) think "i'll be safe anyway"


----------



## laszlo (Mar 24, 2011)

Frick said:


> But there was no damage when overclocked, it occured when changing voltage.



lol and why u change voltage if not for oc?


----------



## D4S4 (Mar 24, 2011)

hahahhaha i loled so hard when i read *boom*


----------



## 6626 (Mar 24, 2011)

Seems like it is slower than 2x570 SLI and probably 2x560 SOC SLI but costs more.

No point in 6990 and 590 cards unless you plan to run two of them for quad SLI.


----------



## MxPhenom 216 (Mar 24, 2011)

guys out of the the million nvidia is making. just because this one card blew up on wizz doesnt mean every other card will. Ill take the 590 over the 6990 in the sacrafice of the 2-3% for the cooler running card, a little bit more efficient, and quieter card.


----------



## rflair (Mar 24, 2011)

nvidiaintelftw said:


> guys out of the the million nvidia is making. just because this one card blew up on wizz doesnt mean every other card will. Ill take the 590 over the 6990 in the sacrafice of the 2-3% for the cooler running card, a little bit more efficient, and quieter card.



The 6990 is the more efficient and cooler card, its fan however makes it noisier.


----------



## N3M3515 (Mar 24, 2011)

newtekie1 said:


> The problem is that they are inconsistant even if we assume the reasoning is true.
> 
> Again, I'll use the Crysis Warhead benchmark.
> 
> For the GTX580 SLi, they raised the shaders to Enthusiast.  But for the HD6990 they left the shaders at Gamer and instead raised the AA.  If GTX580 SLI can run with enthusiast shaders, and get 30FPS which they consider "playable" then the HD6990 should have been able to do the same, considering they raised AA with that card and it still have 35FPS.  Why didn't they increase the shaders instead of AA?



Because increasing the shaders would have caused a larger performance hit than raising AA(as i understand they have diff architectures amd/nvidia, so what affects one, may not affect the other.).
imho('cause i didn`t do the tests)


----------



## mlee49 (Mar 24, 2011)

So no Quad SLI review? 

Nice job Wiz, I personally enjoy it when a well seasoned veteran blows up some expensive gear.


----------



## Kreij (Mar 24, 2011)

nvidiaintelftw said:


> guys out of the the million nvidia is making. just because this one card blew up on wizz doesnt mean every other card will. Ill take the 590 over the 6990 in the sacrafice of the 2-3% for the cooler running card, a little bit more efficient, and quieter card.



You are correct in saying that not all cards may fry, but remember that the people buying the highest end stuff are the speed freaks. They are the most likely to overclock and overvolt things to get max performance. 
If there is an inherent weakness in the circuitry or drivers that can cause the card to nuke, these are the people who will find it ... and not be pleased when they brick their cards.


----------



## W1zzard (Mar 24, 2011)

mlee49 said:


> So no Quad SLI review?



no quad sli drivers


----------



## MxPhenom 216 (Mar 24, 2011)

Kreij said:


> You are correct in saying that not all cards may fry, but remember that the people buying the highest end stuff are the speed freaks. They are the most likely to overclock and overvolt things to get max performance.
> If there is an inherent weakness in the circuitry or drivers that can cause the card to nuke, these are the people who will find it ... and not be pleased when they brick their cards.



well ive been reading up on other threads on other forums and people have been saying reviews that didnt get asus 590s were all completely fine. and other people who got asus cards for reviews probalby didnt ramp up on the fan speed.


----------



## mlee49 (Mar 24, 2011)

W1zzard said:


> no quad sli drivers



No two working cards either 


Quad SLI will be up in a week, quote me.


----------



## qubit (Mar 24, 2011)

I'd take this card over a 6990 and here's why:

It's faster.
It's considerably quieter.
One card failed on overvolting. And so? Defective cards happen, that's what a warranty is for. If it's a design glitch as suggested in the review, they'll recall them and issue ones with improved power circuits. All in all, this incident isn't such a big deal. I'm sure that idiot Charlie Demerjian will jump on it as "proof" that he was right all along though.  <sigh>

I think this is the real killer ace up its sleave: The 6990 is running at nearly full speed at stock. From what I can see, even at full speed with the BIOS switch set to turbo it doesn't win, or just equals. However, the 590 wins the benchies when _considerably_ underclocked at stock speed. This is unheard of in performance races. How much untapped performance does the 590 actually have then? Significant, I reckon. The 6990 can't overclock much though, can it?

I'll bet you non-reference boards with non-reference coolers come out that make this beast run at full GTX 580 speeds and beyond. The 6990 will then be a distant second in the performance race. Yes, it will use enormous amounts of power. However, that's to be expected for a card that's gonna be used by top-end enthusiasts. These guys are gonna have the rigs to cope with such a card.

Finally, as I've said in a couple of other places now, both these advanced GPU architectures will only fully realize their potential when they're made with 28nm technology. The current 40nm tech has run into the power wall now.


----------



## Frizz (Mar 24, 2011)

nvidiaintelftw said:


> well ive been reading up on other threads on other forums and people have been saying reviews that didnt get asus 590s were all completely fine. and other people who got asus cards for reviews probalby didnt ramp up on the fan speed.





TPU 1 dead card
Dem Crazy Swedes.. 1 dead card
Lab501 2 dead cards (non -OC'd, default)
T-Break 1 dead card..

Not all were from the same manufacturers, there is obviously a design flaw that needs attention.


----------



## Kreij (Mar 24, 2011)

W1zzard said:


> no quad sli drivers



I would've thought you had written your own by now.


----------



## qubit (Mar 24, 2011)

randomflip said:


> TPU 1 dead card
> Dem Crazy Swedes.. 1 dead card
> Lab501 2 dead cards (non -OC'd, default)
> T-Break 1 dead card..
> ...



Ouch, that's a lot of dead cards. I can feel that product recall coming on.


----------



## Maban (Mar 24, 2011)

I no longer want one...


----------



## HalfAHertz (Mar 24, 2011)

qubit said:


> I'd take this card over a 6990 and here's why:
> 
> It's faster.
> It's considerably quieter.
> ...



or better yet choose neither because both dual GPUs of this gen have their own little flaws and quirks.


----------



## qubit (Mar 24, 2011)

HalfAHertz said:


> or better yet choose neither because both dual GPUs of this gen have their own little flaws and quirks.



You mean wait for 28nm? Yeah, I'll roll with that.


----------



## yogurt_21 (Mar 24, 2011)

qubit said:


> You mean wait for 28nm? Yeah, I'll roll with that.



well yeah I'm pretty sure you can *suffer* along with your 580 until then lol


----------



## Fourstaff (Mar 24, 2011)

Kreij said:


> I would've thought you had written your own by now.



There are plenty for W1z to do you know, reviews to write, site to manage, Mailmans to ban, and also he is working at bankrupting fraps with TPU Capture. Also, no point writing one since that he killed one of the pair he received.


----------



## qubit (Mar 24, 2011)

yogurt_21 said:


> well yeah I'm pretty sure you can *suffer* along with your 580 until then lol



It's an excellent card. It's really fast and doesn't make too much fan noise about it, either.  40nm tech is good enough for one GPU on a card.

I avoid getting dual GPU cards anyway, because of their inherent issues, such as scaling, only half the RAM can be seen and high cost.


----------



## MxPhenom 216 (Mar 24, 2011)

yeah nvidia needs to recall and fix the VRMs and add more phases. 4+1 phase power is not enough for 150-200w GPUs


----------



## jamsbong (Mar 24, 2011)

The GTX590 is a real disappointment. It just lack speed. On hindsight, I realise the 6990 is a great achievement.

One thing you'll notice is that the 590 runs hotter than the 6990 which led to a noisier 6990. If you need to o/c these cards, you'll want to cool it really well to reduce the power consumption (leakage). I think a watercooled version of the 590 vs watercooled 6990 will see that the 590 winning.

As I am not going to spend that much money for these card. I won't even bother with either of them. I rather wait for a GPU 2 generations down the track or get a SLI/crossfire setups.

If I am going to get these cards. I will water cool it and overclock it. That will release the card's full potential and thus making worthy.


----------



## bogie (Mar 24, 2011)

You wanna graphics card with a bang for you buck?! Then buy the 590!!

If I could choose I think i'd take the AMD 6990 with a water block or after market cooler. But its out of my price range anyway!

2 modded 6950's is the way forward for now me thinks!

2 x modded HD6950's = £420
2 x HD6970's              = £560
2 x 580's                    = £710
1 x 6990                     = £545
1 x 590                       = £570


----------



## MxPhenom 216 (Mar 24, 2011)

jamsbong said:


> The GTX590 is a real disappointment. It just lack speed. On hindsight, I realise the 6990 is a great achievement.
> 
> One thing you'll notice is that the 590 runs hotter than the 6990 which led to a noisier 6990. If you need to o/c these cards, you'll want to cool it really well to reduce the power consumption (leakage). I think a watercooled version of the 590 vs watercooled 6990 will see that the 590 winning.
> 
> ...



uhh in all the reviews ive seen it shows the 590 running cooler on the stock fan profile.


----------



## erocker (Mar 24, 2011)

nvidiaintelftw said:


> yeah nvidia needs to recall and fix the VRMs and add more phases. 4+1 phase power is not enough for 150-200w GPUs



Will never happen. Never, ever. I doubt they are making much money on these cards to begin with.


----------



## cadaveca (Mar 24, 2011)

erocker said:


> Will never happen. Never, ever. I doubt they are making much money on these cards to begin with.






No, gigabyte is making NOTHING with an $1100 card..nothing at all


----------



## MxPhenom 216 (Mar 24, 2011)

erocker said:


> Will never happen. Never, ever. I doubt they are making much money on these cards to begin with.



so much for the best card they have ever made


----------



## qubit (Mar 24, 2011)

erocker said:


> Will never happen. Never, ever. I doubt they are making much money on these cards to begin with.



I wouldn't bank on that. The kind of people rich enough to buy a couple of them are more likely to file a lawsuit if it breaks and they don't honour the warranty. Or if they do, the replacements keep breaking the same way. Or sometimes too much bad publicity will do it as their sales start to tank.

It's all grey areas here though, so I'm not going to bet on any outcome.


----------



## MxPhenom 216 (Mar 25, 2011)

I would like to see what MSI and Asus do with their Twin Frozr/Lightning cards and DirectCU non reference cards. I have a feeling they will beaf up on the VRMs and circuitry


----------



## mlee49 (Mar 25, 2011)

Looks like Nvidia is addressing the overvolting 'problem'

http://nvidia.custhelp.com/cgi-bin/nvidia.cfg/php/enduser/std_adp.php?p_faqid=2947


----------



## MxPhenom 216 (Mar 25, 2011)

mlee49 said:


> Looks like Nvidia is addressing the overvolting 'problem'
> 
> http://nvidia.custhelp.com/cgi-bin/nvidia.cfg/php/enduser/std_adp.php?p_faqid=2947



whats the max overclock then like 675??


----------



## CDdude55 (Mar 25, 2011)

mlee49 said:


> Looks like Nvidia is addressing the overvolting 'problem'
> 
> http://nvidia.custhelp.com/cgi-bin/nvidia.cfg/php/enduser/std_adp.php?p_faqid=2947



They basically just told everyone to suck it up and run the cards at bone stock unless you're throwing a water block on it. I hope the non reference models will have stronger VRM circuitry as well as better coolers for those that want to push their 590.


----------



## MxPhenom 216 (Mar 25, 2011)

CDdude55 said:


> They basically just told everyone to suck it up and run the cards at bone stock unless you're throwing a water block on it. I hope the non reference models will have stronger VRM circuitry as well as better coolers for those that want to push their 590.



Asus DIRECTCU and MSI Lightning!


----------



## WarEagleAU (Mar 25, 2011)

YES the 6990 remains king!! HOOHAH! ok me being retarded aside, I really expected more from the card. Huge OC potential even with the crashes. I saw the HD 6990 for like  750 on the egg. I expect this to be the same


----------



## Sasqui (Mar 25, 2011)

I just read the whole review and the overclock results.







Magic smoke...

W1zzard, you didn't say... If it wasn't Furmark that was used when the card fried, what was it???


----------



## MxPhenom 216 (Mar 25, 2011)

WarEagleAU said:


> YES the 6990 remains king!! HOOHAH! ok me being retarded aside, I really expected more from the card. Huge OC potential even with the crashes. I saw the HD 6990 for like  750 on the egg. I expect this to be the same



its actually $699 on the egg for the Asus card. and the 6990 remains king by a massive 2-3% Even then they trade blows so i wouldnt say the 6990 is still king. its a stalemate at this point till the 28nm GPUs come out


----------



## btarunr (Mar 25, 2011)

*any overclocking/overvoltaging can void your manufacturer's product warranty.*

Do ya think? 

Now NV is hiding behind the warranty void bullshit instead of admitting this card is unfit to be an "enthusiast" card, and is more of a something for people too faint hearted for overclocking that they'll pay for their overclocking-incompetence with buying a faster card that doesn't can't overclock.


----------



## TheMailMan78 (Mar 25, 2011)

Animalpak said:


> I hope that you are smart enough to accept criticism
> 
> Why nobody notice and say the drivers are young and premature ...
> 
> ...



Someones butt hurt? Its OOOOTAY!


----------



## MxPhenom 216 (Mar 25, 2011)

btarunr said:


> http://img.techpowerup.org/110324/bta9886.jpg
> 
> *any overclocking/overvoltaging can void your manufacturer's product warranty.*
> 
> ...



the enthusiast now will either get a 6990 or wait for non reference GTX590 cards to release knowing of these issues



Animalpak said:


> I hope that you are smart enough to accept criticism
> 
> Why nobody notice and say the drivers are young and premature ...
> 
> ...



sorry that you lost your boner over this review, however honestly dont flame wizz about something he forgot to mention. These issues on the 590 about the VRM/circuitry cannot really be fixed with drivers, its a hardware issue. 

Wizz did a good job on the review, i guess some people can't wrap there head around the idea that Nvidia produces cards that can sometimes fail.


----------



## erocker (Mar 25, 2011)

As an enthusiast I love waiting. I'm going to wait for at least three years, then I'll get a card that makes these look like a low end mobile GPU.


----------



## btarunr (Mar 25, 2011)

yogurt_21 said:


> http://www.techpowerup.com/reviews/ASUS/GeForce_GTX_590/images/package2.jpg
> 
> pic of crysis 2 with reference to dx 11?



Nope. Keep trying. 

First spotter gets a custom title. If you already have a custom title, you can animate your avatar as long as it doesn't pose an epileptic hazard.


----------



## newtekie1 (Mar 25, 2011)

btarunr said:


> http://img.techpowerup.org/110324/bta9886.jpg
> 
> *any overclocking/overvoltaging can void your manufacturer's product warranty.*
> 
> ...



No, they are giving the standard line that is, and always has been, true.  People have gotten to comfortable with overclocking and overvolting, and have gotten to the point that they expect to be able to just max out the voltage on a graphics card, and not have any problems.  Well, that isn't the case.

If you are going to overclock, there is a risk.  If you are going to overvolt, there is a huge risk.  I have always worked under the rule, if I can't afford to replace it, I don't overvolt it.



CDdude55 said:


> They basically just told everyone to suck it up and run the cards at bone stock unless you're throwing a water block on it. I hope the non reference models will have stronger VRM circuitry as well as better coolers for those that want to push their 590.



Or more accurately, they told everyone what should have been obvious from the beginning.

Personally, I'd leave the voltage at stock, and be happy with the fact that it overclocks to GTX580 clocks.


----------



## Sasqui (Mar 25, 2011)

btarunr said:


> Nope. Keep trying.
> 
> First spotter gets a custom title. If you already have a custom title, you can animate your avatar as long as it doesn't pose an epileptic hazard.



"... voltage tweek... up to 50%* faster clock speed ..."

... *is that with or without popcorn?


----------



## TheMailMan78 (Mar 25, 2011)

btarunr said:


> Nope. Keep trying.
> 
> First spotter gets a custom title. If you already have a custom title, you can animate your avatar as long as it doesn't pose an epileptic hazard.



hint?


----------



## ShogoXT (Mar 25, 2011)

Im curious about this easter egg as well. Dont even know what to look for.


----------



## JATownes (Mar 25, 2011)

The first pic on the first page of the review is NOT a pic of the ASUS card.  IS that it?


----------



## ShogoXT (Mar 25, 2011)

JATownes said:


> The first pic on the first page of the review is NOT a pic of the ASUS card.  IS that it?



I figured they always have a reference design pic. 

Did see this on that page: 


> We have today with us a GeForce GTX 590 by ASUS, which sticks to NVIDIA's reference design, and combines it with ASUS' high quality packaging and bundle. ASUS' Voltage Tweak technology and SmartDoctor software that lets you up voltage is very much part of the package, ready to enhance your GTX 590 with overclocking. *Boy oh boy.*


----------



## JATownes (Mar 25, 2011)

ShogoXT said:


> I figured they always have a reference design pic.



Yea, I am just grasping at straws trying to figure this out now.  :shadedshu

IS it the freaky weird Call of Juarez 2 1680x1050 score.  That's all wacky.


----------



## ShogoXT (Mar 25, 2011)

Im not surprised, I quit using crossfire and such for this exact reason. Power issues, and non linear performance.

Also I agree with the final Crysis 2 comment on the review. We need to be taken seriously.


----------



## JATownes (Mar 25, 2011)

> The overclocks of our card are 775 MHz core (26% overclock) and 1080 MHz Memory (17% overclock).



This quote does not match the screenshot provided.  Screenshot reflects 1000Mhz memory, not 1080Mhz.  


And why does the clock profiles reflect memory speed at 1710Mhz?





Any of these the proposed Easter Egg?  Those are all the guesses I have.


----------



## Deleted member 67555 (Mar 25, 2011)

It's the pop in this pic
http://www.techpowerup.com/reviews/ASUS/GeForce_GTX_590/images/package2.jpg
at the bottom left by the certificates it says POP either that or it says PnP but pop would be funny as hell


----------



## JATownes (Mar 25, 2011)

jmcslob said:


> It's the pop in this pic
> http://www.techpowerup.com/reviews/ASUS/GeForce_GTX_590/images/package2.jpg
> at the bottom left by the certificates it says POP either that or it says PnP but pop would be funny as hell



That is on all the boxes that I have seen so far.    But I think it is funny that all the Asus GTX590s say "Pop" right on the box.  At least they are warning you.


----------



## Deleted member 67555 (Mar 25, 2011)

I'm rereading this review and it's actually funny to go over it again...
It's funny that W1z used a match to show the size of the GPU....
http://www.techpowerup.com/reviews/ASUS/GeForce_GTX_590/images/gpu2.jpg

Do all dual GPU cards read the shader count as a single GPU?


----------



## Akrian (Mar 25, 2011)

Well. That was a great review. Although I'm scratching my head over the fact that Guru3d.com have their gtx 590 SLI review up too. how did the manage to do that ?

Anyway. I'm glad that nvidia failed this time around,two times in a row to be exact ( last year 5970 had the crown, this year 6990). As mr. Torento said in Fast n Furious - it doesn't matter if you win by 100 miles on 1 inch - a victory is a victory. And based on all reviews that are up atm 6990 looks to be winning by that "1 inch".

What killed me was the fact that Nvidia made a video on youtube announcing that they will pull out a monster card, that they've been working on for 2 years, and that that's their best creaton yet. 
Nvidia Teaser

P.S. I'm not and ATI fanboy, although I admit that I have a soft spot for "Red" team, my current rig has 580 in SLI and it rocks  ( well it does till you try to play RIFT on ultra settings with all bars set to max...and have 15 fps during massive rift invasions with 100+ players running around ).


----------



## XxAtlasxX (Mar 25, 2011)

I found an interesting video about the issue :shadedshu

http://www.youtube.com/watch?v=sRo-1VFMcbc&feature=player_embedded


----------



## Akrian (Mar 25, 2011)

Edit.
Nevermind I've finally looked at 720p -> it is the card. lol


----------



## fullinfusion (Mar 25, 2011)

Good going Green Goblin!!!! Oh and great review as always W1zz 

The Green team is rushing again to catch up with AMD (ATI) :shadedshu...


----------



## fullinfusion (Mar 25, 2011)

XxAtlasxX said:


> I found an interesting video about the issue :shadedshu
> 
> http://www.youtube.com/watch?v=sRo-1VFMcbc&feature=player_embedded



WOW is all I gota say!


----------



## Goodman (Mar 25, 2011)

XxAtlasxX said:


> I found an interesting video about the issue :shadedshu
> 
> http://www.youtube.com/watch?v=sRo-1VFMcbc&feature=player_embedded



All ready posted by Eastcoasthandle on page 3 so please try to go true all the pages before posting...

http://www.techpowerup.com/forums/showpost.php?p=2234319&postcount=66


----------



## Airbrushkid (Mar 25, 2011)

From what I am reading on other sites the GTX 590 is under clocked. So I do beleive it is faster then the 6990. And the power draw from other sites did a comparison between the GTX 590 and the GTX 570 in SLI. The GTX 570's In SLI took more power then the GTX 590. I believe the GTX 590 uses 2 GTX 570 chips.

One other thing only one other site had a problem with the fire thing. So out of all the other reviews only 2 had the problem.


----------



## HalfAHertz (Mar 25, 2011)

Airbrushkid said:


> From what I am reading on other sites the GTX 590 is under clocked. So I do beleive it is faster then the 6990. And the power draw from other sites did a comparison between the GTX 590 and the GTX 570 in SLI. The GTX 570's In SLI took more power then the GTX 590. I believe the GTX 590 uses 2 GTX 570 chips.
> 
> One other thing only one other site had a problem with the fire thing. So out of all the other reviews only 2 had the problem.



Nope on all of the above. Go back and do more reading 

Ok I am back in the game. Is this image the easter egg? Because we have never before seen a glimpse of Wiz's test system in a review.


----------



## DeathByTray (Mar 25, 2011)

Well guys, you can mock the card all you want but fact is, the GTX 590 gives the best bang for the buck.


----------



## Airbrushkid (Mar 25, 2011)

You don't know> That is what the new GTX 590's have now they light up when there's not enough power.





HalfAHertz said:


> Nope on all of the above. Go back and do more reading
> 
> Ok I am back in the game. Is this image the easter egg? Because we have never before seen a glimpse of Wiz's test system in a review.
> http://tpucdn.com/reviews/ASUS/GeForce_GTX_590/images/glow_small.jpg


----------



## Fourstaff (Mar 25, 2011)

DeathByTray said:


> Well guys, you can mock the card all you want but fact is, the GTX 590 gives the best bang for the buck.



Its quite hard to detect sarcasm in the internet, so I think we all can do with less. And if its not, what kind of pot are you smoking to think that GTX590 gives the best bang for buck?


----------



## Bjorn_Of_Iceland (Mar 25, 2011)

JATownes said:


> Yea, I am just grasping at straws trying to figure this out now.  :shadedshu
> 
> IS it the freaky weird Call of Juarez 2 1680x1050 score.  That's all wacky.
> 
> http://img.techpowerup.org/110324/Call of Juarez 2.jpg


Probably a divisible by 8 bug. Normally happens on console ports. Bulletstorm had this effect wherein it runs crap on 1680x1050. Because 1050 is not divisible by 8. Making a custom resolution of 1680x1048 would probably fix this.


----------



## H82LUZ73 (Mar 25, 2011)

Easter Egg Wizz got a new display toy ? 

	LG Flatron W3000H 30" 2560x1600 sponsored by Zotac this mean we get a GPUZ contest too ?
 Also drivers had nothing to do with the card going pop he used 267.71 driver.


----------



## DeathByTray (Mar 25, 2011)

Fourstaff said:


> Its quite hard to detect sarcasm in the internet, so I think we all can do with less. And if its not, what kind of pot are you smoking to think that GTX590 gives the best bang for buck?


It's called pun.
Best *bang *for buck

Didn't think I'd have to explain that one.


----------



## Deleted member 67555 (Mar 25, 2011)

Fourstaff said:


> Its quite hard to detect sarcasm in the internet, so I think we all can do with less. And if its not, what kind of pot are you smoking to think that GTX590 gives the best bang for buck?



Northern lights? I dunno 
I read that sarcasm from across the obvious room...
I'll always remember this card as the Bang for your buck LuLzzz.


----------



## Jonap_1st (Mar 25, 2011)

bring those goddamn 460 2win already !!


----------



## wolf (Mar 25, 2011)

beast of a card but it seems so limited for such a high end product. I'd like to see non reference cards with more power phases and perhaps even another pci-e power connecter just to be sure.

for the time being though, I think there is a lot of potential for a GF114 SLi on a stick card, preferably with 2gb each. from a few reviews where they are also tested, theyre not far behind the 590 in stock trim, clock them @ 900mhz, give them 2gb each and it will likely still consume less power.


----------



## oblivionfall (Mar 25, 2011)

*Bada Boom!*

 Milla Jovovich telling to Bruce Willis how the Nvidia GTX 590 debut-test went:
http://www.youtube.com/watch?v=j8WLYzA0lCs


----------



## Fourstaff (Mar 25, 2011)

DeathByTray said:


> It's called pun.
> Best *bang *for buck
> 
> Didn't think I'd have to explain that one.



All puns and sarcasm are hard to detect in the interwebs, so please add a /pun or /sarcasm tag to it next time for the benefit of linguistically handicapped people


----------



## rflair (Mar 25, 2011)

Couple of thermal pics.

Nvidia 590

http://www.hardware.fr/articles/825-4/dossier-nvidia-repond-amd-avec-geforce-gtx-590.html

AMD 6990

http://www.hardware.fr/articles/747-22/maj-dossier-cartes-graphiques-degagement-thermique.html


----------



## Harlequin_uk (Mar 25, 2011)

so , you shove 1.2v up its ass , it blows up and then you blame nvidia? 1.2 is alot for a stock 580 (they blow up on XS with that) but again your blaming nv? bs.


----------



## Mussels (Mar 25, 2011)

Harlequin_uk said:


> so , you shove 1.2v up its ass , it blows up and then you blame nvidia? 1.2 is alot for a stock 580 (they blow up on XS with that) but again your blaming nv? bs.



the card is advertised as that being a supported feature. Also, nvidia claimed to have drivers with power throttling to prevent that exact situation.


----------



## brendonmc (Mar 25, 2011)

qubit said:


> I don't have time to read the review while I'm at work, so just from the conclusion it doesn't look too good, does it? It was only worth a 7 rating, too. Shame.
> 
> And oh god, this looks damning:
> 
> ...



It looks like graphics card performance is being held back more by game developers.  Probably be able to dust off the old Geforce 4200 soon.....if only it were PCI-E!!

Interesting too that these cards don't overload one PCI-E slot.


----------



## brendonmc (Mar 25, 2011)

Or do they???  Perhaps this is why they are so close....


----------



## Bjorn_Of_Iceland (Mar 25, 2011)

Good thing theyve migrated from the TnT naming, and 'detonator' drivers.

I guess all those who preordered will have a _blast_ with their new toy.


----------



## qubit (Mar 25, 2011)

brendonmc said:


> It looks like graphics card performance is being held back more by game developers.  Probably be able to dust off the old Geforce 4200 soon.....if only it were PCI-E!!
> 
> Interesting too that these cards don't overload one PCI-E slot.





brendonmc said:


> Or do they???  Perhaps this is why they are so close....



You're right, game devs are holding back PC graphics, unfortunately. Also, the 40nm process is running into the brick wall of power dissipation and heat. We need 28nm asap.

And no, these cards cannot overload the PCI-E slot, or they'd burn out your mobo. Therefore, they take all the extra juice from the PCI-E connectors.

Oh and welcome to TPU.


----------



## W1zzard (Mar 25, 2011)

Harlequin_uk said:


> so , you shove 1.2v up its ass , it blows up and then you blame nvidia? 1.2 is alot for a stock 580 (they blow up on XS with that) but again your blaming nv? bs.



i run a lot more voltage through other cards during testing and this never happens .. on gtx 590 with nvidia's power capping feature which is designed for that purpose it doesnt work. i think it's my obligation to tell you, no ?


----------



## MxPhenom 216 (Mar 25, 2011)

W1zzard said:


> i run a lot more voltage through other cards during testing and this never happens .. on gtx 590 with nvidia's power capping feature which is designed for that purpose it doesnt work. i think it's my obligation to tell you, no ?



could that be fixed through drivers?? i feel like the 590 could be an amazing card if the VRM we actually good and if they didnt run the 2 15--200w GPUs on 4+1 phase power. thats obviously not enough. they could have added half an inch or so of PCB and added some phases. Price the card a tiny bit higher, but then it would be king.


----------



## Airbrushkid (Mar 25, 2011)

Why have others not had the same thing happen during there testing?


----------



## TheMailMan78 (Mar 25, 2011)

nvidiaintelftw said:


> could that be fixed through drivers?? i feel like the 590 could be an amazing card if the VRM we actually good and if they didnt run the 2 15--200w GPUs on 4+1 phase power. thats obviously not enough. they could have added half an inch or so of PCB and added some phases. Price the card a tiny bit higher, but then it would be king.



Its not a driver issue. Its a design flaw.


----------



## qubit (Mar 25, 2011)

nvidiaintelftw said:


> could that be fixed through drivers?? i feel like the 590 could be an amazing card if the VRM we actually good and if they didnt run the 2 15--200w GPUs on 4+1 phase power. thats obviously not enough. they could have added half an inch or so of PCB and added some phases. Price the card a tiny bit higher, but then it would be king.



Yeah, my Gigabyte GTX 560 Super Overclock I recently bought (and returned) had 7 or 8 phases onboard and that card is very much weaker than the 590. It looks like nvidia cheaped out there. btw, I dunno how many phases my new Zotac GTX 580 has, I'd have to look it up. I do know that it doesn't squeal too loudly though, if I let it freewheel at 500fps+ (but not for long, to avoid damage).



TheMailMan78 said:


> Its not a driver issue. Its a design flaw.



Yup, exactly.


----------



## cowie (Mar 25, 2011)

Mussels said:


> the card is advertised as that being a supported feature. Also, nvidia claimed to have drivers with power throttling to prevent that exact situation.



cards are not supported for 1.2 anything just 1.05v in ab.
 some have said this setting is 1.10v with dmm 
still its better to have guys that dont pay for them blow them up to give a heads up to anyone that does buy these cards.
sofar not one paying customer has reported killing there card.......yet


----------



## W1zzard (Mar 25, 2011)

cowie said:


> cards are not supported for 1.2 anything just 1.05v in ab.
> some have said this setting is 1.10v with dmm
> still its better to have guys that dont pay for them blow them up to give a heads up to anyone that does buy these cards.
> sofar not one paying customer has reported killing there card.......yet



nobody has paid for a gtx 590 yet .. what would you say if those cards started dying left and right and i have said in my review "oh it will work great with 1.2v, it's perfectly safe" ?


----------



## cowie (Mar 25, 2011)

W1zzard said:


> nobody has paid for a gtx 590 yet .. what would you say if those cards started dying left and right and i have said in my review "oh it will work great with 1.2v, it's perfectly safe" ?



wait do they make you pay for those? i hope not.
you doing your job to find these things out for us thats why i wait to read your reviews.
 you would never say that becuase that would be a foolish thing to garrenty. we all (should)know that overclock and overvolting is risky.
ab is limited for the end users to 1.05v, how did you get more or why did you try more? if you did i mean.


----------



## Mussels (Mar 25, 2011)

Airbrushkid said:


> Why have others not had the same thing happen during there testing?



it has. the youtube video has been linked a few times by now.


----------



## Airbrushkid (Mar 25, 2011)

What do you mean linked? If they cannot make video of there own findings and have to use others. Then I cannot believe it.


----------



## wahdangun (Mar 25, 2011)

great review wizz and shame on nvdia, WTF they cheapen out on this premium extreme high end card ??? 

btw wizz do you thing you will get another card for quad SLi review ??


----------



## W1zzard (Mar 25, 2011)

wahdangun said:


> btw wizz do you thing you will get another card for quad SLi review ??



nvidia is sending me a second card on monday, just got the info from them


----------



## qubit (Mar 25, 2011)

W1zzard said:


> nvidia is sending me one on monday, just got the info from them



And after benching SLI performance, will you try to overvolt it again and see if it's any better, or is that off limits now?


----------



## W1zzard (Mar 25, 2011)

qubit said:


> And after benching SLI performance, will you try to overvolt it again and see if it's any better, or is that off limits now?



i could offer nvidia to do some new testing with their new driver, i'll look into that after the quad sli review (not gonna break another card before the quad sli review is posted)


----------



## wahdangun (Mar 25, 2011)

W1zzard said:


> nvidia is sending me a second card on monday, just got the info from them





W1zzard said:


> i could offer nvidia to do some new testing with their new driver, i'll look into that after the quad sli review (not gonna break another card before the quad sli review is posted)



wow, thats great . i can't wait for your quad SLi review, and please do more testing like upping the frequency to the limit and check it stability and also check if the new driver really solve the pooping problem lol


----------



## HXL492 (Mar 25, 2011)

just wondering if this is the easter egg


----------



## laszlo (Mar 25, 2011)

TheMailMan78 said:


> Its not a driver issue. Its a design flaw.



srry but i disagree 

in my opinion due gpu design (2 high  leaking which result in 2 much power draw ) with stock frequency & the other components used specifically for this card in order to minimize cost,  had the result of obtaining this card who had a fragile balance between components/clocks/power draw

if you break the balance 2 much some component fail and bye bye warranty 

as i saw also nvidia don't encourage the o/c of this as they knew it from beginning but they don't care as no no warranty for this blow-ups  is accepted;i'm sure all cards work flawlessly without o/c;basically buy it but is not our problem  if you oc is your wallet


----------



## ERazer (Mar 25, 2011)

Airbrushkid said:


> What do you mean linked? If they cannot make video of there own findings and have to use others. Then I cannot believe it.



tell me do you video every bench do you do? nobody knew some of this card gonna blow, if wiz knew his gonna blow hell prolly made an awesome 3d hd  movie out of it, id paid to see that


----------



## newtekie1 (Mar 25, 2011)

TheMailMan78 said:


> Its not a driver issue. Its a design flaw.



I disagree, it is a driver issue preventing the power limitter from working properly.

I also think it is a driver issue that prevents the power limitter from lowering the voltages as well as the clocks.  Which means that, when a user overvolts, even at the "safety" clocks the card is pulling too much power.

I think nVidia should just limit the maximum voltage to 1.00v or 1.10v via the BIOS, like they did with the GTX400 cards which were limitted to 1.087v.   That way the likelyhood of someone popping a card is a lot lower because it seems like 1.2v is the point where these tend to pop.  If someone wants to increase the voltage beyond 1.10v they can edit the BIOS, and then it is on them for raising the maximum, not nVidia's fault.


----------



## wahdangun (Mar 25, 2011)

newtekie1 said:


> I disagree, it is a driver issue preventing the power limitter from working properly.
> 
> I also think it is a driver issue that prevents the power limitter from lowering the voltages as well as the clocks.  Which means that, when a user overvolts, even at the "safety" clocks the card is pulling too much power.
> 
> I think nVidia should just limit the maximum voltage to 1.00v or 1.10v via the BIOS, like they did with the GTX400 cards which were limitted to 1.087v.   That way the likelyhood of someone popping a card is a lot lower because it seems like 1.2v is the point where these tend to pop.  If someone wants to increase the voltage beyond 1.10v they can edit the BIOS, and then it is on them for raising the maximum, not nVidia's fault.



the problem is some vendor (like asus and evga) encourage overvolting (hell even in boxes they said it can be increased up to %50) so its design flaw (because the weak VRM)


----------



## cowie (Mar 25, 2011)

I agree newtekie 0.95v -1.21v is over 25% increase in voltage,thats alot.


----------



## Airbrushkid (Mar 25, 2011)

ERazer said:


> tell me do you video every bench do you do? nobody knew some of this card gonna blow, if wiz knew his gonna blow hell prolly made an awesome 3d hd  movie out of it, id paid to see that



If I was running a review site and the companies where sending my product review. Then I would be making videos of the reviews!


----------



## W1zzard (Mar 25, 2011)

Airbrushkid said:


> If I was running a review site and the companies where sending my product review. Then I would be making videos of the reviews!



you can do that when you have a review site, there are many reasons for and against video.

for tpu we chose to go the non video route, and apparently it's working well, going by the amounts of traffic we get


----------



## cadaveca (Mar 25, 2011)

Yeah, you guys don't want to see my ugly mug and mitts for mobo reviews...trust me on that one.

Doing video for reviews exponentially increases the amount of work required to do a review. Gotta do the whole review, plus video/audio editting...more hosting and bandwidth requirements, more possible problems, too.


----------



## newtekie1 (Mar 25, 2011)

wahdangun said:


> the problem is some vendor (like asus and evga) encourage overvolting (hell even in boxes they said it can be increased up to %50) so its design flaw (because the weak VRM)



They say "up to 50% Faster" they don't say you can increase the voltage 50%.  They can limit the voltage to 1.1v and still encourage overvolting.

However, that isn't the issue, the issue is that the powerlimitter in the _driver_ is broken.


----------



## the54thvoid (Mar 25, 2011)

nvidiaintelftw said:


> whats the max overclock then like 675??



Head over to Hexus next week and find out 

http://www.hexus.net/content/item.php?item=29753

POV TGT Ultra charged is clocked at 691Mhz Core.  Substantially up from 607Mhz.


----------



## W1zzard (Mar 25, 2011)

newtekie1 said:


> However, that isn't the issue, the issue is that the powerlimitter in the driver is broken.



i dont think it's completely broken as in "it doesnt work at all". but apparently i managed to create a case where it didnt do what it was supposed to


----------



## the54thvoid (Mar 25, 2011)

W1zzard said:


> i dont think it's completely broken as in "it doesnt work at all". but apparently i managed to create a case where it didnt do what it was supposed to



Just fess up dude - you touched it inappropriately and it didn't like it.  You made it cry sparks.

You technophiles make me sick......


----------



## W1zzard (Mar 25, 2011)

the54thvoid said:


> Just fess up dude - you touched it inappropriately and it didn't like it.  You made it cry sparks.
> 
> You technophiles make me sick......



the 24 sided dice that i roll to get my review data fell on the card and caused a short circuit


----------



## entropy13 (Mar 25, 2011)

W1zzard said:


> the 24 sided dice that i roll to get my review data fell on the card and caused a short circuit



And what did you get from the roll?


----------



## TheMailMan78 (Mar 25, 2011)

Nvidia and Charlie Sheen hooked up to bring firey death!


----------



## Cold Storm (Mar 25, 2011)

Great Review W1zz. Makes me think twice about seeing that card in my system.

@cadaveca... Man, Mailman would love to have some more pron.. Seeing you or Rockz mug on youtube..


----------



## the54thvoid (Mar 25, 2011)

W1zzard said:


> the 24 sided dice that i roll to get my review data fell on the card and caused a short circuit



Icosikaitetragon eh?

I sense an old roleplayer in the mist.

Next time you try frying a +100 Attack GFX card, make sure you're wearing +100 Flame resistant rabbit hide boots.


----------



## entropy13 (Mar 25, 2011)

the54thvoid said:


> Icosikaitetragon eh?
> 
> I sense an old roleplayer in the mist.
> 
> Next time you try frying a +100 Attack GFX card, make sure you're wearing +100 Flame resistant rabbit hide boots.



Doesn't "Attack" only deal with melee and not element-based attacks?


----------



## alexsubri (Mar 25, 2011)

W1zzard said:


> the 24 sided dice that i roll to get my review data fell on the card and caused a short circuit



you mean rick rolled?


----------



## the54thvoid (Mar 25, 2011)

*Not such a bad card....*

After reading quite a lot of reviews i think folk should take stock of what is actually a really good achievement.  Given a year ago, a 512 core single Fermi was a laughable myth (even by me!).  To see it now as a dual card performing with (on majority of reviews) a much quieter noise level than AMD's 6990 and for all intents and purposes tying with it for performance whilst being smaller form too, is quite exceptional.

We laughed at the dust buster of the early GTX 480 but seem to forgive the 6990.  Why?  The 590 consistently across reviews is far quieter than the 6990 which makes it a more tolerable stock card to use.

Yes it's not as good as GTX 580 sli but it's clocked much lower to make it acceptable to industry standards (though just).  There are plenty of reviews with overclocked cards (not overvolted) with performance being far better.

For a card as big and as noisy as the 6990, you'd want it to destroy the 590.  It doesn't.  Likewise, a card smaller and so much quieter, you'd expect to be puny in comparison, it's not.

If you consider the 6990 to be a great card then you have to acknowledge the heavily underclocked GF110's that is the 590 is also a great card.  End of.


----------



## mtosev (Mar 25, 2011)

W1zzard said:


> nvidia is sending me a second card on monday, just got the info from them



cool. the SLi *boom* will be interesting to see PLZ record the cards while testing. I wanna see them go *BOOOOOOOM*


----------



## DanTheMan (Mar 25, 2011)

W1zzard said:


> you can do that when you have a review site, there are many reasons for and against video.
> 
> for tpu we chose to go the non video route, and apparently it's working well, going by the amounts of traffic we get



W1zzard, don't change a thing on how you do your reviews. You do all TPU members a great service with your thorough reviews. The same fiasco happened when you reviewed the 480 card and it brought about you questioning yourself and the "April Fools joke about you leaving". Everyone normally is biased one way or the other. I am normally AN AMD fanboy, but I do read the reviews on the competition just to see if I am still getting a good setup for a fair price. Do not let "HARDCORE" fanboys piss you off this time. It's getting too close to April Fools for that.

All in all YOU ROCK W1ZZARD!!


----------



## Goodman (Mar 25, 2011)

DeathByTray said:


> Well guys, you can mock the card all you want but fact is, the GTX 590 gives the best *bang* for the buck.



Good one


----------



## alexsubri (Mar 25, 2011)

DanTheMan said:


> W1zzard, nge a thing on how you do your reviews. You do all TPU members a great service with your thorough reviews. The same fiasco happened when you reviewed the 480 card and it brought about you questioning yourself and the "April Fools joke about you leaving". Everyone normally is biased one way or the other. I am normally AN AMD fanboy, but I do read the reviews on the competition just to see if I am still getting a good setup for a fair price. Do not let "HARDCORE" fanboys piss you off this time. It's getting too close to April Fools for that.
> 
> All in all YOU ROCK W1ZZARD!!


----------



## CDdude55 (Mar 25, 2011)

DanTheMan said:


> Everyone normally is biased one way or the other. I am normally AN AMD fanboy,



I'm not, i could careless about the companies, i just want performance(within my price range).

Fanboys need to die. lol


----------



## DanTheMan (Mar 25, 2011)

CDdude55 said:


> I'm not, i could careless about the companies, i just want performance(within my price range).
> 
> Fanboys need to die. lol



Just cause I'd admit that I prefer AMD, don't think that I've have not tried NV. I've had Nvidia Chipset MB, NV video cards, it's just they normally gave me a lot of trouble so I prefer to stick with what works best for ME.


----------



## CDdude55 (Mar 25, 2011)

DanTheMan said:


> Just cause I'd admit that I prefer AMD, don't think that I've have not tried NV. I've had Nvidia Chipset MB, NV video cards, it's just they normally gave me a lot of trouble so I prefer to stick with what works best for ME.



Right, i have no issues with someone being a fan of particular products, if it works best for you and it's within your price range then go for it.

What works best for me is always looking at price and performance gain out of it .(though i am starting to lean towards AMD more since im poor lol.)


----------



## VulkanBros (Mar 25, 2011)

Quote from Nvidia forums: 
Posted Yesterday, 08:51 PM
In the web release driver of GeForce GTX 590, we have added some important enhancements to our overcurrent protection for overclocking. We recommend anyone doing overclocking or running stress apps to always use the latest web driver to get the fullest protection for your hardware. Please note that overcurrent protection does not eliminate the risks of overclocking, and hardware damage is possible, particularly when overvoltaging. We recommend anyone using the GTX 590 board with the reference aircooler stick with the default voltage while overclocking, and avoid working around overcurrent protection mechanisms for stress applications. This will help maintain GTX 590's great combination of acoustics, performance, and reliability. NVIDIA has worked with several watercooling companies to develop waterblocks for GTX 590, and these solutions will help provide additional margin for overclocking, but even in this case we recommend enthusiasts stay within 12.5-25mV of the default voltage in order to minimize risk.

These are guidelines only - any overclocking/overvoltaging can void your manufacturer's product warranty.


----------



## wolf (Mar 25, 2011)

this thread is getting epic, a couple of popped cards and ppl go craaaaaazy. not to discount that a popped card is serious, and highlights the risk of overclocking a delicate card like this.

if you buy one just dont get overzealous


----------



## TheMailMan78 (Mar 25, 2011)

I think W1zz was to kind giving this card a 7.0. The damn thing blew up! I mean really WTF does someone have to do to get a low score on TPU? lol

Awesome review W1zz! No doubt. Just wish you were more brutal with the scores.



wolf said:


> this thread is getting epic, a couple of popped cards and ppl go craaaaaazy. not to discount that a popped card is serious, and highlights the risk of overclocking a delicate card like this.
> 
> if you buy one just dont get overzealous



Dude W1zz was not "overzealous" with the voltage bump. It was mild. This card sucks and thats the bottom line.



the54thvoid said:


> After reading quite a lot of reviews i think folk should take stock of what is actually a really good achievement.  Given a year ago, a 512 core single Fermi was a laughable myth (even by me!).  To see it now as a dual card performing with (on majority of reviews) a much quieter noise level than AMD's 6990 and for all intents and purposes tying with it for performance whilst being smaller form too, is quite exceptional.
> 
> We laughed at the dust buster of the early GTX 480 but seem to forgive the 6990.  Why?  The 590 consistently across reviews is far quieter than the 6990 which makes it a more tolerable stock card to use.
> 
> ...



I would agree with you if the card didnt blow the fuck up.


----------



## MxPhenom 216 (Mar 25, 2011)

the54thvoid said:


> After reading quite a lot of reviews i think folk should take stock of what is actually a really good achievement.  Given a year ago, a 512 core single Fermi was a laughable myth (even by me!).  To see it now as a dual card performing with (on majority of reviews) a much quieter noise level than AMD's 6990 and for all intents and purposes tying with it for performance whilst being smaller form too, is quite exceptional.
> 
> We laughed at the dust buster of the early GTX 480 but seem to forgive the 6990.  Why?  The 590 consistently across reviews is far quieter than the 6990 which makes it a more tolerable stock card to use.
> 
> ...



I have to agree completely with you. Way to tell it straight!


----------



## wolf (Mar 25, 2011)

TheMailMan78 said:


> Dude W1zz was not "overzealous" with the voltage bump. It was mild. This card sucks and thats the bottom line.



Im sorry but I flat out disagree with that, with a stock voltage around 940mv, 1200mv is a LOT. my GTX460 stock voltage is 975mv and pumping 1087mv thru it is a lot to me.

nice opinion youve got there tho.


----------



## TheMailMan78 (Mar 25, 2011)

wolf said:


> Im sorry but I flat out disagree with that, with a stock voltage around 940mv, 1200mv is a LOT. my GTX460 stock voltage is 975mv and pumping 1087mv thru it is a lot to me.
> 
> nice opinion youve got there tho.



Its not an opinion. Its fact. This card was rushed out the door to beat ATI. In the process they cut corners and BOOM went the dynamite. It has a fundamental flaw in the design. Remember W1zz blown card is not an isolated incident. You just have to accept the green team dropped the ball.  They didnt even ship it with the right drivers. End of story.


----------



## W1zzard (Mar 25, 2011)

why didnt the powercolor card blow up at 1.45 v ?!


----------



## wolf (Mar 25, 2011)

TheMailMan78 said:


> Its not an opinion. Its fact. This card was rushed out the door to beat ATI. In the process they cut corners and BOOM went the dynamite. It has a fundamental flaw in the design. Remember W1zz blown card is not an isolated incident. You just have to accept the green team dropped the ball.  End of story.



thats what you think and Im not trying to change your mind, but I havent heard of one dying at stock clocks, it trades blows with the 6990 very well, admittedly slightly slower on average, but while being shorter and quieter.

obviously we have different definitions of what makes a gfx card fail or not.


----------



## TheMailMan78 (Mar 25, 2011)

W1zzard said:


> http://tpucdn.com/reviews/Powercolor/HD_6970_PCS_Plus/images/voltagetuning.jpg
> 
> why didnt the powercolor card blow up at 1.45 v ?!



Because its a better designed card.



wolf said:


> thats what you think and Im not trying to change your mind, but I havent heard of one dying at stock clocks, it trades blows with the 6990 very well, admittedly slightly slower on average, but while being shorter and quieter.
> 
> obviously we have different definitions of what makes a gfx card fail or not.



580 = good/great! Wish I owned one!
590 = piece of shit. How many blown review cards does it take for people to get this?!

How would you feel if you bought a car and the first time you red-lined it the engine shot out and killed your dog? Would you deem that car a piece of shit?


----------



## DanTheMan (Mar 25, 2011)

themailman78 said:


> because its a better designed card.



AMEN

The UnderDOG (AMD) always has to fight for what it gets, if that means making a better design, longer lasting, more diverse product lineup then so be it. I would rather for them to take their time and have a better product than to have $750 go up in flames - literally.


----------



## CDdude55 (Mar 25, 2011)

TheMailMan78 said:


> Because its a better designed card.



You have to keep in mind the large difference in performance and range, overvolting a monstrous dual GPU card is completely different from overvolting a more power efficient single GPU card. The risks are there, and the risks move further up the latter when you what to push the most out of an already beefy multi GPU card.


----------



## runevirage (Mar 25, 2011)

Great review as always Wiz. 

Sorry if this has been asked already, but next time could you please go with the 11.4 drivers for AMD? I realize that this time around it probably came in close to the time that you were reviewing the 590 and you couldn't get around to it, but I feel that the performance differences are probably significant enough to test with their newest driver for next time. Thanks.


----------



## D4S4 (Mar 25, 2011)

the54thvoid said:


> You technophiles make me sick......



says a guy who is registered on TPU and has almost 600 posts.


----------



## newtekie1 (Mar 25, 2011)

TheMailMan78 said:


> Its not an opinion. Its fact. This card was rushed out the door to beat ATI. In the process they cut corners and BOOM went the dynamite. It has a fundamental flaw in the design. Remember W1zz blown card is not an isolated incident. You just have to accept the green team dropped the ball.  They didnt even ship it with the right drivers. End of story.



No, the only corner they cut was in the BIOS in not locking down the voltage to lower levels.

You can't say that because the card can't handle 1.2v it is a shitty design.  The fact is that nVidia designed the card to run at 0.94v, and it does that just fine.  Raising the voltages beyond that puts it out of the area it was designed for.

And 1.2v certainly isn't a mild overvolt, not on a Fermi card.  Remember, the maximum you could even go on the original Fermi cards was 1.087v(without modding the BIOS).  So yes, on a Fermi 1.2v is a huge voltage bump.


----------



## erocker (Mar 25, 2011)

CDdude55 said:


> You have to keep in mind the large difference in performance and range, overvolting a monstrous dual GPU card is completely different from overvolting a more power efficient single GPU card. The risks are there, and the risks move further up the latter when you what to push the most out of an already beefy multi GPU card.



It doesn't have to be if more/better power delivery components are in the design of the card. Then again, add those things and price goes up. Nvidia has a rather expensive design with Fermi. That being said, it won't take much for a 3rd party to improve on the design a bit. It will obviously cost more.



newtekie1 said:


> No, the only corner they cut was in the BIOS in not locking down the voltage to lower levels.
> 
> You can't say that because the card can't handle 1.2v it is a shitty design.  The fact is that nVidia designed the card to run at 0.94v, and it does that just fine.  Raising the voltages beyond that puts it out of the area it was designed for.
> 
> And 1.2v certainly isn't a mild overvolt, not on a Fermi card.  Remember, the maximum you could even go on the original Fermi cards was 1.087v(without modding the BIOS).  So yes, on a Fermi 1.2v is a huge voltage bump.



100% correct. I think what people are saying though, is that it could of been designed a little better. Enthusiasts like to push their enthusiast cards.


----------



## W1zzard (Mar 25, 2011)

runevirage said:


> but next time could you please go with the 11.4 drivers for AMD



latest amd driver is catalyst 11.2, i dont waste my time on betas except for the reviewed product.

both ati and nvidia send out magical new beta drivers at the time their competition launches, and those changes may not even make it into the whql build


----------



## newtekie1 (Mar 25, 2011)

erocker said:


> 100% correct. I think what people are saying though, is that it could of been designed a little better. Enthusiasts like to push their enthusiast cards.



I agree, but enthusiasts also know what happens when you push enthusiast cards too far.  Or rather they used to know, now it seems they just assume that since the voltage slider goes all the way to 11, that there is no problem with putting it there...:shadedshu

I'm an enthusiast, I have an enthusiast CPU, if I go in the BIOS of my motherboard right now I have the _option_ to pump some stupidly high voltage through my CPU that would surely fry it.  I also have the _option_ to pump some stupidly high voltage through the RAM as well.  If I decided to do that, and things started to pop, is it eVGA's fault?  Should I blame Intel for having a "shitty" processor design that couldn't handle the voltage?  Should I be upset at Corsair for having a "rushed" RAM product that pops under voltages completely out of spec from the RAM designed voltage?  No.  It is my fault for messing around with voltages, I knew the risks.  So why is it suddenly different with GPUs?  They give you the _option_ to use that voltage, you are the one that is actually deciding to use the voltage.  A real enthusiast knows the risk involved with messing with voltages, and a real enthusiast knows that they themselves are the only ones to blame for blowing something up from overclocking/overvolting.


----------



## Frick (Mar 25, 2011)

newtekie1 said:


> I agree, but enthusiasts also know what happens when you push enthusiast cards too far.  Or rather they used to know, now it seems they just assume that since the voltage slider goes all the way to 11, that there is no problem with putting it there...:shadedshu



So much this. People are messing with powers they don't understand!

I was actually suprised when w1z bumped it all the way to 1.2 at once.


----------



## TheMailMan78 (Mar 25, 2011)

Frick said:


> So much this. People are messing with powers they don't understand!
> 
> I was actually suprised when w1z bumped it all the way to 1.2 at once.



Maybe because they advertised he could on the box?


----------



## newtekie1 (Mar 25, 2011)

TheMailMan78 said:


> Maybe because they advertised he could on the box?



You show me where on the box it says he could go all the way up to 1.2v safely.


----------



## Frick (Mar 25, 2011)

TheMailMan78 said:


> Maybe because they advertised he could on the box?



Does it say how much he could increase it? I only see Voltage Tweak! and "Up to 50% faster clock speed!" which I don't think mean you're supposed to increase the voltage with 20%.

Now I realize it IS a bad thing indeed, especially with that protection thing turned on, but I don't think it's as bad as everyone says either.


----------



## W1zzard (Mar 25, 2011)

testing voltage tuning on msi hd 6950 twin frozr iii now.. 

so guys .. where should i stop ? after 15 - 25 mv like nvidia recommends ? or go as far as the slider lets me ?


----------



## TheMailMan78 (Mar 25, 2011)

W1zzard said:


> testing voltage tuning on msi hd 6950 twin frozr iii now..
> 
> so guys .. where should i stop ? after 15 - 25 mv like nvidia recommends ? or go as far as the slider lets me ?



DO IT! Blow that bitch!



newtekie1 said:


> I agree, but enthusiasts also know what happens when you push enthusiast cards too far.  Or rather they used to know, now it seems they just assume that since the voltage slider goes all the way to 11, that there is no problem with putting it there...:shadedshu
> 
> I'm an enthusiast, I have an enthusiast CPU, if I go in the BIOS of my motherboard right now I have the _option_ to pump some stupidly high voltage through my CPU that would surely fry it.  I also have the _option_ to pump some stupidly high voltage through the RAM as well.  If I decided to do that, and things started to pop, is it eVGA's fault?  Should I blame Intel for having a "shitty" processor design that couldn't handle the voltage?  Should I be upset at Corsair for having a "rushed" RAM product that pops under voltages completely out of spec from the RAM designed voltage?  No.  It is my fault for messing around with voltages, I knew the risks.  So why is it suddenly different with GPUs?  They give you the _option_ to use that voltage, you are the one that is actually deciding to use the voltage.  A real enthusiast knows the risk involved with messing with voltages, and a real enthusiast knows that they themselves are the only ones to blame for blowing something up from overclocking/overvolting.



See the problem is W1zz knows what hes doing. He blew the card. Hes also not alone. Other reviewers blew the card as well. Its junk. Just accept it. Relax and push out. It won't hurt as much.


----------



## newtekie1 (Mar 25, 2011)

Frick said:


> Does it say how much he could increase it? I only see Voltage Tweak! and "Up to 50% faster clock speed!" which I don't think mean you're supposed to increase the voltage with 20%.
> 
> Now I realize it IS a bad thing indeed, especially with that protection thing turned on, but I don't think it's as bad as everyone says either.



Plus, even at 1.0v W1z was able to get 815MHz out of the card(so something like a 30% overclock), that is a damn good clock speed.  It would have been nice to see what happened at 1.05v or 1.1v, I bet 900MHz might have been possible, and still probably been safe from killing the card.  And 900MHz on a 512 Shader Fermi would be one hell of a beast...



W1zzard said:


> testing voltage tuning on msi hd 6950 twin frozr iii now..
> 
> so guys .. where should i stop ? after 15 - 25 mv like nvidia recommends ? or go as far as the slider lets me ?



Go as far as you feel is safe.  It will be different from card to card.  Personally, I'm glad I have sites like your's that give me an idea of what is safe and sometime what isn't.

Personally, ever since I heard about the other GF110 cards popping, I don't think I'd go over 1.1v on any fermi card, that is just my safety maximum.  I think having a feel for what is safe and sometime learning the hard way is part of the enthusiast game.



TheMailMan78 said:


> See the problem is W1zz knows what hes doing. He blew the card. Hes also not alone. Other reviewers blew the card as well. Its junk. Just accept it. Relax and push out. It won't hurt as much.



Yes, he does know what he is doing, and I'm sure he knows the risks of it before he even did it.  However, I'm sure he will also agree that just because the option is there, that doesn't mean everyone should use it, just like with the options to raise voltages on anything else in your system.  Just because it blew because people are putting too much voltage through it, that doesn't make it junk.  You go max out the voltage on your CPU with the stock cooler, and when things start to pop, I'll make you admit the CPU and motherboard were junk.

See, the real problem is that people have become way to complacent with GPU overclocking(and to an extent with CPU overclocking as well).  It has become so easy, that everyone seems to think there is nothing to it, and they don't really know what is going on.  I remember when raising the voltage on a GPU required that you know how to solder, and there were some real risks involved.  Now that a simple piece of software can be used, everyone seems to think the risks are gone.  Well this and the other GF110 cards show us the risks aren't gone.


----------



## Deleted member 24505 (Mar 25, 2011)

W1zzard said:


> testing voltage tuning on msi hd 6950 twin frozr iii now..
> 
> so guys .. where should i stop ? after 15 - 25 mv like nvidia recommends ? or go as far as the slider lets me ?



Its an ati gpu, what would nvidia know about what that can take safetly?

I always thought ati was better at taking more mv's than nvidia gpu's.


----------



## Fourstaff (Mar 25, 2011)

W1zzard said:


> so guys .. where should i stop ? after 15 - 25 mv like nvidia recommends ? or go as far as the slider lets me ?



Possibly start with smaller steps? You are after all the OC guru, so I am in no position to advise you on the dark art of overclocking.


----------



## brandonwh64 (Mar 25, 2011)

W1zzard said:


> testing voltage tuning on msi hd 6950 twin frozr iii now..
> 
> so guys .. where should i stop ? after 15 - 25 mv like nvidia recommends ? or go as far as the slider lets me ?



MAX IT OUT! 

GO BIG or send it to me and i will LOL


----------



## DanTheMan (Mar 25, 2011)

w1zzard said:


> testing voltage tuning on msi hd 6950 twin frozr iii now..
> 
> So guys .. Where should i stop ? After 15 - 25 mv like nvidia recommends ? Or go as far as the slider lets me ?



go for all she's got!
I need warp speed now


----------



## TheMailMan78 (Mar 25, 2011)

newtekie1 said:


> Yes, he does know what he is doing, and I'm sure he knows the risks of it before he even did it.  However, I'm sure he will also agree that just because the option is there, that doesn't mean everyone should use it, just like with the options to raise voltages on anything else in your system.  Just because it blew because people are putting too much voltage through it, that doesn't make it junk.  You go max out the voltage on your CPU with the stock cooler, and when things start to pop, I'll make you admit the CPU and motherboard were junk.
> 
> See, the real problem is that people have become way to complacent with GPU overclocking(and to an extent with CPU overclocking as well).  It has become so easy, that everyone seems to think there is nothing to it, and they don't really know what is going on.  I remember when raising the voltage on a GPU required that you know how to solder, and there were some real risks involved.  Now that a simple piece of software can be used, everyone seems to think the risks are gone.  Well this and the other GF110 cards show us the risks aren't gone.


 Well according to his review it wasn't extreme at all



> As a first step, I increased the voltage from 0.938 V default to 1.000 V, maximum stable clock was 815 MHz - faster than GTX 580! Moving on, I tried 1.2 V to see how much could be gained here, at default clocks and with *NVIDIA's power limiter enabled*. I went to heat up the card and then *boom*, a sound like popcorn cracking, the system turned off and a burnt electronics smell started to fill up the room. Card dead! Even with NVIDIA power limiter enabled. Now the pretty looking, backlit GeForce logo was blinking helplessly and the fan did not spin, both indicate an error with the card's 12V supply.
> After talking to several other reviewers, this does not seem to be an isolated case, and many of them have killed their cards with similar testing, *which is far from being an extreme test.*


----------



## Frick (Mar 25, 2011)

What would be extreme then?


----------



## D4S4 (Mar 25, 2011)

being able to overclock it at that voltage?


----------



## thedude74 (Mar 25, 2011)

TheMailMan78 said:


> Maybe because they advertised he could on the box?



No, it isn't.




TheMailMan78 said:


> See the problem is W1zz knows what hes doing. He blew the card. Hes also not alone. Other reviewers blew the card as well. Its junk. Just accept it. Relax and push out. It won't hurt as much.



Obviously he doesn't, at least in terms of the 590. He jacked the thing up to 1.2v without understanding what the cards limit is. Nvidia clearly states the cards are not supposed to be run anywhere near that voltage.

You aren't even supposed to run a 580 at 1.2v, what made him think you could do that to two sandwiched together is baffling. 

http://www.tweaktown.com/news/19192...590_why_some_have_gone_up_in_smoke/index.html

Calling a card junk because someone ran it will over voltage specification and blew it up is idiotic.


----------



## Deleted member 24505 (Mar 25, 2011)

thedude74 said:


> You aren't even supposed to run a 580 at 1.2v, what made him think you could do that to two sandwiched together is baffling.




Its not two gpu's sandwiched, its side by side. I dont think Nvidia have used the sandwich design since the 7950gx2.


----------



## cheesy999 (Mar 25, 2011)

thedude74 said:


> Obviously he doesn't, at least in terms of the 590. He jacked the thing up to 1.2v without understanding what the cards limit is. Nvidia clearly states the cards are not supposed to be run anywhere near that voltage.
> 
> You aren't even supposed to run a 580 at 1.2v, what made him think you could do that to two sandwiched together is baffling.



that is a good point, even an iphone will break if its asked to do things it wasn't designed for http://www.youtube.com/watch?v=_S8sxpK4_iA&feature=player_detailpage#t=120s (yes i'm making lots of will it blend references lately)


----------



## TheMailMan78 (Mar 25, 2011)

thedude74 said:


> No, it isn't.
> 
> 
> 
> ...


No defending an obviously flawed design is idiotic. But hey thats what fan boys do. Welcome to TPU by the way.


----------



## cheesy999 (Mar 25, 2011)

TheMailMan78 said:


> No defending an obviously flawed design is idiotic. But hey thats what fan boys do. Welcome to TPU by the way.



this person has no idea what he talking about, taking him seriously is much less fun then just going along with it


----------



## TheMailMan78 (Mar 25, 2011)

cheesy999 said:


> this person has no idea what he talking about, taking him seriously is much less fun then just going along with it



Problem is I'm right on this one. NVIDIA's power limiter failed or it was a bum fuse/resistor. Could be both. Ether way flawed design.

Again W1zz wasn't the only one. What don't you understand?


----------



## cheesy999 (Mar 25, 2011)

TheMailMan78 said:


> Problem is I'm right on this one. NVIDIA's power limiter failed or it was a bum fuse/resistor. Could be both. Ether way flawed design.
> 
> Again W1zz wasn't the only one. What don't you understand?



i understand everything, but arguing with trolls is stupid where pretending to take them seriously means you can have much more fun with them, just pretend to agree with 74 and hope he goes away if he bothering you that much


----------



## TheMailMan78 (Mar 25, 2011)

cheesy999 said:


> i understand everything, but arguing with trolls is stupid where pretending to take them seriously means you can have much more fun with them, just pretend to agree with 74 and hope he goes away if he bothering you that much



Oh I though you were referring to me  

Honestly I don't think hes a troll. Hes a butt hurt Nvidiot.


----------



## thedude74 (Mar 25, 2011)

TheMailMan78 said:


> No defending an obviously flawed design is idiotic. But hey thats what fan boys do. Welcome to TPU by the way.



I wouldn't buy a 590 on a bet, but I also wouldn't blame the card for user error. Let us all know when one blows up using the correct parameters and maybe you'd have a point.

Put it this way, my motherboard allows me to crank the voltage on my CPU well beyond what Intel recommends. If I do so and it fries the CPU, is that bad motherboard design, bad CPU design, or am I to blame?



TheMailMan78 said:


> Again W1zz wasn't the only one. What don't you understand?



Does it really matter how many people it happened to if they ALL ran the cards outside the limits of what they were designed to do?

What part of, "the cards aren't designed to be run with that much voltage" don't you understand?


----------



## thedude74 (Mar 25, 2011)

cheesy999 said:


> i understand everything, but arguing with trolls is stupid where pretending to take them seriously means you can have much more fun with them, just pretend to agree with 74 and hope he goes away if he bothering you that much



How am I the troll here?


----------



## TheMailMan78 (Mar 25, 2011)

thedude74 said:


> I wouldn't buy a 590 on a bet, but I also wouldn't blame the card for user error. Let us all know when one blows up using the correct parameters and maybe you'd have a point.
> 
> Put it this way, my motherboard allows me to crank the voltage on my CPU well beyond what Intel recommends. If I do so and it fries the CPU, is that bad motherboard design, bad CPU design, or am I to blame?
> 
> ...



It hurts doesn't it. So many little hearts were broken by the 590....poof your dreams are gone.

These are not just "people". These are professional reviewers that have YEARS of experience with OC GPU's.

Push out. It won't hurt as bad.


----------



## MxPhenom 216 (Mar 25, 2011)

TheMailMan78 said:


> Oh I though you were referring to me
> 
> Honestly I don't think hes a troll. Hes a butt hurt Nvidiot.



your such a ati fanboy its discusting. we can easily call you a radiot so stop referring to everyone who owns a nvidia card or who buys one a nvidiot, makes you sound like a little ignorant bitch.


----------



## TheMailMan78 (Mar 25, 2011)

nvidiaintelftw said:


> your such a ati fanboy its discusting. we can easily call you a radiot so stop referring to everyone who owns a nvidia card or who buys one a nvidiot, makes you sound like a little ignorant bitch.



And heres butt hurt #2! And call me whatever you want. My dreams and hope don't ride on the back of a flawed GPU.


----------



## thedude74 (Mar 25, 2011)

TheMailMan78 said:


> It hurts doesn't it. So many little hearts were broken by the 590....poof your dreams are gone.



The 6990 is a better card...so what? What I run is better than either one of them. That has nothing to do with this discussion. I was never in the running to buy either one.

Blaming Nvidia or the card because a few testers decided to overvolt the card well beyond what it's designed to handle is what fanboys do.

And yes, they are just people. Yes, they have years of experience, but they had NONE with this card and clearly didn't know what voltage's were acceptable.


----------



## Bjorn_Of_Iceland (Mar 25, 2011)

W1zzard said:


> the 24 sided dice that i roll to get my review data fell on the card and caused a short circuit



A crit miss most probably


----------



## MxPhenom 216 (Mar 25, 2011)

TheMailMan78 said:


> And heres butt hurt #2! And call me whatever you want. My dreams and hope don't ride on the back of a flawed GPU.



Oh so its a flawed GPU just because if you overvolt it, which is already voiding warrenties of some vendors, that it will blow up?? NO

We all thought that we would never see a dual GF110 card released out on the market. However, to our surprise nvidia was able to do something with it. Just because OH WOW i overvolted my GPU and it blew up, doesnt make it a design flaw. Cards are set to a default voltage and you can tweak it at your own risk. No where did nvidia say running it at 1.2v, or whatever the hell people are pumping the voltage, is safe. The 590 is a hell of a card and i know for a fact people will agree with this. Look at the stuff other then performance. noise, temperature, and efficiency. Its better in each one of those aspects and its only slow by a max of 5% then the 6990 and even then it can rise up and beat it out. 

You can keep on bashing this card, but if you knew how to make it better, why aren't you an engineer at nvidia??


----------



## cheesy999 (Mar 25, 2011)

TheMailMan78 said:


> And heres butt hurt #2! And call me whatever you want. My dreams and hope don't ride on the back of a flawed GPU.



don't think he supports nvidia as he's switched sides since the username, running a 6950 if i remember correctly


----------



## TheMailMan78 (Mar 25, 2011)

nvidiaintelftw said:


> Oh so its a flawed GPU just because if you overvolt it, which is already voiding warrenties of some vendors, that it will blow up?? NO



BIOS should have had the correct voltage limit. Didn't ship with the correct drivers should I go on NvidiaFTLose?


----------



## MxPhenom 216 (Mar 25, 2011)

cheesy999 said:


> don't think he supports nvidia as he's switched sides since the username, running a 6950 if i remember correctly



no i havent gone to a 6950 yet. still debating it


----------



## newtekie1 (Mar 25, 2011)

TheMailMan78 said:


> Well according to his review it wasn't extreme at all



For a Fermi card, it is extreme.  Again, I point out the fact that the previous Fermi cards limitted the voltages to 1.087v, and they did it for a reason.



tigger said:


> Its not two gpu's sandwiched, its side by side. I dont think Nvidia have used the sandwich design since the 7950gx2.



9800GX2 and GTX295 were both sandwich designs, the GTX295 was later revised to a single PCB.



TheMailMan78 said:


> No defending an obviously flawed design is idiotic. But hey thats what fan boys do. Welcome to TPU by the way.



You haven't set the max voltage on your CPU and RAM yet to make sure the CPU/Mobo and RAM aren't flawed designs yet.

Calling something a flawed design because it fails when it is run way out of spec is idiotic.


----------



## TheMailMan78 (Mar 25, 2011)

nvidiaintelftw said:


> no i havent gone to a 6950 yet. still debating it



You should get a 590! I hear they have the best bang for the buck.



newtekie1 said:


> You haven't set the max voltage on your CPU and RAM yet to make sure the CPU/Mobo and RAM aren't flawed designs yet.



I have.....and guess what......they didnt blow up.


----------



## W1zzard (Mar 25, 2011)

thedude74 said:


> my motherboard allows me to crank the voltage on my CPU well beyond what Intel recommends. If I do so and it fries the CPU



not gonna happen unless you use like 50% more (educated guess, probably needs more). and not within ~2 minutes like it happened on the 590


----------



## cheesy999 (Mar 25, 2011)

nvidiaintelftw said:


> no i havent gone to a 6950 yet. still debating it



i see your problem, changing the gfx would mean changing your username to amdintelftw and that would be seriously confusing


----------



## MxPhenom 216 (Mar 25, 2011)

cheesy999 said:


> i see your problem, changing the gfx would mean changing your username to amdintelftw and that would be seriously confusing



no i dont care about my name.


----------



## cheesy999 (Mar 25, 2011)

nvidiaintelftw said:


> no i dont care about my name.



fair point


----------



## newtekie1 (Mar 25, 2011)

TheMailMan78 said:


> I have.....and guess what......they didnt blow up.



Really, you stuck the stock cooler on your processor, selected the maximum voltage available for everything in the BIOS, and then ran a stess testing program, and nothing blew up?  I doubt it.



TheMailMan78 said:


> BIOS should have had the correct voltage limit.



This I somewhat agree with, besides that users should know that just because you can doesn't mean you should, but nVidia should have made the BIOS limit the voltage to 1.087v like the previous Fermi cards, they should have done the same thing with the GTX580/570.

However, just because they didn't put the limit in the BIOS doesn't mean the card's design is flawed.



W1zzard said:


> not gonna happen unless you use like 50% more (educated guess, probably needs more). and not within ~2 minutes like it happened on the 590



Yeah, but if I select the 2.0v option on my RAM, I'll probably kill the IMC in the processor, and probably the RAM too...


----------



## Deleted member 67555 (Mar 25, 2011)

thedude74 said:


> The 6990 is a better card...so what? What I run is better than either one of them.



Wow REALLY? Your the first to ever make my Ignore list for extreme ignorance 
I'm with MM this card is Garbage....It's an Enthusiast Card...EN-THOO-ZZ-EE-ASS-TAH
It's not a mid level card it's not a work station Card it's supposed to be HIGH END!!

GARBAGE!!! 
If the 6990 did the same thing I'd call it Garbage too!


----------



## cheesy999 (Mar 25, 2011)

newtekie1 said:


> Really, you stuck the stock cooler on your processor, selected the maximum voltage available for everything in the BIOS, and then ran a stess testing program, and nothing blew up? I doubt it.



i don't doubt him, thermal throttling generally works on a CPU and then it doesn't blow up


----------



## MxPhenom 216 (Mar 25, 2011)

newtekie1 said:


> Really, you stuck the stock cooler on your processor, selected the maximum voltage available for everything in the BIOS, and then ran a stess testing program, and nothing blew up?  I doubt it.



my friend ran a 16 hour stress test on his GTX570 at like 1.15v and it blew the card up.


----------



## thedude74 (Mar 25, 2011)

jmcslob said:


> Wow REALLY? Your the first to ever make my Ignore list for extreme ignorance
> I'm with MM this card is Garbage....It's an Enthusiast Card...EN-THOO-ZZ-EE-ASS-TAH
> It's not a mid level card it's not a work station Card it's supposed to be HIGH END!!
> 
> ...



Re-read what you quoted and try again. Talk about extreme ignorance.


----------



## W1zzard (Mar 25, 2011)

please stop the fighting or there will be infractions, tears and closed comments for this review


----------



## pantherx12 (Mar 25, 2011)

thedude74 said:


> No, it isn't.
> 
> 
> 
> ...



the thing is nvidia have a overdraw protection in thier cards . this is what failed you should be able to set voltage to anything and the card would protect itks self this was not the case. this is the problem.


----------



## MxPhenom 216 (Mar 25, 2011)

W1zzard said:


> please stop the fighting or there will be infractions, tears and closed comments for this review



you might as well just close the thread.


----------



## TheMailMan78 (Mar 25, 2011)

pantherx12 said:


> the thing is nvidia have a overdraw protection in thier cards . this is what failed you should be able to set voltage to anything and the card would protect itks self this was not the case. this is the problem.



Logic isn't a welcome guest upon a fanboys ears.


----------



## cheesy999 (Mar 25, 2011)

nvidiaintelftw said:


> you might as well just close the thread.



agree with that one, this threads not going anywhere, just fans of either side arguing the same point over and over again


----------



## TheMailMan78 (Mar 25, 2011)

Don't close it W1zz. Ill leave. So the fanboys can discuss how the Illuminati destroyed the 590 on your test bench with their mind bullets.


----------



## Deleted member 67555 (Mar 25, 2011)

cheesy999 said:


> agree with that one, this threads not going anywhere, just fans of either side arguing the same point over and over again



The only problem with this thread is this..People miss the Fact that Nvidia's top Enthusiast card isn't really for enthusiasts it's more for people that want a PnP card so they can claim they have the best...The reference design is Flawed and When Nvidia's partners come out with a better design we will get to see what this card can really do.


----------



## newtekie1 (Mar 25, 2011)

pantherx12 said:


> the thing is nvidia have a overdraw protection in thier cards . this is what failed you should be able to set voltage to anything and the card would protect itks self this was not the case. this is the problem.



Correct, that is a flaw.  As I said, I would guess the problem is that the power limitter doesn't lower the voltage, it only lowers the clocks.  So if you set the voltage too high, even if it tries to save the card by lowering the clocks, it won't because the voltage is still too high.  But this is likely something that can be fixed with a driver, and it certainly doesn't mean the card itself's design is flawed.  Either way you look at it, it is a software problem, not a design flaw.



jmcslob said:


> The only problem with this thread is this..People miss the Fact that Nvidia's top Enthusiast card isn't really for enthusiasts it's more for people that want a PnP card so they can claim they have the best...The reference design is Flawed and When Nvidia's partners come out with a better design we will get to see what this card can really do.



The design is not flawed, the software that supports the card is.  You can still overvolt and overclock the card, things enthusiusts like to do.


----------



## wolf (Mar 25, 2011)

TheMailMan78 said:


> How would you feel if you bought a car and the first time you red-lined it the engine shot out and killed your dog? Would you deem that car a piece of shit?



not even close to the same comparison, but I did enjoy the part about my dog 



newtekie1 said:


> No, the only corner they cut was in the BIOS in not locking down the voltage to lower levels.
> 
> You can't say that because the card can't handle 1.2v it is a shitty design.  The fact is that nVidia designed the card to run at 0.94v, and it does that just fine.  Raising the voltages beyond that puts it out of the area it was designed for.
> 
> And 1.2v certainly isn't a mild overvolt, not on a Fermi card.  Remember, the maximum you could even go on the original Fermi cards was 1.087v(without modding the BIOS).  So yes, on a Fermi 1.2v is a huge voltage bump.



this. the example drawn with a 6900 series card is completely different, look at the difference in stock votlages, this can account for the amount of voltage the card can handle. fermi cards not only dont scale well afer 1.1v, they don't like it either.



erocker said:


> 100% correct. I think what people are saying though, is that it could of been designed a little better. Enthusiasts like to push their enthusiast cards.



I completely grant this is a let down for the overclocking crowd, but the card is not a failed product. overclocking is always a temptation but the fact is none have blown up in stock trim, at stock clocks.



newtekie1 said:


> I agree, but enthusiasts also know what happens when you push enthusiast cards too far.  Or rather they used to know, now it seems they just assume that since the voltage slider goes all the way to 11, that there is no problem with putting it there...:shadedshu



also this. a few review sites have had the card clock down and save itself just fine, and this is way before getting to 1.2v you just need to be patient, clock/volt the card up slowly and quit while you're ahead, hec ~670mhz core and 3700mhz memory is already 10% faster and a lot to gain from a card that is already such a beast. I think people got their hopes up with the specs and wanted two fully fledged GTX580's on one PCB. while thats a nice thought, at least the reviews with popped cards have shown us t'aint gunna happen.


----------



## cheesy999 (Mar 25, 2011)

newtekie1 said:


> Correct, that is a flaw.  As I said, I would guess the problem is that the power limitter doesn't lower the voltage, it only lowers the clocks.  So if you set the voltage too high, even if it tries to save the card by lowering the clocks, it won't because the voltage is still too high.  But this is likely something that can be fixed with a driver, and it certainly doesn't mean the card itself's design is flawed.  Either way you look at it, it is a software problem, not a design flaw.



problem is that people judge the cards very quickly, it's gonna be like crysis 2 where everyone moans about it because its missing a feature for the first few weeks


----------



## newtekie1 (Mar 25, 2011)

cheesy999 said:


> problem is that people judge the cards very quickly, it's gonna be like crysis 2 where everyone moans about it because its missing a feature for the first few weeks



Saddly this is all too often true, Vista would be another great example.


----------



## sneekypeet (Mar 25, 2011)

TheMailMan78 said:


> Don't close it W1zz. Ill leave. So the fanboys can discuss how the Illuminati destroyed the 590 on your test bench with their mind bullets.



The Rosicrutians have mind bullets not the Illuminati:shadedshu


----------



## CDdude55 (Mar 25, 2011)

...and down goes another Nvidia related thread. lol


----------



## W1zzard (Mar 25, 2011)

TheMailMan78 said:


> how the Illuminati destroyed the 590 on your test bench with their mind bullets



we dont do that


----------



## wolf (Mar 25, 2011)

W1zzard said:


> we dont do that



must be crab people then


----------



## cheesy999 (Mar 25, 2011)

newtekie1 said:


> Vista would be another great example.



still using vista, don't see the point of upgrading, was gonna get a 6000 or 500 series gfx card but they've been really disappointing as well so i'm still on g92 for now, still running DDR2 as its not worth the cost upgrading, the only thing thats been good lately is core i and their far too expensive for me


----------



## newtekie1 (Mar 25, 2011)

cheesy999 said:


> still using vista, don't see the point of upgrading, was gonna get a 6000 or 500 series gfx card but they've been really disappointing as well so i'm still on g92 for now, still running DDR2 as its not worth the cost upgrading, the only thing thats been good lately is core i and their far too expensive for me



If I didn't get Win7 for free, I'd still be on Vista.

If I didn't get a super amazing deal on the Mobo and CPU in my main rig, I'd still be running the 780i and Q9650(X3370) w/ DDR2.  The only reason I upgraded was because selling the X3370 paid for the new CPU and Mobo.


----------



## cheesy999 (Mar 25, 2011)

newtekie1 said:


> If I didn't get Win7 for free, I'd still be on Vista.



i can get a deal which means 7 businesses for £40 and ultimate for £60 but i still can't be bothered to buy it


----------



## Akrian (Mar 25, 2011)

wow this thread became crazzy.
Here's my 2 cents on the topic. When 480 was released it was a huge letdown especially considering the time it took Nvidia to come up with the answer to 5xxx series, and the fact that 5970 still owned it. But fanboys were arguing that it's single gpu vs dual gpu which isn't fair. Fine, although I would argue that it doesn't matter how many cores a card will have as long as it priced and considered as the flagman of the company it will be valid to put it agains the opponents solution. So anyway now, in 2011 Nvidia released their dual gpu solution, which is still slower ( yes, yes barely, but atm a victory is a victory no matter how small the difference is) then ATI's soultion. NOW Nvidia fandom has no excuses to say that it's unfair to compair both solutions, and NOW they are pushing for such "important" aspects of the flagman card as ... SIZE, POWER CONSUMPTION and NOISE. Did that matter when gtx 280 was released ? or did that matter when 9800x2 was out ? or when 295 was released? oh wait... they were holding the position of the fastest cards atm, so whenever ATI fanboys were pointing out those flaws in the cards mentioned above, they were discarded for the same reason.

P.S. I'm not an ATI fanboy btw, as I had 480 in SLI ( and actually it was pretty good, after I've played around it optimal air flow to keep them at 70-80c during load).


To W1zzard : what were the clocks on 6990? the 830 or 880?


Oh btw I also think that 1.2v is a bit to much, and it seems that the card can reach 580s clocks with voltages below or at 1.05. But the fact that they released a card with voltage limiter working incorrectly is shameful, as if they never tested it, to see that it actually works.Question is : will it pass OCCT error test....


----------



## N3M3515 (Mar 25, 2011)

Akrian said:


> wow this thread became crazzy.
> Here's my 2 cents on the topic. When 480 was released and was a huge letdown especially considering the time it took Nvidia to come up with the answer to 5xxx series, and the fact that 5970 still owned it. But fanboys were arguing that it's single gpu vs dual gpu which isn't fair. Fine, although I would argue that it doesn't matter how many cores a card will have as long as it priced and considered as the flagman of the company it will be valid to put it agains the opponents solution. So anyway now, in 2011 Nvidia released their dual gpu solution, which is still slower ( yes, yes barely, but atm a victory is a victory no matter how small the difference is) then ATI's soultion. NOW Nvidia fandom has no excuses to say that it's unfair to compair both solutions, and NOW they are pushing for such "imporant" aspects of the flagman card as ... SIZE, POWER CONSUMPTION and NOISE. Did that matter when gtx 280 was released ? or did that matter when 9800x2 was out ? or when 295 was released? oh wait... they were holding the position of the fastest cards atm, so whenever ATI fanboys were poiting out those flaws in the cards mentioned above, they were discarded for the same reason.



*AMEN*

Your read my mind, now that radeon is faster then now they do care about noise and other stuff besides perf.
No matter what happens nvidia always win lol

There is a saying for that in my country: "Juan Zapata, si no la GANA(win), la EMPATA!(draw)"


----------



## wolf (Mar 26, 2011)

N3M3515 said:


> Your read my mind, now that radeon is faster then now they do care about noise and other stuff besides perf.
> No matter what happens nvidia always win lol
> 
> There is a saying for that in my country: "Juan Zapata, si no la GANA(win), la EMPATA!(draw)"



from what I've seen most users here fully accept that the radeon is the faster card, I certainly have no reservations about that. It was Nvidia themselves that worked so hard to give it other attributes like quietness/length, because they probably knew it would be too hard to straight out beat the 6990 for speed while staying below 375w. and good on them for doing it, they are capitalizing on the cards strengths, as they should.


----------



## N3M3515 (Mar 26, 2011)

wolf said:


> from what I've seen most users here fully accept that the radeon is the faster card, I certainly have no reservations about that. It was Nvidia themselves that worked so hard to give it other attributes like quietness/length, because they probably knew it would be too hard to straight out beat the 6990 for speed while staying below 375w. and good on them for doing it, they are capitalizing on the cards strengths, as they should.



Fair enough, but i'm talking about people that when nvidia has the perf lead, they don't care about acustics, temps, power, because it is the absolute best and "enthusiasts" only care about perf, but when ati has the lead, oh! they magically prefer the nvidia card because is quieter, etc, etc, such hypocrisy....
i've had both camps, from gf mx440, to hd4870 and both camps have had their good and not so good cards, fanboys prefer nvidia or ati no matter how full of crap they may be....  
The only reason that kind of people will accept losing is if for example 6990 were like 500% faster than gtx 590 AND they still prefer gtx 590 because is smaller and quieter LOL hilarious


----------



## erocker (Mar 26, 2011)

N3M3515 said:


> Fair enough, but i'm talking about people that when nvidia has the perf lead, they don't care about acustics, temps, power, because it is the absolute best and "enthusiasts" only care about perf, but when ati has the lead, oh! they magically prefer the nvidia card because is quieter, etc, etc, such hypocrisy....
> i've had both camps, from gf mx440, to hd4870 and both camps have had their good and not so good cards, fanboys prefer nvidia or ati no matter how full of crap they may be....
> The only reason that kind of people will accept losing is if for example 6990 were like 500% faster than gtx 590 AND they still prefer gtx 590 because is smaller and quieter LOL hilarious



I don't see how any of it matters or is relevant.


----------



## wolf (Mar 26, 2011)

N3M3515 said:


> Fair enough, but i'm talking about people that when nvidia has the perf lead... *snip*



I can definitely see what your saying is true for some people, but screw them they're idiots anyway. if I were to buy either of the two today it would be a 6990, however for a single GPU card I'd take the GTX580, go figure


----------



## Kreij (Mar 26, 2011)

Before this thread gets closed I would just like to say ...
 "omgz lul why so old drivers wizz you noob?" 

This is an inside joke, and a pre-emptive strike for the pain we are going to endure during this years April Fools joke.

That being said, this thread is being watched closely so be aware that if you incur infractions from being out of line, you brought it upon yourself. You've been warned by W1zz already.


----------



## qubit (Mar 26, 2011)

thedude74 said:


> No, it isn't.
> 
> 
> 
> ...



I've just read that article. Well, it seems fairly conclusive then that putting 1.2v on the GPU is gonna kill something. No wonder the poor 590 expired.



tigger said:


> Its not two gpu's sandwiched, its side by side. I dont think Nvidia have used the sandwich design since the 7950gx2.



The GTX 295 first revision was made with a sandwhich design and so was the 9800 GX2 before it.


----------



## Deleted member 24505 (Mar 26, 2011)

What about these? were these before the Gtx295 and 9800 gx2?

http://www.firingsquad.com/media/hirez.asp?file=/hardware/geforce_7950_gx2_preview/images/03.jpg


----------



## cowie (Mar 26, 2011)

tigger said:


> What about these? were these before the Gtx295 and 9800 gx2?
> 
> http://www.firingsquad.com/media/hirez.asp?file=/hardware/geforce_7950_gx2_preview/images/03.jpg



That 7900x2 and 7950x2(better) were more of a lol "layer cake" with two pcb's but not sharing the same heatsink and the cores were not face to face.  
the gx2 and first revision 295 have a sandwhich design and shared a common heatsink.
later as said nv came out with i think the 295+ it was called that was built on single pcb side by side cores.
oh and other duel cored cards in the(giga) 6 sieres(66 for sure,68?limited to none) were made also, they were sorta side by side with huge pcb's
darn i almost forgot the asus 78gt duel side by side core card
ok back to pictures of stuff blowing up or fighting depending on where i am lol


----------



## Akrian (Mar 26, 2011)

Oh almost forgot. Before they close the thread. I just have to put this out ( old, but hey, it's always relavent). Bitchin Fast 2000


----------



## JATownes (Mar 26, 2011)

btarunr said:


> There is an Easter egg in the review. First to find gets a cookie.





btarunr said:


> Nope. Keep trying.
> 
> First spotter gets a custom title. If you already have a custom title, you can animate your avatar as long as it doesn't pose an epileptic hazard.



Before the thread is closed, did anyone ever find the Easter Egg?

Custom title on the line and no one has spotted it? 

Oh and BTW, the 6990 and 590 are WAY OVERKILL for any games out today.  But IMHO, both are pretty bad ass cards.


----------



## erocker (Mar 26, 2011)

This thread will not be closed. People posting in this thread will be accountable for their actions. Keep on topic.

Thank you.


----------



## entropy13 (Mar 26, 2011)

JATownes said:


> Before the thread is closed, did anyone ever find the Easter Egg?
> 
> Custom title on the line and no one has spotted it?



This is actually the only reason I still read this thread, to see if someone have made a guess and if they're correct. lol

I'm still looking though (although none of the guesses have been deemed "wrong" yet? I think?)


----------



## yogurt_21 (Mar 26, 2011)

entropy13 said:


> This is actually the only reason I still read this thread, to see if someone have made a guess and if they're correct. lol
> 
> I'm still looking though (although none of the guesses have been deemed "wrong" yet? I think?)



yeah I'm still curious as well, but bta hasn't been back wince the infighting escalated.


I still find it funny that everyone focused on the blown card issue and failed to see that even if that had worked the card is still slower than the 6990 and much MUCH slower than 580sli. I mean shoot my 480sli is tearing the crap outta this card. 

I'm sure nv will address the overvolt limiter issue I just don't think it matters as for 700$ 570sli will tear the crap outta this card.


----------



## entropy13 (Mar 26, 2011)

Some WMG here, but maybe, just maybe, the "stock" clocks are actually *overclocked* clocks already?


----------



## newtekie1 (Mar 26, 2011)

yogurt_21 said:


> yeah I'm still curious as well, but bta hasn't been back wince the infighting escalated.
> 
> 
> I still find it funny that everyone focused on the blown card issue and failed to see that even if that had worked the card is still slower than the 6990 and much MUCH slower than 580sli. I mean shoot my 480sli is tearing the crap outta this card.
> ...



Well of course, but when was the last time the dual-GPU card was actually a better performance for the cash than two single GPU cards?  I mean GTX570/580 SLi also outperforms the HD6990, that won't stop people from buying that.

And considering it hit GTX580 clocks without a voltage bump, it isn't as bad as the stock clocks make it seem.  And that is a 25% overclock on stock voltage, what did the HD6990 overclock with the stock voltage?  It only managed a 10% overclock.


----------



## btarunr (Mar 26, 2011)

I will reveal the easter egg in an hour. Last chance, keep trying.


----------



## Maban (Mar 26, 2011)

btarunr said:


> I will reveal the easter egg in an hour. Last chance, keep trying.



QC Passed? Torx screws? 94V-0 rating?


----------



## entropy13 (Mar 26, 2011)

btarunr said:


> I will reveal the easter egg in an hour. Last chance, keep trying.



It's the card with the lowest score in TPU?


----------



## btarunr (Mar 26, 2011)

entropy13 said:


> It's the card with the lowest score in TPU?



You're close.


----------



## Maban (Mar 26, 2011)

Lowest score of anything on TPU?


----------



## btarunr (Mar 26, 2011)

Answer:

http://www.generalnonsense.net/showthread.php?p=113340#post113340

Thanks for playing.


----------



## Maban (Mar 26, 2011)

Now that's just silly.


----------



## entropy13 (Mar 26, 2011)

btarunr said:


> Answer:
> 
> http://www.generalnonsense.net/showthread.php?p=113340#post113340
> 
> Thanks for playing.



NOOOOOOOOOOO I was too late LOL


----------



## pantherx12 (Mar 26, 2011)

newtekie1 said:


> Really, you stuck the stock cooler on your processor, selected the maximum voltage available for everything in the BIOS, and then ran a stess testing program, and nothing blew up?  I doubt it.
> .




Ran 1.7-1.8 volts through an e5200 under an Arctic freezer 7 pro. (4.24ghz)

Not only did it not die, but it's still alive and kicking to this day. ( I did this around when I first joined TPU, I got into the 4ghz club safely, but wanted to see how far I could push the chip, and I was trying to beat JRacingfan I think, because he taught me how to fiddle with that chip pretty much himself XD )

Hell it could boot into windows, hell I could open CPU-z and take a screen shot.

If I ran a bit of a heavier program though, system just crashed. No damage.



Has to be said, that was a good "Easter egg"


----------



## qubit (Mar 26, 2011)

Oh, I'd found the easter egg and was simply giving everyone else a chance.


----------



## Mussels (Mar 26, 2011)

LOL epic fail was well done.


----------



## cheesy999 (Mar 26, 2011)

look what i just found - AMD is challenging NVidia to beat it score in 3dmark

http://blogs.amd.com/play/2011/03/25/2056/



> Yesterday our competitor also issued a press release, announcing the launch of what they claim to be the “World’s Fastest Graphics Card”– the Nvidia GTX 590. We combed through their announcement to understand how it was that such a claim could be made and why there was no substantiation based on industry-standard benchmarks, similar to what AMD did with industry benchmark 3DMark 11, the latest DirectX 11 benchmark from FutureMark. So now I issue a challenge to our competitor: prove it, don’t just say it. Show us the substantiation. Because as it stands today, leading reviewers agree with us here, here and here that the AMD Radeon HD 6990 sits on the top as the world’s fastest graphics card.


----------



## Mussels (Mar 26, 2011)

cheesy999 said:


> look what i just found - AMD is challenging NVidia to beat it score in 3dmark
> 
> http://blogs.amd.com/play/2011/03/25/2056/



ohhh yeah, i wanna see how this unfolds.


----------



## cowie (Mar 26, 2011)

W1zzard said:


> testing voltage tuning on msi hd 6950 twin frozr iii now..
> 
> so guys .. where should i stop ? after 15 - 25 mv like nvidia recommends ? or go as far as the slider lets me ?



Well since you asked
I would think you would go as far as an end user is able to using normal amd software.
And if you use more then you would disclose this information that you did,as to not squew end user overclocking results.

Would end up sorta like the 460 hawk reviews if not.
In that case,reviewers show +200mv in a special afterburner but end users only had +100mv to use.That was done maybe for safety but it did mess up end user results.


----------



## Harlequin_uk (Mar 26, 2011)

teutonic humour - who would have thought  hats off guys nice one


----------



## Kreij (Mar 26, 2011)

Mussels said:


> ohhh yeah, i wanna see how this unfolds.



That's almost worthy of it's own thread. Should be interesting to say the least.


----------



## cheesy999 (Mar 26, 2011)

Kreij said:


> That's almost worthy of it's own thread. Should be interesting to say the least.



i'd of made one but i think it counts as news and the mods generally don't like us normal people posting news


----------



## Kreij (Mar 26, 2011)

We don't mind if you post an occasional news item you find (in the appropriate section) as long as you don't spam news or duplicate something already posted as a headline in the news section.
We already have the world's foremost news spammer (Bta) and to try to compete with him would be sheer folly.


----------



## Harlequin_uk (Mar 26, 2011)

reading previous reviews - the 6850 in CF and the gtx 460 (even the 768,eg) in sli are within 5% of this card - for less than 1/2 of the cost oO


----------



## cheesy999 (Mar 26, 2011)

Kreij said:


> We don't mind if you post an occasional news item you find (in the appropriate section) as long as you don't spam news or duplicate something already posted as a headline in the news section.
> We already have the world's foremost news spammer (Bta) and to try to compete with him would be sheer folly.



so where would this go - general hardware or just straight in the news section


----------



## Kreij (Mar 26, 2011)

I would put it in the Graphic Card section. (You don't have access to post in the news section I believe)
That way Mussles, Paulieg and BP have to deal with the drama.


----------



## wahdangun (Mar 26, 2011)

is that the easter egg ???


----------



## Kreij (Mar 26, 2011)

Yes.


----------



## newtekie1 (Mar 26, 2011)

cheesy999 said:


> look what i just found - AMD is challenging NVidia to beat it score in 3dmark
> 
> http://blogs.amd.com/play/2011/03/25/2056/



Simple answer to AMD's request, nVidia will simply pick one random test, probably a real world benchmark that has more weight then the synthetic 3DMark2011, and show the GTX590 beating the HD6990.

Nvidia will probably pick HAWX2 as "an accurate representation of real world DX11 gaming."


----------



## cadaveca (Mar 26, 2011)

newtekie1 said:


> Nvidia will probably pick HAWX2 as "an accurate representation of real world DX11 gaming."



 You know that's exactly what went through the heads of those responsible for such @ AMD....









And yes, in my opinion, the card is an EPIC FAIL....I really push my stuff to the limit, and I haven't killed anything really since Core2 Duo. I guess I've made a whole bunch of 5870's develop a cold bug, but they didn't melt down and send molten bits onto the rest of my rig.

Nice easter egg, W1zz!


----------



## MxPhenom 216 (Mar 26, 2011)

look at what Linus was able to do. He didnt compare it to the 6990 but he was able to max out the voltage on the beta AB version he was able to get from MSI, and it didnt blow up at 1.050v and 809mhz. Theres also a guy over at OCN that has a 590 HC at 850mhz with 1.050v and it running great for him.

http://www.youtube.com/watch?v=XRhne0odjJA&feature=player_embedded#at=325


----------



## N3M3515 (Mar 26, 2011)

Harlequin_uk said:


> reading previous reviews - the 6850 in CF and the gtx 460 (even the 768,eg) in sli are within 5% of this card - for less than 1/2 of the cost oO



If you're going to play at 1080p i agree, but since these cards are for 1600p and beyond, the extra cash is justified.

hd 6970/6950 cf, well that's other story (not mentioning sli gtx570 because low memory)


----------



## newtekie1 (Mar 26, 2011)

nvidiaintelftw said:


> look at what Linus was able to do. He didnt compare it to the 6990 but he was able to max out the voltage on the beta AB version he was able to get from MSI, and it didnt blow up at 1.050v and 809mhz. Theres also a guy over at OCN that has a 590 HC at 850mhz with 1.050v and it running great for him.
> 
> http://www.youtube.com/watch?v=XRhne0odjJA&feature=player_embedded#at=325



Yeah, like I said, I would guess these cards are safe at 1.1v or below.

What I find interesting, is that Linus says he "maxed out the slider" which is only 1.05v.  So it seems nVidia has set the maximum there, only via BIOS editting or bypassing that BIOS limit in some way can you go higher.

So all the people saying 1.2v isn't extreme, obviously it is.  And everyone saying that it is a bad card because it popped at 1.2v, well nVidia limits you to 1.05v just like I said they should(well I said 1.087v), and you have to bypass that limit and that is why it pops.

In fact the GTX590 BIOS I just looked at definitely has the voltage limitted to 1.063v, which is what software is _supposed_ to follow and why AfterBurner's slider stops at 1.05v.  It is possible to edit the bios and go higher, or find software that ignores that limit, but obviously there is a reason it is limitted to 1.063v, and an enthusiust should know this.

So concolusion:  The card will not pop at stock, and it will more than likely not pop even if you max out the voltage slider either.


----------



## MxPhenom 216 (Mar 26, 2011)

newtekie1 said:


> Yeah, like I said, I would guess these cards are safe at 1.1v or below.
> 
> What I find interesting, is that Linus says he "maxed out the slider" wich is only 1.05v.  So it seems nVidia has set the maximum there, only via BIOS editting or bypassing that BIOS limit in some way can you go higher.
> 
> So all the people saying 1.2v isn't extreme, obviously it is.  And everyone saying that it is a bad card because it popped at 1.2v, well nVidia limits you to 1.05v just like I said they should(well I said 1.087v), and you have to bypass that limit and that is why it pops.



so basically im not considering this card a fail anymore


----------



## CDdude55 (Mar 26, 2011)

nvidiaintelftw said:


> so basically im not considering this card a fail anymore



It was never a ''fail card'' as some have suggested, those are just people overreacting.

The only issue's that arise is when severely overvolted, the actually card itself is fine. I don't understand how people can deem the whole card ''fail'' based on a non common issue.


----------



## Harlequin_uk (Mar 26, 2011)

N3M3515 said:


> If you're going to play at 1080p i agree, but since these cards are for 1600p and beyond, the extra cash is justified.
> 
> hd 6970/6950 cf, well that's other story (not mentioning sli gtx570 because low memory)



well tpu havent got a review on multiscreen setup for this.... the highest they go to is 2560x1600


----------



## Regenweald (Mar 26, 2011)

A 700 dollar 'enthusiast' card that can't handle overclocking by some of the best in the world. Nah, that not a fail....it's just that Nvidia as a company has now had a change in design direction and lower voltages, temps and acoustics, are now what the enthusiast crowd is calling for, right ?


----------



## HalfAHertz (Mar 26, 2011)

Anyway it wasn't the core that failed, it was an electrical component. So if anyone is to blame, it should be the AIB partners for cheapening out. Don't forget that Ati and Nvida's cards are manufactured at the same place, using the same methods and similar silicon. 

So if Ati's cores can scale to 1.3 and 1.4v then probably so can Nvidia's. And a lot of people seem to forget that the important frequency of the Nvidia cards is the shader one, the little bits that actually do all the hard work, whict is twice that of the core frequency. So they do scale nicely with voltage as well.


----------



## newtekie1 (Mar 26, 2011)

HalfAHertz said:


> Anyway it wasn't the core that failed, it was an electrical component. So if anyone is to blame, it should be the AIB partners for cheapening out. Don't forget that Ati and Nvida's cards are manufactured at the same place, using the same methods and similar silicon.
> 
> So if Ati's cores can scale to 1.3 and 1.4v then probably so can Nvidia's. And a lot of people seem to forget that the important frequency of the Nvidia cards is the shader one, the little bits that actually do all the hard work, whict is twice that of the core frequency. So they do scale nicely with voltage as well.



It isn't really the AIB's fault that the power circuitry failed when they followed nVidia's reference design using nVidia's reference parts, and it fails because nVidia's GPU pulled too much power.

The reason AMD cores can hancle 1.3-1.4+v is because of how much power the core actually pulls.  Power is measured in Watts, which is volts and amps.  AMD's cores pull less amps, so they can handle higher volts.

Though, as it seems now, there really is no fault.  Unless ASUS shipped W1z cards without the BIOS limitted to the 1.05v, then that would be ASUS's fault.  Otherwise, whatever method W1z used to raise the voltage past the 1.05v limit is to blame.  As the cards shouldn't be pushed past this limit, which is why nVidia put it there.


----------



## Lionheart (Mar 26, 2011)

http://www.youtube.com/watch?v=sRo-1VFMcbc

Thought this was intermaresting


----------



## newtekie1 (Mar 26, 2011)

Lionheart said:


> http://www.youtube.com/watch?v=sRo-1VFMcbc
> 
> Thought this was intermaresting



Yeah, its been posted 3 times in this thread now...

Sweclockers was because they were still using the faulty original 267.52 driver.

From their article on their issues:



> The first video card gave up the ghost when I overclocked by raising the voltage to the GPU. I did not think much more about it, that is, after all, what might happen and there's always Monday specimens, especially when it comes to early "samples".
> 
> Shortly thereafter, picked up Andrew to more suffered the same misfortune, and we decided to explore it all together with Nvidia.
> 
> Another video card was sacrificed in order to reach the conclusion that the driver is 267.52 which is the culprit. The test was repeated with the newer 267.71 and Nvidia's security mechanisms worked.



Roughly translated.


----------



## gaximodo (Mar 26, 2011)

And can someone explaine to me why there isn't a voltage tuning section for 6990?
The 590 has far higher OC potencial even at stock which is perfectly safe, and while providing you the option to overvolt it also running quiter and being shorter than the 6990, AMD's new DX11 lines has forced lots ppl to change their cases, while NV never went above 11 inches.
Why should I consider a 6990 over this?


----------



## cowie (Mar 26, 2011)

newtekie1 said:


> It isn't really the AIB's fault that the power circuitry failed when they followed nVidia's reference design using nVidia's reference parts, and it fails because nVidia's GPU pulled too much power.
> 
> The reason AMD cores can hancle 1.3-1.4+v is because of how much power the core actually pulls.  Power is measured in Watts, which is volts and amps.  AMD's cores pull less amps, so they can handle higher volts.
> 
> Though, as it seems now, there really is no fault.  Unless ASUS shipped W1z cards without the BIOS limitted to the 1.05v, then that would be ASUS's fault.  Otherwise, whatever method W1z used to raise the voltage past the 1.05v limit is to blame.  As the cards shouldn't be pushed past this limit, which is why nVidia put it there.



The card was either; 1 not retail 2 the bios was modded 3 a pro version of ab was used.(even a bios cap at 1.063 would limit voltage adding thruogh the api)
Plus there is reports of ab overvolting using the 590
i just went through the bios myself


----------



## newtekie1 (Mar 26, 2011)

gaximodo said:


> And can someone explaine to me why there isn't a voltage tuning section for 6990?



I could be wrong, but I believe it is because, at the time of the review's writting, there was no way to adjust voltages on the HD6990 besides flipping the BIOS switch which gave a slight voltage boost.

I'm not sure if there is even software available now that allows the voltage on the HD6990 to be adjusted, but I haven't exactly gone hunting for it, so it could be out there.


----------



## SetsunaFZero (Mar 26, 2011)

the question is "Will it blend" emm burn  
http://www.youtube.com/watch?v=sRo-1VFMcbc&feature=player_embedded


----------



## DanishDevil (Mar 26, 2011)

SetsunaFZero said:


> the question is "Will it blend" emm burn
> http://www.youtube.com/watch?v=sRo-1VFMcbc&feature=player_embedded



Now 4th time posted in the same thread...


----------



## Saabjock (Mar 27, 2011)

What's with this sense of entitlement guys? I don't see anywhere that you're allowed to over-volt or over-current any videocard...AMD or Nvidia. The fact that it's been done with various levels of success in the past, does not dictate that it should always be accomplished with the same result. You know the risk, you assume the risk and you do so on your own accord. If it works...great. If it doesn't...you knew the risk going in. That's why you are considered an overclocker. Operating the thing outside of the design specifications is entirely up to the buyer...as are the consequences. Try over volting your cellphone or anything else, I can almost guarantee you...the results will be the same.


----------



## Kreij (Mar 27, 2011)

Except in this case, Saab, Asus is promoting doing just that on the outside of their box.


----------



## thedude74 (Mar 27, 2011)

Regenweald said:


> A 700 dollar 'enthusiast' card that can't handle overclocking by some of the best in the world. Nah, that not a fail....it's just that Nvidia as a company has now had a change in design direction and lower voltages, temps and acoustics, are now what the enthusiast crowd is calling for, right ?



The card overclocks just fine. In fact, it overclocks much better than the 6990 does, it just needs less voltage. "Some of the best in the world" failed to do their homework and find out what the max voltage on this card is, so they blew it up.

I'll still yet to see a single explanation why the reviewers thought 1.2v would be acceptable for this card when it's higher than even a 580 should be volted.


----------



## Mussels (Mar 27, 2011)

Saabjock said:


> What's with this sense of entitlement guys? I don't see anywhere that you're allowed to over-volt or over-current any videocard...AMD or Nvidia. The fact that it's been done with various levels of success in the past, does not dictate that it should always be accomplished with the same result. You know the risk, you assume the risk and you do so on your own accord. If it works...great. If it doesn't...you knew the risk going in. That's why you are considered an overclocker. Operating the thing outside of the design specifications is entirely up to the buyer...as are the consequences. Try over volting your cellphone or anything else, I can almost guarantee you...the results will be the same.





Kreij said:


> Except in this case, Saab, Asus is promoting doing just that on the outside of their box.



^ what he said.


This was not a regular 590 here, but asus model which specifically stated it was designed for overvolting and overclocking.


----------



## senninex (Mar 27, 2011)

Overall!!!.... HD6990 is the choice of user & the King!


----------



## wolf (Mar 27, 2011)

Mussels said:


> ^ what he said.
> 
> 
> This was not a regular 590 here, but asus model which specifically stated it was designed for overvolting and overclocking.



but it absolutely is a regular GTX590, yes of course Asus states voltage tweak on the box, and yes that is misleading, but 99% of people buying this $699 card arent deciding their purchase based of stickers on the box. I would hope...


----------



## Mussels (Mar 27, 2011)

wolf said:


> but it absolutely is a regular GTX590, yes of course Asus states voltage tweak on the box, and yes that is misleading, but 99% of people buying this $699 card arent deciding their purchase based of stickers on the box. I would hope...



if those stickers said "comes with a fan" and it didnt come with a fan, you'd be pissed, yes?


the thing you dont get is that arent talking about the other cards, we're talking about this one. and doing what it says you can do, on the box... kills it. its like buying a car that self destructs if you put it into reverse while the air con is on.


----------



## wolf (Mar 27, 2011)

Mussels said:


> if those stickers said "comes with a fan" and it didnt come with a fan, you'd be pissed, yes?
> 
> 
> the thing you dont get is that arent talking about the other cards, we're talking about this one. and doing what it says you can do, on the box... kills it. its like buying a car that self destructs if you put it into reverse while the air con is on.



just keep in mind the box says _up to_ 50%, anyone with some sense should know thats just marketing hype, you might only get 5% more out of it.

I am still a firm believer that far too much voltage was used on this card, and others that have popped. but maybe thats just me.

I also disagree with your car analagy, its more like buying a car that is advertised with a top speed of 300 k/ph, you do 300 k/ph in it and crash, I wouldnt be surprised at all.


----------



## Saabjock (Mar 27, 2011)

If ASUS is advocating overvolting the cards in anyway...then they should 'eat' the cost of replacement. It's just that simple.


----------



## Kreij (Mar 27, 2011)

I agree completely, Saab.


----------



## Kreij (Mar 27, 2011)

wolf said:


> I also disagree with your car analagy, its more like buying a car that is advertised with a top speed of 300 k/ph, you do 300 k/ph in it and crash



No, it's more like buying a car that says top speed 300 and when you try that the engine blows up.
If they make claims, they better suport them with fact, or pay for replacements.

Oops ... sorry about double post. I can't merge them here. DOH!


----------



## wolf (Mar 27, 2011)

Kreij said:


> No, it's more like buying a car that says top speed 300 and when you try that the engine blows up.
> If they make claims, they better suport them with fact, or pay for replacements.



depending on how long you drive at 300 for, that just might happen. in any case I dont want to argue with you guys about your point of view on this one, I just believe that because the slider goes up to 11 doesnt mean it should be set there. yes they advocate overvolting and should assume responsibility towards the dead cards, but the user should assume _some_ also for actually doing it, thats all.

lol I was gunna say, mod double posting!


----------



## Deleted member 24505 (Mar 27, 2011)

If they did'nt want people over volting, it would not be an option in the drivers, also these are all enthusiast cards, which will be modded/over volted etc. A normal user will not pay $700 for a video card, people who buy these know the risks and do take them.

If it states it can do whatever on the box, then it should be able to do it.


----------



## TheMailMan78 (Mar 27, 2011)

Kreij said:


> No, it's more like buying a car that says top speed 300 and when you try that the engine blows up.
> If they make claims, they better suport them with fact, or pay for replacements.
> 
> Oops ... sorry about double post. I can't merge them here. DOH!



I already used that analogy but some people are blinded by the green light.


----------



## newtekie1 (Mar 27, 2011)

Mussels said:


> if those stickers said "comes with a fan" and it didnt come with a fan, you'd be pissed, yes?
> 
> 
> the thing you dont get is that arent talking about the other cards, we're talking about this one. and doing what it says you can do, on the box... kills it. its like buying a car that self destructs if you put it into reverse while the air con is on.



No where on the box does it say you can pump 1.2v through the card.  The box says "Voltage Tweak" meaning you can adjust the voltage, and "up to 50% faster".  Meaning you can increase the voltage(and you can safely up to 1.06v) and when you do that and overclock the card they managed to find one specific scenerio that showed a 50% performance gain with the overclock.

Now, ASUS might have disabled the 1.06v BIOS limit on this card that nVidia recommends, we don't know since W1z hasn't told us.  However, assuming they didn't, then the card can still be considered a "Volage Tweak" card because you can still increase the voltage from the stock 0.94v.

Also, the box clearly says "Voltage Tweak technlolgy allows you to boost GPU Voltage via ASUS Smartdoctor to achieve up to 50%* faster clock speeds." *something about results may vary, Extreme cooling is required to achieve 50% faster.



Kreij said:


> No, it's more like buying a car that says top speed 300 and when you try that the engine blows up.
> If they make claims, they better suport them with fact, or pay for replacements.
> 
> Oops ... sorry about double post. I can't merge them here. DOH!



If you read the box, that anology is wrong.  I think the best anology would be like buying a car that says you can increase the horse power from 200HP to 300HP by adjusting the boost using the boost controller buttons on the dash.  With the fine print saying you have to first install an better radiator in the car first or the engine will die from the extra heat.  Then a user buying the car, ignoring the fine print, just maxing out the boost using the buttons the day they bought the car, then some how managing to bypass the boost limit and going even higher then the boost controller was limitted to by the manufacturer, and then the engine pops.


----------



## Kreij (Mar 27, 2011)

wolf said:


> lol I was gunna say, mod double posting!



My internet connection is so slow I figured someone would sneek in a post before mine hit the boards.


----------



## CDdude55 (Mar 27, 2011)

TheMailMan78 said:


> I already used that analogy but some people are blinded by the green light.



That analogy is incorrect, they are not advertising overvolting at all on the box.

What he said basically v



newtekie1 said:


> No where on the box does it say you can pump 1.2v through the card.  The box says "Voltage Tweak" meaning you can adjust the voltage, and "up to 50% faster".  Meaning you can increase the voltage(and you can safely up to 1.06v) and when you do that and overclock the card they managed to find one specific scenerio that showed a 50% performance gain with the overclock.
> 
> Now, ASUS might have disabled the 1.06v BIOS limit on this card that nVidia recommends, we don't know since W1z hasn't told us.  However, assuming they didn't, then the card can still be considered a "Volage Tweak" card because you can still increase the voltage from the stock 0.94v.


----------



## TheMailMan78 (Mar 27, 2011)

newtekie1 said:


> No where on the box does it say you can pump 1.2v through the card.  The box says "Voltage Tweak" meaning you can adjust the voltage, and "up to 50% faster".  Meaning you can increase the voltage(and you can safely up to 1.06v) and when you do that and overclock the card they managed to find one specific scenerio that showed a 50% performance gain with the overclock.
> 
> Now, ASUS might have disabled the 1.06v BIOS limit on this card that nVidia recommends, we don't know since W1z hasn't told us.  However, assuming they didn't, then the card can still be considered a "Volage Tweak" card because you can still increase the voltage from the stock 0.94v.



It doesn't matter. You could have but 3.0v in it and it shouldn't have blown. The damn overdraw protection is flawed. The "protection" kicked in and the rig rebooted. When it rebooted the "protection" didn't load and blew the fucking card. Its a piece of shit.



CDdude55 said:


> That analogy is incorrect, they are not advertising overvolting at all on the box.
> 
> What he said basically v



You cannot get anywhere close to 50% faster without overvolting the card. Its common sense.


----------



## W1zzard (Mar 27, 2011)

guys, even if the card hadn't died it wouldn't have gotten a much better score. 

as i said in the conclusion it works great out of the box when you're not tweaking it, but what it delivers still isn't impressive or anywhere near it. why not stop the discussion about card died, card won't die, card not designed for overvolt, and talk about what else the card brings to the table, and what not ?


----------



## CDdude55 (Mar 27, 2011)

TheMailMan78 said:


> You cannot get anywhere close to 50% faster without overvolting the card. Its common sense.



That's just marketing, it's common sense.


----------



## TheMailMan78 (Mar 27, 2011)

W1zzard said:


> guys, even if the card hadn't died it wouldn't have gotten a much better score.
> 
> as i said in the conclusion it works great out of the box when you're not tweaking it, but what it delivers still isn't impressive or anywhere near it. why not stop the discussion about card died, card won't die, card not designed for overvolt, and talk about what else the card brings to the table, and what not ?



So in you're opinion would they have been better off with two 570's in one card rather then two 580's clocked low?



CDdude55 said:


> That's just marketing, it's common sense.



It really doesn't matter what Asus said. The cards  overdraw protection failed.


----------



## Kreij (Mar 27, 2011)

W1zzard said:


> and talk about what else the card brings to the table



Good idea. What exaclty does this card "bring to the table?"
Anything other than Nvidia can say we have one too? (dual GPU)


----------



## newtekie1 (Mar 27, 2011)

TheMailMan78 said:


> You cannot get anywhere close to 50% faster without overvolting the card. Its common sense.



And it also says on the box that you can't get anywhere near 50% without extreme cooling.

The card can be overvolted safely even with the stock cooler, just not overvolted to 1.2v, as we've been trying to tell you.  So there is nothing misleading on the box.


----------



## TheMailMan78 (Mar 27, 2011)

Kreij said:


> Good idea. What exaclty does this card "bring to the table?"
> Anything other than Nvidia can say we have one too? (dual GPU)



It will bring the smell of burnt electronics to your room.



newtekie1 said:


> And it also says on the box that you can get anywhere near 50% without extrreme cooling.



Cooling would have made zero difference. Unless you put an ice pack on the fuse/resistor.


----------



## qu4k3r (Mar 27, 2011)

wahdangun said:


> http://img.techpowerup.org/110326/btajsjbc874d.jpg
> 
> is that the easter egg ???


no, it's a letter soup.


----------



## W1zzard (Mar 27, 2011)

TheMailMan78 said:


> So in you're opinion would they have been better off with two 570's in one card rather then two 580's clocked low?



i would have designed the card with 3x 8 pin or 8+8+6, a VRM that can handle this load + 25% safety margin and two higher clocked GTX 580 GPUs for "in your face AMD" effect


----------



## Deleted member 24505 (Mar 27, 2011)

Now that card i would like to see you test W1zz.


----------



## TheMailMan78 (Mar 27, 2011)

W1zzard said:


> i would have designed the card with 3x 8 pin or 8+8+6, a VRM that can handle this load + 25% safety margin and two higher clocked GTX 580 GPUs for "in your face AMD" effect



I understand the higher VRM but the use of 3x 8pin connectors? At draw it didn't seem like it needed more power. Am I correct? Why another 8 pin?


----------



## W1zzard (Mar 27, 2011)

TheMailMan78 said:


> I understand the higher VRM but the use of 3x 8pin connectors? At draw it didn't seem like it needed more power. Am I correct? Why another 8 pin?



nvidia is limited in their TDP and power draw by the input power configuration.

so now they have 375 W and need to adjust their whole card to that limit, read my conclusion for a longer explanation


----------



## Kreij (Mar 27, 2011)

3x8 pin + PCI-E bus gives you a max power draw of 525 Watts.
That almost 1/3 of the max output from a standard (15A) US household circuit for the GC alone.

W1zz, remind me to update my house wiring and breakers if you release your own GC. 

Oh I also wanted to add that if you release a TPU GPU, the word fanboy will have to be redefined in the dictionary.


----------



## W1zzard (Mar 27, 2011)

Kreij said:


> 3x8 pin + PCI-E bus gives you a max power draw of 525 Watts.
> That almost 1/3 of the max output from a standard (15A) US household circuit for the GC alone.
> 
> W1zz, remind me to update my house wiring and breakers if you release your own GC.



if the TPU GPU is too powerful for you to handle, you can always go with the much weaker hd 6990


----------



## (FIH) The Don (Mar 27, 2011)

EPICFAIL


----------



## Kreij (Mar 27, 2011)

W1zzard said:


> if the TPU GPU is too powerful for you to handle, you can always go with the much weaker hd 6990



Rofl ... Nah, I'll update to 1200A service and put 480 lines in the computer room.
Is the TPU GPU going to be three-phase? lol


----------



## TheMailMan78 (Mar 27, 2011)

I would just have the power company run a 3 phase line to my house. Problem solved  More juice at a constant stream = COOLING WIN!

All joking aside if you hooked up with a GPU vendor to produce a TPU certified GPU I would buy one in a second. I want the TPU troll edition!


----------



## Kreij (Mar 27, 2011)

So W1zz, are you taking pre-orders for the TPU GPU yet?

Anyway, in a previous question I asked if the 590 actually brought anything new to the table.
Did it?


----------



## HalfAHertz (Mar 27, 2011)

Kreij said:


> So W1zz, are you taking pre-orders for the TPU GPU yet?
> 
> Anyway, in a previous question I asked if the 590 actually brought anything new to the table.
> Did it?



You can officially haz 3d surround sega mega drive vision with a dash of physx without the need of a 2nd card? EVGA and Galaxy's twin 460 cards were advertised earlier but are not sold anywhere yet..

You win a pci-e slot, nvidia wins your m00nies ergo 2win Boombastic edition.


----------



## Frick (Mar 27, 2011)

BTW, If a 7 is considered a very bad score the scoring system is broken imo.


----------



## W1zzard (Mar 27, 2011)

Frick said:


> BTW, If a 7 is considered a very bad score the scoring system is broken imo.



i never said "very bad"


----------



## Frick (Mar 27, 2011)

W1zzard said:


> i never said "very bad"



How about Epic Fail then? I know it was a joke, but it implies that's how you see it.


----------



## newtekie1 (Mar 27, 2011)

TheMailMan78 said:


> Cooling would have made zero difference. Unless you put an ice pack on the fuse/resistor.



When you us extreme cooling, the entire card is cold, even with insulation.  And people using extreme cooling are killing the cards anyway.

But again, that doesn't change the fact that no where on the box does it say 1.2v is safe.  All it says is that you can raise the voltage and get higher clocks, which you can do safely with the stock cooling, and you can do to the extreme with extreme cooling.

So the argument "well it says you can do it on the box" doesn't fly because it doesn't say you can pump 1.2v though the card on the box, and the BIOS limits the voltage to a very safe 1.06v*.

*Again, I'm assuming that ASUS' BIOS had that limit since it the limit nVidia has obviously set for the reference design.  If ASUS' BIOS didn't have this limit, then that is ASUS' fault and this particular card deserves the grief it gets for popping, but it still don't make the GTX590 design in general bad.



W1zzard said:


> i never said "very bad"



What I don't get is why give this card a 7 and the HD6990 a 9?  So the GTX590 doesn't being anything really new, well what did the HD6990 bring that was new?  So you can't overvolt the GTX590 that much, you couldn't overvolt the HD6990 at all.  The HD6990 overclocked worse than this card.  The HD6990 was louder than this card.  The only thing worse about this card really is the higher power draw and the weaker display output configuration, is that worth dropping the score all the way down to a 7?


----------



## Harlequin_uk (Mar 27, 2011)

it would appear that nvidia actually emailed before the nda was lifted about keeping voltage below 1.05v , was this before or after wizzard blew this one up?


----------



## TheMailMan78 (Mar 27, 2011)

newtekie1 said:


> When you us extreme cooling, the entire card is cold, even with insulation.  And people using extreme cooling are killing the cards anyway.
> 
> But again, that doesn't change the fact that no where on the box does it say 1.2v is safe.  All it says is that you can raise the voltage and get higher clocks, which you can do safely with the stock cooling, and you can do to the extreme with extreme cooling.
> 
> ...



Maybe because the 6990 didn't detonate W1zzs test bench and place his whole office into a thermonuclear winter?

Anyway like I said you could have set the card volts at 6.5v. It shouldn't have mattered. The overdraw protection is flawed.


----------



## MxPhenom 216 (Mar 27, 2011)

Kreij said:


> So W1zz, are you taking pre-orders for the TPU GPU yet?
> 
> Anyway, in a previous question I asked if the 590 actually brought anything new to the table.
> Did it?



yeah pretty good performance for being pretty quiet, runs pretty cool, and more efficient then the 6990 and its wthin ~3% of 6990 performance


----------



## CDdude55 (Mar 27, 2011)

TheMailMan78 said:


> Maybe because the 6990 didn't detonate W1zzs test bench and place his whole office into a thermonuclear winter?



Overreacting..


----------



## newtekie1 (Mar 27, 2011)

TheMailMan78 said:


> Maybe because the 6990 didn't detonate W1zzs test bench and place his whole office into a thermonuclear winter?
> 
> Anyway like I said you could have set the card volts at 6.5v. It shouldn't have mattered. The overdraw protection is flawed.



No, the BIOS limit to 1.06v should have saved the card.  Once you start going past that, no enthusiust should expect protection on the card to save you, especially not protection that we know only works by downclocking the card.

Talk all you want, the only thing that was flawed about the card was the initial driver, beyond that the BIOS was fine with a 1.06v limit, and the overcurrent limit was fine as well as long as the voltage limit set by nVidia is actually adhered to.


----------



## TheMailMan78 (Mar 27, 2011)

newtekie1 said:


> No, the BIOS limit to 1.06v should have saved the card.  Once you start going past that, no enthusiust should expect protection on the card to save you, especially not protection that we know only works by downclocking the card.
> 
> Talk all you want, the only thing that was flawed about the card was the initial driver, beyond that the BIOS was fine with a 1.06v limit, and the overcurrent limit was fine as well as long as the voltage limit set by nVidia is actually adhered to.



If the over current was fine it wouldn't have popped. I know you have problems accepting that but the many reviews on the Internet support my view. Where as you sound to be making excuses. Boo Whoo. Its everyone elses fault but Nvidias. Sometimes a duck is a duck.


----------



## newtekie1 (Mar 27, 2011)

TheMailMan78 said:


> If the over current was fine it wouldn't have popped. I know you have problems accepting that but the many reviews on the Internet support my view. Where as you sound to be making excuses. Boo Whoo. Its everyone elses fault but Nvidias. Sometimes a duck is a duck.



Once you bypass nVidia's set limit, the overcurrent protection literally can't do its job, and you know the risk.  Doesn't make the card's design flawed in any way.


----------



## MxPhenom 216 (Mar 27, 2011)

newtekie1 said:


> Once you bypass nVidia's set limit, the overcurrent protection literally can't do its job, and you know the risk.  Doesn't make the card's design flawed in any way.



yeah it seems he doesnt understand that. going to 1.2v is easily past nvidia set 1.050v limit. if your at 1.050v the card should be fine


----------



## Harlequin_uk (Mar 27, 2011)

what he is saying though , is if that limit is so crucial , why in the asus card is there the ability to go passed it? if it was set in stone then the bios should not have any option to go to 1.2v;

so the limit in fact came into effect AFTER cards started blowing up, sweclockers and XS allready said as much since they had the review done and dusted (and dead cards) before the email came in with the new limit anyway.


----------



## W1zzard (Mar 27, 2011)

Harlequin_uk said:


> it would appear that nvidia actually emailed before the nda was lifted about keeping voltage below 1.05v , was this before or after wizzard blew this one up?



after launch, after getting off the phone with the editors who blew up their cards


----------



## Harlequin_uk (Mar 27, 2011)

tweaktown and xs got the email the day before


----------



## erocker (Mar 27, 2011)

Conspiracy? Alien abduction, perhaps? We'll never know.  Or, apparently some didn't get the email.


----------



## Harlequin_uk (Mar 27, 2011)

erocker said:


> Conspiracy? Alien abduction, perhaps? We'll never know.  Or, *apparently some didn't get the email*.



more than likely - but hardwarecannucks are adding there 10cents worth:

http://www.hardwarecanucks.com/news/video/nvidia-responds-complaints-newly-released-gtx-590-frying/



> In addition, here is an extract from ASUS’ GTX 590 reviewer’s guide:
> 
> _It is not advised to exceed the 1.050 to 1.065 vcore range as this begins to meet the limits for the OCP/OVP mechanism on the card. Exceeding these values without disabling OCP/OVP or having superior cooling could affect the lifespan and functionality of the card/gpu._
> 
> The lesson of the day is this folks: ignoring manufacturer’s recommendations and overclocking your card way beyond its limitations (especially when  using beta drivers and beta software) can in fact lead to the unfortunate killing of hardware. Fancy that eh?



implication being ofc that *some people* ignored what was recommended........


still when the gtx 460 in sli is near enough as quick as this ` monster` - beggers the question - why should anyone actually buy it


----------



## HalfAHertz (Mar 27, 2011)

erocker said:


> Conspiracy? Alien abduction, perhaps? We'll never know.  Or, apparently some didn't get the email.



WTF you mean to tell me it was aliens that sabotaged the 590 and not North Korea???
Damn gues I wasted my $ on that korean attack survival gear


----------



## newtekie1 (Mar 27, 2011)

W1zzard said:


> after launch, after getting off the phone with the editors who blew up their cards



But what was the board maximum set to in the BIOS of your test card?


----------



## TheMailMan78 (Mar 27, 2011)

newtekie1 said:


> Once you bypass nVidia's set limit, the overcurrent protection literally can't do its job, and you know the risk.  Doesn't make the card's design flawed in any way.



Over current protection Is designed to do that exactly. To protect the card from excess power. If it can't do that past the set limit (which by the way is pretty much the max the board can take anyway apparently.) then its flawed. Plain and simple. You should be able to pump it with 3 volts and it not fry the card. You have to understand W1zz didn't do anything magical. He just tried to do what the box stated "up to 50%" more power. In no time should have the card blown up attempting to do whats its advertised.

Now you will say "Oh but it said power not volts" Well have you ever heard of a card producing anywhere near 50% plus power without upping the volts? Volts isn't a major issue in OC. You know this. Heat is. The only thing volts can do if the GPU is well cooled is produce electron migration which can cut the GPU lifespan down....even thats very debatable. But in this case the GPU was fine. It was the damn board that blew. Why? Because the over-current protection was inadequate. The damn thing should have never went snap crackle pop. This is why the 6990 got a higher score I am willing to bet. The 6990 is just a better designed card. Is it better in performance? Debatable. But its built better with its own limitations in mind.

Anyway W1zz just stated the "limit" was given to him well after he blew up the card....along with other review sites. Gee someone dropped the ball.....I wonder who.


----------



## Harlequin_uk (Mar 27, 2011)

TheMailMan78 said:


> Anyway W1zz just stated the "limit" was given to him well after he blew up the card....along with other review sites. Gee someone dropped the ball.....I wonder who



`other review` sites got the email before the nda was lifited , maybe after they blew teh cards or not - but before the nda was lifted.


----------



## newtekie1 (Mar 27, 2011)

TheMailMan78 said:


> Over current protection Is designed to do that exactly. To protect the card from excess power. If it can't do that past the set limit (which by the way is pretty much the max the board can take anyway apparently.) then its flawed. Plain and simple. You should be able to pump it with 3 volts and it not fry the card.



Except that is not what the protection is designed to do exactly.  It is designed to stop the card from pulling excess power when run within spec.  It is not designed to pretect people that want to ignore the other safety features put in place and bypass nVidia's set maximums.  The card and the power protection was not designed to handle 1.2v, period.



TheMailMan78 said:


> You have to understand W1zz didn't do anything magical. He just tried to do what the box stated "up to 50%" more power. In no time should have the card blown up attempting to do whats its advertised.



Yes, what W1z did was magical.  He, I assume since he hasn't answered what the limit of the BIOS on his cards were set to, bypassed the limit set by nVidia.  Made a 0.2v voltage jump in one go(what experienced overclocker does that?), and grossly bypassed the limit nVidia had set for the card.

I've asked you at least once to show me where on the box it says 50% more power.  It is not advertised anywhere on the box that the card can handle 1.2v or 50% more power.  Until you show me where on the box exactly it says that 1.2v is safe, or that the card can handle 50% more power, you need to stop making this argument.  Furthermore, it clearly does say on the box that extreme cooling is needed to the up to 50% more speed, W1z wasn't using extreme cooling.


----------



## TheMailMan78 (Mar 27, 2011)




----------



## qubit (Mar 27, 2011)

Ok people, never mind if the card has an inherent design flaw, I'm wondering if given its high cost, it'll be worth repairing by the manufacturer or will it just be scrapped?


----------



## newtekie1 (Mar 27, 2011)

TheMailMan78 said:


> http://img.techpowerup.org/110327/package1.jpg



I don't see anywhere on there where it says 1.2v is safe, or up to 50% more power.  Care to show me exactly where those to things are shown?

Perhaps you are confusing the meaning of "faster" with the meaning of "more power"?  You know those aren't the same thing, right?  Apparently not...


----------



## TheMailMan78 (Mar 27, 2011)

newtekie1 said:


> I don't see anywhere on there where it says 1.2v is safe, or up to 50% more power.  Care to show me exactly where those to things are shown?



Ok 50% more speed and with voltage tweak. So I guess you expected to get 50% more speed with pixie dust? Cool.


----------



## newtekie1 (Mar 27, 2011)

TheMailMan78 said:


> Ok 50% more speed and with voltage tweak. So I guess you expected to get 50% more speed with pixie dust? Cool.



No, I expect to get it with extreme cooling, like the box says.  But I guess, since you are so into what the box says, you felt it ok to ignore where the box says extreme cooling is necessary...wait that doesn't make sense...


----------



## erocker (Mar 27, 2011)

Is it even possible to "extreme cool" the part that blew up or the part that caused the part to blow up?


----------



## TheMailMan78 (Mar 27, 2011)

newtekie1 said:


> No, I expect to get it with extreme cooling, like the box says.  But I guess, since you are so into what the box says, you felt it ok to ignore where the box says extreme cooling is necessary...wait that doesn't make sense...



And you do you think extreme cooling would have made any difference in this case considering the GPU didn't melt down?


----------



## newtekie1 (Mar 27, 2011)

erocker said:


> Is it even possible to "extreme cool" the part that blew up or the part that caused the part to blow up?



Extreme cool mosfets?  Yeah, but I'm not sure it will help with popping, however adding better cooling to mosfets definitely will help them handle more current.  However, extreme cooling the GPU will help achieve the "up to" 50% higher GPU clocks.


----------



## TheMailMan78 (Mar 27, 2011)

newtekie1 said:


> Extreme cool mosfets?  Yeah, but I'm not sure it will help with popping.  However, extreme cooling the GPU will help achieve the "up to" 50% higher GPU clocks.



Not without a massive voltage jump.


----------



## CDdude55 (Mar 27, 2011)

erocker said:


> Is it even possible to "extreme cool" the part that blew up or the part that caused the part to blow up?



True.

I thought he pumped to much voltage into the GPU's themselves which then ended up frying it, and it that case, ''extreme cooling'' should easily solve the problem.


----------



## DaedalusHelios (Mar 27, 2011)

TheMailMan78 said:


>



Its marketing, *up to 90%* of marketing from all companies including AMD and Nvidia is bullshit or propaganda.


----------



## newtekie1 (Mar 27, 2011)

TheMailMan78 said:


> And you do you think extreme cooling would have made any difference in this case considering the GPU didn't melt down?



Extreme cooling doesn't mean just the GPU.  Cooling the VRMs better might save them.  Better cooling certainly will allow a mosfet to handle more current without frying.



TheMailMan78 said:


> Not without a massive voltage jump.



And with extreme cooling a massive voltage bump will be possible.  Of course with extreme cooling, you don't exactly expect the card to last for more than half an hour or so anyway.  Unless you are running cards on liquid nitrogen 24/7?


----------



## TheMailMan78 (Mar 27, 2011)

CDdude55 said:


> True.
> 
> I thought he pumped to much voltage into the GPU's themselves which then ended up frying it, and it that case, ''extreme cooling'' should easily solve the problem.


 But thats not what blew. What blew was on the very edge of the card.



DaedalusHelios said:


> Its marketing, *up to 90%* of marketing from all companies including AMD and Nvidia is bullshit or propaganda.


 Thats irrelevant. They still have to provide proof making such a claim or face false advertisement charges. Even if the circumstanse are wacky like they tested it on the surface of Pluto. They still have to provide proof if questioned.



newtekie1 said:


> Extreme cooling doesn't mean just the GPU.  Cooling the VRMs better might save them.  Better cooling certainly will allow a mosfet to handle more current without frying.
> 
> And with extreme cooling a massive voltage bump will be possible.  Of course with extreme cooling, you don't exactly expect the card to last for more than half an hour or so anyway.  Unless you are running cards on liquid nitrogen 24/7?


 So I guess you need to submerge the entire card in LN?  






Accroding to even Nvidia "This should not have happen"



> According to NVIDIA this should not happen. In their official reviewer driver (which I used), the NVIDIA Power limit is designed to be active for all applications, not only Furmark.



So yeah something failed here. It failed in other reviews. Again this is not an isolated incident and the card sucks in its current revision, blown cards across a few sites proves this. Now maybe when the new one W1zz talked about hits the streets you might have something worth buying.


----------



## Frick (Mar 27, 2011)

So a driver failure means the card itself is badly designed?


----------



## qubit (Mar 27, 2011)

Frick said:


> So a driver failure means the card itself is badly designed?



Ya know, the more I see people make their points about this card and what it should tolerate, the less I'm sure whether or not it has a hardware design flaw.

Both newtekie1 and mailman78 are maiking good arguments, so it's hard to say.

It seems to me that it's a combination of hardware and software factors that come together to make this go pop. Also, it seems that Asus's claim on the box is in error. nvidia themselves say not to push this much voltage at it (see my earlier post) and it rock solid at it's stock voltage. Also, it could possibly have more phases on the regulator, given how much those power those GPUs pull, further muddying the water.

And I tell ya what, my shiny new GTX 580 has the same power circuitry as the 590 (just x1, obviously) so I ain't never gonna increase the volts on my baby!


----------



## CDdude55 (Mar 28, 2011)

I am curious as to really what caused the problem, you can't just say ''the card is badly designed'', because something had to trigger the card not to protect itself from heavy voltage like it's supposed to, i have heard that some of the drivers being used cuts off the protection so overvolting causes it to screw up, so if that's the case, it's just a matter of swapping drivers.

Or is the power regulation circuitry just not strong enough to handle that amount of voltage?.. or maybe it's cooling, maybe once you slap an aftermarket cooler or waterblock on there then everything is smooth sailing when it comes to heavy overclocks/overvolting(which Nvidia has said recently themselves).


----------



## cowie (Mar 28, 2011)

qubit said:


> Ya know, the more I see people make their points about this card and what it should tolerate, the less I'm sure whether or not it has a hardware design flaw.
> 
> Both newtekie1 and mailman78 are maiking good arguments, so it's hard to say.
> 
> ...



no you have 1 more power phase, 590 has 5 per core the 570 4 ..590 sits in between 570 and 580 (6).

i have found that the 6990 had similar issues with some reviewers.... looking for dead 590 cards of all things,but they were to scared to mention it in reviews.
they have good pmw so lets face it the deul cards of both side are trash

http://forums.overclockersclub.com/index.php?showtopic=183386&st=36
BOSCO's posts
page 4-6
Ya the 6990's are not fairing much better I know of 4 cards that are dead so far with 2 more having issues. One of ours died as well..... shakes head 

http://www.neoseeker.com/Articles/Hardware/Reviews/AMD_HD_6990_Antilles/18.html


 so how did the overvolting go with the 6990 wizz any issues?
i dont see any voltage tweeks in that asus 6990 review maybe not avaliabe at the time? any plans for follow up review with of 20%+ voltage??


----------



## qubit (Mar 28, 2011)

cowie said:


> no you have 1 more power phase, 590 has 5 per core the 570 4 ..590 sits in between 570 and 580 (6).



So, that begs the question of why they chose to use less phases for a card that inherently consumes more, doesn't it? Are the phases more potent than on the 580 perhaps, so they can use less of them? Did they cheap out? Was space too tight on the circuit board? (unlikely). I think it's a good question.

At least I can rest a bit more easily about my card, hehe.


----------



## damric (Mar 28, 2011)

wahdangun said:


> http://img.techpowerup.org/110326/btajsjbc874d.jpg
> 
> is that the easter egg ???



That shit is hilarious.


----------



## DaedalusHelios (Mar 28, 2011)

TheMailMan78 said:


> Thats irrelevant. They still have to provide proof making such a claim or face false advertisement charges. Even if the circumstanse are wacky like they tested it on the surface of Pluto. They still have to provide proof if questioned.



You think Kingpin or the like won't be doing just that with nicely binned chips? They always do but I know I couldn't get their results even if I tried. Kind of like the theoretical bandwith of things like SATA and USB 2.0 that nobody actually achieves yet are advertised like that is what you can get. The PC market is used to wild claims over the years without proof. Remember all those DX10 and DX10.1 compatible cards that are super low end and cannot handle the DX10 games yet are marketed for "extreme DX10 gaming". Do they meet the minimum system requirements even? If marketing was honest we wouldn't need reviewers for the most part.


----------



## newtekie1 (Mar 28, 2011)

TheMailMan78 said:


> So I guess you need to submerge the entire card in LN?
> 
> Accroding to even Nvidia "This should not have happen"
> 
> ...



That is a fuse that blew when the mosfet failed, your point?

And again, the power limitter won't work at 1.2v, and nVidia says you should never go that high.  What part of that don't you understand?

I haven't seen a single other reivew that blew the card at or below the nVidia set maximum of 1.06v, have you?  The exception being Sweclockers, which blew their card because they were using drivers that didn't have the power limitter in place.

What is interesting is that I just read another review that used smartdoctor, and went a little bit more in depth about smartdoctor with this card.  It seems that smartdoctor actually does bypass the BIOS limit on the card, allowing voltages up to 1.213v.  In this case, it seems ASUS' software is to blame for bypassing that limit without even a warning to the user.  Also, the same review actually contacted nVidia _before_ just jacking up the voltages and clocks and here is what nVidia told them:



> It is not advised to generally exceed the 1.050 to 1.065 vcore range as *this begins to meet the limits for the OCP/OVP mechanism* on the card. Exceeding these values without disabling OCP/OVP or having superior cooling could affect the lifespan and functionality of the card/gpu.



I don't really feal like this is going anywhere anymore, so I'm done discussing it.


----------



## sLowEnd (Mar 28, 2011)

TheMailMan78 said:


> http://img.techpowerup.org/110327/package1.jpg



FYI that is a generic logo they use for all their recent cards that support overvolting.





















...etc

If you read the fine print, it also mentions that it's "*up to* 50%"


----------



## thedude74 (Mar 28, 2011)

CDdude55 said:


> I am curious as to really what caused the problem, you can't just say ''the card is badly designed'', because something had to trigger the card not to protect itself from heavy voltage like it's supposed to, i have heard that some of the drivers being used cuts off the protection so overvolting causes it to screw up, so if that's the case, it's just a matter of swapping drivers.
> 
> Or is the power regulation circuitry just not strong enough to handle that amount of voltage?.. or maybe it's cooling, maybe once you slap an aftermarket cooler or waterblock on there then everything is smooth sailing when it comes to heavy overclocks/overvolting(which Nvidia has said recently themselves).



Nvidia said this:



> The few press reports on GTX 590 boards dying were *caused by overvoltaging to unsafe levels* (as high as 1.2V vs. default voltage of 0.91 to 0.96V), *and using older drivers that have lower levels of overcurrent protection*.



Which basically says it all. 

Additionally, anyone still clinging to the idea that the protection should have kicked in at these ridiculous voltages should also read this.



> Please note that overcurrent protection *does not eliminate* the risks of overclocking, and *hardware damage is possible*, particularly *when overvoltaging*.



Which basically means, you run the card in unintended ways you can end up with unintended consequences. 


Now, when a rash of 590's burn up at stock voltages or even overclocked voltages that are within design parameters, I'll start agreeing with people calling this card junk, until then they don't have a leg to stand on.

*The 590 in the review was clocked at 612mhz stock with a voltage of .938v. At 1.0v it hit 815mhz with stock cooling! That's a 33% overclock and is faster than a 580! What a horrible card!* 

Finally, where is the similar outrage in terms of the 6900, which as far as I can tell is the ONLY one of these two cards that is failing under proper use? Why no mention of these cards dying and why is it a big secret? Why no crazy overvolt test's to see how far a 6990 could be pushed? Seems like a double standard to me.


----------



## gaximodo (Mar 28, 2011)

The cards is quieter, smaller, and as strong as the HD6990 for most of us (1920*1200/1080) while providing more than twice of overclocking potential at stock voltages, at the same price.

wrong thread.. OOOPS


----------



## cowie (Mar 28, 2011)

thedude74 said:


> Nvidia said this:
> 
> 
> 
> ...



They cant be honest about any amd card short comings...amd will not send them any more review cards.
we should let karma run its coarse it will came back to bite in the ass somehow someway 
Maybe in a month or so time when/if these 6990 start pile'in up in droves the unwarned public will wake up on the amd card reviews and be more of a skeptic or just stop reading them altogether.
 remember only two brands will not void warrenty if the bios switch is moved!!! that alone is enuogh for me to only recomend two brands of that card.


----------



## Harlequin_uk (Mar 28, 2011)

http://www.nvidia.com/object/win7-winvista-64bit-267.91-driver.html

new driver out - guessing its fixed now


----------



## the54thvoid (Mar 28, 2011)

*When a Tweak isn't a Tweak.*

The actual word 'tweak' means to 'fine tune'.  A 25% adjustment in voltage isn't a tweak.  So if you want to go on semantics, what some reviewers have done is voltage 'boost'.

The Hardware Canucks article also has this paragraph:



> In addition, here is an extract from *ASUS’ GTX 590 reviewer’s guide*:
> 
> It is not advised to exceed the 1.050 to 1.065 vcore range as this begins to meet the limits for the OCP/OVP mechanism on the card. Exceeding these values without disabling OCP/OVP or having superior cooling could affect the lifespan and functionality of the card/gpu.



I have read through the posts so i don't think this has been posted up but it should point as an indicator that if cards fried, it's because even our beloved W1zz wanted to push it (unless that's what the e-mails were all about but my reading was that was an NV email not the actual ASUS reviewer guide.?).

Other review sites have pushed the clock up and didn't bother overvolting.  Reviewing the reviews, most that OC'd got 690 core, even without voltage changes.

The argument shouldn't be about fried cards.  The cards fried due to
a) wrong drivers,
b) in the ASUS case, reviewers not following guidelines
c) MOST IMPORTANTLY, Nvidia failing to tell the reviewers exactly what drivers to use or not having them on time.

The card doesn't suck.  In almost all reviews it gets praise for what it delivers, performance without the associated drawbacks of noise and size.  The fact some folk killed it is NV's fault for poor marketing and technical advice.  

Dare I say it W1zzard's review is one of the most scathing.  Most places give it 8 or 9 out of 10.

If the new driver doesn't allow the same cock ups (and no more cards fry) then this whole argument is irrelevant.


----------



## Mussels (Mar 28, 2011)

reviewers not following guidelines? those guidelines were given out AFTER the reviewers ended up with dead cards, and informed everyone of the problem.

the rest of your post i agree with.


----------



## TheMailMan78 (Mar 28, 2011)

Mussels said:


> reviewers not following guidelines? those guidelines were given out AFTER the reviewers ended up with dead cards, and informed everyone of the problem.
> 
> the rest of your post i agree with.



Its like the reviewers were the beta testers. Sad day for Nvidia.


----------



## Harlequin_uk (Mar 28, 2011)

well so far theres an increasing amount of dead 6990`s as well.


too much from both `teams`.


----------



## cadaveca (Mar 28, 2011)

Harlequin_uk said:


> well so far theres an increasing amount of dead 6990`s as well.
> 
> 
> too much from both `teams`.



Um, links?


----------



## Harlequin_uk (Mar 28, 2011)

cadaveca said:


> Um, links?




scroll up?


http://www.techpowerup.com/forums/showpost.php?p=2238253&postcount=479


----------



## CDdude55 (Mar 28, 2011)

Harlequin_uk said:


> well so far theres an increasing amount of dead 6990`s as well.
> 
> 
> too much from both `teams`.



I have not seen any dead 6990's from any reviews, so that's a lie unless i see some links. The only issues ive seen with the 6990 is that it's loud as fuck and super expensive. (and performance is very similar to the GTX 590)




			
				TheMailMan78 said:
			
		

> Its like the reviewers were the beta testers. Sad day for Nvidia.



The reviewers are always the ''beta testers'', and of course as thoroughly explained before, i find it hard to blame Nvidia for such a non common issue. I guarantee 98% of people buying this card won't have the same problem, as there are different standards for reviews and consumers. Those that what to overvolt know the risks on such a card, but it's far from a mainstream problem.


----------



## cadaveca (Mar 28, 2011)

Harlequin_uk said:


> scroll up?
> 
> 
> http://www.techpowerup.com/forums/showpost.php?p=2238253&postcount=479





CDdude55 said:


> I have not seen any dead 6990's from any reviews.




Neoseeker, apparantly.



> After our initial testing of the HD 6990 we moved the graphics card over to a backup system that we were using to test new games for our benchmarking suite. We were able to complete testing with the HD 6990 in some of our new benchmarks including H.A.W.X 2, Lost Planet 2 and DiRT 2, however, when we were testing the performance of Dragon Age II the HD 6990 died on us. At the time of it's demise the card was set at the stock 830MHz setting and the BIOS switch was in the default position. The fact that it died could have been that we tested the graphics card at both the 375W and 450W settings, but since the review we have left the settings at default level.


----------



## wahdangun (Mar 28, 2011)

the54thvoid said:


> The actual word 'tweak' means to 'fine tune'.  A 25% adjustment in voltage isn't a tweak.  So if you want to go on semantics, what some reviewers have done is voltage 'boost'.
> 
> The Hardware Canucks article also has this paragraph:
> 
> ...



yes it is some card dead with default clock and voltage 




> ^^ This , i know already of 5 dead cards from my friends that died at stock voltages stock clock running heaven and Vantage benchmark , add 2 more testing on LN2 and 2 more on H2O = 9 cards already , how many dead 590s have not been reported ? this card cannot even handle 100% full load on GPU for long periods of time , yes it will blow up in smoke even at stock voltages and stock clocks. Nvidia should make a recall on this cards me thinks :/ , so many cards 590s have died and since has not been reported it looks thats just a few cards , the truth is , its going to happen soon or later , its like a time bomb , this card its a total failure and im sad i had high expectations on this card. Ill wait for the next 28mm cards to be release , as of right now 590 or 6990 are not apealing to me at all



linky

chipsy is a member of HWbot, if you don't trust me just go to guru3d forum and ask everyone about chipsy.


----------



## EarthDog (Mar 28, 2011)

Question for your Wizzard... And I didnt check this thread for this answer...

Did you intentionally put EPIC FAIL in your 'thumbs down' section of your review summary? (first letter of each sentence)


----------



## the54thvoid (Mar 28, 2011)

Mussels said:


> reviewers not following guidelines? those guidelines were given out AFTER the reviewers ended up with dead cards, and informed everyone of the problem.
> 
> the rest of your post i agree with.





TheMailMan78 said:


> Its like the reviewers were the beta testers. Sad day for Nvidia.



Yeah, that's kinda my point - NV do know how to massively cock things up.  How much better would the PR be if they waited a few more days for the appropriate guidance/testing or drivers?


----------



## TheMailMan78 (Mar 28, 2011)

wahdangun said:


> yes it is some card dead with default clock and voltage
> 
> 
> 
> ...



Most of these Nvidia users don't even want to believe W1zz. Do you think they will believe this guy?

Moderators---> <--- Mailman.


----------



## Harlequin_uk (Mar 28, 2011)

and you are still in denial that 6x 6990`s in reviewers hands have either failed or are failing....


----------



## cadaveca (Mar 28, 2011)

Harlequin_uk said:


> and you are still in denial that 6x 6990`s in reviewers hands have either failed or are failing....



I don't think so. Deaths from one side does not justify the cards failing from the other camp. Shouldn't happen at all, as far as I am concerned, and AMD's cards dying has NOTHING to do with the 590GTX deaths, other than these GPUs are made by the same company, on the same process. It's the PCBs that have failed, not the GPUs.


----------



## Harlequin_uk (Mar 28, 2011)

this is the first `real` foray passed the pci-sig limits , and its obvious those limits are there for a good reason - whilst the gpu`s themselves might be up for it , they are failing at the weakest point


and any cards failing from either camp is not good at all.


----------



## cadaveca (Mar 28, 2011)

What I don't understand is why, when these cards are really only wired to provide 375w(slot plus 2x8-pin), people expect them to exceed that power draw. nVidia states 365w for GTX590, so to expect much overclock, if any, seems a bit silly.

However, it is very concerning that these cards can even go past that, nevermind things like OCP and OVP, in my opinion, should be hardware-based, and not software, nor should it be able to be affected by drivers, and in that alone, I consider these cards a failure.

I appreciate the reviewers killing these cards though...that leaves clear lines drawn in the sand as to what people can expect, no matter how far over specification these cards were run. It's been a long time since we've seen VRM failures, yet DR-MOS tech on VGAs is relatively new...and quite cost-effective. I do not beleive cost-effective solutions should be used on "flagship" cards, either, so I'm left looking at nVidia for an explanation.


----------



## Harlequin_uk (Mar 28, 2011)

i know wizzard said about another power plug - but thats way outside pci-sig spec , since the spec only allows for 2 power plugs

http://www.pcisig.com/specifications/pciexpress/specifications/

either amd and nv should make the cards complient or not - pretending to do it isnt working.


----------



## EarthDog (Mar 28, 2011)

What about the molex adapters on some boards (most notably any quad SLI boards). I believe that is what he was talking about...

the slot can deliver up to 150W I believe, but sits only at 75W...?


----------



## cowie (Mar 28, 2011)

cadaveca said:


> What I don't understand is why, when these cards are really only wired to provide 375w(slot plus 2x8-pin), people expect them to exceed that power draw. nVidia states 365w for GTX590, so to expect much overclock, if any, seems a bit silly.
> 
> However, it is very concerning that these cards can even go past that, nevermind things like OCP and OVP, in my opinion, should be hardware-based, and not software, nor should it be able to be affected by drivers, and in that alone, I consider these cards a failure.
> 
> *I appreciate the reviewers killing these cards though...that leaves clear lines drawn in the sand as to what people can expect, no matter how far over specification these cards were run. It's been a long time since we've seen VRM failures, yet DR-MOS tech on VGAs is relatively new...and quite cost-effective. I do not beleive cost-effective solutions should be used on "flagship" cards, either, so I'm left looking at nVidia for an explanation*.



i like it better when they are honest and outright not put issues with certain cards in highlight and cover others up.

the same sites that killed the 590 some with over 20% added voltage,have added 0%(besides the switch)in the 6990 in there reviews thats not odd?

Then we have to read a comparo on the two cards from a reviewer and expect him to be non biased?
 w/e i think its best to know of facts before someone(joe public) goes off and says the power circuts are too weak for stock clocks/overclocking @1.06v as alot are doing without having a clue..


----------



## Sin (Mar 28, 2011)

Harlequin check this out(online google translator should be good enough): 
http://lab501.ro/placi-video/nvidia-geforce-gtx-590-studiu-de-overclocking

and then this:
http://lab501.ro/placi-video/asus-radeon-hd-6990-studiu-de-overclocking

making it ~400W compliant(PCIe + 2x8pin) does not translate in to making it utter sh*t.


----------



## cadaveca (Mar 28, 2011)

cowie said:


> the same sites that killed the 590 some with over 20% added voltage,have added 0%(besides the switch)in the 6990 in there reviews thats not odd?


LoL. Way i see it, GTX590 came with votlage boost guidelines for review, so reviewers were expected to increase volts. Meanwhile, maybe 6990 came with recommendation to not adjust volts?

I mean, you can insinuate all you want, doesn't change the fact that very often reviewers are given specific requests as to what they cover in a review. Many sites covered GTX590, and most adjusted voltage. Many sites covered 6990, and very few adjusted voltage.

So, no, to me, it is not odd, at all.


----------



## cowie (Mar 28, 2011)

Well why did they bother to adjust volltage at all?
you did not have to,its also a better guage to what most people will get in overclocking and not have an unobtainable result from an end user.?

now if say a reviewer did get a crasy clock out of the card,would he have said how much voltage he used? or just the max 1.06(in the two bios i'v read) or just the headline "1000 on just stock volts"?

i do agree this is not the time and place for the lack of reporting on the 6990 cards dying,its just hard for me to look at certain card reviews anymore.
Reak of bias has no room in a honest review site

where did you hear that bs that they had to use volage with the 590??because thats what it is.


----------



## cadaveca (Mar 28, 2011)

I didn't say they HAD to use voltage. I did not review this card, so I am not privy to any info given to reviewers. I am just jumping to conclusions as you are.

However, OC testing is standard. *6990 features AUSUM OC bios, no need to adjust volts, IMHO, as PCIe spec is broken by AUSUM bios.*

GTX590 offers voltage adjustment, and specifically, ASUS card have "Voltage Tweak" stuff. Seems pretty basic to me. This feature was explored, and that testing lead to card death. End story.

You seem to imply that W1zz is biased in his reviews, while personally, i think he is critical of all things, and has no brand loyalty.

Look at it this way...both cards are high-level gpus that are "downclocked" to meet PCIe spec. 6990 reaches spec of 2x6970, no problem, via AUSUM bios. GTX 590 is the same, however, does not always seem to meet the spec of 2x GTX580, and does not have dual bios. But testing to each spec, the spec of whatever cards were squished into one, did not require any adjustments other than flipping bios on 6990, but on GTX590, voltage increases were needed...I don't see any issues with how that was done, and why 6990 didn't get the same voltage boost.


----------



## W1zzard (Mar 28, 2011)

if you think i'm biased, then ask yourself why i dont do 3 tests, say "great card" and be done with my testing so nvidia is happy, users are happy. problem solved.


----------



## cowie (Mar 28, 2011)

no i love wizz i was talking about in genaral.

anyways I wanted to see how fast i could kill one of these pos's at stock volt overclocks lol
i cant even find one?


----------



## cowie (Mar 28, 2011)

no wizz read above
asmof i like your reviews the best you run o3 o5 and 06


----------



## newtekie1 (Mar 28, 2011)

cadaveca said:


> GTX590 offers voltage adjustment, and specifically, ASUS card have "Voltage Tweak" stuff. Seems pretty basic to me. This feature was explored, and that testing lead to card death. End story.



Agreed, the particular problem seems to be actually with the SmartDoctor software, not the card.  It seems like ASUS has made a rather costly mistake in allowing their software to ignore the limit set in the BIOS, and that is what caused the death of this card.

Now this is a review for the ASUS card, so scoring it baddly because the ASUS software provided with it allowed the voltage to be adjusted way beyond limits seems fair to me.  However, I would have like to see a little more differientiation about the actual reference card.  Perhaps even a seperate review write up on the card minus the shiney ASUS SmartDoctor software as a general GTX590 review.


----------



## cowie (Mar 28, 2011)

well no i thuoght that to only asus but.. i have found one reviewer(tbreak) that killed it with afterbuner yes he did go 1.21v on the wrong sent driver was a zotec card.
 i did look at a retail zotec cards bios it was limited to 1.06v.
wont put up a link thats kinda tacky thing to do in 590 review of anothers site
 some c/p action of what he said
 this guy manned up about it.
sorta what i did with my made for oc'in 460 hawk he got "noobcaked greedy"
1.21v set
_That’s a good amount of scaling with a 77MHz overclock. Now given that the GTX 590 has the ability for voltage tweaking, I got really brave and decided to over-voltage the card as well as the speeds and see how far I can go. Using MSI Afterburner I increased the voltage by 75mV and bumping up the Core speeds to 753Mhz which resulted in an incredible score of 9816 on 3DMark 11 (p). Hoping to hit the 10k mark, I decided to bump the voltage to 125mV with Core speeds of 804Mhz. As 3DMark 11 was coming to a close, the whole system shutdown and I could see smoke coming out of the power cable connectors.
As the smell of burnt plastic and metal faded, I contacted Nvidia to try and see what happened. The GTX 590 should have been able to either handle the voltage increase, or simply failed to a BSOD or crashed the system. I certainly didn’t expect it to die on me. Turns out the drivers provided by Zotac were 267.52 and the latest drivers from Nvidia, 267.71, have some fail-safe measures to protect the GTX 590 against over-voltage. So remember folks, a simple driver update can be the deciding factor between the life and death of a graphics card when you’re pushing it to the limits. Always update your drivers._

althuogh he might have had a abx for use.


----------



## Steevo (Mar 29, 2011)

I run mu 5870 at 1.35 volts, water cooled core, and yes, a cooler for the phases. But even with just paltry aluminum sinks on the phases it would run that voltage OK.


Nvidia made a card, then made it dead by providing a enthusiast card to enthusiasts and not the standard epeen nvidiot. They did the equal of saying, here buy our twin turbocar that can go amazingly fast, and we built for racing. 


Oh, and if you ever race it, it will blow up, aand your warranty will be void.


----------



## DaedalusHelios (Mar 29, 2011)

Steevo said:


> I run mu 5870 at 1.35 volts, water cooled core, and yes, a cooler for the phases. But even with just paltry aluminum sinks on the phases it would run that voltage OK.
> 
> 
> Nvidia made a card, then made it dead by providing a enthusiast card to enthusiasts and not the standard epeen nvidiot. They did the equal of saying, here buy our twin turbocar that can go amazingly fast, and we built for racing.
> ...



Supercars are notorious for having parts break when driven hard so I think you need a different analogy. 

I think both camps have been rushing their products out to market equally. Nvidia has just shipped what should be considered more of a beta as the final product. It was a bad move but unless they feel it financially they could care less TBH. The same goes for AMD and any company that answers to stock holders rather than the end user. I think they need new upper management at Nvidia and perhaps they will be more competitive on the high end without compromising overclockability.


----------



## wahdangun (Mar 29, 2011)

cadaveca said:


> LoL. Way i see it, GTX590 came with votlage boost guidelines for review, so reviewers were expected to increase volts. Meanwhile, maybe 6990 came with recommendation to not adjust volts?
> 
> I mean, you can insinuate all you want, doesn't change the fact that very often reviewers are given specific requests as to what they cover in a review. Many sites covered GTX590, and most adjusted voltage. Many sites covered 6990, and very few adjusted voltage.
> 
> So, no, to me, it is not odd, at all.



no, why wizz didn't adjust hd 6990 voltage was because there are no program that support new voltage controller in hd 6990,


----------



## cadaveca (Mar 29, 2011)

wahdangun said:


> no, why wizz didn't adjust hd 6990 voltage was because there are no program that support new voltage controller in hd 6990,



Pretty sure he could have whipped one up if needed.


Oh wait...it works now, doesn't it.


----------



## gaximodo (Mar 29, 2011)

why would ppl compare apples to bananas, different hardware takes different voltages, AMD Cpu's carry's much higher voltage configuration than Intel's and overclocks worse.
Same story here, generally, Nv's high end overclocks waaay better than AMD's, overvolt'd or not.
AMD pushed their gpu/cpu clocks way too high (concluded from their smaller headroom for overclocking) which will result a higher failing rate than their competitors(theoretically). 
Not to mention those HD6990's died without any tweaks, while 590's only died on excessive overclocks(and faulty drivers, with the new drivers, the card's OCP works just fine, but  NV shouldn't let this problem happen at the first place)
My 4870X2 died within like what, 2 weeks, and it came with Arctic Xtreme somthing cooler, so heat was never a problem. I'm not based on my opinion just because one card died on me, but AMD's dual cards do have some reliability problems, I thought they were getting better with their 5970, but this one's probably as bad as the 4870X2, or even worse.
Also if NV makes their cards as long as the 6990, the extra heat sink NV could fit in the huge spaces here and there will definitely lower the cards' overall heat output and make their cards more quiet.


----------



## Dr. K6 (Mar 29, 2011)

gaximodo said:


> Same story here, generally, Nv's high end overclocks waaay better than AMD's, overvolt'd or not.


My unlocked 6950 @ 1030MHz and my previous 5850 @ 1080MHz disagree.  Try that with a GTX 570 and it will blow up. 


gaximodo said:


> AMD pushed their gpu/cpu clocks way too high (concluded from their smaller headroom for overclocking) which will result a higher failing rate than their competitors(theoretically).


Do you really think you understand how to clock hardware better than the company's own engineers?  Think about that for a second and how ridiculous it sounds.  Also, building false logic on incorrect statements just leads to a bunch of nonsense - AMD's cards, despite being clocked higher, are still much more power efficient than NVIDIA's and don't require near the circuitry or the stress.  If anyone's trying to overclock their hardware to (barely) compete, it's NVIDIA.  That'd explain why their GTX 570's failed and why the GTX 590's are failing now.  With that track record, nevermind the 8800's from back in the day, I wouldn't bet a nickel on NVIDIA's quality control.


gaximodo said:


> Not to mention those HD6990's died without any tweaks, while 590's only died on excessive overclocks(and faulty drivers, with the new drivers, the card's OCP works just fine, but  NV shouldn't let this problem happen at the first place)
> My 4870X2 died within like what, 2 weeks, and it came with Arctic Xtreme somthing cooler, so heat was never a problem.


There are multiple reports from several different review sites of GTX 590's dying at stock.  The only documented 6990 death I've seen is the one from Neoseeker.


gaximodo said:


> I'm not based on my opinion just because one card died on me, but AMD's dual cards do have some reliability problems, I thought they were getting better with their 5970, but this one's probably as bad as the 4870X2, or even worse.


Again, after the 8800's, the GTX295's (I had three die on me), GTX 570 reports, and now the GTX 590, I have no faith in NVIDIA's quality control.


gaximodo said:


> Also if NV makes their cards as long as the 6990, the extra heat sink NV could fit in the huge spaces here and there will definitely lower the cards' overall heat output and make their cards more quiet.


Maybe if they made the card longer they might actually have a card that works.  As it is, they had to release a driver to downclock the cards just so they don't blow at stock.  I'd like to see reviewers retest with the newest driver to see how much a performance hit there is when the card actually runs safely.


----------



## the54thvoid (Mar 29, 2011)

I'm going to go out on a limb here.

The 590 isn't an enthusiast card for overclockers.  It's and enthusiast card for really rich folk or people with fairly reasonable disposable incomes.  It doesn't come with two BIOS and it doesn't have specific points for voltage measurements like the MSI Lightning range or the Gigabyte SOC card.

It is in fact just a really fast and relatively quiet card.  It isn't designed for overvolting to the extent other cards may be.

Just because it's expensive, doesn't make it enthusiast.  An enthusiast will buy a product known for it's 'character', not based on it's price.  The argument this is enthusiast because of it's price is misleading on all fronts.  In other words, having cash does not an enthusiast make!

Let's face it, if you want to get maximum graphics scores you'll probably go for 4 way GTX 580 on LN2, not two of these (likewise of you're an AMD OC'er you'd use 4 way 6970's)..  Dual graphics cards are nice little speed devils that don't ask to be tampered with.

We seem to have forgotten the whole 6990 AMD switch warranty (i know XFX will honour that warranty now).  Can that voltage be adjusted?  Will it melt the card if it's bypassed?  WHO CARES?  It's not designed for it.

Price doesn't exemplify enthusiast.  It's an expensive well performing (but not best) dual gpu.  The 6990 pretty much takes the crown in neutral vendor games but is way noisier and needs a better cooler.

Give me a 6970 or a 580 lightning over either dual card and i'd be much happier.


----------



## Frick (Mar 29, 2011)

Agreed with that, but I think this card can be interesting for overclockers and the people wanting to set records with single cards.


----------



## qubit (Mar 29, 2011)

I'd like to ask all the people criticising the 590: who _really_ wouldn't want one?  Yeah, it's pretty nice, regardless.


----------



## W1zzard (Mar 29, 2011)

i wouldnt want one .. using a asus 580 direct cu ii and extremely happy with it


----------



## Mussels (Mar 29, 2011)

qubit said:


> I'd like to ask all the people criticising the 590: who _really_ wouldn't want one?  Yeah, it's pretty nice, regardless.



me. terribly power inefficient. too loud.


----------



## Frizz (Mar 29, 2011)

qubit said:


> I'd like to ask all the people criticising the 590: who _really_ wouldn't want one?  Yeah, it's pretty nice, regardless.



I would dislike having one, especially for the price difference here in my country, I would rather spend the money on 2x 580's.


----------



## qubit (Mar 29, 2011)

W1zzard said:


> i wouldnt want one .. using a asus 580 direct cu ii and extremely happy with it



Well, you don't deserve one as you went and broke your new toy. You'll have to save up your pocket money for another one now.


----------



## Mussels (Mar 29, 2011)

qubit said:


> Well, you don't deserve one as you went and broke your new toy. You'll have to save up your pocket money for another one now.



this is w1zzard. he has BOXES of spare video cards.


----------



## qubit (Mar 29, 2011)

Mussels said:


> this is w1zzard. he has BOXES of spare video cards.



I can imagine, hehe.  I'll bet it's a right alladin's cave there. Just imagine the hord of classic and unusual cards that must be there. _< envy >_


----------



## Mussels (Mar 29, 2011)

qubit said:


> I can imagine, hehe.  I'll bet it's a right alladin's cave there. Just imagine the hord of classic and unusual cards that must be there. _< envy >_



IIRC, he bins them once they reach a certain age, or gives them away.


----------



## Deleted member 24505 (Mar 29, 2011)

Maybe W1zzard should have a competition every now and again to relieve himself of all his "junk"


----------



## Mussels (Mar 29, 2011)

tigger said:


> Maybe W1zzard should have a competition every now and again to relieve himself of all his "junk"



i've suggested it, but his lordliness is a very busy man.


shall we get back on topic now, of... well, nvidia failing, nvidiots fanboy hatin, ati fanboys lolin, etc etc.


----------



## Frick (Mar 29, 2011)

Just wanted to say that I really have the term nvidiots. What's wrong with "fanboy"? It's like people using the term M$ or something, it's just stupid imo.


----------



## TheMailMan78 (Mar 29, 2011)

qubit said:


> I'd like to ask all the people criticising the 590: who _really_ wouldn't want one?  Yeah, it's pretty nice, regardless.



I would rather have a 580 by FAR. No SLI issues and no risk. Plus the current state of PC gaming is nothing but ports. 6990 and 590 are a waste of money.



Frick said:


> Just wanted to say that I really have the term nvidiots. What's wrong with "fanboy"? It's like people using the term M$ or something, it's just stupid imo.



I'm going to use the term "Nvidiot" strictly in your honor Frick.


----------



## DaedalusHelios (Mar 29, 2011)

Frick said:


> Just wanted to say that I really have the term nvidiots. What's wrong with "fanboy"? It's like people using the term M$ or something, it's just stupid imo.



It is mainly used by trolls or fanboys of the AMD/ATi camp. I just wish people looked at things more objectively rather than all the bullshit drama. Those that lack intellect will fill the void with emotion.

Power consumption isn't an issue for those of us with really cheap power rates like I have. Therefore an option to turn off all power throttling would be great on all new Nvidia and AMD offerings. That way higher OC's will be more stable with adequate cooling. Except a design like a 590 when overvolting, but I had no intention of doing that.


----------



## TheMailMan78 (Mar 29, 2011)

DaedalusHelios said:


> It is mainly used by trolls or fanboys of the AMD/ATi camp. I just wish people looked at things more objectively rather than all the bullshit drama. Those that lack intellect will fill the void with emotion.



I guess W1zz is a troll/fanboy because I've heard him say it before. "Nvidiot" is a term for people who blindly buy anything Nvidia without ever looking at the facts. OMGWTFBBQ NVIDIA FTW!


----------



## DaedalusHelios (Mar 29, 2011)

TheMailMan78 said:


> I guess W1zz is a troll/fanboy because I've heard him say it before. "Nvidiot" is a term for people who blindly buy anything Nvidia without ever looking at the facts. OMGWTFBBQ NVIDIA FTW!



Forum owners can troll. It is his forum. He made some comment about badaboom and porn on a bus too. You have done your share of trollin good man.

Now I drive to work.


----------



## W1zzard (Mar 29, 2011)

TheMailMan78 said:


> I guess W1zz is a troll/fanboy



and that's why, out of like 100 cards i could use, i picked an nvidia card for my work pc ? clearly ati fanboy


----------



## Harlequin_uk (Mar 29, 2011)

first cases of retail cards blowing up - zotac gtx 590 that is, at stock everything in a well vented machine , running 3dmark 11 - lulz then when nv actually do the 3dm challenge

http://forums.overclockers.co.uk/showpost.php?p=18778252&postcount=84

clocks at 650mhz on stock volts and it blew

http://img845.imageshack.us/img845/2889/quemado.jpg



well holy shit


----------



## Frick (Mar 29, 2011)

TheMailMan78 said:


> I guess W1zz is a troll/fanboy because I've heard him say it before. "Nvidiot" is a term for people who blindly buy anything Nvidia without ever looking at the facts. OMGWTFBBQ NVIDIA FTW!



that it is but it still looks stupid.


----------



## TheMailMan78 (Mar 29, 2011)

W1zzard said:


> and that's why, out of like 100 cards i could use, i picked an nvidia card for my work pc ? clearly ati fanboy



Your logic. IT BURNS! I mean of course. You didn't give the 590 an 11 score even tho it blew up in your rig. Clearly you are a AMD fanboy.



Harlequin_uk said:


> first cases of retail cards blowing up - zotac gtx 590 that is, at stock everything in a well vented machine , running 3dmark 11 - lulz then when nv actually do the 3dm challenge
> 
> http://forums.overclockers.co.uk/showpost.php?p=18778252&postcount=84
> 
> ...



Ok now it blows at stock volts?!? lol FAIL! But this is an isolated incident correct?


----------



## Frick (Mar 29, 2011)

Well that is fail. I would like to know more about it though. Couldn't have been the PSU or something as well?

(i'm not trying to sound like a fanboy, but a card that blows on stock settings is not common and it should be looked into deeper)


----------



## CDdude55 (Mar 29, 2011)

TheMailMan78 said:


> But this is an isolated incident correct?



Yes, he seems to be the only consumer case i have seen that has had that happen.


----------



## TheMailMan78 (Mar 29, 2011)

CDdude55 said:


> Yes, he seems to be the only consumer case i have seen that has had that happen.



Well a few others have been listed also earlier in this thread.

http://forums.guru3d.com/showpost.php?p=3930980&postcount=74


----------



## N3M3515 (Mar 29, 2011)

Fail card this is
"there is no worst blind that the one who doesn't want to see"


----------



## CDdude55 (Mar 29, 2011)

TheMailMan78 said:


> Well a few others have been listed also earlier in this thread.



Ive only seen mainly cases from reviewers, and only that one from a mainstream user. (ahh, didn't see the link you posted nvm. lol)

Personally, i wouldn't worry about it, i honestly don't think this is much of a mainstream problem.


----------



## TheMailMan78 (Mar 29, 2011)

CDdude55 said:


> Ive only seen mainly cases from reviewers, and only that one from a mainstream user.
> 
> Personally, i wouldn't worry about it, i honestly don't think this is much of a mainstream problem.



Who knows. Time will tell. All I know is if I had a choice right now it would be a 580 or a 6970.


----------



## CDdude55 (Mar 29, 2011)

TheMailMan78 said:


> Well a few others have been listed also earlier in this thread.
> 
> http://forums.guru3d.com/showpost.php?p=3930980&postcount=74



It's seeming to be a bigger issue then i thought.


----------



## newtekie1 (Mar 29, 2011)

TheMailMan78 said:


> All I know is if I had a choice right now it would be a 580 or a 6970.



If I was buying right now, I'd still buy a GTX470.  I think they are probably the best value right now for the money, and still offer a monster of a performer.


----------



## TheMailMan78 (Mar 29, 2011)

newtekie1 said:


> If I was buying right now, I'd still buy a GTX470.  I think they are probably the best value right now for the money, and still offer a monster of a performer.



I already own a 5850. 470 is not worth it to me. But I agree the 470 is an epic bang for the buck card.


----------



## CDdude55 (Mar 29, 2011)

newtekie1 said:


> If I was buying right now, I'd still buy a GTX470.  I think they are probably the best value right now for the money, and still offer a monster of a performer.



I agree, my GTX 470 is still a great performing card.


----------



## qubit (Mar 29, 2011)

Hi people

You seem to be pretty clued up on what card to get at a particular price point, so can you please help bokou choose a decent card for $100?

www.techpowerup.com/forums/showthread.php?p=2240152#post2240152


----------



## the54thvoid (Mar 29, 2011)

I figure it's just a matter of time before the idiots at AMD allow the partners to redo the cooler and the 6990 will become the most perfect overkill gfx card around.  Just think, an Asus Direct CUII 6990 variant.  Possible?

I know this is a 590 thread but it's only fair i argue for the 6990 now.  Once it's noise problem is fixed.


----------



## newtekie1 (Mar 29, 2011)

TheMailMan78 said:


> I already own a 5850. 470 is not worth it to me. But I agree the 470 is an epic bang for the buck card.



Oh I know, I'm not saying anyone with an HD5800/GTX400 series card should upgrade, in fact I wouldn't upgrade at all.

What I was saying is that if I was building a rig today from scratch, the GTX470 would be pretty high on the list of cards I would be considering.  The new HD6950 1GB would be up there too.


----------



## N3M3515 (Mar 29, 2011)

^
agreed on 6950 1GB


----------



## Initialised (Mar 30, 2011)

*Another one bites the dust*

I couldn't get a video, but here's the damage:







Driver was 267.91 with SLi v6 running Crysis 2 at Extreme, 1920x1200.

Card was EVGA Classified, *no overclocking or overvoltage* was applied.


----------



## entropy13 (Mar 30, 2011)

Initialised said:


> I couldn't get a video, but here's the damage:
> 
> http://farm6.static.flickr.com/5135/5573788233_7113fcf29f_b.jpg
> 
> ...



"YOU LIE!" The Nvidiots would say regarding the bolded words.


----------



## Mussels (Mar 30, 2011)

ouch, thats some damage.


----------



## 2DividedbyZero (Mar 30, 2011)

that is bad news





for nVidia


----------



## N3M3515 (Mar 30, 2011)

I don't know why would they release such a fail card.....
Hitler


----------



## 2DividedbyZero (Mar 30, 2011)

N3M3515 said:


> I don't know why would they release such a fail card.....
> Hitler



goldie


----------



## bbmarley (Mar 31, 2011)

N3M3515 said:


> I don't know why would they release such a fail card.....
> Hitler



epic


----------



## Easy Rhino (Mar 31, 2011)

oh noes i am an nvidia fan and i cannot handle all of this fail card. oh noes what shall i do with myself as i suffer abuse from the amd fanatics!! QQ!


----------



## erocker (Mar 31, 2011)

Hey, I just feel a bit sorry for the folks that gave $700 bucks away to find their video card "blow up". Pretty unacceptable no matter what company or product it is.


----------



## dumo (Mar 31, 2011)

They knew......http://www.gurufocus.com/news.php?id=126575


----------



## Mussels (Mar 31, 2011)

dumo said:


> They knew......http://www.gurufocus.com/news.php?id=126575



thats hilarious


----------



## Deleted member 67555 (Mar 31, 2011)

dumo said:


> They knew......http://www.gurufocus.com/news.php?id=126575



Isn't that illegal?:shadedshu
It's kind of obvious he knew.... They all knew.....Seriously Isn't that illegal?
OMG This card isn't just bullshit It's illegal..in a manner of speaking
WHOA this is clear cut insider trading Shame Shame


----------



## dumo (Mar 31, 2011)

jmcslob said:


> Isn't that illegal?:shadedshu
> It's kind of obvious he knew.... They all knew.....Seriously Isn't that illegal?
> OMG This card isn't just bullshit It's illegal..in a manner of speaking
> WHOA this is clear cut insider trading Shame Shame


Info here http://en.wikipedia.org/wiki/Insider_trading

"Rule 10b5-1 also created for insiders an affirmative defense if the insider can demonstrate that the trades conducted on behalf of the insider were conducted as part of a preexisting contract or written, binding plan for trading in the future.[3] For example, if a corporate insider plans on retiring after a period of time and, as part of his or her retirement planning, adopts a written, binding plan to sell a specific amount of the company's stock every month for the next two years, and during this period the insider comes into possession of material nonpublic information about the company, any subsequent trades based on the original plan might not constitute prohibited insider trading."

I'm pretty sure all aspects were considered and have to be conform with the regulations before the sell


----------



## Deleted member 67555 (Mar 31, 2011)

dumo said:


> Info here http://en.wikipedia.org/wiki/Insider_trading



So the Headline "Nvidia GTX590 World's Fastest Video Card" should be enough to prove they were blowing smoke up peoples asses while dumping stock.....OUCH!! Well I hope they actually get prosecuted.


----------



## erocker (Mar 31, 2011)

I highly doubt that selling stock is related to one of their least profitable products.


----------



## Easy Rhino (Mar 31, 2011)

even if it was, that is not considered insider trading. if you are an executive and you believe your company is manufacturing crap which will eventually lower its stock price because of low sales then naturally you are going to want to sell your stock.


----------



## erocker (Mar 31, 2011)

Easy Rhino said:


> even if it was, that is not considered insider trading. if you are an executive and you believe your company is manufacturing crap which will eventually lower its stock price because of low sales then naturally you are going to want to sell your stock.



Plus, he lost money. The stock has gone up since he sold it.


----------



## Easy Rhino (Mar 31, 2011)

yea he sold it at 17.75 and it is at 18.45.


----------



## Deleted member 67555 (Mar 31, 2011)

Easy Rhino said:


> yea he sold it at 17.75 and it is at 18.45.



For now....AMD releases good drivers that put them up a bit and ride the wave of new customers while Nvidia does it's best PR to recover from it's Flagship GFX quite Literally going up in smoke...


----------



## dumo (Mar 31, 2011)

Probly they already past the "launch and distribute" part of product cycle...Now they're in the "damage control" mode which involved halted production, rmas, new driver development and new product planning.

Too bad though, imo their GTX580 is really good gpu


----------



## DaedalusHelios (Mar 31, 2011)

Easy Rhino said:


> yea he sold it at 17.75 and it is at 18.45.



That plus he basically admitted he was partially at fault for the miscommunication that lead to the 480 not being what it was thought to be performance wise(the 580). I think he is sinking the company through poor leadership. 

Like I said before, they will be better off with new management. Unfortunately I don't believe that will happen. This poorly engineered card will result in a stagnation in prices on the high end most likely. Good luck grabbing a 580 at $350, or 6990 at $500 where the prices should be. 



jmcslob said:


> For now....AMD releases good drivers that put them up a bit and ride the wave of new customers while Nvidia does it's best PR to recover from it's Flagship GFX quite Literally going up in smoke...



Possibly what could be killing it is the throttling of a PSU that isn't prepared for that much current load on the 12V rail. Isn't it the highest power consumption of any card when under high work load. Perhaps it could even kill some PSU's in the process.


----------



## Easy Rhino (Mar 31, 2011)

nvidia is still a much better run company than amd's graphics card department. this is just one crap card in a lineup of extremely successful cards. amd makes some great stuff too but their management is terrible. and the stockholders know it.


----------



## Harlequin_uk (Mar 31, 2011)

DaedalusHelios said:


> Possibly what could be killing it is the throttling of a PSU that isn't prepared for that much current load on the 12V rail. Isn't it the highest power consumption of any card when under high work load. Perhaps it could even kill some PSU's in the process



not the psu`s that tpu or sweclockers are using , in fact most people with the money or need to buy this card allready have the top end psu - so if 1kw psu`s are throttling then there is a huge problem.


----------



## DaedalusHelios (Mar 31, 2011)

Easy Rhino said:


> nvidia is still a much better run company than amd's graphics card department. this is just one crap card in a lineup of extremely successful cards. amd makes some great stuff too but their management is terrible. and the stockholders know it.



How was AMD's graphics card department badly run? Aside from using it's profit to repay AMD's massive debt? Are they starving it from a lack of funding? I might have missed out on some big news thanks to my work/study schedule.


----------



## entropy13 (Mar 31, 2011)

Easy Rhino said:


> even if it was, that is not considered insider trading. if you are an executive and you believe your company is manufacturing crap which will eventually lower its stock price because of low sales then naturally you are going to want to sell your stock.



Not really relevant however because he's stupid. He also sold his shares (and lost money) when they launched the GTX 580 which was relatively successful, whereas there was no HD 6990 yet.


----------



## Easy Rhino (Mar 31, 2011)

DaedalusHelios said:


> How was AMD's graphics card department badly run? Aside from using it's profit to repay AMD's massive debt? Are they starving it from a lack of funding? I might have missed out on some big news thanks to my work/study schedule.





entropy13 said:


> Not really relevant however because he's stupid. He also sold his shares (and lost money) when they launched the GTX 580 which was relatively successful, whereas there was no HD 6990 yet.



Nvidia being the better run company is a fact of life. Their business model makes them far more profitable and the market knows it. AMD has juggled top executives for years now since buying ATI. They just don't have the leadership to manufacture both CPUs and GPUs while making a profit. Nor do they have proper strategy to compete against Intel and Nvidia at the same time. So all of the people laughing and making fun of the gtx590 are simply being trolls.


----------



## TheMailMan78 (Mar 31, 2011)

Easy Rhino said:


> Nvidia being the better run company is a fact of life. Their business model makes them far more profitable and the market knows it. AMD has juggled top executives for years now since buying ATI. They just don't have the leadership to manufacture both CPUs and GPUs while making a profit. Nor do they have proper strategy to compete against Intel and Nvidia at the same time. So all of the people laughing and making fun of the gtx590 are simply being trolls.



Not making cards that blow up sounds like a better strategy to me.


----------



## W1zzard (Mar 31, 2011)

http://www.techpowerup.com/reviews/MSI/GTX_580_Lightning/

1.35 V on GF110 and zero problems


----------



## Easy Rhino (Mar 31, 2011)

TheMailMan78 said:


> Not making cards that blow up sounds like a better strategy to me.



one bad card doesnt break an entirely successful and profitable franchise.


----------



## wolf (Mar 31, 2011)

W1zzard said:


> http://www.techpowerup.com/reviews/MSI/GTX_580_Lightning/
> 
> 1.35 V on GF110 and zero problems



I believe the pcb and vrm situation on the 580 lightning facilitated a stable 1.35V, try that on a reference GTX580 if you dare. seems logical to me its not the GPU but rather the rest of the card that held back the 590, hec the GPU didnt pop the power circutry did.

perhaps a lightning 590 is in order MSI!



Easy Rhino said:


> Nvidia being the better run company is a fact of life. Their business model makes them far more profitable and the market knows it. AMD has juggled top executives for years now since buying ATI. They just don't have the leadership to manufacture both CPUs and GPUs while making a profit. Nor do they have proper strategy to compete against Intel and Nvidia at the same time. So all of the people laughing and making fun of the gtx590 are simply being trolls.





Easy Rhino said:


> one bad card doesnt break an entirely successful and profitable franchise.



*+1*


----------



## btarunr (Mar 31, 2011)

That just shows that GF110 itself can handle 1.35V fine. The VRM on GTX 590 is fail. 

There's a reason AMD spent that much on Volterra+CPL based PWM on HD 6990.


----------



## TheMailMan78 (Mar 31, 2011)

Easy Rhino said:


> one bad card doesnt break an entirely successful and profitable franchise.



No it doesn't. Fanboys will still buy crap in droves. Doesn't mean they are good products. Just means they have good marketing. (See Apple)

Now with that being said I like Nvidia  and I don't believe they have shitty products. BUT they have had a few black eyes lately. Re-branding, Fermi, and now this. ATI has had nothing but win cards since the 48xx series. To top that off AMD and ATI merged. That alone will take YEARS to work out the kinks. Logistically and financially. You have to keep this in mind. Nvidia has one market to focus on and its been dropping the ball IMO.

To me they would have been better off not releasing the 590 until they were positive it would destroy the 6990. As it stands now its a PR nightmare. The 590 at its core is a launch fail and not something Nvidia needed right now. They need to focus more on the "bang for the buck" segment like the 470.


----------



## btarunr (Mar 31, 2011)

Easy Rhino said:


> Nvidia being the better run company is a fact of life.



How a company is run is completely irrelevant to the consumer as long as he's getting quality products, and AMD is honoring its quality and warranty commitments. 

Otherwise, this is a fact of life: http://www.techpowerup.com/139629/Q...-Slump-Competitors-Eat-into-Intel-s-Lead.html

When well-made products also translate into higher sales than NVIDIA, you know which is the better company.


----------



## entropy13 (Mar 31, 2011)

Easy Rhino said:


> Nvidia being the better run company is a fact of life. Their business model makes them far more profitable and the market knows it. AMD has juggled top executives for years now since buying ATI. They just don't have the leadership to manufacture both CPUs and GPUs while making a profit. Nor do they have proper strategy to compete against Intel and Nvidia at the same time. So all of the people laughing and making fun of the gtx590 are simply being trolls.



Why did you have to quote me though? I said Huang is stupid for selling his shares once a month since December regardless of changes in stock price. I made no mention of Nvidia, the company, itself.


Or maybe he's getting ready to step down....


----------



## W1zzard (Mar 31, 2011)

entropy13 said:


> Huang is stupid for selling his shares once a month since December regardless of changes in stock price.



that's what he has to do to avoid insider trading


----------



## Easy Rhino (Mar 31, 2011)

one bad card does not destroy a profitable and well run company. im not sure why people are making such a big deal out of this. 

nvidia can handle this because their business strategy is far superior to AMDs and investors know it. the proof is in the pudding. AMD makes some great products as well but they simply cannot get out of their own way. i won't buy anything from a company whose management ruins everything the engineers create.


----------



## dumo (Mar 31, 2011)

This is the way it should be built......http://www.techpowerup.com/reviews/MSI/GTX_580_Lightning/


----------



## btarunr (Mar 31, 2011)

Easy Rhino said:


> AMD makes some great products as well but they simply cannot get out of their own way. i won't buy anything from a company whose management ruins everything the engineers create.



What's wrong with that? They're continuing to deliver quality products at competitive prices, have a better quality record as far as GPUs go, and are honoring warranties. Why should anything else matter to the consumer?


----------



## entropy13 (Mar 31, 2011)

W1zzard said:


> that's what he has to do to avoid insider trading



He's only doing it now though...he did the same in April-May last year, but that's two months, compared to one month of almost 700k shares and a three month period of an average of 133k shares sold per month.


----------



## Easy Rhino (Mar 31, 2011)

btarunr said:


> What's wrong with that? They're continuing to deliver quality products at competitive prices, have a better quality record as far as GPUs go, and are honoring warranties. Why should anything else matter to the consumer?



i'm not your average consumer 



entropy13 said:


> He's only doing it now though...he did the same in April-May last year, but that's two months, compared to one month of almost 700k shares and a three month period of an average of 133k shares sold per month.



CEOs are paid in stock. Perhaps he wanted to buy a new private jet this year?


----------



## entropy13 (Mar 31, 2011)

Easy Rhino said:


> CEOs are paid in stock. Perhaps he wanted to buy a new private jet this year?



That still wouldn't explain the relatively bigger December stock sale. In context that would have been the more likely source of money for personal consumption since the previous sale was more than 6 months ago.


----------



## Easy Rhino (Mar 31, 2011)

entropy13 said:


> That still wouldn't explain the relatively bigger December stock sale. In context that would have been the more likely source of money for personal consumption since the previous sale was more than 6 months ago.



well since there is literally no evidence of insider trading i dont see how anyone can come to the conclusion that this is insider trading.


----------



## W1zzard (Mar 31, 2011)

Easy Rhino said:


> well since there is literally no evidence of insider trading i dont see how anyone can come to the conclusion that this is insider trading.



jen hsun has to setup a plan well ahead that is executed so that it is independent of any insider information he may have.

if he could sell at any time he could obviously sell at the best time for himself

the SEC has extensive documentation on this


----------



## CDdude55 (Mar 31, 2011)

TheMailMan78 said:


> Re-branding, Fermi, and now this. ATI has had nothing but win cards since the 48xx series.



Well the 48xx series cards were super inefficient (besides the 4850 on down), my old 4870 ran hotter then my current GTX 470. and the 4870 X2 is still the highest power consuming card to date.

They did a great job with the HD 5800/HD 6000 series though.


----------



## btarunr (Mar 31, 2011)

Easy Rhino said:


> i'm not your average consumer



Does NVIDIA give you something it doesn't give "average consumers", or does it charge you less? If neither, you're an average consumer.


----------



## Easy Rhino (Mar 31, 2011)

W1zzard said:


> jen hsun has to setup a plan well ahead that is executed so that it is independent of any insider information he may have.
> 
> if he could sell at any time he could obviously sell at the best time for himself
> 
> the SEC has extensive documentation on this



exactly. the SEC does this with pretty much every executive paid in stock. and just about every executive with a good lawyer buys and sells stock using some sort of third party.



btarunr said:


> Does NVIDIA give you something it doesn't give "average consumers", or does it charge you less? If neither, you're an average consumer.



when given the choice in a mutual fund to choose a stock between AMD, NVDA and INTC I did some research and found nvidia to be a far better bet in the long run. i think from there you can put 2 and 2 together.


----------



## TheMailMan78 (Mar 31, 2011)

CDdude55 said:


> Well the 48xx series cards were super inefficient (besides the 4850 on down), my old 4870 ran hotter then my current GTX 470. and the 4870 X2 is still the highest power consuming card to date.
> 
> They did a great job with the HD 5800/HD 6000 series though.



Efficiency? When the hell has that mattered to the enthusiast market. I remember someone saying on here "Ill rape my dead grandmother as long as Fermi is faster then 5970" or something to that effect. OC is all people care about. After that heat but few even care about that as long as swifttech is in business.



Easy Rhino said:


> when given the choice in a mutual fund to choose a stock between AMD, NVDA and INTC I did some research and found nvidia to be a far better bet in the long run. i think from there you can put 2 and 2 together.


 So you bought stock in Nvidia? Moron. Didn't you learn anything from my mistake?! I was a retard when I bought AMD years ago. You know better Easy....or maybe you don't. STAY OUT OF THE TECH MARKET!


----------



## Easy Rhino (Mar 31, 2011)

TheMailMan78 said:


> So you bought stock in Nvidia? Moron. Didn't you learn anything from my mistake?! I was a retard when I bought AMD years ago. You know better Easy....or maybe you don't. STAY OUT OF THE TECH MARKET!



i bought at one of nvidias lowest points...


----------



## btarunr (Mar 31, 2011)

Easy Rhino said:


> when given the choice in a mutual fund to choose a stock between AMD, NVDA and INTC I did some research and found nvidia to be a far better bet in the long run. i think from there you can put 2 and 2 together.



Oh I see, this is a stock market debate, not a consumer debate. Sorry, I didn't read the older posts.


----------



## TheMailMan78 (Mar 31, 2011)

Easy Rhino said:


> i bought at one of nvidias lowest points...



God speed man. God speed............what you just said is what I thought at the time. Lost 14k on the deal.


----------



## Easy Rhino (Mar 31, 2011)

TheMailMan78 said:


> God speed man. God speed............what you just said is what I thought at the time. Lost 14k on the deal.



lol. that is not "winning"


----------



## CDdude55 (Mar 31, 2011)

TheMailMan78 said:


> Efficiency? When the hell has that mattered to the enthusiast market. I remember someone saying on here "Ill rape my dead grandmother as long as Fermi is faster then 5970" or something to that effect. OC is all people care about. After that heat but few even care about that as long as swifttech is in business.



I don't see how you can think that enthusiasts don't care for efficiency, yes the enthusiast market is more tolerant of high heat and power, but that doesn't mean it's ok for cards to be released like that and go unnoticed, i also don't understand how you can go after Nvidias inefficient designs, but then defend ATI's inefficient designs by perpetuating the age old argument that it's fine for the enthusiast community as long as it's from the brand you prefer, whether a card ''blows up'' from a certain manufacturer or not is irrelevant, we need cards that work efficiently no matter if were using stock or aftermarket, which is something the GTX 590 failed to do, just like how the the 4870's ran at 89c when overclocked. It's needs to get fixed.


----------



## TheMailMan78 (Mar 31, 2011)

CDdude55 said:


> I don't see how you can think that enthusiasts don't care for efficiency, yes the enthusiast market is more tolerant of high heat and power, but that doesn't mean it's ok for cards to be released like that and go unnoticed, i also don't understand how you can go after Nvidias inefficient designs, but then defend ATI's inefficient designs by perpetuating the age old argument that it's fine for the enthusiast community as long as it's from the brand you prefer, whether a card ''blows up'' from a certain manufacturer or not is irrelevant, we need cards that work efficiently no matter if were using stock or aftermarket, which is something the GTX 590 failed to do, just like how the the 4870's ran at 89c when overclocked. It's needs to get fixed.



I never went after Nvidias efficiency. I could care less. You are the one who brought that up. Efficiency isnt a big deal. Cards blowing up is.



Easy Rhino said:


> lol. that is not "winning"



No thats before I was injected with TIGER BLOOD!


----------



## CDdude55 (Mar 31, 2011)

TheMailMan78 said:


> I never went after Nvidias efficiency. I could care less. You are the one who brought that up. Efficiency isnt a big deal. Cards blowing up is.



And there it is the second time, the argument that inefficient designs don't matter depending the severity of the issue. Whether a card ''blows up'' is irrelevant, any internal hardware or external software inefficiency needs and should to be fixed. You have again loosely said that whether a card runs super hot or consumes buckets of power isn't a concern because at least it hasn't ''blown up'', so you are willfully ignoring other widespread issues based on a line of cards you possibly prefer by using that blanket argument that it doesn't matter in those cases simply because ''it didn't blow up''. I love it.

Even in the Fermi review you said:



			
				TheMailMan78 said:
			
		

> Well that is true HOWEVER that GPU is WAY to hot for what it does.



But i don't get it, i thought inefficient designs didn't matter?, where was that argument in that thread?. Oh , right, at least it didn't ''blow up''.


----------



## erocker (Mar 31, 2011)

But they run cooler though.


----------



## TheMailMan78 (Mar 31, 2011)

CDdude55 said:


> And there it is the second time, the argument that inefficient designs don't matter depending the severity of the issue. Whether a card ''blows up'' is irrelevant, any internal hardware or external software inefficiency needs and should to be fixed. You have again loosely said that whether a card runs super hot or consumes buckets of power isn't a concern because at least it hasn't ''blown up'', so you are willfully ignoring other widespread issues based on a line of cards you possibly prefer by using that blanket argument that it doesn't matter in those cases simply because ''it didn't blow up''. I love it.
> 
> Even in the Fermi review you said:
> 
> ...



You do know heat and efficiency are two different things? Also that heat is tolerable if the performance justifies it. When the fermi hit the market it was FAR hotter then AMD's offerings and gave a very minor boost. Not worth the heat it produced. Few cared. Most put them under water if they HAD to have an Nvidia product (mostly folders and crunchers). Smart people knew their was other choices (or people who didn't fold.). Fermi has come a LONG was since then. I wouldn't hesitate getting a 470 or a 580 now.

Anyway you are the only person who brought up anything about efficiency. I never said anything about it. I also never said 6990 was a great card. I just said it was a better designed card then the 590. I find both the 6990 and the 590 useless.

Oh and I "prefer" what gives me the best deal when I am in the market for a GPU. So smoke that in your pipe.



erocker said:


> But they run cooler though.



Yeah well hes nerd raging right now. Facts like that are ignored.


----------



## CDdude55 (Mar 31, 2011)

TheMailMan78 said:


> You do know heat and efficiency are two different things? Also that heat is tolerable if the performance justifies it. When the fermi hit the market it was FAR hotter then AMD's offerings and gave a very minor boost. Not worth the heat it produced. Few cared. Most put them under water if they HAD to have an Nvidia product (mostly folders and crunchers). Smart people knew their was other choices (or people who didn't fold.). Fermi has come a LONG was since then. I wouldn't hesitate getting a 470 or a 580 now.
> 
> Anyway you are the only person who brought up anything about efficiency. I never said anything about it. I also never said 6990 was a great card. I just said it was a better designed card then the 590. I find both the 6990 and the 590 useless.
> 
> Oh and I "prefer" what gives me the best deal when I am in the market for a GPU. So smoke that in your pipe.



Yes, that Fermi point is true, that's why Fermi was an inefficient design, because it pushed significantly more heat and power but performance didn't match that heat or power output. I never said heat purely determines efficiency, it's the scaling of heat and power for how much performance it pushes. My original argument was your statements about how efficiency in general is unimportant which i consider to be untrue, but then you say that it doesn't matter that the 4800 series were inefficient(yes, they output more heat and consume more power then they push out performance wise) but then somehow you flip the argument over when talking about Fermi and the 590, saying that actual heat and power output does matter. Saying that just as long at it doesn't ''blow up''  is false and disingenuous imo, it seems like you're saying that depending on the actual design flaw warrants a card to be looked at and im saying all cards that are flawed in design warrant a fixing no matter the severity of the issue.

That's all i was saying.

And im glad to hear you pick value over brand, which is what really matters.



			
				TheMailMan78 said:
			
		

> Yeah well hes nerd raging right now. Facts like that are ignored.



Didn't notice that post actually lol, but yes, again, no one is denying they're inefficient.


----------



## TheMailMan78 (Mar 31, 2011)

CDdude55 said:


> Yes, that Fermi point is true, that's why Fermi was an inefficient design, because it pushed significantly more heat and power but performance didn't match that heat or power output. I never said heat purely determines efficiency, it's the scaling of heat and power for how much performance it pushes. My original argument was your statements about how efficiency in general is unimportant which i consider to be untrue, but then you say that it doesn't matter that the 4800 series were inefficient(yes, they output more heat and consume more power then they push out performance wise) but then somehow you flip the argument over when talking about Fermi and the 590, saying that actual heat and power output does matter. Saying that just as long at it doesn't ''blow up''  is false and disingenuous imo, it seems like you're saying that depending on the actual design flaw warrants a card to be looked at and im saying all cards that are flawed in design warrant a fixing no matter the severity of the issue.
> 
> That's all i was saying.
> 
> ...



4870X2 was inefficient. But thats to be somewhat expected in a duel GPU card. The 4870 and 4850 in particular were very good for what they cost. 4850 was probably the best bang for the buck at the time.

The 590s efficiency wasn't even expected. Its a duel GPU card. They are all about power (Or should be.) Bragging about the 590 efficiency is like bragging about your Mustangs gas mileage. Who gives a shit. The problem is the damn thing blew up. I mean really? What good is a Mustangs gas mileage if the engine blows on the first run? Savvy?


----------



## erocker (Mar 31, 2011)

Bottom line is the power circuitry is insufficient for this card. There's no excuse for it especially for the price of the card.


----------



## CDdude55 (Mar 31, 2011)

TheMailMan78 said:


> 4870X2 was inefficient. But thats to be somewhat expected in a duel GPU card. The 4870 and 4850 in particular were very good for what they cost. 4850 was probably the best bang for the buck at the time.
> 
> The 590s efficiency wasn't even expected. Its a duel GPU card. They are all about power (Or should be.) Bragging about the 590 efficiency is like bragging about your Mustangs gas mileage. Who gives a shit. The problem is the damn thing blew up. I mean really? What good is a Mustangs gas mileage if the engine blows on the first run? Savvy?



I agree, i definitely think the 590 has some very series issues, especially if people are having these issues with everything at bone stock.

Whether a card is efficient or not should matter, and both camps have/had issues with properly designing video cards, whether they blew up or not.

I just hope they fix this issue.


----------



## erocker (Mar 31, 2011)

CDdude55 said:


> I agree, i definitely think the 590 has some very series issues, especially if people are having these issues with everything at bone stock.
> 
> Whether a card is efficient or not should matter, and both camps have/had issues with properly designing video cards, whether they blew up or not.
> 
> I just hope they fix this issue.



At least one "camp" knows how to make a card without cheaping out on components. Prehaps Nvidia can fix the issue by redesigning their cards.


----------



## CDdude55 (Mar 31, 2011)

erocker said:


> At least one "camp" knows how to make a card without cheaping out on components.



Sure...



			
				erocker said:
			
		

> Prehaps Nvidia can fix the issue by redesigning their cards



I agree, and i hope they do it within a decent time frame.


----------



## bpgt64 (Mar 31, 2011)

Most the reports I have read deal with volt moding causing the card to explode.  There are a few reviews that show decent overclocks without volt modding.

To me it seems like there might have been a defect in that particular resistor, and a recall  MIGHT be in order if slight over clocking causes the card to fail.


----------



## erocker (Mar 31, 2011)

CDdude55 said:


> \
> 
> 
> I agree, and i hope they do it within a decent time frame.



Probably not, though their partners will most likely be able to come up with something adequate before the next series of cards.


----------



## thedude74 (Mar 31, 2011)

wolf said:


> I believe the pcb and vrm situation on the 580 lightning facilitated a stable 1.35V, try that on a reference GTX580 if you dare.



Exactly.

I love how he twists the words of people on the forums in his remarks to suit his point. A reference 580 is not supposed to be clocked at even 1.2v and everyone knows this. Pretending that it's not true by using an extreme overclockers card just makes them look disingenuous.

No one ever said that an upgraded version of a 580 couldn't do 1.2v, so showing the lightning doing that means nothing.


----------



## Deleted member 67555 (Mar 31, 2011)

mirror mirror Erocker......
Maybe there was an alternate reality where Nvidia's 590 was a success ...


----------



## Akrian (Apr 1, 2011)

DaedalusHelios said:


> the high end most likely. Good luck grabbing a 580 at $350, or 6990 at $500 where the prices should be. .(



Well I DID grab my 580s for 400. And after selling my 480s the upgrade from 480s to 580s cost me 200$ total, which is pretty good.   I wish 6990 was at 500$ though... my second PC would love it just as much as I would 

The thing that makes Nvidia stand out is their marketing politics, they simply go for the crowd, which in return produces those amazing myths that were true a loong time ago : Nvidia's always faster, and Nvidia has better drivers. Nvidia and ATI/AMD both had their victories and fails in terms of their products, and ATI drivers have been good for quite some time now ( at least they don't burn your cards lol).
And that's the same reason why 590 will sell -> agressive marketing.


----------



## wolf (Apr 1, 2011)

http://www.fudzilla.com/graphics/item/22274-gtx-590-bios-is-good-no-changes

the fail seems to be almost entirely that of Asus, not nvidia or the 590 itself.


----------



## entropy13 (Apr 1, 2011)

wolf said:


> http://www.fudzilla.com/graphics/item/22274-gtx-590-bios-is-good-no-changes
> 
> the fail seems to be almost entirely that of Asus, not nvidia or the 590 itself.



Someone posted a pic of his dead GTX 590 in this thread. It's from EVGA.

http://www.techpowerup.com/forums/showpost.php?p=2241298&postcount=558


----------



## wolf (Apr 1, 2011)

I find it hard to believe it died with no overclocking or overvolting, but if that _is_ the case the warranty replaces it anyway, no sweat except for a wait.


----------



## MoonPig (Apr 1, 2011)

wolf said:


> I find it hard to believe it died with no overclocking or overvolting, but if that _is_ the case the warranty replaces it anyway, no sweat except for a wait.



And if it's your only card? You're paying £550 to rent a card that you can use for afew days, then wait a month for it to be replaced, use for afew days... And continue.


----------



## qubit (Apr 1, 2011)

MoonPig said:


> And if it's your only card? You're paying £550 to rent a card that you can use for afew days, then wait a month for it to be replaced, use for afew days... And continue.



No, they don't _keep_ breaking when run at stock.


----------



## MoonPig (Apr 1, 2011)

And how many people run their cards at stock.. First thing i did when i got this GTX460 and my i7 950 was see what they could do...

The GTX590 is aimed at the very top of the market, the people who overclock.


----------



## wolf (Apr 1, 2011)

MoonPig said:


> And how many people run their cards at stock.. First thing i did when i got this GTX460 and my i7 950 was see what they could do...
> 
> The GTX590 is aimed at the very top of the market, the people who overclock.



theres also a lot of users and reviews that overclock and overvolt within nvidia guidelines and have perfectly functional cards, his card will be replaced anyway.


----------



## MoonPig (Apr 1, 2011)

The fact is, it's happened to numerous reviewers... That's not good. There's obviously a fault.


----------



## newtekie1 (Apr 1, 2011)

MoonPig said:


> The fact is, it's happened to numerous reviewers... That's not good. There's obviously a fault.



Yes, the reviewers tried to push 1.2v though a card with a 5 phase design(per GPU) when the 6 phase GTX580 can barely handle it.  I have yet to see a reviewer pop a card with less than 1.6v(the maximum set in the BIOS) or with proper drivers.



MoonPig said:


> The GTX590 is aimed at the very top of the market, the people who overclock.



And they still can overclock, they can even overvolt, something that AFAIK you still can't do on the HD6990.


----------



## TheMailMan78 (Apr 1, 2011)

Honestly guys we have beat this horse to death. Resurrected it. Released it on the villagers below and beat it to death again.


----------



## qubit (Apr 1, 2011)

TheMailMan78 said:


> Honestly guys we have beat this horse to death. Resurrected it. Released it on the villagers below and beat it to death again.



But I don't understand, is there some kind of problem with the GTX 590?  Not with an _nvidia_ card, surely?


----------



## newtekie1 (Apr 1, 2011)

qubit said:


> But I don't understand, is there some kind of problem with the GTX 590?  Not with an _nvidia_ card, surely?



There was a problem with ASUS' BIOS and software, but the card itself is sound.


----------



## Mussels (Apr 2, 2011)

TheMailMan78 said:


> Honestly guys we have beat this horse to death. Resurrected it. Released it on the villagers below and beat it to death again.



are we still talking about the 590, or me and my april fools joke?


----------



## qubit (Apr 2, 2011)

newtekie1 said:


> There was a problem with ASUS' BIOS and software, but the card itself is sound.



I was just teasing.   I know all about it as I was quite active in this thread a few days ago.

Seeing all the explanations going back and forth over where the problem lies, I get the impression that the culprit for these failures isn't as clear cut as it could be. Essentially, it seems that all the following points play a part:

- The power circuitry isn't quite as beefy as it should be (it's less per GPU than a 580, apparently, due to less phases)
- hardware protection that isn't quite robust enough
- An older driver that didn't protect it

However, the card is completely solid when run at stock clocks and voltage.

Bottom line with my GTX 580 is that I'm gonna play it safe, so I'm not going to overclock and overvolt it. It's plenty fast, consumes enough power as it is and produces enough heat.

You might be interested in my new thread: Is my GTX 580 really pulling 72A?


----------



## HalfAHertz (Apr 2, 2011)

qubit said:


> I was just teasing.   I know all about it as I was quite active in this thread a few days ago.
> 
> Seeing all the explanations going back and forth over where the problem lies, I get the impression that the culprit for these failures isn't as clear cut as it could be. Essentially, it seems that all the following points play a part:
> 
> ...



Then do the opposite - try to undervolt it keeping stock speeds to shave off a few W


----------



## qubit (Apr 2, 2011)

HalfAHertz said:


> Then do the opposite - try to undervolt it keeping stock speeds to shave off a few W



Now that's not a bad idea. If you look at W1zzard's review of it, it's only a bit faster than the standard one anyway, so I could probably get away with lowering the voltage and clocking it the same as a standard one. It would be interesting to see how much this tames it.

Incidentally, this was the same price as a standard one when I bought it off Amazon on March 21st, so it was a no brainer really.


----------



## DaedalusHelios (Apr 2, 2011)

Well I guess I jumped to conclusions assuming it was a faulty design. Good to hear it was an ASUS thing and not a NV issue. Not much of a surprise considering my luck with NV 7 series ASUS chipset boards being 3:2 failure rate.


----------



## Boilerhog (Apr 6, 2011)

FLAME ON !  




newtekie1 said:


> yes,
> and they still can overclock, they can even overvolt, something that afaik you still can't do on the hd6990.





So, with proper mature drivers, some decent after market cooling ,such as water or phaze change setup, in all liklyhood, the Radeon 6990 will get POWNT.


FLAME OFF !


----------



## bpgt64 (Apr 13, 2011)

Been running mine overclocked for the past few weeks and have had zero issues.


----------



## newtekie1 (Apr 13, 2011)

Boilerhog said:


> So, with proper mature drivers, some decent after market cooling ,such as water or phaze change setup, in all liklyhood, the Radeon 6990 will get POWNT.



Well, even at stock voltages, with stock cooling, it is already overclocking better than the HD6990 W1z reviewed.


----------



## TheMailMan78 (Apr 13, 2011)

Mussels said:


> are we still talking about the 590, or me and my april fools joke?



Nobody cared when you were dieing. What makes you think anyone would be talking about you a week after you died?


----------



## Boilerhog (Apr 13, 2011)

If the 590 is going to be the fastest card on the planet ,and it is,after tweaks,we can still talk about it untill something faster comes along,and frankly ,it seems Nvidia could have given it a little more love , but the 6990 is pretty much tapped from the getgo.so lets see the non reference from both teams now and give us some more to talk about,you talk like they have all died   wizz as far as i can tell went straight for what he thought was a usable setting and it was ,under the correct curcumstances, maybe we could toast a 6990 in a similar fashion,to show it can be done ,sacrifice anyone,lol,I would still like to see some Overclocking results though
Hows it feel to have a top 6 core proc barely able to run with a midrange quadcore,anyway?


----------



## T3kl0rd (Apr 14, 2011)

Wow, AMD made a GPU that actually is close to what nVidia created.  Wonder how close we are to a single AMD GPU being this close to an nVidia card?


----------



## Melvis (Apr 19, 2011)

Did anyone find the easter egg?

I found this on a different forum but it doesnt line up with what it shows now.

I cant be bothered going through all the posts to find out.


----------



## Cold Storm (Apr 19, 2011)

Melvis said:


> Did anyone find the easter egg?
> 
> I found this on a different forum but it doesnt line up with what it shows now.
> 
> I cant be bothered going through all the posts to find out.



Yeah, the Easter Egg was given away by Bta since no one found it yet... I think about half way threw the thread..


----------



## Melvis (Apr 24, 2011)

Cold Storm said:


> Yeah, the Easter Egg was given away by Bta since no one found it yet... I think about half way threw the thread..



So it was never found huh? I wonder what it was after all that hmmmm


----------



## Mussels (Apr 24, 2011)

Melvis said:


> So it was never found huh? I wonder what it was after all that hmmmm



it was given away. it was told. no one found it.


the letters at the conclusion spelled out 'EPIC FAIL'


----------



## Melvis (Apr 24, 2011)

Mussels said:


> it was given away. it was told. no one found it.
> 
> 
> the letters at the conclusion spelled out 'EPIC FAIL'



 So i did find it? well there ya go, thanks Mussels


----------



## qubit (Apr 24, 2011)

Mussels said:


> it was given away. it was told. no one found it.
> 
> 
> the letters at the conclusion spelled out 'EPIC FAIL'



I just looked at it and it's not there any more. Not sure why W1zz took it out, but I think he should have left it in.


----------

