Monday, March 14th 2011
GeForce GTX 590 Key Features Revealed
An alleged partner presentation slide leaked to the internet reveals quite a bit more about the GeForce GTX 590 than what we already know. To begin with, it lays to rest speculations surrounding the shader configuration, each of the two GF110 GPUs have all 512 CUDA cores enabled. Next, the full width of the memory interface is utilized, giving you 1536 MB per GPU, or 3 GB of total memory on the card.
The rest are fascinating features, such as a removable cooler shroud that lets you clean the card from time to time (you might need to clean it now and then for the best cooling performance), heatsinks that use vapor-chamber technology, getting rid of those pesky heat-pipes, high-grade 12-layer PCB that uses 2 oz copper layers, and a 10-phase VRM (looks like 4+1 phase per GPU). As expected, the final iteration of the card needs to draw power from two 8-pin PCI-E power connectors. "Barely Street Legal"? Is it because they'll throw you out of LAN parties for having too much of a performance advantage? Hmmm.
Source:
XtremeSystems Forums
The rest are fascinating features, such as a removable cooler shroud that lets you clean the card from time to time (you might need to clean it now and then for the best cooling performance), heatsinks that use vapor-chamber technology, getting rid of those pesky heat-pipes, high-grade 12-layer PCB that uses 2 oz copper layers, and a 10-phase VRM (looks like 4+1 phase per GPU). As expected, the final iteration of the card needs to draw power from two 8-pin PCI-E power connectors. "Barely Street Legal"? Is it because they'll throw you out of LAN parties for having too much of a performance advantage? Hmmm.
49 Comments on GeForce GTX 590 Key Features Revealed
1920x1200 with AA ?
seriously 3gb of memory is little! they need to load it up with 6gb, which is 1 gb more than they need per gpu.
their design is so highly ineffective...
I can't see they're winning against amd in at this power envelope.... not in profits atleast. those die's are frigging huge!
When I read the line about throwing you out due to performance...I was thinking more along these lines and they'd throw your ass out for the 2nd mortgage they need on their house to supply the juice everytime you come over or to upgrade the electric grid in the local area...
so we are talking dual full blown 580 cores then. Should offer quite a bit of bang, here's hoping for not too much buck.
Anyway, the (ASUS) GTX 590 was priced at 7 395 SEK (w/ 25% VAT) = ~1 167 USD or ~934 USD w/o VAT. In comparison, at the same e-tailer, the (ASUS) HD 6990 is priced at 6 299 SEK (w/ 25% VAT) = ~995 USD or ~796 w/o VAT.
Again, the price may not be final and these are Swedish prices from the same e-tailer. Just thought I'd share since there was some speculation on price in this thread. :toast:
:shadedshu
Given that the 580 and 6970 are pretty close on Performance per Watt for these two single GPUs I think we'll end up at about the same performance because they're both trying to keep under 375 watts.
(personally, I think AMD *could* come out on top because the 6970 does have slightly better performance per watt than the 580. But I think the end result will actually come down more to binning than anything else)
I obviously agree with you and think you are quite smart.
I will not buy it if is 900 bills damn it !
I bet this is also one of the reason EVGA and Galaxy put 460 cores in their dual solutions - they were simply denied to use proper 560s because they would step all over the 590's heels at almost half the price...
I hope I'm wrong but things are starting to smell very fishy...
A pair of 560s OCed:
"Subjective obtained GPU power consumption = ~ 298 Watts"
A 6990:
Notice how they both produce 102fps@19x12 in BC2. The only reason the 560 fall back at 25x16 is because of their 1GB memory buffer.
Edit: I'd use W1z's charts but he doesn't seem to have an overclocked sli review.
Ati and nVidia have very close average performance before.
It is only during the Geforce 8 era that nVidia pulled ahead because ATi fucked up quite badly with their HD2k series.
6 months ago nobody would have guessed Nvidia could pull this off with the state GF100 was in, a single HOT hungry GPU with 480 SP's, this beast packs 1024 and yes it will consume more power but really thats a feat unto itself.