Friday, March 11th 2011

EVGA Unveils GeForce GTX 460 2WIN Dual-GPU Graphics Card

It looks like EVGA isn't waiting for GeForce GTX 590, and is releasing its own dual-GPU graphics card to challenge Radeon HD 6990. Being released about 12 days ahead of GTX 590, EVGA's new GeForce GTX 460 2WIN could be a tad bit late to the market, considering it was first shown to the world back in January, at this year's CES event. The EVGA GTX 460 2WIN is a dual-GPU graphics card that uses two GeForce GTX 460 GPUs with 1 GB of memory each, for an SLI on a stick solution.

The EVGA GTX 460 2WIN (2WIN sounds like "twin"), uses clock speeds of 700 MHz core, and 900 MHz (3600 MHz GDDR5 effective) memory. Each GF104 chip has 336 CUDA cores enabled, totaling the CUDA core count to 672. The card is cooled by an in-house cooler by EVGA, it uses a large heatsink that is ventilated by three 80 mm fans. Power is drawn from two 8-pin PCI-E power connectors. Display outputs include three DVI and one mini-HDMI, you can run a 3-display NVIDIA 3D Vision Surround setup with just one of these cards, without needing a second one.
EVGA also let out its own performance figures for the card, in which the card was put through 3DMark 11 and Unigine Heaven, where the card scored about 5~10% faster than the EVGA GeForce GTX 580. EVGA did not disclose pricing or availability.
Add your own comment

50 Comments on EVGA Unveils GeForce GTX 460 2WIN Dual-GPU Graphics Card

#26
sneekypeet
not-so supermod
(FIH) The Donno....can.....look at the pictures....why would there be a sli finger if you could'nt sli it?
I think post 11 covers it. Looks to me like they threw a bunch of older parts together to make money off of EOL cores. The PCB likely already had the finger, so they left it;)

...and I got ninja'd ;)
Posted on Reply
#27
(FIH) The Don
sigh :shadedshu

very bad eVGA :nutkick::banghead:

kinda like corsair making their top psu's 8pin that cant reach the mb connector inside the 800D
Posted on Reply
#28
sneekypeet
not-so supermod
(FIH) The Donkinda like corsair making their top psu's 8pin that cant reach the mb connector inside the 800D
Easy:roll:
Posted on Reply
#29
{uZa}DOA
Razi3lAn EVGA rep confirmed on the forums that this does not have quad sli support - Meaning you can't have 2 of these. The reason the SLI connector was left on was because they were eager to get this out. Sloppy.
totally fail in my opinion.. No SLI /facepalm...:banghead:
Posted on Reply
#31
pentastar111
this card would be of no use to me...gotta have 2 for nv surround. strange decision over at EVGA's camp to put this cool card out and it not be able to support triple screen gaming. :confused:
Posted on Reply
#32
Red_Machine
Dude, it has three friggen DVI connectors. Of course it supports 3 monitors!
Posted on Reply
#33
pentastar111
Red_MachineDude, it has three friggen DVI connectors. Of course it supports 3 monitors!
I am a tired moron. I read the article again, don't know how i missed that bit of info the first time. lol. I am going to sleep.
Posted on Reply
#34
JrRacinFan
Served 5k and counting ...
Good card, bad timing. My rebuttal to the card, GTX570. Also another point to put out, a decent quality 650W+ power supply would be ideal with this. Whereas a stock gtx570, a decent quality 450W+ power supply would be more than adequate (single card use).
Posted on Reply
#35
oily_17
If it is easy to set-up and get both cores Folding, without any bugs, then I will definitely be interested in these cards.
Posted on Reply
#36
newtekie1
Semi-Retired Folder
Everyone is missing the point complaining about it being a GTX460 instead of a GTX560. Besides the fact that the GTX560 only gives a very minor performance improvement when both are clocked the same(because they are the same exact core with a few extra shaders enabled on the GTX560), also the GTX560 core is pin compatible with the GTX460 core. So when they run out of GTX460 cores, they just slap on some GTX560 cores, and call it a new card. No re-engineering required on their end.
JrRacinFanGood card, bad timing. My rebuttal to the card, GTX570. Also another point to put out, a decent quality 650W+ power supply would be ideal with this. Whereas a stock gtx570, a decent quality 450W+ power supply would be more than adequate (single card use).
Except this card will outperform a GTX570...

Unfortunately I know eVGA all too well, well actually I know the graphics card industry all too well, so I know this card will likely cost more than a GTX580.
Posted on Reply
#37
br4dz
Card is ugly as sin.
Posted on Reply
#38
Crap Daddy
An EVGA rep said on their forums that the price will be under GTX580.
Posted on Reply
#39
ToTTenTranz
newtekie1Everyone is missing the point complaining about it being a GTX460 instead of a GTX560. Besides the fact that the GTX560 only gives a very minor performance improvement when both are clocked the same(because they are the same exact core with a few extra shaders enabled on the GTX560), also the GTX560 core is pin compatible with the GTX460 core. So when they run out of GTX460 cores, they just slap on some GTX560 cores, and call it a new card. No re-engineering required on their end.
The extra "SM" activated in GTX560 and higher clocks don't come for free, specially since both chips are made in the same 40nm process.
That said, the GTX560 has a higher TDP than GTX460, so changing to dual GTX560s would require some re-engineering regarding power regulation and probably heat dissipation.





Nonetheless, with only 1GB per GPU available, this card will choke on memory quite often, given that its "main feature" is to run a multi-monitor setup, even more if it's doing 3D (twice the framebuffer memory needed).
Using the 24 ROP version with (768*2) 1536MB for each GPU would have been a smarter choice, IMO.
Posted on Reply
#40
newtekie1
Semi-Retired Folder
ToTTenTranzThe extra "SM" activated in GTX560 and higher clocks don't come for free, specially since both chips are made in the same 40nm process.
That said, the GTX560 has a higher TDP than GTX460, so changing to dual GTX560s would require some re-engineering regarding power regulation and probably heat dissipation.
Yes, but they are using overclocked GTX460 cores, and likely a power regulation and heatsink design that could easily handle stock clocked GTX560 cores. The extra shaders don't add that much really, it was the high stock clock speed, and extra voltage needed to ensure that speed was stable that resulted in the higher TDP and the need for better power regulation.
ToTTenTranzNonetheless, with only 1GB per GPU available, this card will choke on memory quite often, given that its "main feature" is to run a multi-monitor setup, even more if it's doing 3D (twice the framebuffer memory needed).
Using the 24 ROP version with (768*2) 1536MB for each GPU would have been a smarter choice, IMO.
I wouldn't call that its main feature, its main feature is going a fast ass card. I wouldn't be surprised if we see 2GB on the GTX560 version though.;)
Posted on Reply
#41
ToTTenTranz
newtekie1I wouldn't call that its main feature, its main feature is going a fast ass card.
Well you wouldn't, but EVGA certainly does.
Check out the product's webpage:

4 Monitors. 3D Surround. 2 GPUs. One graphics card -> That's their tag line.

And then check the specs for those benchmarks they took against the GTX580.
3DMark11 is running with the "Performance" settings and the Unigine bench is running at 1440x900.
If they had to lower the quality settings that much to get a performance overhead from a GTX580, then this 2WIN card is clearly lacking in memory amount.
newtekie1I wouldn't be surprised if we see 2GB on the GTX560 version though.;)
If that happens, then the card will definitely be more expensive than a single GTX580 (but also substantially faster in almost every situation, of course).
Posted on Reply
#42
ebolamonkey3
ToTTenTranzWell you wouldn't, but EVGA certainly does.
Check out the product's webpage:

4 Monitors. 3D Surround. 2 GPUs. One graphics card -> That's their tag line.

And then check the specs for those benchmarks they took against the GTX580.
3DMark11 is running with the "Performance" settings and the Unigine bench is running at 1440x900.
If they had to lower the quality settings that much to get a performance overhead from a GTX580, then this 2WIN card is clearly lacking in memory amount.

If that happens, then the card will definitely be more expensive than a single GTX580 (but also substantially faster in almost every situation, of course).
Yea I really don't think EVGA thought this one through.

GTX 460 SLI is pretty even with a single GTX 580 in terms of performance. It's got faster avg. fps in most games, but also lower min fps as well, at around 1920x1200 resolution. Only plus is providing the same performance as a GTX 580 for substantially cheaper ($350~ vs $500).

GTX 460 1gb has already been shown to be bottlenecked by the memory size at resolution 2560x1600 (4 million pixels), so by not putting 2gb (per GPU) of RAM on this card and then marketing it as a solution to multi-monitor setups, EVGA has set this card up for fail. Most common surround solutions are 3x 1680x1050 (5.3 million pixels) and 3x 1920x1200 (6.9 million pixels), so this card will definitely be limited by the RAM. That's not even counting 3x 30" setups, whatever cards powering those monitors need to render over 12 megapixels! :eek:

Also, since this card does not support quad-SLI, there is no upgrade path. This was the one thing that would have made the 2Win a win (har har) over a pair of regular GTX 460s in SLI. Right now a pair of GTX 460s will only cost you $320 (Cheapest on Newegg, not counting the ridiculous Galaxy pair that's $250 after rebate), and will probably perform better than this card. So unless someone doesn't have the PCI-E slot to spare, that would be a better solution. Though to be fair, it's probably mostly Nvidia's fault rather than EVGA's. The SLI connector's there so technically the capability's there, but Nvidia probably won't ever write the drivers for it :(

If I had to guess, EVGA probably designed this card with 2Purposes (hee hee). One to get rid of excess GTX 460 cores, and two to design a dual GPU card that will cater to a larger market than the GTX 590. With the GTX 590's limited supply and thin profit margin, EVGA is probably only making them due to their close relationship w/ Nvidia. MSI was supposed to be another partner that would produce the GTX 590, but turned down b/c of low supplies and low margins. So now we get EVGA, Asus, and Inno3D :rolleyes: But this card will be obtainable for a much bigger market than the GTX 590, and now EVGA has built a good foundation for a GTX 560 2Win in the future.
Posted on Reply
#43
cdawall
where the hell are my stars
To bad its to long for my mini itx case :/
Posted on Reply
#44
yogurt_21
curious if this will end up faster than the 5970, if it is how much I wonder?

I mean shoot we were all crying for this card when the gf104 was released and yet now it comes out with the 590 right on its heels.

I'd expect this to sell just about as well as the dual x1950 card did.
Posted on Reply
#45
newtekie1
Semi-Retired Folder
yogurt_21curious if this will end up faster than the 5970, if it is how much I wonder?

I mean shoot we were all crying for this card when the gf104 was released and yet now it comes out with the 590 right on its heels.

I'd expect this to sell just about as well as the dual x1950 card did.
Well two GTX460s in SLI are just under an HD5970, and this card is overclocking the GTX460s slightly, so I figure the card will end up being pretty close to dead on with the performance of the HD5970.

I think the performance gap, and hopefully price gap, will be big enough that the GTX590 won't really hurt this card all that much. If it ends up being true that this card is priced below a GTX580, and it does end up performing better, it might be a winner. It is a little different than when the dual x1950 cards came out because we really haven't made a huge leap in technology. With the x1950 cards, we were transitioning from DX9 to DX10. So with the DX10 cards coming out, it not only made the dual x1950 cards obsolete performance wise, but also technology wise as well. In the case of this card, it isn't obsolute technology wise, since the GF104 core is pretty much identical to the "new" GF114 core, there is nothing new really with the "next generation" cards other than a higher number for the name.
Posted on Reply
#46
Bjorn_Of_Iceland
erixxI like your pic much more now Bjorn, hahahahah what an irony.
I did it for teh lulz
Crap DaddyAn EVGA rep said on their forums that the price will be under GTX580.
wow just wow
Posted on Reply
#47
xXxBREAKERxXx
I can't understand. Why they are bothering with Old models ?
Posted on Reply
#48
newtekie1
Semi-Retired Folder
xXxBREAKERxXxI can't understand. Why they are bothering with Old models ?
Because there really is not difference between the old models and new models technology wise.
Posted on Reply
#49
wolf
Better Than Native
this looks like a very promising card at the right price! the GF104/114 core scales beautifully in tandem and gains very well from overclocking too.

I hope in the coming months to see this revamped with at least GTX560Ti cores and clocks, if not higher clocks, and fingers crossed for 2+2gb of memory, but I wouldnt count on it.
Posted on Reply
#50
Dave65
To bad it don't support SLI..
Posted on Reply
Add your own comment
Dec 17th, 2024 22:30 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts