Friday, March 11th 2011
Galaxy Designs its Own Dual GeForce GTX 460 Graphics Card
After EVGA's release of its GeForce GTX 460 2WIN dual-GPU graphics card earlier this week, Galaxy wasn't going to sit back. The company rushed in with pictures of its own dual-GeForce GTX 460 graphics card. The card is still in the works, and Galaxy was only able to display its PCB. Galaxy chose a milky-white PCB, which uses two GF104 cores in an internal SLI, powered by a strong VRM circuitry. Each GPU has 336 CUDA cores, and is wired to 1 GB of GDDR5 memory over a 256-bit wide memory interface.
Further, both display outputs of each GPU is wired out in the form of DVI connectors, making the card a single-piece 3D Vision Surround solution. Interestingly, Galaxy chose a bridge chip other than nForce 200 to run the SLI on a stick solution. If Galaxy's implementation clicks, then every AIC partner with its own R&D could work on their own dual-GPU cards, currently, they're held back by non-supply of nForce 200.
Source:
Expreview
Further, both display outputs of each GPU is wired out in the form of DVI connectors, making the card a single-piece 3D Vision Surround solution. Interestingly, Galaxy chose a bridge chip other than nForce 200 to run the SLI on a stick solution. If Galaxy's implementation clicks, then every AIC partner with its own R&D could work on their own dual-GPU cards, currently, they're held back by non-supply of nForce 200.
48 Comments on Galaxy Designs its Own Dual GeForce GTX 460 Graphics Card
And because of the way SLI/Crossfire works, a dual GPU card with 2GB of RAM can only effectively use 1GB, because whatever is being stored in RAM has to be repeated for each GPU.
However, I think people are over reacting to the 1GB of RAM per GPU. There are only a few games on the market right now that hit that maximum(Metro2033@2560x1600+ w/AA is the only one I can think of), and it doesn't seem like we will see any games in the near future that will need more than 1GB of RAM, even with a mulitiple monitor setup. The people running SLI with GTX460/560s already with 1GB of RAM don't really seem to be having much of an issue.
An basic rule of thumb is the higher the resolution the more memory required. I'm not solving for the exact equation to this but in general it applies.
I dont care if this thing has 4GB of onboard memory, I'm not interested in buying last year's backfill(anything released after initial new lineup). Especially since the new 590 will be out soon which means a dual 500 series card is more appropriate than a dual 400.
I would love this if it were dual 560's.
And calling it last year's backfill is poor on your part, there is nothing new with the GTX560, they are using the same GPU as the GTX460, it just as all the shaders enabled.
Heck I even thought of buying a Radeon X2 and I got an SLI board. Only reason is cause at the time I didn't like the NV offerings. Being limited to SLI yet wanting Crossfire sucks. The X2s solved that issue. Course I never got one, price was always prohibitive for me. Heck sometimes they'd be more expensive than two cards.
But If I was on a Crossfire board and wasn't satisfied with ATI's offerings, a dual 460 would seem like a good deal.
Ok, it's not backfill, but it's damn close. The 460's were great in SLI but I just dont see the purpose of using older chips in the bleeding edge of hardware. Tell me it will be worth it now, and then keep true when benchmarks roll in.
Oh and plenty more games, at lesser resolutions use 1 gigabyte of ram.
And whatever game with whatever settings you are playing that uses more than 1GB of RAM wouldn't run at anything playable if you threw 3GB of RAM per GPU on the card.
Then what are the outputs connected to... GPU or Bridge or Something Else(like RAM)?
It perplexes me.
And, again, the GTX560 is pin compatible with the GTX460. That means when they run out of GTX460 cores, they just start putting GTX560 cores on, call it a new card without changing anything on the PCB, and suddenly add yet another SKU to the new product list. The new card will show a performance increase over the old one using GTX460 cores, more idiots will run out to buy this "new" awesome card, and it cost the manufacturer absolutely nothing in R&D costs.
Its not really outdated tech. 460 was the best of the 2010 Fermis IMHO. They can still hold their own against 560s if they are clocked up. But clocked 560s will of course blow them away.
1GB =
lol