Friday, March 11th 2011

Galaxy Designs its Own Dual GeForce GTX 460 Graphics Card

After EVGA's release of its GeForce GTX 460 2WIN dual-GPU graphics card earlier this week, Galaxy wasn't going to sit back. The company rushed in with pictures of its own dual-GeForce GTX 460 graphics card. The card is still in the works, and Galaxy was only able to display its PCB. Galaxy chose a milky-white PCB, which uses two GF104 cores in an internal SLI, powered by a strong VRM circuitry. Each GPU has 336 CUDA cores, and is wired to 1 GB of GDDR5 memory over a 256-bit wide memory interface.

Further, both display outputs of each GPU is wired out in the form of DVI connectors, making the card a single-piece 3D Vision Surround solution. Interestingly, Galaxy chose a bridge chip other than nForce 200 to run the SLI on a stick solution. If Galaxy's implementation clicks, then every AIC partner with its own R&D could work on their own dual-GPU cards, currently, they're held back by non-supply of nForce 200.
Source: Expreview
Add your own comment

48 Comments on Galaxy Designs its Own Dual GeForce GTX 460 Graphics Card

#26
overclocking101
"Each gpu is wired to 1gb gddr5" that means the card has 2gb on it has anyone read the 4870X2 specs?? each card wired to 1gb ram and that card has 2gb on it?? i mean seriously.
Posted on Reply
#27
newtekie1
Semi-Retired Folder
overclocking101"Each gpu is wired to 1gb gddr5" that means the card has 2gb on it has anyone read the 4870X2 specs?? each card wired to 1gb ram and that card has 2gb on it?? i mean seriously.
Yes, but the HD4870x2 isn't a DX11 card, and is a generation old at this point(arguably two, but I won't say that).

And because of the way SLI/Crossfire works, a dual GPU card with 2GB of RAM can only effectively use 1GB, because whatever is being stored in RAM has to be repeated for each GPU.

However, I think people are over reacting to the 1GB of RAM per GPU. There are only a few games on the market right now that hit that maximum(Metro2033@2560x1600+ w/AA is the only one I can think of), and it doesn't seem like we will see any games in the near future that will need more than 1GB of RAM, even with a mulitiple monitor setup. The people running SLI with GTX460/560s already with 1GB of RAM don't really seem to be having much of an issue.
Posted on Reply
#28
mlee49
GTA had adjustments to the settings that showed the memory usage. At 1200p it can easily eat 1GB. My old 275's in SLI had a hard time at some spots.

An basic rule of thumb is the higher the resolution the more memory required. I'm not solving for the exact equation to this but in general it applies.

I dont care if this thing has 4GB of onboard memory, I'm not interested in buying last year's backfill(anything released after initial new lineup). Especially since the new 590 will be out soon which means a dual 500 series card is more appropriate than a dual 400.

I would love this if it were dual 560's.
Posted on Reply
#29
DarkOCean
Gta IV show that it eats 1,6 gb even at 1280x1024 crysis 1 eats 1,3 gb at 1080p.
Posted on Reply
#30
ebolamonkey3
_JP_Unless you can't read:
Still in design stages right now for this card. Definitely possible to just more ram on there.
Posted on Reply
#31
ebolamonkey3
mlee49GTA had adjustments to the settings that showed the memory usage. At 1200p it can easily eat 1GB. My old 275's in SLI had a hard time at some spots.

An basic rule of thumb is the higher the resolution the more memory required. I'm not solving for the exact equation to this but in general it applies.

I dont care if this thing has 4GB of onboard memory, I'm not interested in buying last year's backfill(anything released after initial new lineup). Especially since the new 590 will be out soon which means a dual 500 series card is more appropriate than a dual 400.

I would love this if it were dual 560's.
Clock for clock, the 560 is only 10% faster though...
Posted on Reply
#32
newtekie1
Semi-Retired Folder
mlee49GTA had adjustments to the settings that showed the memory usage. At 1200p it can easily eat 1GB. My old 275's in SLI had a hard time at some spots.

An basic rule of thumb is the higher the resolution the more memory required. I'm not solving for the exact equation to this but in general it applies.

I dont care if this thing has 4GB of onboard memory, I'm not interested in buying last year's backfill(anything released after initial new lineup). Especially since the new 590 will be out soon which means a dual 500 series card is more appropriate than a dual 400.

I would love this if it were dual 560's.
Yes, but at those settings, the card/s would choke anyway regardless of how much RAM was available, so there isn't much point in that argument.

And calling it last year's backfill is poor on your part, there is nothing new with the GTX560, they are using the same GPU as the GTX460, it just as all the shaders enabled.
Posted on Reply
#33
NC37
DarkOCeanWho in the right mind will pay for this more than two separate gtx 460 .This should be like 300-350$ at most to sell.
People that don't have a SLI board or just boards with one 16x slot.

Heck I even thought of buying a Radeon X2 and I got an SLI board. Only reason is cause at the time I didn't like the NV offerings. Being limited to SLI yet wanting Crossfire sucks. The X2s solved that issue. Course I never got one, price was always prohibitive for me. Heck sometimes they'd be more expensive than two cards.

But If I was on a Crossfire board and wasn't satisfied with ATI's offerings, a dual 460 would seem like a good deal.
Posted on Reply
#34
mlee49
newtekie1Yes, but at those settings, the card/s would choke anyway regardless of how much RAM was available, so there isn't much point in that argument.

And calling it last year's backfill is poor on your part, there is nothing new with the GTX560, they are using the same GPU as the GTX460, it just as all the shaders enabled.
The heatsinks new[er]

Ok, it's not backfill, but it's damn close. The 460's were great in SLI but I just dont see the purpose of using older chips in the bleeding edge of hardware. Tell me it will be worth it now, and then keep true when benchmarks roll in.

Oh and plenty more games, at lesser resolutions use 1 gigabyte of ram.
Posted on Reply
#35
newtekie1
Semi-Retired Folder
mlee49The heatsinks new[er]

Ok, it's not backfill, but it's damn close. The 460's were great in SLI but I just dont see the purpose of using older chips in the bleeding edge of hardware. Tell me it will be worth it now, and then keep true when benchmarks roll in.

Oh and plenty more games, at lesser resolutions use 1 gigabyte of ram.
Again, they aren't any older than the chips on the GTX560 cards.

And whatever game with whatever settings you are playing that uses more than 1GB of RAM wouldn't run at anything playable if you threw 3GB of RAM per GPU on the card.
Posted on Reply
#36
balanarahul
ToTTenTranzA PCI-Express bridge has always been necessary to connect multi-GPUs in one card. It's not there to make the GPUs communicate with each other, it's there to receive the data from both GPUs and transmit it to a single PCI-Express connection (to the motherboard).
Kind of a multiplexer, or mux, if you're familiar with electronics' terms.

Besides, the "aesthetics of the PCB" is hardly a main concern, nor it should ever be. That thing is supposed to go inside a case, for god's sake. Want aesthetics? Buy a painting, or a designer's lamp.
Hmm...
Then what are the outputs connected to... GPU or Bridge or Something Else(like RAM)?
Posted on Reply
#37
HalfAHertz
newtekie1Again, they aren't any older than the chips on the GTX560 cards.

And whatever game with whatever settings you are playing that uses more than 1GB of RAM wouldn't run at anything playable if you threw 3GB of RAM per GPU on the card.
didn't the Gf114 get some extra loving from it's Gf110 daddy? Like advanced Z-culling and optimized 64bit/FP16 texturing?
Posted on Reply
#38
Chewers
Pretty much the same as EVGA GTX460 2WIN (Showen in CES, PEX East 2011) only EVGA can run SLI and its so sweet.
Posted on Reply
#39
newtekie1
Semi-Retired Folder
HalfAHertzdidn't the Gf114 get some extra loving from it's Gf110 daddy? Like advanced Z-culling and optimized 64bit/FP16 texturing?
I don't believe so. I believe GF104 already had the special sauce tweaks to Fermi that nVidia was going to implement.
ChewersPretty much the same as EVGA GTX460 2WIN (Showen in CES, PEX East 2011) only EVGA can run SLI and its so sweet.
Unfortunately the eVGA card can't run in SLi with another. It was posted on the eVGA forums that the SLi connector was just there from inital designs, but it doesn't actually work.
Posted on Reply
#40
HalfAHertz
newtekie1I don't believe so. I believe GF104 already had the special sauce tweaks to Fermi that nVidia was going to implement.



Unfortunately the eVGA card can't run in SLi with another. It was posted on the eVGA forums that the SLi connector was just there from inital designs, but it doesn't actually work.
If that's the case, some brave and fearless soul should try to put a 560 bios on a 460 and see if all the shaders are there :p
Posted on Reply
#41
mlee49
newtekie1Again, they aren't any older than the chips on the GTX560 cards.

And whatever game with whatever settings you are playing that uses more than 1GB of RAM wouldn't run at anything playable if you threw 3GB of RAM per GPU on the card.
Then why didn't they release it as a dual 560 card? If its common knowledge the 460 chips are practically why would anyone release a 400 series card at this point?

It perplexes me.
Posted on Reply
#42
HalfAHertz
mlee49Then why didn't they release it as a dual 560 card? If its common knowledge the 460 chips are practically why would anyone release a 400 series card at this point?

It perplexes me.
It was said already. Not only can they clear old stocks like this, but they can market it at above MSRP price because it is a unique niche product ergo the 2win: they win twice :D
Posted on Reply
#43
mlee49
HalfAHertzIt was said already. Not only can they clear old stocks like this, but they can market it at above MSRP price because it is a unique niche product ergo the 2win: they win twice :D
To me they loose twice...
Posted on Reply
#44
HalfAHertz
mlee49To me they loose twice...
This is why you don't run a multi-million electronics manufacturing company :p
Posted on Reply
#45
newtekie1
Semi-Retired Folder
mlee49Then why didn't they release it as a dual 560 card? If its common knowledge the 460 chips are practically why would anyone release a 400 series card at this point?

It perplexes me.
Because people in the know, that will be buying this card, know what the GTX460 is still capable of, and that the GTX500 serieis is nothing more than a revision of the previous GPUs that shouldn't have been names as a new generation. This card will more than likely outperform a GTX580, and actually might outperform an HD5970.

And, again, the GTX560 is pin compatible with the GTX460. That means when they run out of GTX460 cores, they just start putting GTX560 cores on, call it a new card without changing anything on the PCB, and suddenly add yet another SKU to the new product list. The new card will show a performance increase over the old one using GTX460 cores, more idiots will run out to buy this "new" awesome card, and it cost the manufacturer absolutely nothing in R&D costs.
Posted on Reply
#46
NC37
Yep, altho I'd like to see them price slash the 460s at least once more before they completely turn them all into twins.

Its not really outdated tech. 460 was the best of the 2010 Fermis IMHO. They can still hold their own against 560s if they are clocked up. But clocked 560s will of course blow them away.
Posted on Reply
#48
alucasa
I wanna see GTS 530 X2.

lol
Posted on Reply
Add your own comment
Dec 18th, 2024 05:16 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts