Wednesday, January 20th 2010

Galaxy Designs Dual-Core GeForce GTS 250 Graphics Accelerator

Whereas NVIDIA is inching towards releasing its DirectX compliant GF100 GPU from which it has high expectations, some of its partners don't seem to give up on the two-generation old G92. Galaxy has almost finished designing a graphics card that uses two GeForce GTS 250 GPUs on one PCB, in an SLI on card setup. It refers to the card as "DualCore GTS250", and consists of a long blue PCB which holds two G92-426 GPUs (G92-426 GPUs are used on low-power variants of the GTS 250).

Each GPU is wired to 1 GB of GDDR3 memory across its 256-bit wide memory interface (total 2 GB on board). The two GPUs are connected to the system over an nForce 200 BR-03 bridge chip. The multi-GPU system is contained within the board, with no SLI fingers for expansion, even though G92 chips are Quad SLI capable.
The card draws power from two 6-pin PCI-E power connectors, regulated by a 9-phase digital PWM circuit. This isn't the first implementation of its kind from a very technical standpoint. The 2-year old NVIDIA GeForce 9800 GX2 uses two 65 nm G92 chips in a similar implementation, which is Quad SLI capable. There is no word on when Galaxy will release this.
Source: Expreview
Add your own comment

60 Comments on Galaxy Designs Dual-Core GeForce GTS 250 Graphics Accelerator

#1
mlee49
Not bad, one for primary one for physx.

Wonder how well it could fold?
Posted on Reply
#2
Imsochobo
This is as a 9800GX2.

8800GTX(384bit)->9800GTX(256bit,higher core/shaderclock)->9800GTX+(Higherclocks)->-GTS250

Doesnt this mean we got just another 9800GX2 on our hands? since its based on 9800GTX(+) ?
Posted on Reply
#3
newtekie1
Semi-Retired Folder
Wish they would have put an SLI finger on there for quad-sli.
ImsochoboThis is as a 9800GX2.

8800GTX(384bit)->9800GTX(256bit,higher core/shaderclock)->9800GTX+(Higherclocks)->-GTS250

Doesnt this mean we got just another 9800GX2 on our hands? since its based on 9800GTX(+) ?
It is close, but the big difference is the low power cores, allowing this to only use two 6-pins instead of the 6-pin + 8-pin that was required on the 9800GX2.
Posted on Reply
#4
BarbaricSoul
newtekie1Wish they would have put an SLI finger on there for quad-sli.



It is close, but the big difference is the low power cores, allowing this to only use two 6-pins instead of the 6-pin + 8-pin that was required on the 9800GX2.
Also the 9800gx2 had 2 PCBs
Posted on Reply
#5
DirectorC
If they could drop this at under $300, maybe even at $250, it would be a huge competitor. I'd pimp myself out with one, that's for sure. I just shipped out my fake GTS250 and I'm already missing it. Played COD4 last night with some mighty reduced visual effects for smooth(ish) game play :ohwell:
Posted on Reply
#6
Tatty_Two
Gone Fishing
This "should" be about the same speed as the 285 but 10-15% cheaper (UK prices). I read an article about 2 months ago in a PC magazine that said both Galaxy and Zotac were looking at doing this possibly because it is alledged that there is a big surplus of G92 GPU's and they can be had on the "cheap" by board partners, the article was partly speculation though to be fair, if Zotac does come up with one, perhaps that will come with the option to SLi, certainly 2 of these would do a nice fold.
Posted on Reply
#7
Helper
mlee49Not bad, one for primary one for physx.

Wonder how well it could fold?
I don't think that you would be able to split it into two GPUs. It would probably appear as a single slot, Dual-GPU card to the system. The fact won't change whether you activate SLI or not because GPUs are directly connected with each other though NF200 chip on the PCB itself. Physx only works on a single GPU BTW.

Thinking about two G92 cores(9800 GX2), it should fold very well.
ImsochoboThis is as a 9800GX2.

8800GTX(384bit)->9800GTX(256bit,higher core/shaderclock)->9800GTX+(Higherclocks)->-GTS250

Doesnt this mean we got just another 9800GX2 on our hands? since its based on 9800GTX(+) ?
8800 GTX has notting to do with any other card you listed there. It's a completely different core(G80) with 32 Pipelines, 384 bit bus and things like that. In theory, it's a superior core to G92. It has more capabilities then any other single GPU G92 card.

But yeah, this is another 9800GX2 with the newest G92 cores. Actually 9800GX2 has G92B(65 NM revision) already so this is just those last low-power cores combined. I own a 9800GX2 myself and this won't be a bad performer overall. On the other hand, this should have been released a 2 weeks ago(Christmas) and it had to have fully functional G200B cores. Which means DUAL GTX 285. Now this looks more like some attempt to get rid of G92 cores. SLIing and putting them on a cheap PCB... I think it needs to be priced really low in order to be an alternative to HD 5800 series. People would still go and buy a HD 5850 instead of this. Then it could only be a good folding card...
Posted on Reply
#8
Imsochobo
HelperI don't think that you would be able to split it into two GPUs. It would probably appear as a single slot, Dual-GPU card to the system. The fact won't change whether you activate SLI or not because GPUs are directly connected with each other though NF200 chip on the PCB itself. Physx only works on a single GPU BTW.

Thinking about two G92 cores(9800 GX2), it should fold very well.



8800 GTX has notting to do with any other card you listed there. It's a completely different core(G80) with 32 Pipelines, 384 bit bus and things like that. In theory, it's a superior core to G92. It has more capabilities then any other single GPU G92 card.
128 shaders.
64 texture units.
24 ROPS.

Is what both 9800GTX and 8800GTX share, they are quite simular...
The diffrences:
9800GTX have over 8800GTX:
have a lower memory bandwidth due to 256 bit vs 384.
have a higher processing power(both 9800GTX and 9800GTX+.
Have its build on 65 NM and 55nm over the 90NM on 8800GTX(hence higher clocks)
Higher shader clock.
Higher memory clock
Higher core clock.
Otherwise i cant say i can find very very big diffrences, except its a slight refresh of the core and stripped memory bus.
Its not that far off tho :P
(aalmost the same as 2900XT vs 3870, same performance, refresh except ati added features.)
But yeah, this is another 9800GX2 with the newest G92 cores. Actually 9800GX2 has G92B(65 NM revision) already so this is just those last low-power cores combined. I own a 9800GX2 myself and this won't be a bad performer overall. On the other hand, this should have been released a 2 weeks ago(Christmas) and it had to have fully functional G200B cores. Which means DUAL GTX 285. Now this looks more like some attempt to get rid of G92 cores. SLIing and putting them on a cheap PCB... I think it needs to be priced really low in order to be an alternative to HD 5800 series. People would still go and buy a HD 5850 instead of this. Then it could only be a good folding card..
No kidding, lower power usage, higher performance, HDMI with bitstream(love that! :rockout:) DX11 and its very quiet, i havnt heard mine yet, in a HTPC case!
Posted on Reply
#9
theubersmurf
Tatty_OneI read an article about 2 months ago in a PC magazine that said both Galaxy and Zotac were looking at doing this possibly because it is alledged that there is a big surplus of G92 GPU's and they can be had on the "cheap" by board partners.
It seems pretty clear by the way they try to sell off 8800gts 512 variants now, that there was a surplus of 65nm g92 cores. I bought a 9800gtx+ a while back expressly for physx, and they used the smaller cooler designed for the 55nm cores, thing was ridiculously hot. I get the impression they way overproduced those cores because of the success of the g80, suspecting that the g92 would do as well. I get the feeling they're still trying to get rid of them. I think they're selling them as keychains in the nvidia store too.

my bad, it's a g98 core.
Posted on Reply
#10
arnoo1
useless, damn who cares nowadays about g92b gpu's

''keep waithing for fermi''
Posted on Reply
#11
3volvedcombat
arnoo1useless, damn who cares nowadays about g92b gpu's

''keep waithing for fermi''
Look Fermi is looking looser everyday my sir, Today im reading this artical from semi accurate and the fermi is using 280 watts, 20 watts away from the PCI-e limit of 300w.

Now thats pretty crazy but its from semiaccurate.

Now i just couldnt imagine fermi getting all these predections, and negatives everywere as just a accident. I think fermi or gt100 might be to hot to handle, and to big to produce, ive seen 20 of these threads, and prediections from sites already.

BUT WE WILL SEE.

This is a nice card, 248 stream processors of love on a single PCB with 2 gigs of TOTAL memory. Itll do the job for every game out there and itll be faster then a 285. Becuase these g92 gpu'z will overclock very high, and the ram should hit silly speeds offering 200-250gb of total bandwith. I want to see what the heatsink is going to look like. I dont want no wack gigabyte's creativity as a heatsink. Be nice if it had the same 295 heatsink :D
Posted on Reply
#12
Helper
Imsochobo128 shaders.
64 texture units.
24 ROPS.

Is what both 9800GTX and 8800GTX share, they are quite simular...
The diffrences:
9800GTX have over 8800GTX:
have a lower memory bandwidth due to 256 bit vs 384.
have a higher processing power(both 9800GTX and 9800GTX+.
Have its build on 65 NM and 55nm over the 90NM on 8800GTX(hence higher clocks)
Higher shader clock.
Higher memory clock
Higher core clock.
Otherwise i cant say i can find very very big diffrences, except its a slight refresh of the core and stripped memory bus.
Its not that far off tho :P
(aalmost the same as 2900XT vs 3870, same performance, refresh except ati added features.)
No dude, most of this is very wrong. I mean you got it all wrong except higher clock speeds and smaller manifacturing process on 9800 GTX+

You are thinking about 8800 GTS to 9800 GTX/+ and GTS 250. That's what it MAY look like when you think it as HD 2900 to HD 3870. Hell even that's wrong. I owned both cards, they are both based off of the same core and I would use HD 2900 over HD 3870 anyday. Digitally controlable voltage regulation on memory and core, 512-Bit memory bus. Higher quality components and much more. HD 3870 was a cheaper was to built a HD 2900. HD 2900 was still getting sold for a few hundred more $ after the release of HD 3870. It did run hotter and it needed twice as much as juice but it was a more capable card overall. That's why people used to put R600 cores instead of RV670 to sub-zero cooling, gave all the voltage and break 3DMark records with like 1200-1400 MHZ on the core...

Now let's come to Green camp, G92 only had 16 ROPs unlike 24 on G80. 8800 GTS Doesn't have higher processing power... 8800 GTX/Ultra does. G80 core performs better then G92 on same speeds. It's a faster core. It's not a rehash like 8800 GTS to 9800 GTX or HD 2900 to HD 3870 it's totally a different card.
Posted on Reply
#13
qubit
Overclocked quantum bit
Hey, I'd like to have one just for the novelty factor.
Posted on Reply
#14
douglatins
Design a decent single slot cooling and make it a folders wet dream
Posted on Reply
#15
Nemo~
Correct me if i am wrong but 9800GX2 wuz dual pcb card and this one is single pcb variant.
Posted on Reply
#17
crazyeyesreaper
Not a Moderator
the 8800gts 512 was renamed and shrunk to become the 9800gtx in terms of power the 9800gtx is rivaled by the 8800gtx and there on part do to there few differences in terms of renaming

the 8800gts 512 became the 9800gtx which then became the 9800gtx + and the once again was renamed to the gts 250

this is a new 9800gx 2 for sure its a 1 pcb and using 2 gts 250s that are of the low power variation the difference with this reincarnation of the 9800gx2 is the fact it has 2gb of ram not 1 gb as everyone knows if u have 2gb card only 1 gig is usable thus the only 9800gx2 only really had 512mb of ram which is why it suffered as time went on this new version should prove to be a very good card in its own right and with 2 low power cores quad sli with these cards by galaxy should prove to be a fairly competent setup
Posted on Reply
#18
cauby
Nice!probably it will perform very close to the gtx295 except with a much lower price tag...

and as for fermi,well I gave up waiting for Nvidia to finally release the goddamned thing.By the time it is actually released,ATI will have DX11 cards in every price point for almost 6 months and by the looks of GF100,it will be hot,expensive and power hungry (just rumours though,so will have to wait like everyone else).
Posted on Reply
#19
DirectorC
caubyNice!probably it will perform very close to the gtx295 except with a much lower price tag..
No way. 275s kill 250s.
Posted on Reply
#20
Imsochobo
Helperread over.
Rops aint processing power, the 9800GTX stays faster in processing power, graphical power is in 8800GTX's favour in many, many games, due to 384 bit and the rops which helps.
9800GTX was the replacement for 8800GTX which had a little bit weaker raw hardware spec and clocked higher to compensate for this loss so nvidia could gain more money.

They share huge amounts of architectual design, and for the end user its a polished architecture, which is totally fine by all means, untill its 4 years old.

All i was trying to say is, the card is like a dualcard from the 8 series days, jeez, lets make a 5670X2 guys :)
Posted on Reply
#21
Solaris17
Super Dainty Moderator
i want these. total rocks in all my riggings.
Posted on Reply
#22
erocker
*
I guess Nvidia's partners need to do something with their time. I like the card but the price will most likely be laughable. I wonder if ATi will counter with a 4850x2? Oh, wait...
Posted on Reply
#24
Helper
ImsochoboRops aint processing power, the 9800GTX stays faster in processing power, graphical power is in 8800GTX's favour in many, many games, due to 384 bit and the rops which helps.
9800GTX was the replacement for 8800GTX which had a little bit weaker raw hardware spec and clocked higher to compensate for this loss so nvidia could gain more money.

They share huge amounts of architectual design, and for the end user its a polished architecture, which is totally fine by all means, untill its 4 years old.
ROPs set top amount of processing power a graphics card puts out at the end of a draw. If you go and cut HD 5870 to 2 ROPs, it will get bottlenecked by them and it won't be able to perform any better then a X1950 series card despite it's sheer amount of shader and core power. Most recent cards, including ATi's higher HD 4800 series and G92s lack them. They will perform much better with more ROPs, especially in high resolutions with anti-aliasing. 16 just doesn't cut it for today's cards. But 32 does, as we see on highest-end cards.

G80 is a huge GPU, like how G200 is. It's far more advanced and complex then G92... G92 on the other hand is a weaker, mainstream GPU. Only way to make it a higher-end card is putting two of it to one PCB, like the card here. It bases it's power on SLI. Compare two 8800 Ultras SLIed to 9800GX2 at 2048X1536 with every possible setting on in a game. You'll see the difference. It's not only because of memory bandwidth, it's also because of ROPs and GPUs... So G80 and G92 aren't similar except DirectX 10 shader architecture overall...
Posted on Reply
#25
PP Mguire
crazyeyesreaperthe 8800gts 512 was renamed and shrunk to become the 9800gtx in terms of power the 9800gtx is rivaled by the 8800gtx and there on part do to there few differences in terms of renaming

the 8800gts 512 became the 9800gtx which then became the 9800gtx + and the once again was renamed to the gts 250

this is a new 9800gx 2 for sure its a 1 pcb and using 2 gts 250s that are of the low power variation the difference with this reincarnation of the 9800gx2 is the fact it has 2gb of ram not 1 gb as everyone knows if u have 2gb card only 1 gig is usable thus the only 9800gx2 only really had 512mb of ram which is why it suffered as time went on this new version should prove to be a very good card in its own right and with 2 low power cores quad sli with these cards by galaxy should prove to be a fairly competent setup
The 8800GTS became 9800GTX but the 9800GTX+ was different. It has a G92b core which is 55nm and the GTS250 is that with 1gb of ram.

They should have made this dual pcb. One for the 2 GPUs and the other for a serpate cooling for the nf200 chip :banghead:

This card would prolly be decent though if the pcb wasnt blue.
Posted on Reply
Add your own comment
Nov 29th, 2024 17:27 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts