Wednesday, January 20th 2010
Galaxy Designs Dual-Core GeForce GTS 250 Graphics Accelerator
Whereas NVIDIA is inching towards releasing its DirectX compliant GF100 GPU from which it has high expectations, some of its partners don't seem to give up on the two-generation old G92. Galaxy has almost finished designing a graphics card that uses two GeForce GTS 250 GPUs on one PCB, in an SLI on card setup. It refers to the card as "DualCore GTS250", and consists of a long blue PCB which holds two G92-426 GPUs (G92-426 GPUs are used on low-power variants of the GTS 250).
Each GPU is wired to 1 GB of GDDR3 memory across its 256-bit wide memory interface (total 2 GB on board). The two GPUs are connected to the system over an nForce 200 BR-03 bridge chip. The multi-GPU system is contained within the board, with no SLI fingers for expansion, even though G92 chips are Quad SLI capable.The card draws power from two 6-pin PCI-E power connectors, regulated by a 9-phase digital PWM circuit. This isn't the first implementation of its kind from a very technical standpoint. The 2-year old NVIDIA GeForce 9800 GX2 uses two 65 nm G92 chips in a similar implementation, which is Quad SLI capable. There is no word on when Galaxy will release this.
Source:
Expreview
Each GPU is wired to 1 GB of GDDR3 memory across its 256-bit wide memory interface (total 2 GB on board). The two GPUs are connected to the system over an nForce 200 BR-03 bridge chip. The multi-GPU system is contained within the board, with no SLI fingers for expansion, even though G92 chips are Quad SLI capable.The card draws power from two 6-pin PCI-E power connectors, regulated by a 9-phase digital PWM circuit. This isn't the first implementation of its kind from a very technical standpoint. The 2-year old NVIDIA GeForce 9800 GX2 uses two 65 nm G92 chips in a similar implementation, which is Quad SLI capable. There is no word on when Galaxy will release this.
60 Comments on Galaxy Designs Dual-Core GeForce GTS 250 Graphics Accelerator
Wonder how well it could fold?
8800GTX(384bit)->9800GTX(256bit,higher core/shaderclock)->9800GTX+(Higherclocks)->-GTS250
Doesnt this mean we got just another 9800GX2 on our hands? since its based on 9800GTX(+) ?
Thinking about two G92 cores(9800 GX2), it should fold very well. 8800 GTX has notting to do with any other card you listed there. It's a completely different core(G80) with 32 Pipelines, 384 bit bus and things like that. In theory, it's a superior core to G92. It has more capabilities then any other single GPU G92 card.
But yeah, this is another 9800GX2 with the newest G92 cores. Actually 9800GX2 has G92B(65 NM revision) already so this is just those last low-power cores combined. I own a 9800GX2 myself and this won't be a bad performer overall. On the other hand, this should have been released a 2 weeks ago(Christmas) and it had to have fully functional G200B cores. Which means DUAL GTX 285. Now this looks more like some attempt to get rid of G92 cores. SLIing and putting them on a cheap PCB... I think it needs to be priced really low in order to be an alternative to HD 5800 series. People would still go and buy a HD 5850 instead of this. Then it could only be a good folding card...
64 texture units.
24 ROPS.
Is what both 9800GTX and 8800GTX share, they are quite simular...
The diffrences:
9800GTX have over 8800GTX:
have a lower memory bandwidth due to 256 bit vs 384.
have a higher processing power(both 9800GTX and 9800GTX+.
Have its build on 65 NM and 55nm over the 90NM on 8800GTX(hence higher clocks)
Higher shader clock.
Higher memory clock
Higher core clock.
Otherwise i cant say i can find very very big diffrences, except its a slight refresh of the core and stripped memory bus.
Its not that far off tho :P
(aalmost the same as 2900XT vs 3870, same performance, refresh except ati added features.) No kidding, lower power usage, higher performance, HDMI with bitstream(love that! :rockout:) DX11 and its very quiet, i havnt heard mine yet, in a HTPC case!
my bad, it's a g98 core.
''keep waithing for fermi''
Now thats pretty crazy but its from semiaccurate.
Now i just couldnt imagine fermi getting all these predections, and negatives everywere as just a accident. I think fermi or gt100 might be to hot to handle, and to big to produce, ive seen 20 of these threads, and prediections from sites already.
BUT WE WILL SEE.
This is a nice card, 248 stream processors of love on a single PCB with 2 gigs of TOTAL memory. Itll do the job for every game out there and itll be faster then a 285. Becuase these g92 gpu'z will overclock very high, and the ram should hit silly speeds offering 200-250gb of total bandwith. I want to see what the heatsink is going to look like. I dont want no wack gigabyte's creativity as a heatsink. Be nice if it had the same 295 heatsink :D
You are thinking about 8800 GTS to 9800 GTX/+ and GTS 250. That's what it MAY look like when you think it as HD 2900 to HD 3870. Hell even that's wrong. I owned both cards, they are both based off of the same core and I would use HD 2900 over HD 3870 anyday. Digitally controlable voltage regulation on memory and core, 512-Bit memory bus. Higher quality components and much more. HD 3870 was a cheaper was to built a HD 2900. HD 2900 was still getting sold for a few hundred more $ after the release of HD 3870. It did run hotter and it needed twice as much as juice but it was a more capable card overall. That's why people used to put R600 cores instead of RV670 to sub-zero cooling, gave all the voltage and break 3DMark records with like 1200-1400 MHZ on the core...
Now let's come to Green camp, G92 only had 16 ROPs unlike 24 on G80. 8800 GTS Doesn't have higher processing power... 8800 GTX/Ultra does. G80 core performs better then G92 on same speeds. It's a faster core. It's not a rehash like 8800 GTS to 9800 GTX or HD 2900 to HD 3870 it's totally a different card.
the 8800gts 512 became the 9800gtx which then became the 9800gtx + and the once again was renamed to the gts 250
this is a new 9800gx 2 for sure its a 1 pcb and using 2 gts 250s that are of the low power variation the difference with this reincarnation of the 9800gx2 is the fact it has 2gb of ram not 1 gb as everyone knows if u have 2gb card only 1 gig is usable thus the only 9800gx2 only really had 512mb of ram which is why it suffered as time went on this new version should prove to be a very good card in its own right and with 2 low power cores quad sli with these cards by galaxy should prove to be a fairly competent setup
and as for fermi,well I gave up waiting for Nvidia to finally release the goddamned thing.By the time it is actually released,ATI will have DX11 cards in every price point for almost 6 months and by the looks of GF100,it will be hot,expensive and power hungry (just rumours though,so will have to wait like everyone else).
9800GTX was the replacement for 8800GTX which had a little bit weaker raw hardware spec and clocked higher to compensate for this loss so nvidia could gain more money.
They share huge amounts of architectual design, and for the end user its a polished architecture, which is totally fine by all means, untill its 4 years old.
All i was trying to say is, the card is like a dualcard from the 8 series days, jeez, lets make a 5670X2 guys :)
I'm quoting "We know that the 9800GX2 has 2 printed circuit boards (PCBs)"
G80 is a huge GPU, like how G200 is. It's far more advanced and complex then G92... G92 on the other hand is a weaker, mainstream GPU. Only way to make it a higher-end card is putting two of it to one PCB, like the card here. It bases it's power on SLI. Compare two 8800 Ultras SLIed to 9800GX2 at 2048X1536 with every possible setting on in a game. You'll see the difference. It's not only because of memory bandwidth, it's also because of ROPs and GPUs... So G80 and G92 aren't similar except DirectX 10 shader architecture overall...
They should have made this dual pcb. One for the 2 GPUs and the other for a serpate cooling for the nf200 chip :banghead:
This card would prolly be decent though if the pcb wasnt blue.