Thursday, June 3rd 2010

Galaxy Designs GeForce GTX 480 with Dedicated PhysX GPU

While every NVIDIA GPU since GeForce 8 series, with over 256 MB of memory supports the CUDA GPU compute technology, and with it PhysX GPU acceleration, the prospect of dedicated GPUs for computing PhysX interests many, especially enthusiasts, after NVIDIA stopped production of the PhysX processor from erstwhile Ageia. EVGA first devised a graphics card that has a high-end GPU processing graphics, and a mainstream GPU dedicated to PhysX processing, with its GeForce GTX 275 Co-op. Building on the same principle, Galaxy designed an enthusiast-grade graphics card that uses an NVIDIA GeForce GTX 480 for DirectX 11 compliant graphics processing, while its sidekick on board is a 40 nm GeForce GT 240 GPU.

The GeForce GTX 480 sticks to specifications, complete with 1536 MB of GDDR5 memory across a 384-bit wide, while the GT 240 has its own 512 MB of GDDR5 memory across a 128-bit wide interface. Both GPUs share the system bus over an NVIDIA nForce 200 bridge chip, which gives both GPUs a PCI-Express 2.0 x16 link. The GTX 480 GPU packs 480 CUDA cores, and features the latest GPU technologies, including 3D Vision surround. Since it is independent from the GT 240, the GTX 480 can pair with three more of its kind for 4-way SLI, without affecting the functionality of the GT 240 in any way.
The GT 240 packs 96 CUDA cores, and is DirectCompute 4.1 compliant. It has enough compute power for offloading PhysX processing from the main GPU. Since the GT 240 in its standalone card form is very energy efficient by not requiring auxiliary power, its presence didn't affect the electricals of this card in any big way. The GTX 480 component seems to be powered by a 6+2 phase digital PWM circuit, while the GT 240 uses a simpler 2+1 phase PWM circuit. Power inputs remain 8-pin and 6-pin. More details as they emerge.
Add your own comment

44 Comments on Galaxy Designs GeForce GTX 480 with Dedicated PhysX GPU

#26
Disparia
TaskforceI don't need PhysX, what am i going to do with it?

Where dah games at to run it?
Fold while you game.
Posted on Reply
#27
Steevo
Play all 16 hardware accelerated games, pair it with the killer 5770 and have a entirely overblown computer that will get spanked by a machine costing half as much.
Posted on Reply
#28
DarthCyclonis
OnBoardIt's not G92 anymore. GT215 40nm and DX10.1.
So they really should have rebranded it the G92.1 Midget. LOL
Posted on Reply
#29
RONX GT
A bit interesting then the regular GTX 480. NICE one Galaxy.:)
Posted on Reply
#30
Yukikaze
They really can't find a way to sell them G2XX leftovers, eh ? This is beyond silly. A GTX480 doesn't need the add-on GPU for PhysX. PhysX is a sham anyway (There's what ? 15 games that actually do it on the GPU ?), and this card is pointless.

Now, if this second chip was a Hydra chip like wahdangun said, then it would've been quite an intriguing card.
Posted on Reply
#31
bobzilla2009
YukikazeThey really can't find a way to sell them G2XX leftovers, eh ? This is beyond silly. A GTX480 doesn't need the add-on GPU for PhysX. PhysX is a sham anyway (There's what ? 15 games that actually do it on the GPU ?), and this card is pointless.
It's no doubt that physX will never amount to much in the scheme of things, but adding some chips they would otherwise not sell, recouping some money on them and getting sales by putting 'OMG PhysX!' on the side is a good business practice. People will buy anything if it has enough exclamation marks on it.
Posted on Reply
#32
Yukikaze
bobzilla2009It's no doubt that physX will never amount to much in the scheme of things, but adding some chips they would otherwise not sell, recouping some money on them and getting sales by putting 'OMG PhysX!' on the side is a good business practice. People will buy anything if it has enough exclamation marks on it.
"Good business practice" doesn't change the fact the card itself is silly and the fact that people will buy anything with enough exclamation marks on it doesn't mean it is a good product.
Posted on Reply
#33
Bjorn_Of_Iceland
The GTX480 is enough to run physx on its own. All that does is more heat and power draw.
Posted on Reply
#34
HalfAHertz
Doesn't it like go over the pci-e power standard? The 480 is already close to 300W on its own.
Posted on Reply
#35
stupido
SteevoThey should shut off the 480 core and memory when in 2D clocks and save a shitload of power. Then it would be worth it.
that is very good remark.

imagine this card in a gaming/HTPC setup...
Posted on Reply
#36
newtekie1
Semi-Retired Folder
SteevoThey should shut off the 480 core and memory when in 2D clocks and save a shitload of power. Then it would be worth it.
That would be totally sweet, but saddly I don't think it is possible with the way the display logic is integrated in to the GPU. It might have worked on the older GF200, with the seperate display logic chip though.
Posted on Reply
#37
HalfAHertz
What if they used an nvidia optimus type of solution? The pc still sees the two separate GPUs right? And each of them has its own powerer circuitry...
Posted on Reply
#38
newtekie1
Semi-Retired Folder
HalfAHertzWhat if they used an nvidia optimus type of solution? The pc still sees the two separate GPUs right? And each of them has its own powerer circuitry...
Actually that might work, a Hybrid-SLi type of setup. Not sure if that would work or not, but I'm sure some smart engineer could figure something out.
Posted on Reply
#39
a_ump
wasn't there a benchmark with a GTX 470 and a physics gpu and it showed no difference. what woudl be the diff here with a GTX 480. pointless imo
Posted on Reply
#40
3volvedcombat
Well the gt240 gpu is just stuck on there now, it will sell, and its alright.'

GTX 480 is silly buff in physx so it wont even change performance value's with the extra gpu in there. Hopefully Something will happen



Im pist frankly. I wanted a 512core 480 that was a beast just so i can look up at buying a beast nvidia card in the future. Thats not going to happen right now at least. :mad:
Posted on Reply
#41
Imsochobo
newtekie1I think what he is asking is if the GT240 is fast enough to calculate PhysX and not hold back the GTX480. The same way a weak CPU would hold back a stong GPU.

The issue is that people think PhysX takes a lot of computing power, when it really doesn't, so a very basic card can handle it easily without limitting the more powerful graphics card.
and like CPU can do this.
Alltho nvidia doesnt like that :P

I used the hack once when you cudnt do nv for physx, but it was a total waste of powerconsumtion i think, i playd one game per year that would maybe look better with physx....
Make it opencl based and then both cpu and gpu can do it, based on desire...
Posted on Reply
#42
newtekie1
Semi-Retired Folder
Imsochoboand like CPU can do this.
Alltho nvidia doesnt like that :P

I used the hack once when you cudnt do nv for physx, but it was a total waste of powerconsumtion i think, i playd one game per year that would maybe look better with physx....
Make it opencl based and then both cpu and gpu can do it, based on desire...
There is the option to have the CPU do PhysX in the PhysX control panel. With my X3220(Q9650)@3.6GHz, the framerates are terrible when doing it on the CPU. Now that is only using one core of the quad core processor, I feel that it probably would be able to handle it if the game developers went back and re-worked it to use the multi-threaded capabilities nVidia added to PhysX recently. However, while someone with a quad like me might be able to handle PhysX on the CPU, I still don't think a lot of the dual-core users would be able to handle PhysX on the CPU.

I really have to agree with you on PhysX not really being worth the power consumption of a second card. My HD4890+9600GT was only used for Batman, and after that I removed the 9600GT and it sits on my desk... If it wasn't for the fact that I keep the 9600GT as a back-up card(which is why I bought it, I would not have bought it for PhysX) I would get sell it.
Posted on Reply
#43
swaaye
Looks kinda like a Voodoo2. The only difference is that this can heat your home.
Posted on Reply
Add your own comment
May 6th, 2024 20:00 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts