Thursday, October 22nd 2009

EVGA and NVIDIA Design Unique Multi-GPU Graphics Accelerator

EVGA and NVIDIA are readying a unique multi-GPU graphics accelerator this Halloween, slated for October 30. To celebrate its launch, the two have organized a launch party for 300 lucky participants who will go to the NVIDIA Plaza in Santa Clara, CA and witness the launch of the new GeForce product. The accelerator packs two GPUs: a G200b, and a G92b. That's right, a GeForce GTX 200 series GPU, with a GeForce GTS 250 GPU. This is perhaps the first graphics accelerator to pack two entirely different GPUs. How it works, however, is interesting: the G200b GPU handles graphics, while the G92b is dedicated to PhysX processing. The accelerator could have 896 MB of graphics memory, with 512 MB of dedicated memory for the PhysX GPU. You can sign up for the event here.
Source: Bright Side of News
Add your own comment

80 Comments on EVGA and NVIDIA Design Unique Multi-GPU Graphics Accelerator

#51
Mussels
Freshwater Moderator
interestign article, but a few points


1. only one game was tested. poor sampling so its hard to prove if its going to be the same in all games.

2. approximately a 10 FPS gain was seen using the second card for physx. i'd not pay $150 au and another 50-100W of power use for 10 FPS.

3. december 2008 - i'm sure performance on single card (and SLI) setups has been improved
Posted on Reply
#52
ShadowFold
Instead of this pointless Frankensteinish monstrosity of a card, I would much rather they 40nm'ify the GT200 and get DX11 built into it. They obviously don't have GT300 ready, might as well get some GT200 stuff with DX11 and 40nm, right?
Posted on Reply
#53
Steevo
This is the new FERMI, they take some old dies and duct tape them together, throw in a leaf blower and charge $599.99, extension cord for the leaf blower not included.


This is almost not funny, without competition ATI is free to dilly dally on the new GPU's and cards like the eyefinity 2Gb I need.
Posted on Reply
#54
KainXS
its more than likely a 1GB GTX285 with a 1GB GTS250 with 2 PCB's (similar to the 9800GX2 but not in sli) that would be the easiest way to make a card like this

but you can pretty much guarantee this will be nothing but a high priced waste of money.

at least they are being innovative though, its a good idea to me, just too late.
Posted on Reply
#55
newtekie1
Semi-Retired Folder
Wow, what a retarded card. I'm sure there are a niche few out there that want a GTX285 and GTS250 for dedicated PhysX to play the few PhysX games, but this card is still retarded.

Though maybe this is just a stepping stone to a version with GT300 and a GT200 core?

I don't know, there are always these retarded cards coming out at the end of a generation of cards, sometime they lead to innovation, sometimes not. It is always a good thing to see something new though.
Posted on Reply
#57
AsRock
TPU addict
Mine was 2 days ago lol.. Pretty common really has more ppl do it when the weather colder :P..

Happy Bdays to all :)..
DonInKansasThey can send it to me; October 30th is mah birthday. :roll:
FordGT90ConceptIf it were like Mac Pro laptops where the weaker one was used in 2D apps while the stronger one would work in 3D apps (saving power), I would like it but, if the second GPU is solely intended for PhysX which few games require, I'd say they are wasting their time.

And let me guess, they're going to market it as "Frankenstein" or "Monster."
NV think of doing that Your kidding right ?. There so full of them selfs with Cuda.

Expect it of ATI though and i'm surprised it's not been done already but then again they have got the idle usage down real low already.

Not enough games for this yet for this be as good as it could be. People don't want this they want 300 series already lol.
Posted on Reply
#58
Darksaber
Senior Editor & Case Reviewer
FYI:

Signed up to the event ^^. I live 9000 km away from the event, but hey maybe they choose me lol ^^.

cheers
DS
Posted on Reply
#59
btarunr
Editor & Senior Moderator
But they only give you soft drinks :(
Posted on Reply
#60
Bjorn_Of_Iceland
I know something that would be scary, stick a print out of an electric bill while using this card on the hsf.
Posted on Reply
#61
DaMulta
My stars went supernova
The WTF Futermark card.
Posted on Reply
#62
mdm-adph
Kinda neat from an engineering perspective, but newtekie is right -- these kind of crap cards always come out at the end of a generation just to fill time.

Or need I remind anyone of the "HD 2600XT Dual."

"$400 dollar card, $150 performance."
Posted on Reply
#63
erocker
*
I like the card, combining two different GPU's onto a single PCB is no small feat. Unfortunately it's about 8 months too late.
Posted on Reply
#64
newtekie1
Semi-Retired Folder
erockerI like the card, combining two different GPU's onto a single PCB is no small feat. Unfortunately it's about 8 months too late.
The more I think about it, the more I think it actually is probably easier than we think.

I'm obviously not a graphics card engineer, but I would think desigining this card would be easier than desiging something like a GTX295 with two of the same cores.

Think about it: With two of the same cores, you have to design the card so the two core communicate together to share the work. You have to design SLi communication into the PCB, along with communication with the PCI-E bus for each core.

With this card, all you need to do is slap the two cores on the same PCB, connect them both to a NF200 bridge chip, and that is it. No need to design communication paths between the two cards. To the system, they are just two cards sharing the same PCI-e bus.

And on a different note, this card will probably benefit the people with motherboard that only have one PCI-E x16 slot. So they can have a dedicated PhysX card, the only problem is that the price will likely be so outragous, they might as well buy a new motherboard...
Posted on Reply
#65
Roph
Wonderful, I always wanted a card like this that would cost me way too much and be aimed at utilising a dead (or soon to die) proprietary API, PhysX. :)

Isn't it possible to get the onboard IGP of an nvidia motherboard to do the physx whilst your real GPU does the graphics anyway?
Posted on Reply
#66
Marineborn
*blinks* *stares a little more*...*blinks again* wait....what.......wai...huh.....STUPID!


i dont see the point in this, im guessing they need to make somemore money on a gimmick they hope people to buy to support all the money there losing.
Posted on Reply
#67
Mussels
Freshwater Moderator
RophWonderful, I always wanted a card like this that would cost me way too much and be aimed at utilising a dead (or soon to die) proprietary API, PhysX. :)

Isn't it possible to get the onboard IGP of an nvidia motherboard to do the physx whilst your real GPU does the graphics anyway?
possible? yes. easy? no.


Nv ramped up the requirements for physX recently, so most of the onboard GPU's cant handle the more modern physX titles
Posted on Reply
#68
Meizuman
www.evga.com/articles/00512/
" EVGA and NVIDIA would like to invite you to a new graphics card launch at the NVIDIA campus in Santa Clara, CA. No it’s not based on the new Fermi architecture… we are still finalizing those details, so stay tuned. It’s a rocking new graphics card designed by NVIDIA and EVGA to take PhysX to the next level.
The bolded part made me... :roll:

And in the launch event they can test that card vs HD 5870 + GTS 250 with the hacked drivers to allow physX... I would really wanna see that. :D
Posted on Reply
#69
Animalpak
amschipThis only proves that Nvidia has no Fermi ready yet if they are desperate enough to have to waste money on product development like that. Oh my...
Remember that the 8800 GTX has dominate the world of graphics card almost for one year, without any response from ATi.

Fermi is close to be completed and there is no delay from Nvidia.
Posted on Reply
#70
zithe
Interesting to see this card fold.
Posted on Reply
#71
zCexVe
Heh, another G92? teh suk! Might as well make a keychain again to buy time for G300 :P
Posted on Reply
#72
mdm-adph
AnimalpakRemember that the 8800 GTX has dominate the world of graphics card almost for one year, without any response from ATi.

Fermi is close to be completed and there is no delay from Nvidia.
No delay, eh?

www.techpowerup.com/tags.php?tag=GT300

First it was Q4 2008, then Q1 2009, then Q4 2009 (that's right now) with demos in September -- this is what you call "no delay?"
Posted on Reply
#73
newtekie1
Semi-Retired Folder
mdm-adphNo delay, eh?

www.techpowerup.com/tags.php?tag=GT300

First it was Q4 2008, then Q1 2009, then Q4 2009 (that's right now) with demos in September -- this is what you call "no delay?"
The Q4 2008 and Q1 2009 dates were nothing more than extremely early speculation based on nothing, it was never official from nVidia. Just a bunch of news reporters saying it "could" be out by Q4 2008, or it was comeing out "as early as" Q1 2009.

Q4 2009 has been the official release schedule since nVidia announced it back in December 2008.

So, no, it was never Q4 2008, then Q1 2009. It has always been Q4 2009.
Posted on Reply
#74
Imsochobo
newtekie1The Q4 2008 and Q1 2009 dates were nothing more than extremely early speculation based on nothing, it was never official from nVidia. Just a bunch of news reporters saying it "could" be out by Q4 2008, or it was comeing out "as early as" Q1 2009.

Q4 2009 has been the official release schedule since nVidia announced it back in December 2008.

So, no, it was never Q4 2008, then Q1 2009. It has always been Q4 2009.
Nvidia have had delay on everything else than what was planned, no official saying on exact date nor q1 q2 q3 q4 09 01 stuff either.
Their 40NM parts was roughly 3 months delayed.

NVidia and ATI is the best that ever happened to this world, the bad part is that nvidia turning out to be arrogant pricks now. But they have had a very very good performance growth for every generation change, and pushed themself hard.
Surely GPGPU is good, but without a common API for intel ati and nvidia this isnt any compelling feature.
I wudnt want to make a app ive made just for nvidia costumers, and i deffy dont want to make it for 4 diffrent api's, one for intel, one for ati, one for nvidia and one for x86!

And i wudnt make a GPGPU card for consumers yet, could increase the performance for GPGPU steadyly, not dedicate something for it, from one generation(gaming card) then to gpgpu car the other generation.
Stuff dont happen over night, just watch 64 bits, 32 bit worked and people didnt switch.
Posted on Reply
#75
araditus
havent seen a fun/successful card from the green camp since my 8800ultra sli or my current 275s, i like the 275, +5% on the 285 minus20% on the price :)

on topic, this frakenstein card has scared me...

i will be purchasing ATI next time if this card is released as my religion does not approve of the Doctor's work.
Posted on Reply
Add your own comment
Nov 27th, 2024 15:13 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts