Thursday, August 28th 2008

Radeon and GeForce Share Work, PhysX Applications Win

The functionality of CUDA and its implementation of GPU-accelerated PhysX processing has benefited many a GeForce user. Users of ATI accelerators lacking this incentive either use Ageia PhysX card or avoid it altogether. It has been verified by Hardspell that in an environment where Radeon accelerator(s) do graphics processing, a GeForce accelerator can be used standalone to process PhysX. Hardspell used a Radeon HD 3850 along with a GeForce 9600 GT on the same system with the display connected to the Radeon, though no form of multi-GPU graphics connection existed, the GeForce card partnered the Radeon well in processing physics, while the Radeon did graphics. Results of the oZone 3D FluidMark, a benchmark that includes routines to evaluate the machine's capability in processing physics, showed a greater than 350% increase in scores, showing that the GeForce accelerator is doing its job.
This was further proved with game testing of Unreal Tournament III. Provided are screen-shots from the game along with those of the FluidMark windows. The first window shows a score of 759 o3marks, while the second window in which GeForce processed PhysX, the score jumped to 2909 o3marks.
Source: Hardspell
Add your own comment

144 Comments on Radeon and GeForce Share Work, PhysX Applications Win

#76
Jambul_Er
wolf2009the img is a hidden link to another site
This is our Russian site download images ...:toast:
Posted on Reply
#77
btarunr
Editor & Senior Moderator
Jambul_ErI have Windows XP SP3:cool:
Shut-down, Connect the monitor to the 9600 GSO, start, let the OS detect the display and configure as a second display-head, shut down, connect the monitor back to the HD 4850, start up. So the OS is fooled into thinking there are display-heads configured for both adapters.
Posted on Reply
#78
Jambul_Er
WshlistThanks for trying jambul, did you install cuda? or the cuda.dll? in other words did you do some experimentation? If so keep us informed please :)
I only did that already did ... And even dances with diamonds - nothing helps:confused:
Posted on Reply
#79
SPAWN
ктонить в курсе када на радеонку 4870 будут такие дравишки?)))
Posted on Reply
#80
Wshlist
Jambul_ErI only did that already did ... And even dances with diamonds - nothing helps:confused:
Oh well :/
Someone will discover how it works eventually.
Posted on Reply
#81
Hayder_Master
everyone here think it is good point for ati , for me i think not , maybe good for users but not for ati , cuz that mean every pc must put in nvidia card in any situation primary or physics , ati must develop a software to solve this biggest problem
Posted on Reply
#82
eidairaman1
The Exiled Airman
hence Havoc Engine, Havoc Engine has been around alot longer so its further ahead in the Physics Dept.
Posted on Reply
#83
Wshlist
eidairaman1hence Havoc Engine, Havoc Engine has been around alot longer so its further ahead in the Physics Dept.
Havok's original engine only used the CPU for physics, and then much later after AGEIA came around did they start with a engine that used the GPU, however that was a separate licence that gamedevelopers had to opt for (and pay for) so adoption wasn't very big I think, it's hard to say because nobody knows when people/companies say 'havok physics' if it's using the old CPU licence/SDK, like HL2 does for instance, or their (relatively)newer GPU one, plus I think their GPU one was partly non-interactive, mostly just visual wasn't it? I'm not sure about the details of it.

But either way, if a game uses the one developed by AGEIA, PhysX, like the unreal3 engine, then it doesn't matter if your card has great HAVOK support since it's PhysX that the game requires. And right now I bet lots of developers are opting for the PhysX one since it suddenly has a lot of people that can use it, unless of course they are smart like the crysis makers and just make their own physics engine and bypass all the hassle :) Although doing that on the GPU might be harder to develop than you think *shrug*
Posted on Reply
#84
Fitseries3
Eleet Hardware Junkie
quick question.....

i have a board with 3 full length pcie slots, 4870x2 and 4870 in CFX and i am thinking about getting a 9800gt for physx. the reason for the 9800gt is because it's the fastest single slot card i can think of and it will work perfectly in between my 2 ATI beasts.

is this a good move for me or should i do something else with my time/money?
Posted on Reply
#85
btarunr
Editor & Senior Moderator
fitseries3quick question.....

i have a board with 3 full length pcie slots, 4870x2 and 4870 in CFX and i am thinking about getting a 9800gt for physx. the reason for the 9800gt is because it's the fastest single slot card i can think of and it will work perfectly in between my 2 ATI beasts.

is this a good move for me or should i do something else with my time/money?
The performance of PhysX isn't all that proportional to the GPU computational power beyond maybe a 8800 GS 384MB. IIRC the third long slot is PCI-E x4, is it?
Posted on Reply
#86
Fitseries3
Eleet Hardware Junkie
4x when 3 cards are in yes. 8x when only 2 are used.

i can get a 9800gt for like $40 so it's not a price thing... just availability.
Posted on Reply
#87
btarunr
Editor & Senior Moderator
fitseries34x when 3 cards are in yes. 8x when only 2 are used.

i can get a 9800gt for like $40 so it's not a price thing... just availability.
It also becomes a heat and power-draw thing :)
Posted on Reply
#88
DarkMatter
btarunrIt also becomes a heat and power-draw thing :)
With 3 R770 on his system, I don't think the power/heat of a 9800GT is an issue for him.
Posted on Reply
#89
btarunr
Editor & Senior Moderator
DarkMatterWith 3 R770 on his system, I don't think the power/heat of a 9800GT is an issue for him.
Point is, those RV770s crunch graphics, but I don't think choosing a 9800 GT over a 8800 GS would translate to anything better than higher power draw than what it already is.
Posted on Reply
#90
DarkMatter
btarunrPoint is, those RV770s crunch graphics, but I don't think choosing a 9800 GT over a 8800 GS would translate to anything better than higher power draw than what it already is.
I agree, but the extra power required to go from ther GS to the GT is NOTHING compared to the power draw he already has. Sure it is pointlees if he will get the same performance, but we don't really know how much power it's going to be required in the next 6 months = 50+ new titles.

And then it's the $40 argument. If he can get a 9800GT for that money I would never never tell him to get something slower. He could even had to pay more for a GS!!
Posted on Reply
#91
btarunr
Editor & Senior Moderator
I don't think those 50+ titles have PhysX content that would make a dedicated PhysX unit such as 8800 GS sweat. 96 NVIDIA SPs is still a huge amount of rated shader compute power. 192 bit memory bus doesn't matter, the card isn't transferring large chunks of data (such a textures), it's just crunching lots of math in real-time. To look at it that way, if a PhysX title does have physics load that makes a dedicated 8800 GS sweat, a single GTX 280 machine is in for a significant graphics performance hit.

If he isn't getting a 8800 GS for less than $40 bucks, 9800 GT is cool.
Posted on Reply
#92
Fitseries3
Eleet Hardware Junkie
btarunrPoint is, those RV770s crunch graphics, but I don't think choosing a 9800 GT over a 8800 GS would translate to anything better than higher power draw than what it already is.
i can get a 9800gt new in box for $40 but i'd have to pay retail for a 8800.

heat.... fuck heat... who cares anyway? it's for benching and i have some 130cfm fans i use to cool the vid cards anyway.
Posted on Reply
#93
DarkMatter
btarunrI don't think those 50+ titles have PhysX content that would make a dedicated PhysX unit such as 8800 GS sweat. 96 NVIDIA SPs is still a huge amount of rated shader compute power. 192 bit memory bus doesn't matter, the card isn't transferring large chunks of data (such a textures), it's just crunching lots of math in real-time. To look at it that way, if a PhysX title does have physics load that makes a dedicated 8800 GS sweat, a single GTX 280 machine is in for a significant graphics performance hit.

If he isn't getting a 8800 GS for less than $40 bucks, 9800 GT is cool.
I think we both know each others points, but we just has focused the thing in a different way. This is how I see it, in order of importance:

- Price: IMHO, a 9800GT for $40 is must have. Period. :D

-Power consumption: the difference between both cards is 10W. The X2 alone consumes 300W. Add 150W for the HD4870 and 120W for the rest of the system, as well as 75 for the baseline card (GS) and we are talking about 645W under full load. 645 or 655 who cares?

- Performance: probably in the next year a GT will not get you better performance than the GS, maybe not even in 2 years? Who cares? You won't need to change the card. IMO you can't apply the same criteria as with graphics, where a little underpowered card makes more sense because you will need to upgrade it soon anyway. And if we take into account the price, are you really going to risk the future performance, or the possibility that you will need to upgrade the card a lot sooner in order to get a cheaper or a bit less power hungry card?
Posted on Reply
#94
Fitseries3
Eleet Hardware Junkie
im also looking at resell value here. i can get the 9800gt and use it for a few months and still get more outta it than i paid.

not trying to dog on your btarunr
Posted on Reply
#95
btarunr
Editor & Senior Moderator
fitseries3i can get a 9800gt new in box for $40 but i'd have to pay retail for a 8800.

heat.... fuck heat... who cares anyway? it's for benching and i have some 130cfm fans i use to cool the vid cards anyway.
Then price is the only issue, go for it.
Posted on Reply
#96
btarunr
Editor & Senior Moderator
DarkMatter- Performance: probably in the next year a GT will not get you better performance than the GS, maybe not even in 2 years? Who cares? You won't need to change the card. IMO you can't apply the same criteria as with graphics, where a little underpowered card makes more sense because you will need to upgrade it soon anyway. And if we take into account the price, are you really going to risk the future performance, or the possibility that you will need to upgrade the card a lot sooner in order to get a cheaper or a bit less power hungry card?
2 years from now you think the industry will let you use a 9800 GT for PhysX? You'll be hit by the standard-syndrome, they'll come up with "The latest PhysX engine requires a CUDA < insert advanced version here > -supportive graphics card", naturally all existing hardware will become 'obsolete'. Anyway, that's Fit's we're talking about. His hardware changes like the weather :laugh:
Posted on Reply
#97
Fitseries3
Eleet Hardware Junkie
what a really want to know is if it is worth the $40 or should i get something that has nothing to do with gfx/physx.
Posted on Reply
#98
btarunr
Editor & Senior Moderator
fitseries3what a really want to know is if it is worth the $40 or should i get something that has nothing to do with gfx/physx.
Something that can put those cards to use, a game. :) ...if not 9800 GT though for $40 nothing beats it.
Posted on Reply
#99
Fitseries3
Eleet Hardware Junkie
i dont game though.
Posted on Reply
#100
DarkMatter
fitseries3i dont game though.
Then IMO that money will be better spent paying some drinks to your friends or something. Maybe sending it to the DarkMatter, he will surely give it good use... :rolleyes:
Posted on Reply
Add your own comment
Nov 23rd, 2024 07:56 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts