Thursday, March 5th 2015
NVIDIA Frees PhysX Source Code
After Epic's Unreal Engine 4 and Unity 5 game engines went "free,"with their source-codes put up by their makes for anyone to inspect freely, NVIDIA decided to join the bandwagon of showering game developers with technical empowerment, by putting up the entire source-code of PhysX 3.3.3, including its cloth and destruction physics code, on GitHub. The move to put up free-code of PhysX appears to be linked to the liberation of Unreal Engine 4 code.
NVIDIA PhysX is the principal physics component of Unreal-driven game titles for several years now. There's a catch, though. NVIDIA is only freeing CPU-based implementation of PhysX, and not its GPU-accelerated one, which leverages NVIDIA's proprietary CUDA GPU compute technology. There should still be plenty for game devs and students in the field, to chew on. In another interesting development, the PhysX SDK has been expanded from its traditionally Windows roots to cover more platforms, namely OS X, Linux, and Android. Find instructions on how to get your hands on the code, at the source link.
Source:
NVIDIA
NVIDIA PhysX is the principal physics component of Unreal-driven game titles for several years now. There's a catch, though. NVIDIA is only freeing CPU-based implementation of PhysX, and not its GPU-accelerated one, which leverages NVIDIA's proprietary CUDA GPU compute technology. There should still be plenty for game devs and students in the field, to chew on. In another interesting development, the PhysX SDK has been expanded from its traditionally Windows roots to cover more platforms, namely OS X, Linux, and Android. Find instructions on how to get your hands on the code, at the source link.
56 Comments on NVIDIA Frees PhysX Source Code
The x87 FPU was 80-bit precision. Actually, that's only 63 bit precision, with exponent and sign. But it still had all sorts of problems/bugs.
SSE was 32-bit with some 64-bit internal calcs but not consistent in what and how, so reversing the order of a calculation could give different results. The SSE2 was better with its consistently 64-bit architecture. That is still a far cry away from 80-bit or 128-bit double precision. For gaming graphics, 64-bit is more than enough. For calculating interest-rate options, doing experimental science, calculating pi, or doing "critical" calculations (example en.wikipedia.org/wiki/Cluster_(spacecraft), then you'd better look for a better FP unit.
I remember looking at the Ageia/PhysX board about 10 years ago for Credit Derivative calculations. It was interesting at the time for its speed, price, and also to develop semi-proprietary hardware/software combo that wouldn't be easily copied within the target client. We didn't go with it because while it was fast the FP wasn't accurate enough. Remember Physics Modelling (millions of artifacts doing approximately the right thing) is different from complex computation (tens, hundreds or thousands of calculations requiring absolute accuracy).
www.nextpowerup.com/news/18839/nvidia-disables-mobile-gpu-overclocking-via-bios.html
Physx is far more competent with particle interaction but regardless of that, its still very limited.
Only a full involvement of Devs would let physx make more of an impact but that won't happen.
Physx is a gimmick, far less useful per se, than AMD's mantle. Both are proprietary and both have had their day now. DX12 and CPU physics will replace them both, probably more pragmatically.
Scripted effects, no matter how impressive, don't really mean anything.
I know the skyscraper in BF4 is fully scripted; no dynamic physics going on. Not sure about the rest.
Dynamic physics require much more processing power and are harder to implement, that's why most developers usually go for the scripted route for more complex effects.
Batman: arkham games have some of the best dynamic physics effects out there.
It all comes down to the developers of a game to properly utilize a physics engine. If a game's dynamic physics effects are lacking or bad, doesn't mean the engine itself is bad.
There isn't much difference on the CPU side.
By particles in infamous, do you mean the particles produced by the protagonist when he uses his powers? How do you know they are not scripted? If they are not, there is no reason they cannot be done with any other CPU physics engine.
Here you can find a series of tests performed with physx and bullet. Publishing havok's test results is forbidden (how curious). The author talks about havok in the conclusion page.
physxinfo.com/news/11297/the-evolution-of-physx-sdk-performance-wise/
So what about GPU physx? When the amount of the effects that are going to be used on-screen are too much for the CPU to handle, developers use physx's GPU-accelerated modules. It's not that CPU physx cannot run those effects, it can, but it's simply not possible to run such heavy effects on the CPU and have a good performance. The FPS will tank to single digits.
You might think the GPU physics effects in say batman arkham asylum or city aren't better than infamous, but remember that they are ALL dynamic. No scripted effects here.
This is from arkham asylum. Jump to 1:51.
When crane is moving his hand through those rigid bodies, every single collision is dynamic and calculated, and there are a huge number of them. This just cannot be done with a good FPS on CPU, no matter the engine.
I remember running this game with GPU effects on, and this scene brought even my 8800 GTS to a crawl.
If you have a geforce, you can download the dedicated physx benchmark fluidmark and test its performance for yourself. You can also use it if you don't have a geforce, but only CPU tests will run, obviously.
..after..6-7 years is it? :rolleyes:
Did you even care to read all those links provided? Guess not. Physx is just as fast or faster than other physics engines in the CPU mode, as it's been shown again and again. Are you ignoring the provided links on purpose?
Just to clarify; CPU physX does NOT require a geforce card to be present. It only runs on the CPU, so it's hardware agnostic.
Well, physx is here to stay. It's even unreal engines' default, built-in physics engine.
Surely those guys at epic are insane for choosing such an inefficient engine. /s
Yes, GPU physics is probably the way of the future. It's just taking too long.
AFAIK, so far physx is the only physics engine with an optional GPU acceleration ability, and since it's locked to nvidia GPUs, it will not become widely supported.
Either nvidia should port it to OpenCL and/or DirectCompute, or havok come up with a universal GPU-accelerated solution, which will probably force nvidia's hand too.
For some reason you're fixated on havok as if only it has the right to exist. There are even some game engines that use in-house physics, like cryengine and frostbite and do not bother with havok. If you mean GPU accelerated effects, yes. It's not logical to implement those effects in the gameplay, if they're only limited to geforce cards. That would break the gameplay on non-geforce cards.
That would not change until nvidia opens GPU physx and allows it to run on non-nvidia cards, but they seem to be unwilling, unfortunately.
It wouldn't hurt if havok made an open GPU module and kicked nvidia's rear.
It's just like how 8% of internet browsers used are Internet Explorer. If it wasn't for MS shoehorning in IE support on every one of their devices NO ONE WOULD USE IT. That is because other browsers like Chrome and Firefox are flat out superior.
P.S. I only mentioned Havok because that is the one that I remembered off the top of my head. There are plenty of Physics engines out there worth using. However PhysX is not one of them unless Nvidia forces you.
Now I see that you're trolling. There was no point in replying to you at all, thinking you're a reasonable and knowledgeable person.
You also think IE's market share is 8%?! WOOWW. Even statcounter says it's higher than that.
Some people just can't be helped.
www.w3schools.com/browsers/browsers_stats.asp
Just like IE, PhysX is being phased out because it is simply inferior to its competition's products. If you want to talk about a game, how about you mention a specific one.
gs.statcounter.com/
Really? So nvidia is phasing out physx by introducing newer versions?! Didn't know people phased out software by continuing its development. Countless tests done, showing physx is just as fast, and I posted links too, and yet you repeat the same thing without bothering to read any of them.
I don't have to mention a specific one. You said if a game, any game, uses physx, then nvidia forced them to. here:
www.physxinfo.com/index.php?p=gam&f=all
The list of all 581 physx games. Show everyone that ALL of them were forced by nvidia. MAANN.
If you mean gpu accelerated then that would be the case(nvidia should really open that up, getting universal gpu path for physics would make games more realistic. Now physics are used merely for eye candy, not really physics), but this cpu path has it's future.
No, not ALL PhysX titles were forced, but we can all agree that a big reason PhysX has noteworthy market-share is due to Nvidia's money-hatting.