# NVIDIA PhysX 5.0 Coming in 2020, Supports FEM for Deformable Physics



## P4-630 (Dec 26, 2019)

NVIDIA PhysX 5.0 Coming in 2020, Supports FEM for Deformable Physics
					

NVIDIA announced PhysX 5.0, a new version of the physics middleware SDK, coming soon in 2020. It supports FEM for deformable physics.




					wccftech.com


----------



## rtwjunkie (Dec 26, 2019)

Very cool! Stuff like that will really increase the realism factor and I am all for it.  

Now they just need to not make it proprietary and PhysX would really take off, finally!


----------



## Steevo (Dec 26, 2019)

Their version is "open source" as much as if you bought a Ford sure it would drive on public roads, but only if you run Ford brand gas, tires, oil, jacket, gloves and air in the tires.


----------



## dj-electric (Dec 26, 2019)

rtwjunkie said:


> Now they just need to not make it proprietary and PhysX would really take off, finally!


After 12 years of owning this technology and its licensing, they just might.

who am i kidding, its mostly gonna stay in the realm of cool demos for another year.


----------



## eidairaman1 (Dec 26, 2019)

Steevo said:


> Their version is "open source" as much as if you bought a Ford sure it would drive on public roads, but only if you run Ford brand gas, tires, oil, jacket, gloves and air in the tires.



I call it panic mode since AMD announced it before NV did.


----------



## TheoneandonlyMrK (Dec 26, 2019)

rtwjunkie said:


> Very cool! Stuff like that will really increase the realism factor and I am all for it.
> 
> Now they just need to not make it proprietary and PhysX would really take off, finally!


It's Nvidia's implementation of Fem which Amd released a pr piece on too, prior.
Seams like Nvidia making another land grab of an open Microsoft derived standard no? Are these two different FEM or like DXr12 is this Nvidias implementation.


----------



## ShrimpBrime (Dec 26, 2019)

theoneandonlymrk said:


> It's Nvidia's implementation of Fem which Amd released a pr piece on too, prior.
> Seams like Nvidia making another land grab of an open Microsoft derived standard no? Are these two different FEM or like DXr12 is this Nvidias implementation.



Nvidia Physx are pure hardware rendered in realtime.

Everything else is software rendered on a Cpu, not in real time.

Not sure how we can actually do good comparisons here.


----------



## rtwjunkie (Dec 26, 2019)

theoneandonlymrk said:


> It's Nvidia's implementation of Fem which Amd released a pr piece on too, prior.
> Seams like Nvidia making another land grab of an open Microsoft derived standard no? Are these two different FEM or like DXr12 is this Nvidias implementation.


That’s the question we need answered.


----------



## Grog6 (Dec 26, 2019)

Hardware in an fpga is still software; not sure of your argument.

If it takes a bios update and it changes stuff, it's software in my opinion.

I'm learning VHDL, so it all looks like software to me.


----------



## Steevo (Dec 26, 2019)

ShrimpBrime said:


> Nvidia Physx are pure hardware rendered in realtime.
> 
> Everything else is software rendered on a Cpu, not in real time.
> 
> Not sure how we can actually do good comparisons here.




Nah, one of the actual defining characteristics of Physx was it used a lot of precooked data.

Also Physx is a "brand" of physical interaction engine. It runs on the CPU or GPU, either provide real time physical interaction calculations, the GPU can run some faster than the CPU due to things like a constrained mesh being more parallel in computation than serial. 

Nvidia bought this and then ported the Physx engine to run on their GPU claiming that only their GPUs were "good enough" after years of their compute power being bested by ATI (compression and rounding in Nvidia hardware). And pulled another version of Tesselation. Meanwhile only a few (and one so bad customers were refunded) games used their GPU implementation, and AMD won the consoles so here we are.


----------



## TheoneandonlyMrK (Dec 26, 2019)

ShrimpBrime said:


> Nvidia Physx are pure hardware rendered in realtime.
> 
> Everything else is software rendered on a Cpu, not in real time.
> 
> Not sure how we can actually do good comparisons here.


Not in real time ,as if one's more real time , let's leave it at your making assumptions here regards FEM and Nvidias implementation, im not arguing a point on an unknown I'll just wait until we know.


----------



## ShrimpBrime (Dec 26, 2019)

theoneandonlymrk said:


> Not in real time ,as if one's more real time , let's leave it at your making assumptions here regards FEM and Nvidias implementation, im not arguing a point on an unknown I'll just wait until we know.



Yes, hardware rendered physx is real time.

Software rendered is pre-defined physics outcomes.

So here is NVidia FLEX demo (SDK 4.0) I ran personally. This is what real time rendering looks like.
The rendering happens as you see it happening.
(edit) Gpu specs GTX 980 1335/1800 clocks about 96% usage.









A quick google search, reads realtime liquid rendering has been around since 2013, even though this was invented closer to 10 years prior by Ageia.
This is just actual liquid rendering versus liquid particles with texture overlay.





						Nvidia’s PhysX has finally cracked realistic, real-time rendered water | ExtremeTech
					

No matter how great a video game heroine's hair looks, or how many individual furs are articulated on an anthropomorphic ...




					www.extremetech.com
				




Looks like real water when wireframe mode is enable don't it?? (video at 1:25)


----------



## TheoneandonlyMrK (Dec 26, 2019)

ShrimpBrime said:


> Yes, hardware rendered physx is real time.
> 
> Software rendered is pre-defined physics outcomes.
> 
> ...


Yes but this is not that ,is it.

And unless im wrong you have not tried physx5 ,FEM on Nvidia or AMD.

I asked a question btw, if you want a debate have a go at answering that or continue to prove how much you know about something else that's been out a while.


----------



## ShrimpBrime (Dec 26, 2019)

theoneandonlymrk said:


> Yes but this is not that ,is it.
> 
> And unless im wrong you have not tried physx5 ,FEM on Nvidia or AMD.



I'm not interested in software physics rendered on a cpu with predefined end points.

Look, the difference is....

Software, always predefined fall action and end point for example. The brick falls hits the ground in 5 certain ways (for example)

Hardware on a gpu will be a random occurrence where the brick falls. All calculations are rendered that moment. The brick will never land the same way twice.

A mix of hardware and software physics is a good thing though. I'm all for seeing the same bullet impact over and over if I can have a high particle count involved with it.
While software bullet impact will be the same particle count each time, hardware rendered will be a random amount each time. And that will depend on a lot of variables, where the software sees the impact and renders it how it was written.

I do not expect SDK 5.0 to be too much different than 4.0. Nvidia has the rendering down pat pretty well.

The above mentioned that NV wanted to hold onto it...... well, any good company would like to make some moneys. If NV physx sells more cards, well the company is there to make money. Some people don't like that and shun the physx, but totally fail to see and understand the differences between real time rendering and software pre-defined rendering.

In example, my video above shows constant random. That fluid does not act the same ways twice.

No, a Cpu doesn't have the horse power over Cuda cores.

No AMD doesn't render anything special on a cpu.... at least not yet.

Question, when was the last time you've played any game that was totally rendered on a cpu??? (without physics)..... most of you young ones....(not MRK,general public) probably never.


----------



## cucker tarlson (Dec 26, 2019)

ShrimpBrime said:


> I'm not interested in software physics rendered on a cpu with predefined end points.
> 
> Look, the difference is....
> 
> ...


exactly.


----------



## TheoneandonlyMrK (Dec 26, 2019)

ShrimpBrime said:


> I'm not interested in software physics rendered on a cpu with predefined end points.
> 
> Look, the difference is....
> 
> ...


Sorry but I'm not here to debate any of that , I respect your points but you don't know what I have played.


I asked if this is Nvidias implementation of FEM physics and if it was an implementation of a STANDARD such that AMD and Nvidia are using similar code , admittedly not exactly like that but close.

Is it just a land grab by Nvidia.

Nothing like your debate.


----------



## ShrimpBrime (Dec 26, 2019)

theoneandonlymrk said:


> Sorry but I'm not here to debate any of that , I respect your points but you don't know what I have played.
> 
> 
> I asked if this is Nvidias implementation of FEM physics and if it was an implementation of a STANDARD such that AMD and Nvidia are using similar code , admittedly not exactly like that but close.
> ...



It's about making money.

Can't look down on any company trying to make money. 

Is that your debate then?


----------



## R-T-B (Dec 26, 2019)

Steevo said:


> Their version is "open source" as much as if you bought a Ford sure it would drive on public roads, but only if you run Ford brand gas, tires, oil, jacket, gloves and air in the tires.



It's literally licensed BSD-3. You can argue it doesn't have the features you want in open source sure, but you can't whine about it not being open source because you are too lazy to add them yourself. That goes against the whole concept of open source.



dj-electric said:


> After 12 years of owning this technology and its licensing, they just might.
> 
> who am i kidding, its mostly gonna stay in the realm of cool demos for another year.



It's used in every Unity game ever.

Everyone seems to conveniently forget that.



ShrimpBrime said:


> Software, always predefined fall action and end point for example. The brick falls hits the ground in 5 certain ways (for example)



This is an implementation issue more than what runs it.


----------



## cucker tarlson (Dec 26, 2019)

ShrimpBrime said:


> This is what real time rendering looks like.
> The rendering happens as you see it happening.
> (edit) Gpu specs GTX 980 1335/1800 clocks about 96% usage.



meanwhile the amd demo looked like it ran 30 fps,with pre-defined physics,on god knows how many cores
980 at 1300mhz is what,a gtx1060 stock ? a three and a half year old bottom mid range.


----------



## ShrimpBrime (Dec 26, 2019)

cucker tarlson said:


> meanwhile the amd demo looked like it ran 30 fps,with pre-defined physics,on god knows how many cores
> 980 at 1300mhz is what,a gtx1060 stock ? a three and a half year old bottom mid range.



Not sure about the AMD scaling on Cpu threads. I'd like to think at least a single core, two at best.
GTX 980 would be somwhere about 5-10% faster than a 1060, but a 1060 would render physx just fine.

The push for physx could become big if it was implemented or pushed on game developers more.

We also see NV physx in movies. Physx is not restricted to just gaming.
Here's a neat video clip showing a few movies utilizing HW NV physx.
Not the best, but gets the point across. (2009)


----------



## Space Lynx (Dec 26, 2019)

nice another 30 fps hit i am going to have to turn off so i can enjoy my smooth high frame rates, so lets see thats RTX I have to turn off, ambient occlusion to medium, Shadows to medium, and Physx to off. and i can play all my games at 120+ fps 1440p with a 1080 ti. nice


----------



## ShrimpBrime (Dec 26, 2019)

lynx29 said:


> nice another 30 fps hit i am going to have to turn off so i can enjoy my smooth high frame rates, so lets see thats RTX I have to turn off, ambient occlusion to medium, Shadows to medium, and Physx to off. and i can play all my games at 120+ fps 1440p with a 1080 ti. nice



So you run 90 fps instead of 120 fps and you could tell the difference how? 

Your personal experience should differ. The guy with a 2080ti won't have your issues....


----------



## Space Lynx (Dec 26, 2019)

ShrimpBrime said:


> So you run 90 fps instead of 120 fps and you could tell the difference how?
> 
> Your personal experience should differ. The guy with a 2080ti won't have your issues....



i can tell the difference between 144 and 165 actually. my eyes are very sensitive to smoothness. and yes the guy with 2080 ti will, you can't even run witcher 3 at 1440p 165hz 165 fps everything maxed with a 2080 ti


----------



## TheoneandonlyMrK (Dec 26, 2019)

ShrimpBrime said:


> It's about making money.
> 
> Can't look down on any company trying to make money.
> 
> Is that your debate then?


My question stands un answered.

Debate, I'm just being specific , your meandering.


----------



## ShrimpBrime (Dec 26, 2019)

theoneandonlymrk said:


> Seams like Nvidia making another land grab of an open Microsoft derived standard no?



OK then.
Your answer is no.

Thanks for participating. lol.

Physx isn't new. NVidia didn't just pop into the now with Physx.

More like posting for competition and that's healthy in a race. (edit) It drives people to develop.)
_________________________________________________________________________________________________________________________________________
Since SDK is open source, why don't some of you NVidia users, play around with the stuff.

You can DL it here.









						NVIDIA PhysX 4.5 and 5.0 SDK
					

Latest features and libraries.




					developer.nvidia.com


----------



## TheoneandonlyMrK (Dec 26, 2019)

ShrimpBrime said:


> OK then.
> Your answer is no.
> 
> Thanks for participating. lol.
> ...


Sweet now the first question?

Reaks of opinion that answer too, physx is much newer then physics your making out Nvidia invented something rather than adapted for their use.

I get business reality and physx so patronising me isn't going to help.

Go physx5 , really, it's just I'm just after decent games and experiences ,who gives a shit how eh, i was just prying for details not a verses debate.

Playing fair it's clear im sick of Nvidias exclusive features , I've said so many times, closed proprietary tech is not ideal to me, i get it but don't want it , i think moving past walls and competition in performance is the future of GPU , not exclusive versions of the same features.


----------



## ShrimpBrime (Dec 27, 2019)

theoneandonlymrk said:


> Sweet now the first question?
> 
> Reaks of opinion that answer too, physx is much newer then physics your making out Nvidia invented something rather than adapted for their use.
> 
> ...



Your comments are based more on your own feelings than facts.

No NVidia did not invent HW physx, they purchased a very early release of it and have been building on it more than a decade. (closer to two decades)

Yes, I understand the frustration that you need Cuda cores to render their physx. That doesn't take away from the facts, nor should you let your feelings get in the way of seeing it for what it really is.

I know proprietary tech is not appealing. But if NVidia didn't purchase it, AMD and NVidia users would have to purchase a PPU or physx processing unit just like when first released. The 3rd party PPU released around 250$.....
With NV purchasing Ageia way back when, since the 8000 series GPUs, NV Physx was just simply included with the card via the drivers. THIS is what made it difficult to push on game developers because they understand exactly your point of view and also share the same feelings on the proprietary hardware/software.... which every manufacturer has a bit of proprietary something or other in one form or another.

I've done the AMD/ATI card with dedicated physx Gpu. It's fun and interesting, sometimes a pain in the butt to get working because of driver clashing, something I haven't done in a long time actually.

I wish physics on a Cpu could be a thing and just be as amazing as Physics rendered on a dedicated gpu or ppu card.


----------



## TheoneandonlyMrK (Dec 27, 2019)

ShrimpBrime said:


> Your comments are based more on your own feelings than facts.
> 
> No NVidia did not invent HW physx, they purchased a very early release of it and have been building on it more than a decade. (closer to two decades)
> 
> ...


I have no feelings on these matters , history is what it is, i was born in 76 ,so again I lived it too bro, I have and will have a go with any version of tech i can buy or borrow just to see stuff.
Get off the us v them bandwagon and back in the op conversation ,FEM physics , looks promising I have no hope of knowing anything more here apparently.

You miss took an opinion for a feeling, my opinions are based on facts not other opinions or others opinions.

Its tech ,,,i get frustrated by problems with tech not at all when it's working, how it works interest's me too though.

As i said im after information on this new tech not what's out now or opinions on who does physx best ,im trying to be on topic bro.


----------



## ShrimpBrime (Dec 27, 2019)

sorry lat reply, was off to chow.
Firstly FEM isn't exactly physics, it the mathematics involved with joint or disjointing of objects with overlapping components of simple geometry.

FEM is short for. _Finite Element_ Model
You're talking about FEMFX. Here you go.








						FEMFX
					

A multithreaded CPU library for deformable material physics, using the Finite Element Method (FEM)




					gpuopen.com
				




This is all CPU based physics computation. Experience this and pit it against GPU physics and there's pretty much a night and day difference.

If we blend CPU and GPU physx, things start getting real or more surreal if you're my age (which you are) this stuff is out of this world in comparison to 20 years ago computing.

There's nothing wrong with NV physx supporting FEMFX physics. It can only make things better, especially that all physx and physics are open source. We just need people to get and be on board with implementing this technology into far more than just gaming.

edit: FEMFX uses the unreal engine. It's not a stand alone engine like NVidia Physx. just a little fyi there.


----------



## ShurikN (Dec 27, 2019)

R-T-B said:


> It's used in every Unity game ever.
> 
> Everyone seems to conveniently forget that.


And with every right. Unity games tend to be the worst optimized pieces of shit ever conceived. A well running (3D) Unity game comes once in a blue moon.


----------



## Ferrum Master (Dec 27, 2019)

ShurikN said:


> And with every right. Unity games tend to be the worst optimized pieces of shit ever conceived. A well running (3D) Unity game comes once in a blue moon.



Which one is good? Never seen such.

Subnautica with the latest patch needs manually to assign seeing only 3 CPU cores, otherwise it is a 40fps stutter fest.

Advanced physx engine lol.


----------



## Space Lynx (Dec 27, 2019)

Ferrum Master said:


> Witch one is good? Never seen such.
> 
> Subnautica with the latest patch needs manually to assign seeing only 3 CPU cores, otherwise it is a 40fps stutter fest.
> 
> Advanced physx engine lol.



Which*


----------



## ShrimpBrime (Dec 27, 2019)

Ferrum Master said:


> Which one is good? Never seen such.
> 
> Subnautica with the latest patch needs manually to assign seeing only 3 CPU cores, otherwise it is a 40fps stutter fest.
> 
> Advanced physx engine lol.



Subnautica does not use advanced physx engine.

Here's the list of games that does utilize NV physx.





						GameWorks PhysX Overview
					

PhysX is a scalable multi-platform game physics solution supporting a wide range of devices, from smartphones to high-end multicore CPUs and GPUs. PhysX is already integrated into some of the most popular game engines, including Unreal Engine (versions 3 and 4), Unity3D, and Stingray. New! PhysX...




					www.geforce.com
				




The thread is about NVidia Physx SDK 5.0 and FEMFX. Has nothing to do with Unity game engine.


----------



## Vayra86 (Dec 27, 2019)

Liking the recent physics push. I think its clear this is where games and also VR applications need to work towards. Achieving realism is much more than just the picture, its about how we can manipulate it, otherwise its just a glorified painting.


----------



## Ferrum Master (Dec 27, 2019)

ShrimpBrime said:


> The thread is about NVidia Physx SDK 5.0 and FEMFX. Has nothing to do with Unity game engine.



Unity beta uses  Physx SDK 4.1 and eventually the 5.0, at current sad state, also in physx stability it is a hot mess, havok is more stable and faster. Adding up to a pile of turd something more will not help anything, the performance is already quite bad.

@R-T-B could explain even better.


----------



## Solid State Soul ( SSS ) (Dec 27, 2019)

Very cool.  With next gen consoles equipped with 8 cores 16 threads ZEN 2 proccecors devs shouldn't feel too restrained on the CPU which hopefully means more interest in real time particles solutions like Physix or AMD equivalent. 

Remember that games are always optimized for consoles first so with the quality of console games increasing,  so dos PC games


----------



## TheoneandonlyMrK (Dec 27, 2019)

ShrimpBrime said:


> sorry lat reply, was off to chow.
> Firstly FEM isn't exactly physics, it the mathematics involved with joint or disjointing of objects with overlapping components of simple geometry.
> 
> FEM is short for. _Finite Element_ Model
> ...


I give up, you just put words in my mouth , dodge my question then patronise the shit out the situation, loads of that's arguable again but im not here for that ,goodbye.
Not least that last bit, both can be added to any game engine.


----------



## Vayra86 (Dec 27, 2019)

theoneandonlymrk said:


> I give up, you just put words in my mouth , dodge my question then patronise the shit out the situation, loads of that's arguable again but im not here for that ,goodbye.
> Not least that last bit, both can be added to any game engine.



I think the disconnect between you both is that @ShrimpBrime  is talking about things 'as is' and you are thinking about things that 'may happen' or 'should happen'.

The reality is as it stands and it kinda boils down to who made the bigger investment. Its not about a red vs green debate as much as these are proof pf concept discussions. We have seen that GPU acceleration makes for night and day differences in complexity and performance. There is no coding around that, its one of those things about physics, there are no tricks for it. It needs brute force to a degree and GPUs are designed much better for that purpose, as they are much wider than a CPU.

So the direction forward is what we dont see answered here and what you are really asking for. The direction forward is, and has been, that the most readily available AND applicable solution gets used. Here is the kicker though: Applicable relates to hardware compatability. But also hardware capability. And in the latter CPU physics is an uphill battle, a PPU for physics is economically not viable, and GPU physics is often vendor specific.

Its a major problem and dilemma for every stakeholder. And probably the main reason none of it has truly taken off yet and most of it is meh Havok or cpu PhysX...


----------



## notb (Dec 27, 2019)

OMG. One can get a serious headache reading some of the comments here.
"I dare you. I double dare you! Say _FEM physics_ one more time!"

*Clearing things up:*

FEM stands for Finite Element Method - a numerical technique for solving partial differential equations (PDE) on computers.
Deformations of materials can be described with PDEs and can be solved using FEM.
FEM isn't used just for problems in physics and, clearly, not just for computing flexible objects in games.

*Now, what has happened:*

1) AMD launched FEMFX which is a *CPU* library for solving deformable objects. It's nothing new. We've been doing that for decades. And it's something a decently trained physics or material engineering student can do.
Of course this library may simplify game development, especially taking into account that some smaller studios may not employ people trained in advanced numerical methods.

2) Nvidia announced that FEM for deformable objects will be implemented in next PhysX.
Now, this could mean one of 2 things:
- it's a *CPU *implementation, which makes it just a quick answer to FEMFX,
- it's a mixed *CPU+GPU* implementation, which - assuming a decent performance boost - would make it fairly game-changing.

Solving PDEs on GPUs is not a new idea and Nvidia as well as other entities are working on efficient implementations.
FEM provides a beautifully parallel problem, since it divides the system into cells that are calculated independently in each step.
However, after that step all cells have to be updated by new boundaries, which makes a GPU implementation somehow challenging.


----------



## Vya Domus (Dec 27, 2019)

R-T-B said:


> It's used in every Unity game ever.



Uhm, no it isn't.

It *can *be used. Very different.


----------



## ShrimpBrime (Dec 27, 2019)

theoneandonlymrk said:


> I give up, you just put words in my mouth , dodge my question then patronise the shit out the situation, loads of that's arguable again but im not here for that ,goodbye.
> Not least that last bit, both can be added to any game engine.


None of that happned, I provided information with links to support it and nothing more.

If an admin or mod felt I was being anything like what you are calling me, they would have stepped in and infracted me for it.

So quit the name calling, open your mind and perhaps reword some of your questions so maybe we will understand them better.

Notb pretty much re-capped all Ive said to you. Perhaps read his post.


----------



## TheoneandonlyMrK (Dec 27, 2019)

Vayra86 said:


> I think the disconnect between you both is that @Shrimpbrine is talking about things 'as is' and you are thinking about things that 'may happen' or 'should happen'.
> 
> The reality is as it stands and it kinda boils down to who made the bigger investment. Its not about a red vs green debate as much as these are proof pf concept discussions. We have seen that GPU acceleration makes for night and day differences in complexity and performance. There is no coding around that, its one of those things about physics, there are no tricks for it. It needs brute force to a degree and GPUs are designed much better for that purpose, as they are much wider than a CPU.
> 
> ...


Exactly, im after information on this new tech , he is invested in physx as it is now, boring.

@shrimpbrine what name calling , take it to pms if you're bothered by what i said but do go on and retake the last word if you must.


----------



## ShrimpBrime (Dec 27, 2019)

theoneandonlymrk said:


> Exactly, im after information on this new tech , he is invested in physx as it is now, boring.
> 
> @shrimpbrine what name calling , take it to pms if you're bothered by what i said but do go on and retake the last word if you must.



OK, PMs it is, but to recap....

I tried to provide you with information to help you understand how physx is NOW, so you can look forward and kinda know what to expect.

Like I was saying earlier..... FEMFX being implemented or "supported" (better word) by NVidia is a good thing and NOT NVidia trying to steal land from someone. It's all open source now, shared technology that even YOU can dabble and play with right at your very own desktop and you don't need CUDA (Workstation or Tesla card for example) to run the SDK on your desktop because the SDK is also in D3D.

Also was trying to explain without going into vast technical detail the difference between CPU physics rendering and GPU/PPU physx rendering.
(this is what I meant by open your mind)

That's all I'm trying to do here.
So with the links I've provided, read. Investigate.

Heck you wanna try NV physx? I got a GTX 580 with a full cover block on it I'll lend you.
check you pm box.


----------



## notb (Dec 27, 2019)

ShrimpBrime said:


> Notb pretty much re-capped all Ive said to you. Perhaps read his post.


Yeah... that's what happens when you start a post at home and click send during lunch break. 


theoneandonlymrk said:


> Exactly, im after information on this new tech , he is invested in physx as it is now, boring.


We've given you enough information. You don't want to learn. Or you're just unable to accept some stuff.

*So once again, slower* (and maybe this time you'll ask some questions...)

Normally you have a problem that you want to solve. There's a system (that represents something real, like some bodies and dynamics). You have to find equations (a model) that describe what's going on, implement them and make the processor find a solution.

Some problems are modeled with PDEs (partial differential equations) and there are different ways to solve these using a computer.
FEM is the popular simulation approach (fairly simple and numerically intensive, but still far from brute force).

That's the general workflow.

Here we're talking about a specific case: deformable bodies.
So the system is known: theres an object (represented by a 3D mesh and material properties) and there are some forces.
FEMFX and PhysX authors have chosen some equations that will provide a solution. And they've chosen the numerical technique to solve them: FEM.

You no longer have to spend a lot of time on development. You have a library - someone did it for you. And if you don't like the results or the speed, you still can write it yourself.
We've had libraries like these for a long time. It's just that this time they're optimized for games, not engineering.

That's it. There's no magic, no special tech, no great new inventions.


I'm not sure what else you'd like to know.
Both FEMFX and PhysX are open source. I can recommend some literature about computational physics if you're really fascinated by this topic...


----------



## TheoneandonlyMrK (Dec 27, 2019)

ShrimpBrime said:


> OK, PMs it is, but to recap....
> 
> I tried to provide you with information to help you understand how physx is NOW, so you can look forward and kinda know what to expect.
> 
> ...


I have a few Nvidia cards ,ty for the offer but im fine.

And as I said i know what physx is and is not atm.


----------



## ShrimpBrime (Dec 27, 2019)

notb said:


> I'm not sure what else you'd like to know.
> Both FEMFX and PhysX are open source. I can recommend some literature about computational physics if you're really fascinated by this topic...



Lay it on me!

Working with Physx system specs:

Intel 8700K
Maximus X hero
Fast ddr4
GTX 770
GTX 580 (has issue, will replace) (have 3 cards)
Ageia physx PPU P100.

NV driver 331, while 391 will not install. Black screen issues, maybe the bad 580.
 GTX 580 reports 99% usage at idle temps 35c. Card is bad, needs pulled.

All kinds of physics stuff going on.

Major ageia NV drivers conflicts.
Not having a good testing software or benchmark.

Currently in use Physx SDK 4.0/4.1  -

Demos cut from NV Ageia PPU 9.12 drivers.
Ageia PPU drivers 8.10.13 are meh, demos don't work.
Latest PPU 7.11.13 drivers, must uninstall NV physx drivers.

W10 testing at this time. Similar to W7, but seems to support drivers better.


----------



## R-T-B (Dec 28, 2019)

Vya Domus said:


> Uhm, no it isn't.
> 
> It *can *be used. Very different.



It's a slight exageration perhaps, but I mean it literally is the stock physics implementation...  and few bother replacing it.



Ferrum Master said:


> @R-T-B could explain even better.



No, even I can't explain the mess that is Unity I am afraid...


----------



## R-T-B (Jan 5, 2020)

ShurikN said:


> And with every right. Unity games tend to be the worst optimized pieces of shit ever conceived. A well running (3D) Unity game comes once in a blue moon.



This is more on the devs than Unity itself though.  I can think of quite a few examples of awesome Unity performance in my library.

Unity is both at once accessible and confusing.  That leads to a lot of poor performing titles.


----------

