@Benetanegia
You still don't understand. In the past we at least had 6 nodes and 6 poligons. Today we don't even have that. The object is not even there in non-PhysX mode. Thats just plain dumb.
We could talk about normal version with 50 nodes and just as much polygons and about hi-def objects with thousands of nodes and twice as much polygons. But we have no normal version and hi-def one. Thats wtf for me.
Particles? I've seen them in GeForce 2. Lightning Demo. Or Quake 3 Arena with CorkScrew mod.
Ralgun slug emitted up to 999 physically affected sparks upon hitting the wall. And that was like in year 2000. Glass breaking in Red Faction 2001 looked better than PhysX one in Mirror's Edge. Except the one in Red Faction could run on AthlonXP 2400+ easily while PhysX glass cannot run smooth on 4GHz i7 Quad core. If that doesn't rise any of your eyebrows, then i'm not sure what will.
Cloth simulation was already done with Unreal Engine (just look at the flags uin UT99 they look freakin awesome even today and it's a game from 1999). Not at such extent, but it was done. On 10 times weaker hardware. Today? The flags are just gone. Missing. Not there. Unless you have "uber" PhysX gfx card. Pheh. Clothing was also simulated in Hitman series and Splinter Cell. Again in a slightly simplified way, but it was there. Running smooth on AthlonXP 2400+.
Don't tell me they couldn't do all that with 10 fold of everything on today' hardware.
They instead rather remove that object than make it normal definition.
I don't care how much horse power you need today or how many clusters gfx cards have. That's completely irrelevant information. The most important thing is relation between time, hardware performance increases in this time compared to what we've seen in the past.
If we've seen basic cloth simulation, advanced particles, destruction, ragdolls etc in year 2000 on funny weak hardware (for today's standards), one would expect something at least on that level or improved by 10 times of that today on powerful quad cores, loads of memory and 10 times faster graphic cards. But have we seen any of it? Ok, partially on PhysX enabled graphic cards. But what about CPU physics? Flags are just gone, broken glass just fades out even before it hits the ground, static environments etc. Thank god at least ragdolls remained.
I wouldn't mind blocky flags like the ones in games from 2000. At least it was there and i could interact with it. But i don't even have the flag there because i don't have GeForce card. It's just gone. Entirely. LMAO.
And it's open, and it would have been open to Ati users if when Nvidia gave PhysX to Ati for free with no conditions they had said yes, or when the guy from ngohq.com made one moded driver that made it posible, if they had supported him, like Nvidia did, instead of scare the hell out of him with demands. But of course, back then Ati had nothing to compete in that arena and Nvidia cards would have destroyed their cards, so they said no no nooo! And now poor Ati users can't do anything but cry, and say it's not that great.
You're not looking at the bigger picture. Sure the guy at NGOHQ made PhysX hack. But imagine all the crap users would be throwing at ATI if this hack failed to work properly in certain games or if games were crashing. No one would blame NGOHQ, they would rush blaming ATI instead. Been there, seen that numerous times with Microsoft. It was NVIDIA driver that crashed and the users were spitting over Windows Vista and how crappy it is made. But it wasn't even Vista's fault. It was NVIDIA's driver (or any other in that matter) that crashed.
So i perfectly understand why ATI refused to cooperate. They'd go the PhysX way if NVIDIA would send them their entire documentation, SDK's, everything, with same full capability as NVIDIA. But supporting a hack made by 3rd party, that's just not logical. Sorry. The same reason why laptop companies don't support any other graphic drivers than the ones on their webpage? Because of the exact same reason. Troubleshooting and tech support and bad word that could spread about brand "X" because someone with hacked drivers fried his graphic card or something.
I don't mind PhysX, it's a great thing actually. I just hate the way how NVIDIA is pushing it around and placing stupid restrictions on it. And all these stupid restrictions are just damaging evolution of games and physics. Imagine what would happen if PhysX was an open standard that anyone could implement and use on ANY graphic card. I can bet 1000 eur that we'd see at least 5 high profile games with awesome physics effects that everyone could enjoy. Not just GeForce users. It's really that simple.