Nvidia should know that it is hurting the gamers with ATI cards who already have a dedicated card for PhysX. With Vista, it was not allowed, so Nvidia didnt do anything about it, but then Nvidia put in a code to block it in Win7 (which allowed 2 different GPU's to operate at the same time).
Those who play Mirror's Edge or Batman are left out in the cold if they have ATI cards.
Here's a note to you game developers who design such games:
No matter how much Nvidia is paying you to use PhysX instead of Havok or some other great game engines that allow for awesome physics utilized by the CPU (Crysis, Ghostbusters, etc..), please do not take the bait from Nvidia and then leave out nearly half of the gamers with a crippled version of your game.
PhysX already cripples it in the first place with a few gimmick effects. Usually, when PhysX is being used, it means that the game will not be allowed to have as good overall built-in physics effects as Crysis for example, and that the game will have to be designed in such a way to be played without PhysX for nearly 1/2 of the gamers out there. Once again, it means that while the game will be somewhat crippled without those PhysX effects, you will have to make sure that those additional physics effects are not so integrated into the game that it makes the game dependent on it.
A good game should depend on the physics. Usually, the better the physics are, the better the game could be by depending on it. Half Life 2 depended on it to a great degree, which earned it its reputation. Crysis also depended on integrated physics, in that the basic physic effects were always there no matter how much we tried to dumb down the settings. Games with such good engines never ever needed PhysX to begin with.
The gimmick PhysX effects could have been done using CPU-based algorithms with nearly or exactly the same stimulations so that we could enjoy Mirror's Edge or Batman on both ATI and Nvidia cards with much better performance than what Nvidia's PhysX would have ever allowed for with top-end Nvidia cards in the first place. It would have resulted in greater sale revenues in the long run than with Nvidia's TWIMTBP bribes. Also, you would have never been discouraged to build in some of the physics effects into the game so that the game depends on it, without having to worry about breaking the game for those with ATI cards. For example, Mirror's Edge could have actually depended on the flags and breaking glass that were done with cpu-based physics.
There you go, dear developers.