Monday, October 5th 2009
Hack Released to Enable PhysX on Windows 7 with ATI GPU Present
For NVIDIA's PhysX technology, it has been a roller-coaster ride since NVIDIA's acquisition of the technology, and its makers. As much as PhysX quickly became one of the important selling-points of NVIDIA's consumer graphics line GeForce, it also had its small share of controversy, linked to market dynamics more than anything. With the technology's port to the GeForce GPU, enthusiasts fancied having the freedom of choice with a primary GPU that is dedicated to rendering 3D graphics, and a second GPU that is just about powerful to assign as a dedicated PhysX GPU.
Although having a powerful ATI Radeon GPU aided by a less-powerful NVIDIA GeForce GPU for PhysX was possible on Windows XP, the succeeding Windows Vista restricted this, by making sure two active display drivers couldn't coexist. Windows 7 removed this restriction, but before you could rejoice, NVIDIA quickly released a driver-level code with its 186 series drivers, that disables NVIDIA PhysX altogether when a GPU from another vendor is coexisting and enabled, even an IGP for that matter. If that wasn't bizarre enough, with the latest drivers, you can't even pair an Ageia PhysX PPU card with an ATI Radeon GPU going about its business. To the rescue comes a soft-modder's nifty bit of software that overrides this restriction from NVIDIA's drivers, so you can use dedicated GeForce PhysX cards on machines with ATI Radeon primary GPUs again. The corrective driver patch comes from tech portal NGOHQ.com community member GenL.
The patch, which you can download here, has been successful so far going by community members' feedback. It lays to rest any argument NVIDIA would like to make about how using dedicated PhysX cards with primary GPUs of your choice (which happen to be an ATI Radeon) would be the end of the world, other than of course, market-dynamics.
Speaking of which, here's NVIDIA's statement on why dedicated PhysX accelerators ought not to work with GPUs from other vendors: "PhysX is an open software standard any company can freely develop hardware or software that supports it. NVIDIA supports GPU accelerated PhysX on NVIDIA GPUs while using NVIDIA GPUs for graphics. NVIDIA performs extensive Engineering, Development, and QA work that makes PhysX a great experience for customers. For a variety of reasons - some development expense some quality assurance and some business reasons NVIDIA will not support GPU accelerated Physx with NVIDIA GPUs while GPU rendering is happening on non- NVIDIA GPUs."
Source:
NGOHQ
Although having a powerful ATI Radeon GPU aided by a less-powerful NVIDIA GeForce GPU for PhysX was possible on Windows XP, the succeeding Windows Vista restricted this, by making sure two active display drivers couldn't coexist. Windows 7 removed this restriction, but before you could rejoice, NVIDIA quickly released a driver-level code with its 186 series drivers, that disables NVIDIA PhysX altogether when a GPU from another vendor is coexisting and enabled, even an IGP for that matter. If that wasn't bizarre enough, with the latest drivers, you can't even pair an Ageia PhysX PPU card with an ATI Radeon GPU going about its business. To the rescue comes a soft-modder's nifty bit of software that overrides this restriction from NVIDIA's drivers, so you can use dedicated GeForce PhysX cards on machines with ATI Radeon primary GPUs again. The corrective driver patch comes from tech portal NGOHQ.com community member GenL.
The patch, which you can download here, has been successful so far going by community members' feedback. It lays to rest any argument NVIDIA would like to make about how using dedicated PhysX cards with primary GPUs of your choice (which happen to be an ATI Radeon) would be the end of the world, other than of course, market-dynamics.
Speaking of which, here's NVIDIA's statement on why dedicated PhysX accelerators ought not to work with GPUs from other vendors: "PhysX is an open software standard any company can freely develop hardware or software that supports it. NVIDIA supports GPU accelerated PhysX on NVIDIA GPUs while using NVIDIA GPUs for graphics. NVIDIA performs extensive Engineering, Development, and QA work that makes PhysX a great experience for customers. For a variety of reasons - some development expense some quality assurance and some business reasons NVIDIA will not support GPU accelerated Physx with NVIDIA GPUs while GPU rendering is happening on non- NVIDIA GPUs."
111 Comments on Hack Released to Enable PhysX on Windows 7 with ATI GPU Present
Nvidia went the their route and ATI went their's.
The problem is that physics processing should be an open standard that works for everyone, on any card, in any game. We will see what unfolds in the future as the battle looks to be a lose/lose for anyone who demands a proprietary standard that only runs on their hardware.
Just my opinion :toast:
And I'm not too fond of the idea of having two heat producing graphic cards inside the computer.
I think the GPU's either need a cheaper extra GPU to calculate stuff so that is always available even if you purchase one card or that physics acc. only take off when multi-CPU systems are common.
Side question, can people still use AGEIA made PhysX cards?
The GPU is well suited for doing physics, but not if everyone has their own proprietary standard and the game companies are coerced into using one method or another.
As I said, we will see how this all works out.
You put a product on the shelves that meant to do a specific task/mission and advertise it. You list its minimum requirements on the box and on your website.
You can’t come up after a few months/years and introduce some new exaggerated restrictions via software updates that weren’t listed in the minimum requirements after a lot of people already bought this product.
This is absolutely illegal. Nvidia made a terrible mistake and opened its front door for lawsuits from its customers and its rivals. They will probably revert it and remove these restrictions - for their own good.
If you bought a DVD player and it suddenly stopped working for no reason, then you are entitled for a refund/replacement. This fact covered by billions of legal precedences all around the globe.
Hell,phsyx was born out of Ageia!!:banghead:
Technically (and legally) they can. If they push a product into end-of-life status they are no longer required by law to support it with "new and improved" updates.
They do, however, have to announce the ending of support for the product. Whether they have done this or not, I cannot say as it could be buried in some small print tattooed to some lawyer's a$$.
The law protects consumers from such abusive behavior.
And oh..once my DVD player's warranty period is over, a refund is the last thing I'd expect from its manufacturer if it abruptly fails. :)
I know that in this case, the manufacturer broke into my house, and broke my DVD player.
Those who play Mirror's Edge or Batman are left out in the cold if they have ATI cards.
Here's a note to you game developers who design such games:
No matter how much Nvidia is paying you to use PhysX instead of Havok or some other great game engines that allow for awesome physics utilized by the CPU (Crysis, Ghostbusters, etc..), please do not take the bait from Nvidia and then leave out nearly half of the gamers with a crippled version of your game.
PhysX already cripples it in the first place with a few gimmick effects. Usually, when PhysX is being used, it means that the game will not be allowed to have as good overall built-in physics effects as Crysis for example, and that the game will have to be designed in such a way to be played without PhysX for nearly 1/2 of the gamers out there. Once again, it means that while the game will be somewhat crippled without those PhysX effects, you will have to make sure that those additional physics effects are not so integrated into the game that it makes the game dependent on it.
A good game should depend on the physics. Usually, the better the physics are, the better the game could be by depending on it. Half Life 2 depended on it to a great degree, which earned it its reputation. Crysis also depended on integrated physics, in that the basic physic effects were always there no matter how much we tried to dumb down the settings. Games with such good engines never ever needed PhysX to begin with.
The gimmick PhysX effects could have been done using CPU-based algorithms with nearly or exactly the same stimulations so that we could enjoy Mirror's Edge or Batman on both ATI and Nvidia cards with much better performance than what Nvidia's PhysX would have ever allowed for with top-end Nvidia cards in the first place. It would have resulted in greater sale revenues in the long run than with Nvidia's TWIMTBP bribes. Also, you would have never been discouraged to build in some of the physics effects into the game so that the game depends on it, without having to worry about breaking the game for those with ATI cards. For example, Mirror's Edge could have actually depended on the flags and breaking glass that were done with cpu-based physics.
There you go, dear developers.
Besides additional PhysX effects in Batman are a joke to me, anyway. I'm still waiting for a game that'd really convince that it's worthwile to spend extra $$$ on a physics capable hardware.
I truly hope that this will all wash out with an open standard for physics that work for eveyone. Especially me. ;)
and NVIDIA will HAVE to comply to Intel/AMD physics rules, why? who the heck wants to program for ONE GPU vendor when there is a bunch more like ATi, S3 Graphics, Intel, Matrox that are basicly most of the market
one thing DirectCompute and OpenCL will arrive soon with Windows 7, and PhysX is then dead, most new games will use DirectCompute as its a part of DirectX 11 and is a open windows only standard, same goes for OpenCL thats an open open standard