Friday, May 28th 2010

NVIDIA Removes Restriction on ATI GPUs with NVIDIA GPUs Processing PhysX

NVIDIA has reportedly removed the driver-level code which restricts users from having an NVIDIA GeForce GPU process PhysX with an ATI Radeon GPU in the lead, processing graphics. Version 257.15 Beta of the GeForce drivers brought about this change. Possible commercial interests may have played NVIDIA's previous decision to prevent the use of GeForce GPUs to process PhysX with ATI Radeon GPUs, where users could buy an inexpensive GeForce GPU to go with a high-end DirectX 11 compliant Radeon GPU, thereby reducing NVIDIA's margins, though officially NVIDIA maintained that the restriction was in place to ensure Quality Assurance. The present move also seems to have commercial interests in mind, as NVIDIA could clear inventories of GeForce GPUs at least to users of ATI Radeon GPUs. NVIDIA replenished its high-end offering recently with the DirectX 11 compliant GeForce 400 series GPUs.

Update (28/05): A fresh report by Anandtech says that the ability to use GeForce for PhysX in systems with graphics led by Radeon GPUs with the 257.15 beta driver is just a bug and not a feature. It means that this ability is one-off for this particular version of the driver, and future drivers may not feature it.
Source: NGOHQ.com
Add your own comment

276 Comments on NVIDIA Removes Restriction on ATI GPUs with NVIDIA GPUs Processing PhysX

#51
GotNoRice
zAAmVista limitation will be there forever as far as I know since it uses WDDM1.0. This ensures that only one display driver can be used at a time so unless ATI cards start using nVidia drivers or vice versa or Microsoft brings out a WDDM1.1 patch to Vista, it will never work on Vista. ;)
Windows Vista was updated to WDDM1.1 at the same time it got DirectX11, in the platform update:

en.wikipedia.org/wiki/Windows_Vista_Platform_Update#Platform_Update
Posted on Reply
#52
cadaveca
My name is Dave
GotNoRiceWindows Vista was updated to WDDM1.1 at the same time it got DirectX11, in the platform update:

en.wikipedia.org/wiki/Windows_Vista_Platform_Update#Platform_Update
Hmmm...
For example, even though DXGI 1.1 update introduces support for hardware 2D acceleration featured by WDDM 1.1 video drivers, only Direct2D and DirectWrite will employ it and GDI/GDI+ will continue to rely on software rendering.[citation needed] Also, even though Direct3D 11 runtime will be able to run on D3D9-class hardware and WDDM drivers using "feature levels" first introduced in Direct3D 10.1, Desktop Windows Manager has not been updated to use either Direct3D 10.1 or WARP software rasterizer
Posted on Reply
#53
dadi_oh
TRIPTEX_MTLWhat's does a decent physx GPU run for now? If I could pick up a card for less that $50 I'd consider it.
The GT 240 is a good bet. Got mine on a Newegg shell shockers for less than $50. Has really low power consumption (40nm process) and no extra power connector needed. And with 96 shaders I have yet to see it use more than 50% GPU usage. If that is the case then even a lowly GT 220 would do (48 shaders).
Posted on Reply
#54
xtremesv
LOL, sad I just recently sold my GT220 PPU which initially I bought to play BAA. I would’ve kept it if only I could’ve also used it as a CUDA dedicated card to enable the special filters in Just Cause 2.
Posted on Reply
#55
Unregistered
I knew Nvidia was going to do this, I am not sure why this was a shocker. This is just another reason why I will still buy Nvidia products.
Posted on Edit | Reply
#56
GotNoRice
cadavecaHmmm...
That quote about the Desktop Windows Manager not using DX10.1 is just about Aero. In Vista Aero uses DX9, in 7 Aero uses DX10.1 (or 9 via feature level support). All it's saying is that Vista still uses DX9 for Aero and was never updated to use 10.1.

It's not talking about game support or anything like that.
Posted on Reply
#57
EastCoasthandle
qubitFinally they see sense. :rolleyes: If it's to get rid of old cards, then does that suggest they'll reintriduce the restriction in a later driver, all in the name of "Quality Assurance"??
You hit the nail on the head. There is no telling when they will re-activate this policy.
Posted on Reply
#58
dadi_oh
GotNoRiceWindows Vista was updated to WDDM1.1 at the same time it got DirectX11, in the platform update:

en.wikipedia.org/wiki/Windows_Vista_Platform_Update#Platform_Update
I couldn't get it to work on my son's machine with Vista Home Premium 32 bit. Trying to find the actual "platform update" in the myriad of updates at download.microsoft was impossible. The machine was fully updated via windows update but it still didn't work. Couldn't find the actual "platform update" even though the wikipedia article gives the date of the release.

Has anyone ACTUALLY got Vista to run with ATI and Nvidia drivers at the same time?
Posted on Reply
#59
DaedalusHelios
air_iiLet's hope they don't take that back later (say, after you buy a 9800gt) ;).
You definately need G92 8800gt or better for physX in games like Batman: AA and others. The Shellshocker right now is not good enough if that is what you might be looking at as 128bit memory cripples that card. :toast:
EastCoasthandleYou hit the nail on the head. There is no telling when they will re-activate this policy.
They won't because the cat is now out of the bag. Lets not let pessimism and paranoia infect the thread. We have no reason to think they would be ok with commiting reputation suicide by revoking it. It would mess up marketing campaigns and things that are similar in the retail segment. The only "what if" is will they enable its acceleration on ATi GPUs. It would work as well and there is no denying that, but it could still work.
Posted on Reply
#60
kenkickr
EastCoasthandleYou hit the nail on the head. There is no telling when they will re-activate this policy.
But at least we know it's breakable if they do go back to "blocking" out ATI.
Posted on Reply
#61
Mussels
Freshwater Moderator
kenkickrBut at least we know it's breakable if they do go back to "blocking" out ATI.
it was cracked/broken before Nv reversed it anyway.
Posted on Reply
#62
Phxprovost
Xtreme Refugee
DaedalusHeliosThey won't because the cat is now out of the bag. Lets not let pessimism and paranoia infect the thread. We have no reason to think they would be ok with commiting reputation suicide by revoking it. It would mess up marketing campaigns and things that are similar in the retail segment. The only "what if" is will they enable its acceleration on ATi GPUs. It would work as well and there is no denying that, but it could still work.
O rite cause a company that gives the CEO a "working sample" held together by wood screws clearly cares what the public thinks :rolleyes:
Posted on Reply
#63
DannibusX
I'll keep my eye on this, if nVidia is seriously going to allow their cards to be used as PPUs I will buy a new card for PhysX. One of the 200 series. I'm using an 8800GT at the moment.
Posted on Reply
#64
RejZoR
Musselscool.... now i can slap in my 8600GT and... uhh.. well actually i have no games that use PhysX, lol. might aquire metro 2033 just for this.
Well, if you have Core i7, you don't even need HW PhysX, I tried it with enabled Advanced PhysX and it worked pretty well on CPU itself. Maybe slight slowdown but still pretty much playable. Opposed to Mirror's Edge where it will lag like insane with Advanced PhysX with only one physics affected object moving around. Heh.
Posted on Reply
#65
Mussels
Freshwater Moderator
PhxprovostO rite cause a company that gives the CEO a "working sample" held together by wood screws clearly cares what the public thinks :rolleyes:
i want details on whatever it is you are discussing. you have aroused my curiousity.
Posted on Reply
#66
cadaveca
My name is Dave
He's talking about the developer conference where JH firist showed Fermi, which was a GTX480 with the end sawn off.
Posted on Reply
#67
Mussels
Freshwater Moderator
cadavecaHe's talking about the developer conference where JH firist showed Fermi, which was a GTX480 with the end sawn off.
i wanna seeeeee it
Posted on Reply
#68
DaedalusHelios
PhxprovostO rite cause a company that gives the CEO a "working sample" held together by wood screws clearly cares what the public thinks :rolleyes:
Yeah man they are like evil incarnate and eat babies. Seriously, real babies. They aren't even dead yet when they start eating them. A comic book villian runs Nvidia and he will never stop until the whole world hates him.

OR

Nvidia operates like any other business and profit is their only concern. AMD, Intel, and Nvidia all answer to stock holders. AMD is not the Messiah, and Nvidia is not run by Satan.
Posted on Reply
#69
Phxprovost
Xtreme Refugee
DaedalusHeliosYeah man they are like evil incarnate and eat babies. Seriously, real babies. They aren't even dead yet when they start eating them. A comic book villian runs Nvidia and he will never stop until the whole world hates him.

OR

Nvidia operates like any other business and profit is their only concern. AMD, Intel, and Nvidia all answer to stock holders. AMD is not the Messiah, and Nvidia is not run by Satan.
:wtf: cause i ever made that claim right? Im simply saying it would not surprise me at all if maybe 3 months from now Nvidia decides to off this in a driver update....you know kinda like they did in the past? Or are we just choosing to ignore that? :rolleyes:
Posted on Reply
#70
theubersmurf
BarbaricSoulis it really worth it? I mean really, how many games actually use PhysX

I had a 8800gt running with my 5870 at one point, wasn't worth the trouble to set it up though.
The unreal engine is heavily licensed and uses physx, that alone may make it worthwhile.
Posted on Reply
#71
crow1001
WTF has physx got to offer apart from some crappy effects that have no effect on gameplay whatsoever, very limited support in games " current and in the future " with the majority being complete balls, oh yeah expect 50% drop in FPS with phsyx hardware accelerated games. Havok FTW.:rockout:
Posted on Reply
#72
MilkyWay
to late because the amount of games that use physx is limited and the popularity it had has dropped, they kinda dropped it when people where buying second cards for physx

to me its a waste of say £20-£40

but nice gesture if you have one lying around
Posted on Reply
#73
DaedalusHelios
Phxprovost:wtf: cause i ever made that claim right? Im simply saying it would not surprise me at all if maybe 3 months from now Nvidia decides to off this in a driver update....you know kinda like they did in the past? Or are we just choosing to ignore that? :rolleyes:
I am just tired of the cartoon-like paranoia. People circumvented it and that is why they just opened it up and left it for all to use IMO. You know they did a while back and now the two reasons they were holding back are gone. The other reason was a lack of high end offerings to combat ATi in the time gap before release thanks to fab issues. They also desperately need to move G92 derivatives and the best way to do so is SLI support and/or PhysX friendly cards. You don't remove support unless you have a damn good reason because it is a business.
Posted on Reply
#75
newtekie1
Semi-Retired Folder
kenkickrThe minimum I would recommend would be a 9600GS/GSO or 8800GS. Anything below is just uncivilized:D

Physxinfo.com is a great site to follow when it comes to physx supported games.
DaedalusHeliosYou definately need G92 8800gt or better for physX in games like Batman: AA and others. The Shellshocker right now is not good enough if that is what you might be looking at as 128bit memory cripples that card. :toast:
A 9600GT with 64 shaders was more than enough to handle Batman: AA at high PhysX.

And the shellshocker would work perfectly, as it is the shader power that matters, the memory bus is relatively unimportant for PhysX performance, which is why cards like the GT240/GT220 make great PhysX cards.
crow1001WTF has physx got to offer apart from some crappy effects that have no effect on gameplay whatsoever, very limited support in games " current and in the future " with the majority being complete balls, oh yeah expect 50% drop in FPS with phsyx hardware accelerated games. Havok FTW.:rockout:
PhysX and Havok, in a non hardware accelerated environemnt offer pretty much the same effects for the same performance hit, everything is run on the CPU. So really, there is no reason to say "Havok FTW", as it offers nothing over PhysX, but PhysX offers the option of hardware accleration to add more effect. Yes, more effects on the screen means more to render for the GPU, and if you have a single GPU doing PhysX also, then the performance hit can be rather noticeable. However, if you don't like it, just turn hardware accelerated PhysX off, and the game won't be any different from a Havok implementation.
Posted on Reply
Add your own comment
Dec 18th, 2024 08:10 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts