# NVIDIA Removes Restriction on ATI GPUs with NVIDIA GPUs Processing PhysX



## btarunr (May 28, 2010)

NVIDIA has reportedly removed the driver-level code which restricts users from having an NVIDIA GeForce GPU process PhysX with an ATI Radeon GPU in the lead, processing graphics. Version 257.15 Beta of the GeForce drivers brought about this change. Possible commercial interests may have played NVIDIA's previous decision to prevent the use of GeForce GPUs to process PhysX with ATI Radeon GPUs, where users could buy an inexpensive GeForce GPU to go with a high-end DirectX 11 compliant Radeon GPU, thereby reducing NVIDIA's margins, though officially NVIDIA maintained that the restriction was in place to ensure Quality Assurance. The present move also seems to have commercial interests in mind, as NVIDIA could clear inventories of GeForce GPUs at least to users of ATI Radeon GPUs. NVIDIA replenished its high-end offering recently with the DirectX 11 compliant GeForce 400 series GPUs.

*Update (28/05):* A fresh report by Anandtech says that the ability to use GeForce for PhysX in systems with graphics led by Radeon GPUs with the 257.15 beta driver is just a bug and not a feature. It means that this ability is one-off for this particular version of the driver, and future drivers may not feature it.

*View at TechPowerUp Main Site*


----------



## lyndonguitar (May 28, 2010)

thats nice, too bad i don't have a dual pcie Mobo. 

next rig baby.


----------



## roast (May 28, 2010)

I'm actually shocked nVidia went ahead and did this. I always hoped they would, but never thought they would go through with it. Thats cool.


----------



## air_ii (May 28, 2010)

Let's hope they don't take that back later (say, after you buy a 9800gt) .


----------



## TheMailMan78 (May 28, 2010)

Well I'm buying a Nvidia GPU now!


----------



## AlienIsGOD (May 28, 2010)

Ima wait a cpl days to see if this pans out... I got 3 PCI-E slots so I could def throw one of these in there,,,,,


----------



## H82LUZ73 (May 28, 2010)

Why do I get this feeling it`s only for the GTX470-480 line.......


----------



## qubit (May 28, 2010)

Finally they see sense.  If it's to get rid of old cards, then does that suggest they'll reintriduce the restriction in a later driver, all in the name of "Quality Assurance"??


----------



## RejZoR (May 28, 2010)

Well, this restriction was only hurting them, so it makes sense...
They want to sell as much new cards as possible while also get rid of old stock.
New users won't buy old crap and AMD users won't either. However with this restrictions lifted, many AMD users will probably buy some cheaper low-mid end GeForce to run PhysX. This is also a point where PhysX is starting to make sense for everyone. I just wonder if there is any way to get around Vista limitation, so you could run HD5850 as primary and a mid end GeForce for PhysX. Coz as far as i know this can only be done on Win7...


----------



## zAAm (May 28, 2010)

RejZoR said:


> Well, this restriction was only hurting them, so it makes sense...
> They want to sell as much new cards as possible while also get rid of old stock.
> New users won't buy old crap and AMD users won't either. However with this restrictions lifted, many AMD users will probably buy some cheaper low-mid end GeForce to run PhysX. This is also a point where PhysX is starting to make sense for everyone. I just wonder if there is any way to get around Vista limitation, so you could run HD5850 as primary and a mid end GeForce for PhysX. Coz as far as i know this can only be done on Win7...



Vista limitation will be there forever as far as I know since it uses WDDM1.0. This ensures that only one display driver can be used at a time so unless ATI cards start using nVidia drivers or vice versa or Microsoft brings out a WDDM1.1 patch to Vista, it will never work on Vista.


----------



## crow1001 (May 28, 2010)

LoL Nvidia are hurting bad from ATI's current popularity and sales, fermi sales are weak and profits marginal, they now open up physx to ATI cards in an attempt to capitalize on ATI's popularity, be in no doubt Nvidia are in pain financially, they didn't do this out of the kindness of their harts..


----------



## WSP (May 28, 2010)

I'm waitin until WHQL driver release, and see what happens.


----------



## Loosenut (May 28, 2010)

I'm out of PCIe slots but I do have a PCIex1 available, you guys think these new Nvidia drivers will work with my Ageia card?


----------



## theubersmurf (May 28, 2010)

Is there any sort of official statement on this? Or are they just going to leave us up in the air? I too get the feeling they may pull the rug out from under everyone later on...I'd like to hear them say something like "We've decided to make it available to you...blah blah" Whatever, even if it's a very corporate response, just so long as I know this isn't a mistake in the current driver version.


----------



## Phxprovost (May 28, 2010)

ohh good so nvidia kicks sand in my face and tells me no back when i actually had an nvidia gpu laying around that i wanted to use in tandem with my ati gpu...but now that i got rid of it im suddenly aloud to use it with my ati gpu? go f*ck yourself nvidia :shadedshu


----------



## douglatins (May 28, 2010)

WOW, great, now i need a NVIDIA card and a triple slot mobo



Phxprovost said:


> ohh good so nvidia kicks sand in my face and tells me no back when i actually had an nvidia gpu laying around that i wanted to use in tandem with my ati gpu...but now that i got rid of it im suddenly aloud to use it with my ati gpu? go f*ck yourself nvidia :shadedshu



Thats hardly their fault


----------



## RejZoR (May 28, 2010)

zAAm said:


> Vista limitation will be there forever as far as I know since it uses WDDM1.0. This ensures that only one display driver can be used at a time so unless ATI cards start using nVidia drivers or vice versa or Microsoft brings out a WDDM1.1 patch to Vista, it will never work on Vista.



I know, but SLi and Crossfire can be done. So it's not entirely locked. Unless it's limited to one model only even if you have more of them. Unlike PhysX where you'll have a different kind of adapter.


----------



## Phxprovost (May 28, 2010)

douglatins said:


> Thats hardly their fault



its hardly their fault that just out of the blue one day they decided to be dicks and block ati cards on a driver level?  Ohh right i forgot that code just magically appears in drivers sometimes


----------



## BarbaricSoul (May 28, 2010)

is it really worth it? I mean really, how many games actually use PhysX

I had a 8800gt running with my 5870 at one point, wasn't worth the trouble to set it up though.


----------



## dadi_oh (May 28, 2010)

Well everyone who really wanted Physx was able to easily apply a patch to do it (I am running a GT 240 Physx with my 5870 Eyefinity). This just opens it to the masses who are too nervous to try a hack.

I think it is a smart move by Nvidia to allow them to start moving low end cards into ATI machines Although it would have been smarter if they hadn't been jerks in the first place by adding the restriction (in the name of some lame ass excuse about "quality"). How transparent was that....

Now watch the price of these low power GT220 GT240 GT250 cards take a bump as people jump on the band wagon...  No extra pcie power connector required and more than enough shader power for Physx dedicated cards....


----------



## dadi_oh (May 28, 2010)

BarbaricSoul said:


> is it really worth it? I mean really, how many games actually use PhysX
> 
> I had a 8800gt running with my 5870 at one point, wasn't worth the trouble to set it up though.



True enough. I did it because I am a HW junkie and if it can be done it shoudl be done 

About the only game I have that can use it is UT3 and that is only in 3 dedicated Physx maps. I have Mirrors Edge but I just can't get enthusiastic about that game. I am thinking about getting Batman Arkham Asylum just to try out the eye candy but I'll wait until the price drops a bit.

edit: Maybe this move will encourage developeers to start introducing more viable Physx games... but that could be months away in their development cycle...


----------



## BarbaricSoul (May 28, 2010)

dadi_oh said:


> . I did it because I am a HW junkie and if it can be done it shoudl be done
> .



That's why I did it also, because I could.


----------



## Marineborn (May 28, 2010)

they only did this cause they know all there rebranded shit wont sell and all its good for know is the garbage ass Phyx chip on it now, there desperate


----------



## dir_d (May 28, 2010)

This is a great marketing stunt for Nvidia, they were feeling the pressure and loosing customers now they still get some card sells even if its not the 300+ cards and they get physx out....If Nvidia keeps these drivers out i can see marketing next year...why buy 2 cards for physx when you can have just 1 card and get it all.


----------



## RejZoR (May 28, 2010)

Because it's easier to keep existing Radeon and buyng cheap GeForce as additional card opposed to buyng brand new super expensive GTX 470/GTX 480 card...


----------



## newtekie1 (May 28, 2010)

It isn't surprising, and btarunner pretty much hit the nail right on the head, this is done so they can clear out old weaker GPUs as PhysX cards.  A quick trip over to eVGADIA...I mean eVGA...and they have huge adverts on the main page that say "GT 240 Makes a great Dedicated PhysX card!" and "Maximize your gaming experience with a PhysX card!"  They aren't even calling them graphics cards at this point, they are simply referring to them as PhysX cards...



lyndonrakista said:


> thats nice, too bad i don't have a dual pcie Mobo.
> 
> next rig baby.



You've got a PCI-E x1 slot...buy a dremel and a super cheap $50 9800GT and make yourselft a PCI-E x1 PhysX card.  I recently chopped PCI-E x16 card down to fit in a board with no x16 slot, and I was actually surprised at how easy it really was.  I was affraid at first, and though it would be hard, but really it wasn't.


----------



## RejZoR (May 28, 2010)

I think GeForce 8400GS 512MB should do the trick. I mean, was the actual PhysX PPU any faster than this thing?


----------



## dadi_oh (May 28, 2010)

RejZoR said:


> I think GeForce 8400GS 512MB should do the trick. I mean, was the actual PhysX PPU any faster than this thing?



Does the 8400GS have enough shaders though? I thought Physx needed 32 shaders minimum and I think the 8400GS only has 16?


----------



## ShRoOmAlIsTiC (May 28, 2010)

32 min,  8400gs is 16.   9600 or better should be used.  the agiea ppu was actually a little better then a 9600gt.  There was a review somewhere a while back.


----------



## kenkickr (May 28, 2010)

BarbaricSoul said:


> is it really worth it? I mean really, how many games actually use PhysX
> 
> I had a 8800gt running with my 5870 at one point, wasn't worth the trouble to set it up though.



There isn't a whole lot of games that use Physx but compared to Ghost Recon Advanced Warfighter, one of the first to support PPU Physx but it really didn't make much of a difference except lower avg framerates, there are games now that will blow your mind with GPU physx enabled like Batman Arkham Asylum and Metro 2033.  Mirrors Edge is cool but I would definitely recommend  BAA and Metro not just for the physx but tremendous gameplay and story.

I didn't think the 8400GS supported Physx.


----------



## Mussels (May 28, 2010)

cool.... now i can slap in my 8600GT and... uhh.. well actually i have no games that use PhysX, lol. might aquire metro 2033 just for this.


----------



## TRIPTEX_CAN (May 28, 2010)

I like the idea of getting higher quality physics but I'm not convinced on having to add another card to play the 6 games that actually use Nvidia physx. It's a desperation move from Nvidia on 2 fronts. It encourages more developers to use their physx and helps to move EOL GPUs off the shelves. I would not be surprised to see Nvidia drop support for this after it no longer fits their agenda. 

Still with that said the Physx in Maffia 2 looks amazing. 

http://www.youtube.com/watch?v=0v1NgyzQ9tM http://www.youtube.com/watch?v=vcpEc6kC0HM

What's does a decent physx GPU run for now? If I could pick up a card for less that $50 I'd consider it.


----------



## Mussels (May 28, 2010)

TRIPTEX_MTL said:


> I like the idea of getting higher quality physics but I'm not convinced on having to add another card to play the 6 games that actually use Nvidia physx. It's a desperation move from Nvidia on 2 fronts. It encourages more developers to use their physx and helps to move EOL GPUs off the shelves. I would not be surprised to see Nvidia drop support for this after it no longer fits their agenda.
> 
> Still with that said the Physx in Maffia 2 looks amazing.
> 
> ...



its more than 6 now, with metro 2033 and mafia II (going by what you said, i've not heard of the game)

edit: from a link a few posts down, its 14 games now.


----------



## kenkickr (May 28, 2010)

TRIPTEX_MTL said:


> What's does a decent physx GPU run for now? If I could pick up a card for less that $50 I'd consider it.



The minimum I would recommend would be a 9600GS/GSO or 8800GS.  Anything below is just uncivilized

Physxinfo.com is a great site to follow when it comes to physx supported games.


----------



## lyndonguitar (May 28, 2010)

newtekie1 said:


> It isn't surprising, and btarunner pretty much hit the nail right on the head, this is done so they can clear out old weaker GPUs as PhysX cards.  A quick trip over to eVGADIA...I mean eVGA...and they have huge adverts on the main page that say "GT 240 Makes a great Dedicated PhysX card!" and "Maximize your gaming experience with a PhysX card!"  They aren't even calling them graphics cards at this point, they are simply referring to them as PhysX cards...
> 
> 
> 
> You've got a PCI-E x1 slot...buy a dremel and a super cheap $50 9800GT and make yourselft a PCI-E x1 PhysX card.  I recently chopped PCI-E x16 card down to fit in a board with no x16 slot, and I was actually surprised at how easy it really was.  I was affraid at first, and though it would be hard, but really it wasn't.




really? Thats so cool. will it work on an Asus P5Q? and what about loses? Whats the advantage of x16 vs x1 when it comes to just Physx.


----------



## Mussels (May 28, 2010)

lyndonrakista said:


> really? Thats so cool. will it work on an Asus P5Q? and what about loses? Whats the advantage of x16 vs x1 when it comes to just Physx.



zero difference for physX


----------



## bpgt64 (May 28, 2010)

Good Strategy to move G92 based chips imo...I wonder if this is the beginning in a shift of thought between ATI and Nvidia, in the realization that Intel is more of a problem than one another.


----------



## TRIPTEX_CAN (May 28, 2010)

kenkickr said:


> The minimum I would recommend would be a 9600GS/GSO or 8800GS.  Anything below is just uncivilized
> 
> Physxinfo.com is a great site to follow when it comes to physx supported games.



lol I havent heard of 90% of the titles on that list but there is a few winners.


----------



## Mussels (May 28, 2010)

http://www.physxinfo.com/data/vreview.html

direct link to the list of hardware accelerated titles.


----------



## kenkickr (May 28, 2010)

I just hope the Aliens: Colonial Marines doesn't meet the same fate that Aliens vs Predator had.


----------



## VulkanBros (May 28, 2010)

Mussels said:


> cool.... now i can slap in my 8600GT and... uhh.. well actually i have no games that use PhysX, lol. might aquire metro 2033 just for this.



Can highly recommend Metro 2033 .. great visuals and a great game

Take a look at this nVidia PhysX games list...it´s soooooo looong...
http://www.nzone.com/object/nzone_physxgames_home.html


----------



## ShRoOmAlIsTiC (May 28, 2010)

this pretty awesome news for me actually.  I just ordered a xfx g 240 last night to replace my gtx260 physx card.  the gtx was just overkill and a power hog in my rig.   now I wont have to hack shit and from what i read the 240 is a great physx card.


----------



## TRIPTEX_CAN (May 28, 2010)

Just talked to a coworker who has an extra 8800GT kicking around for me to test with. That should be sufficient right?


----------



## Frick (May 28, 2010)

Phxprovost said:


> ohh good so nvidia kicks sand in my face and tells me no back when i actually had an nvidia gpu laying around that i wanted to use in tandem with my ati gpu...but now that i got rid of it im suddenly aloud to use it with my ati gpu? go f*ck yourself nvidia :shadedshu



Meh, better late then never. Stop whining.


----------



## kid41212003 (May 28, 2010)

You probably need to install their newest beta driver too (257.15). It's the first driver that actually allow PhysX on a single GPU in setup with both GPUs are the same.


----------



## kenkickr (May 28, 2010)

TRIPTEX_MTL said:


> Just talked to a coworker who has an extra 8800GT kicking around for me to test with. That should be sufficient right?



Definitely.


----------



## johnnyfiive (May 28, 2010)

Awesome, I have a gtx 465 coming soon. Maybe now I can play Metro 2033 without wanting to scratch my eyeballs.


----------



## filip007 (May 28, 2010)

PhysX is slowly opening for all, nVidia could do that and it will if there market share will continue to decline, some games like Mirror Edge can run PhysX as software but will slowdown any Radeon, i must say nVidia good luck.

AMD recently get a man from former Ageia so the future is not that bad after all.


----------



## phanbuey (May 28, 2010)

wow... they finally pulled their collective heads out of their a**es.


----------



## kenkickr (May 28, 2010)

johnnyfiive said:


> Awesome, I have a gtx 465 coming soon. Maybe now I can play Metro 2033 without wanting to scratch my eyeballs.



that 465 will be more than enough to remedy some of the massive framerate drops in Metro.  At first I couldn't stand it with my 5870 but even when I put in a 8800GS it improved greatly.


----------



## Fourstaff (May 28, 2010)

Start of a series of events that will put Nvidia into my good books, or calculated decision for publicity?


----------



## GotNoRice (May 28, 2010)

zAAm said:


> Vista limitation will be there forever as far as I know since it uses WDDM1.0. This ensures that only one display driver can be used at a time so unless ATI cards start using nVidia drivers or vice versa or Microsoft brings out a WDDM1.1 patch to Vista, it will never work on Vista.



Windows Vista was updated to WDDM1.1 at the same time it got DirectX11, in the platform update:

http://en.wikipedia.org/wiki/Windows_Vista_Platform_Update#Platform_Update


----------



## cadaveca (May 28, 2010)

GotNoRice said:


> Windows Vista was updated to WDDM1.1 at the same time it got DirectX11, in the platform update:
> 
> http://en.wikipedia.org/wiki/Windows_Vista_Platform_Update#Platform_Update



Hmmm...



> For example, even though DXGI 1.1 update introduces support for hardware 2D acceleration featured by WDDM 1.1 video drivers, *only Direct2D and DirectWrite will employ it* and GDI/GDI+ will continue to rely on software rendering.[citation needed] *Also, even though Direct3D 11 runtime will be able to run on D3D9-class hardware and WDDM drivers using "feature levels" first introduced in Direct3D 10.1, Desktop Windows Manager has not been updated to use either Direct3D 10.1 or WARP software rasterizer*


----------



## dadi_oh (May 28, 2010)

TRIPTEX_MTL said:


> What's does a decent physx GPU run for now? If I could pick up a card for less that $50 I'd consider it.



The GT 240 is a good bet. Got mine on a Newegg shell shockers for less than $50. Has really low power consumption (40nm process) and no extra power connector needed. And with 96 shaders I have yet to see it use more than 50% GPU usage. If that is the case then even a lowly GT 220 would do (48 shaders).


----------



## xtremesv (May 28, 2010)

LOL, sad I just recently sold my GT220 PPU which initially I bought to play BAA. I would’ve kept it if only I could’ve also used it as a CUDA dedicated card to enable the special filters in Just Cause 2.


----------



## $immond$ (May 28, 2010)

I knew Nvidia was going to do this, I am not sure why this was a shocker. This is just another reason why I will still buy Nvidia products.


----------



## GotNoRice (May 28, 2010)

cadaveca said:


> Hmmm...



That quote about the Desktop Windows Manager not using DX10.1 is just about Aero.  In Vista Aero uses DX9, in 7 Aero uses DX10.1 (or 9 via feature level support).  All it's saying is that Vista still uses DX9 for Aero and was never updated to use 10.1.

It's not talking about game support or anything like that.


----------



## EastCoasthandle (May 28, 2010)

qubit said:


> Finally they see sense.  If it's to get rid of old cards, then does that suggest they'll reintriduce the restriction in a later driver, all in the name of "Quality Assurance"??



You hit the nail on the head.  There is no telling when they will re-activate this policy.


----------



## dadi_oh (May 28, 2010)

GotNoRice said:


> Windows Vista was updated to WDDM1.1 at the same time it got DirectX11, in the platform update:
> 
> http://en.wikipedia.org/wiki/Windows_Vista_Platform_Update#Platform_Update



I couldn't get it to work on my son's machine with Vista Home Premium 32 bit. Trying to find the actual "platform update" in the myriad of updates at download.microsoft was impossible. The machine was fully updated via windows update but it still didn't work. Couldn't find the actual "platform update" even though the wikipedia article gives the date of the release.

Has anyone ACTUALLY got Vista to run with ATI and Nvidia drivers at the same time?


----------



## DaedalusHelios (May 28, 2010)

air_ii said:


> Let's hope they don't take that back later (say, after you buy a 9800gt) .



You definately need G92 8800gt or better for physX in games like Batman: AA and others. The Shellshocker right now is not good enough if that is what you might be looking at as 128bit memory cripples that card. 




EastCoasthandle said:


> You hit the nail on the head.  There is no telling when they will re-activate this policy.



They won't because the cat is now out of the bag. Lets not let pessimism and paranoia infect the thread. We have no reason to think they would be ok with commiting reputation suicide by revoking it. It would mess up marketing campaigns and things that are similar in the retail segment. The only "what if" is will they enable its acceleration on ATi GPUs. It would work as well and there is no denying that, but it could still work.


----------



## kenkickr (May 28, 2010)

EastCoasthandle said:


> You hit the nail on the head.  There is no telling when they will re-activate this policy.



But at least we know it's breakable if they do go back to "blocking" out ATI.


----------



## Mussels (May 28, 2010)

kenkickr said:


> But at least we know it's breakable if they do go back to "blocking" out ATI.



it was cracked/broken before Nv reversed it anyway.


----------



## Phxprovost (May 28, 2010)

DaedalusHelios said:


> They won't because the cat is now out of the bag. Lets not let pessimism and paranoia infect the thread. We have no reason to think they would be ok with commiting reputation suicide by revoking it. It would mess up marketing campaigns and things that are similar in the retail segment. The only "what if" is will they enable its acceleration on ATi GPUs. It would work as well and there is no denying that, but it could still work.



O rite cause a company that gives the CEO a "working sample" held together by wood screws clearly cares what the public thinks


----------



## DannibusX (May 28, 2010)

I'll keep my eye on this, if nVidia is seriously going to allow their cards to be used as PPUs I will buy a new card for PhysX.  One of the 200 series.  I'm using an 8800GT at the moment.


----------



## RejZoR (May 28, 2010)

Mussels said:


> cool.... now i can slap in my 8600GT and... uhh.. well actually i have no games that use PhysX, lol. might aquire metro 2033 just for this.



Well, if you have Core i7, you don't even need HW PhysX, I tried it with enabled Advanced PhysX and it worked pretty well on CPU itself. Maybe slight slowdown but still pretty much playable. Opposed to Mirror's Edge where it will lag like insane with Advanced PhysX with only one physics affected object moving around. Heh.


----------



## Mussels (May 28, 2010)

Phxprovost said:


> O rite cause a company that gives the CEO a "working sample" held together by wood screws clearly cares what the public thinks



i want details on whatever it is you are discussing. you have aroused my curiousity.


----------



## cadaveca (May 28, 2010)

He's talking about the developer conference where JH firist showed Fermi, which was a GTX480 with the end sawn off.


----------



## Mussels (May 28, 2010)

cadaveca said:


> He's talking about the developer conference where JH firist showed Fermi, which was a GTX480 with the end sawn off.



i wanna seeeeee it


----------



## DaedalusHelios (May 28, 2010)

Phxprovost said:


> O rite cause a company that gives the CEO a "working sample" held together by wood screws clearly cares what the public thinks



Yeah man they are like evil incarnate and eat babies. Seriously, real babies. They aren't even dead yet when they start eating them. A comic book villian runs Nvidia and he will never stop until the whole world hates him.

OR

Nvidia operates like any other business and profit is their only concern. AMD, Intel, and Nvidia all answer to stock holders. AMD is not the Messiah, and Nvidia is not run by Satan.


----------



## Phxprovost (May 28, 2010)

DaedalusHelios said:


> Yeah man they are like evil incarnate and eat babies. Seriously, real babies. They aren't even dead yet when they start eating them. A comic book villian runs Nvidia and he will never stop until the whole world hates him.
> 
> OR
> 
> Nvidia operates like any other business and profit is their only concern. AMD, Intel, and Nvidia all answer to stock holders. AMD is not the Messiah, and Nvidia is not run by Satan.



 cause i ever made that claim right?  Im simply saying it would not surprise me at all if maybe 3 months from now Nvidia decides to off this in a driver update....you know kinda like they did in the past? Or are we just choosing to ignore that?


----------



## theubersmurf (May 28, 2010)

BarbaricSoul said:


> is it really worth it? I mean really, how many games actually use PhysX
> 
> I had a 8800gt running with my 5870 at one point, wasn't worth the trouble to set it up though.


The unreal engine is heavily licensed and uses physx, that alone may make it worthwhile.


----------



## crow1001 (May 28, 2010)

WTF has physx got to offer apart from some crappy effects that have no effect on gameplay whatsoever, very limited support in games " current and in the future " with the majority being complete balls, oh yeah expect 50% drop in FPS with phsyx hardware accelerated games. Havok FTW.


----------



## MilkyWay (May 28, 2010)

to late because the amount of games that use physx is limited and the popularity it had has dropped, they kinda dropped it when people where buying second cards for physx

to me its a waste of say £20-£40

but nice gesture if you have one lying around


----------



## DaedalusHelios (May 28, 2010)

Phxprovost said:


> cause i ever made that claim right?  Im simply saying it would not surprise me at all if maybe 3 months from now Nvidia decides to off this in a driver update....you know kinda like they did in the past? Or are we just choosing to ignore that?



I am just tired of the cartoon-like paranoia.  People circumvented it and that is why they just opened it up and left it for all to use IMO. You know they did a while back and now the two reasons they were holding back are gone. The other reason was a lack of high end offerings to combat ATi in the time gap before release thanks to fab issues. They also desperately need to move G92 derivatives and the best way to do so is SLI support and/or PhysX friendly cards. You don't remove support unless you have a damn good reason because it is a business.


----------



## cadaveca (May 28, 2010)

Mussels said:


> i wanna seeeeee it



http://www.youtube.com/watch?v=mJOv3VlkEjQ


----------



## newtekie1 (May 28, 2010)

kenkickr said:


> The minimum I would recommend would be a 9600GS/GSO or 8800GS.  Anything below is just uncivilized
> 
> Physxinfo.com is a great site to follow when it comes to physx supported games.





DaedalusHelios said:


> You definately need G92 8800gt or better for physX in games like Batman: AA and others. The Shellshocker right now is not good enough if that is what you might be looking at as 128bit memory cripples that card.



A 9600GT with 64 shaders was more than enough to handle Batman: AA at high PhysX.

And the shellshocker would work perfectly, as it is the shader power that matters, the memory bus is relatively unimportant for PhysX performance, which is why cards like the GT240/GT220 make great PhysX cards.



crow1001 said:


> WTF has physx got to offer apart from some crappy effects that have no effect on gameplay whatsoever, very limited support in games " current and in the future " with the majority being complete balls, oh yeah expect 50% drop in FPS with phsyx hardware accelerated games. Havok FTW.



PhysX and Havok, in a non hardware accelerated environemnt offer pretty much the same effects for the same performance hit, everything is run on the CPU.  So really, there is no reason to say "Havok FTW", as it offers nothing over PhysX, but PhysX offers the option of hardware accleration to add more effect.  Yes, more effects on the screen means more to render for the GPU, and if you have a single GPU doing PhysX also, then the performance hit can be rather noticeable.  However, if you don't like it, just turn hardware accelerated PhysX off, and the game won't be any different from a Havok implementation.


----------



## DaedalusHelios (May 28, 2010)

newtekie1 said:


> A 9600GT with 64 shaders was more than enough to handle Batman: AA at high PhysX.
> 
> And the shellshocker would work perfectly, as it is the shader power that matters, the memory bus is relatively unimportant for PhysX performance, which is why cards like the GT240/GT220 make great PhysX cards.



Define "enough". It is a subjective statement.

If what you mean by enough is it will not limit framerate when paired with a 5870 then no. I have seen people benchmark Batman: AA and they have reported that the 8800gt 512mb vanilla had less frames operating as dedicated physX than when using 8800gts 512mb and even moreso than a GTX 260 when being used with a 5870. That means frames are limited by the card when running at max resolutions and settings. Now with lesser systems it may not be the case though. I am assuming highend since this is a gaming forum and I didn't explain it fully.


----------



## cdawall (May 28, 2010)

Only shitty part is my ch3 doesn't have room with 2 4850x2's installed the x1 slots are covered by the coolers would be ok if I went all water tho....hmmm tec cooled water sounds fun.


----------



## ShRoOmAlIsTiC (May 28, 2010)

my 240 will be monday,  ill do a bench with my 5850 and gtx260 in batman first then do another with the 5850 and the 240.  Just to see how big of a difference there is in physx.


check this out
http://www.elitebastards.com/index....rticle&id=842&catid=14&Itemid=27&limitstart=3

a 9600gso and 9800gt both come out equal in physx performance


----------



## DaedalusHelios (May 28, 2010)

ShRoOmAlIsTiC said:


> my 240 will be monday,  ill do a bench with my 5850 and gtx260 in batman first then do another with the 5850 and the 240.  Just to see how big of a difference there is in physx.



Please do. I haven't seen results with a 5850 yet and the only thing I have is 5870s on the ATi side. So no way for me to test that personally.


----------



## MilkyWay (May 28, 2010)

Buying a GTX260 for physx is a waste since i have a GTX 260 as my main card, the price of a GTX 260 is just not worth those few extra effects on those very few games. I also dont see a lot of developers supporting it in the future.

I cant believe you need a powerful card to run physx, not powerful maybe mid range these days but those cards arnt cheapo its definitely not worth it at all. Sure you can choose to do that if you like but it doesn't mean its value for money.


----------



## OnBoard (May 28, 2010)

Really hope it's just not a simple "we forgot to block ATI cards in the Beta" seeing people buying GeForce cards  If the coming WHQLs also allow this, then it should be good.

Never got why not allow as it means more sales that wouldn't have happened.


----------



## ShRoOmAlIsTiC (May 28, 2010)

MilkyWay said:


> Buying a GTX260 for physx is a waste since i have a GTX 260 as my main card, the price of a GTX 260 is just not worth those few extra effects on those very few games. I also dont see a lot of developers supporting it in the future.
> 
> I cant believe you need a powerful card to run physx, not powerful maybe mid range these days but those cards arnt cheapo its definitely not worth it at all. Sure you can choose to do that if you like but it doesn't mean its value for money.



I got mine in a trade and kept it for physx.  Not a good idea to buy a card that expensive for a physx card unless you can find one used on the forums for 100 bucks or so seeing thats how much some of the 240's and the 9800 gt's gofor.


----------



## saikamaldoss (May 28, 2010)




----------



## EastCoasthandle (May 28, 2010)

DaedalusHelios said:


> They won't because the cat is now out of the bag. Lets not let pessimism and paranoia infect the thread. We have no reason to think they would be ok with commiting reputation suicide by revoking it. It would mess up marketing campaigns and things that are similar in the retail segment. The only "what if" is will they enable its acceleration on ATi GPUs. It would work as well and there is no denying that, but it could still work.


Oh they will, they will.  This is Nv we are talking about here. 



> Yes, this is a bug in the latest build of PhysX that was packaged with the driver. We'll be fixing this issue ASAP - the WHQL driver launching in early June won't have this issue. -NVIDIA


----------



## newtekie1 (May 28, 2010)

DaedalusHelios said:


> Define "enough". It is a subjective statement.
> 
> If what you mean by enough is it will not limit framerate when paired with a 5870 then no. I have seen people benchmark Batman: AA and they have reported that the 8800gt 512mb vanilla had less frames operating as dedicated physX than when using 8800gts 512mb and even moreso than a GTX 260 when being used with a 5870. That means frames are limited by the card when running at max resolutions and settings. Now with lesser systems it may not be the case though. I am assuming highend since this is a gaming forum and I didn't explain it fully.



I haven't tested with a HD5800 card, but with the HD4890, there was no difference in performance between the 9600GT, my GTX260, and my GTX285 doing PhysX.  However, the HD4890 might have been holding back the performance.

I don't have a HD5850, but I do have an GTX470, I'll test the 9600GT vs. GTX285 to see if there is a difference in performance between thet two this weekend.



EastCoasthandle said:


> Oh they will, they will.  This is Nv we are talking about here.



Damn, that really sucks.


----------



## erocker (May 28, 2010)

I'm currently running two 5850's with a GT 240 for PhysX just fine. Going to pick up Batman tonight.  What tests do you guys need done?


----------



## poldo (May 28, 2010)

sorry guys, false alarm.

http://www.anandtech.com/show/3744/...terogeneous-gpu-physx-its-a-bug-not-a-feature



> Yes, this is a bug in the latest build of PhysX that was packaged with the driver. We'll be fixing this issue ASAP - the WHQL driver launching in early June won't have this issue. -NVIDIA



quoted from Anandtech:



> NVIDIA tells us that they will also be "fixing" the 257.15 beta driver on their site, so new downloads of that driver will have the restriction in place.


----------



## qubit (May 28, 2010)

poldo said:


> sorry guys, false alarm.
> 
> http://www.anandtech.com/show/3744/...terogeneous-gpu-physx-its-a-bug-not-a-feature



Phew, that's a relief. 

Sounded too good to be true, didn't it? No wonder no official announcement.


----------



## $immond$ (May 28, 2010)

I really hope this pisses off those f**king annoying ATI purists. 

You all know who you are.


----------



## DannibusX (May 28, 2010)

$immond$ said:


> I really hope this pisses off those f**king annoying ATI purists.
> 
> You all know who you are.



Huh?


----------



## EastCoasthandle (May 28, 2010)

$immond$ said:


> I really hope this pisses off those f**king annoying ATI purists.
> 
> You all know who you are.



I'm not sure what you mean.  They still get an updated driver to use for mods.


----------



## erocker (May 28, 2010)

Disregard their comment. It's been dealt with.


----------



## Phxprovost (May 28, 2010)

poldo said:


> sorry guys, false alarm.
> 
> http://www.anandtech.com/show/3744/...terogeneous-gpu-physx-its-a-bug-not-a-feature



............. sounds about right


----------



## christian27 (May 28, 2010)

I knew it was too good coming from Nvidia


----------



## slyfox2151 (May 28, 2010)

its a bug not a feature, good bye physx with ATI again. that was short lived


----------



## Loosenut (May 28, 2010)

slyfox2151 said:


> its a bug not a feature, good bye physx with ATI again. that was short lived



"bug" They made it sound like a disease. :shadedshu 

I'm sure someone has a copy somewhere if I ever want to try it.


----------



## enaher (May 28, 2010)

poldo said:


> sorry guys, false alarm.
> 
> http://www.anandtech.com/show/3744/...terogeneous-gpu-physx-its-a-bug-not-a-feature



Dman that sucks... well to bad still think it's better for nvidia to allow at card tu use a nvidia card for physx...


----------



## wahdangun (May 28, 2010)

EastCoasthandle said:


> Oh they will, they will.  This is Nv we are talking about here.



wow, what a dic* move nvdia.

and ppl who consider buying nv card for phisyx would be pissed right now.

EDIT : and btw btanur should update the news


----------



## crow1001 (May 28, 2010)

Ah well no big loss, in fact no loss at all.


----------



## poldo (May 28, 2010)

wahdangun said:


> EDIT : and btw btanur should update the news



the guy's fast, its already updated. i would've preferred it this way: "Nvidia, trolling with removing restriction..."


----------



## DaedalusHelios (May 28, 2010)

EastCoasthandle said:


> Oh they will, they will.  This is Nv we are talking about here.



I wish they would have allowed it. I guess we will continue to just modify a few files to make it work. It is against their own interest to do it but I guess they are ok with that. Is it the choice of the CEO or the board of directors/majority stock holders? If a company chooses to go against pursuing profit it will hurt their own company and stock holders which will ideally get the people who make those decisions fired. That is unless they aren't pursuing profit in that segment anymore and just making it a value added feature for their platform of graphics not allowing card mismatching. Still it is against their best interest and their company will be affected with decreased profits.

Maybe I thought their management was smarter than they in fact are. I do give people too much credit sometimes.


----------



## v12dock (May 28, 2010)

Good way to get rid of there massive G92 stock


----------



## TRIPTEX_CAN (May 28, 2010)

So being a dooshy ATI purist and all I guess I won't be buying a new Nvidia gpu for physx but I will still try this out with a used card. Intentional or not Nvidia still gave us newer drivers to play with, and whether or not they planned this there will be an increase in G92 sales either new GPUs or second hand. Nvidia still opened the door for us.


----------



## crow1001 (May 28, 2010)

LoL. why are people getting a stiffy for physx, all it is good for is trying, then you will soon realize what a waste of time, energy and heat a physx card is and subsequently remove it form your system.


----------



## erocker (May 28, 2010)

TRIPTEX_MTL said:


> So being a dooshy ATI purist and all I guess I won't be buying a new Nvidia gpu for physx but I will still try this out with a used card. Intentional or not Nvidia still gave us newer drivers to play with, and whether or not they planned this there will be an increase in G92 sales either new GPUs or second hand. Nvidia still opened the door for us.



By accident apparently. Either way "bug" or not, there's a way to do it.


----------



## WSP (May 28, 2010)

haha...
even before waiting on the next WHQL driver release, it's been updated that this is a bug.
too bad nv, I already downloaded those beta drivers with 'bug' :lol:

nv are really just making more people mad at them


----------



## Kitkat (May 28, 2010)

THEY HAD TO U KNOW IT. NO SECRET SHOWS its band rule number 1 bitches lol

also....That open physics initiative is as serious as the 3D one. They cant stop it with this. OPEN OPEN OPEN. Theres still too much (little) money in phyx and gaming all together for it to stop sorry.


----------



## newtekie1 (May 28, 2010)

erocker said:


> I'm currently running two 5850's with a GT 240 for PhysX just fine. Going to pick up Batman tonight.  What tests do you guys need done?



You don't happen to have a more powerful nVidia card also?  Basically the argument is that a card like the GT 240 or 9600GT is going to be slower then a more powerful card, and there seems to be a sticking point that 128-bit vs. 256-bit memory bus makes a difference in PhysX.  So maybe if you had a 9800GT to compare with the GT 240 it would be perfect since your graphics setup is one of the most powerful around.


----------



## TRIPTEX_CAN (May 28, 2010)

So who has these drivers and can post them on megaupload or similar?

edit. Nevermind I found them here. http://news.softpedia.com/news/Download-NVIDIA-GeForce-ION-257-15-Beta-Drivers-142852.shtml

Should still be the original version.


----------



## zads (May 28, 2010)

From your friends at Nvidia: NVidia shows a youtube of the PhysX restriction removal process


----------



## TRIPTEX_CAN (May 28, 2010)

zads said:


> From your friends at Nvidia: NVidia shows a youtube of the PhysX restriction removal process



This is why we should have a "no thanks" button. :shadedshu


----------



## qubit (May 28, 2010)

zads said:


> From your friends at Nvidia: NVidia shows a youtube of the PhysX restriction removal process



Dork. :shadedshu

It's a rickroll, everybody.


----------



## bogie (May 28, 2010)

So whats the best/affordable single slot nvidia physx solution to put with my ATi HD4870x2?


----------



## dadi_oh (May 28, 2010)

bogie said:


> So whats the best/affordable single slot nvidia physx solution to put with my ATi HD4870x2?



GT240


----------



## ShRoOmAlIsTiC (May 28, 2010)

TRIPTEX_MTL said:


> So who has these drivers and can post them on megaupload or similar?
> 
> edit. Nevermind I found them here. http://news.softpedia.com/news/Download-NVIDIA-GeForce-ION-257-15-Beta-Drivers-142852.shtml
> 
> Should still be the original version.



those come from an external mirror wich is linked to nvidia so those are probly the new ones now.

can someone host the original drivers for us?  I think mine might be the newer ones too.


----------



## DigitalUK (May 28, 2010)

yea just tested them and have been updated as well no physx, seems hard to find the ones from earlier now.


----------



## TheMailMan78 (May 28, 2010)

Ok now I am out of the market. Damn you Nvidia. Such a cock tease.


----------



## driver66 (May 28, 2010)

For all of the bitching about how physx sucks and is "useless" These threads sure do generate A LOT of interest


----------



## TheMailMan78 (May 28, 2010)

driver66 said:


> For all of the bitching about how physx sucks and is "useless" These threads sure do generate A LOT of interest



Well it pretty much is. But its still something to F@#k with and this is an enthusiast site after all.

You do understand that even if a game uses Physx it doesn't necessarily use hardware Physx. Very few games do. Maybe 15? So yeah its pointless in practicality. Its just fun to play with in benches and such.


----------



## cadaveca (May 28, 2010)

driver66 said:


> For all of the bitching about how physx sucks and is "useless" These threads sure do generate A LOT of interest



Human nature...we always want what we cannot have.


----------



## TheMailMan78 (May 28, 2010)

Actually I nailed the number of 15 games that use Physx hardware perfectly! Damn I'm good!

http://www.nzone.com/object/nzone_physxgames_home.html


----------



## driver66 (May 28, 2010)

I was just saying ............... 


It's just the irony 



MAILMAN   !!!!!!!!!!!   you stole my birth day !!!!!!!!!!!!!!!!!!!!!


----------



## Baam (May 28, 2010)

Bah. When i saw this story i figured i would grab a Nvidia card from newegg to give PhysX a try. Oh well thanks for the update, just saved me a few bucks. I guess if you use an AMD card as your primary card Nvidia doesn't want your  business.


----------



## ShRoOmAlIsTiC (May 28, 2010)

Baam said:


> Bah. When i saw this story i figured i would grab a Nvidia card from newegg to give PhysX a try. Oh well thanks for the update, just saved me a few bucks. I guess if you use an AMD card as your primary card Nvidia doesn't want your  business.



which is kinda wierd if you ask me,  they make motherboards for amd,  why not make physx cards for amd as well.


----------



## dadi_oh (May 28, 2010)

Baam said:


> Bah. When i saw this story i figured i would grab a Nvidia card from newegg to give PhysX a try. Oh well thanks for the update, just saved me a few bucks. I guess if you use an AMD card as your primary card Nvidia doesn't want your  business.



You can still do it  Very quick and easy. 

See link.

Works great for me. GT 240 as a dedicated Physx and 5870 main card.


----------



## EastCoasthandle (May 28, 2010)

I honestly don't think it was a mistake to leave out that stuff in the latest update.  It appears that they did this as a one time thing.  And, I wouldn't be surprise to see it again on another non WHQL release driver.  


Think about it, if it really was a mistake then the file should have been replaced by now.  Downloading the file now should have the locks in place.  Anyone want to test that theory?


----------



## dlpatague (May 29, 2010)

I use a GTS 250 on the x4 slot of my Rampage II Gene with 2 5850s in CF. It works great. If you're going to buy an Nvidia card for dedicated PhysX I suggest you buy a 9800GT or higher. 

If anyone wants more info about PhysX Hybrid setups I suggest you read this site: http://physxinfo.com/news/2789/hybrid-physx-mod-1-03-available/

I post regular comments on there under the name xDee xDee.

This is my rig: http://www.techpowerup.com/gallery/2634.html


----------



## erocker (May 29, 2010)

dlpatague said:


> I use a GTS 250 on the x4 slot of my Rampage II Gene with 2 5850s in CF. It works great. If you're going to buy an Nvidia card for dedicated PhysX I suggest you buy a 9800GT or higher.
> 
> If anyone wants more info about PhysX Hybrid setups I suggest you read this site: http://physxinfo.com/news/2789/hybrid-physx-mod-1-03-available/
> 
> ...



My GT 240 is working great in Batman with PhysX on high x8 AA. Plus, no external power connector to deal with. NGOHQ.com has all the things you need to get it running.


----------



## qubit (May 29, 2010)

driver66 said:


> For all of the bitching about how physx sucks and is "useless" These threads sure do generate A LOT of interest



I for one never thought PhysX (or Havok for that matter) suck. It's just underused, reducing it to useless eye candy, rather than the fully destructible environments that it's capable of. I believe a handful of games actually deliver this - Bad Company 2 is it? - and it seriously improves the game.

When I saw Havok first used in Half-Life 2, it was fantastic.


----------



## newtekie1 (May 29, 2010)

ShRoOmAlIsTiC said:


> which is kinda wierd if you ask me,  they make motherboards for amd,  why not make physx cards for amd as well.



ATi has always been a direct competitor to nVidia.

However, nVidia started making AMD chipsets before AMD bought ATi, long before.  It has only been recently that AMD became a direct competitor by buying ATi, and it doesn't make sense for nVidia to just shut down their entire chipset devision because of it.



qubit said:


> I for one never thought PhysX (or Havok for that matter) suck. It's just underused, reducing it to useless eye candy, rather than the fully destructible environments that it's capable of. I believe a handful of games actually deliver this - Bad Company 2 is it? - and it seriously improves the game.
> 
> When I saw Havok first used in Half-Life 2, it was fantastic.



The hardware accelerated parts of PhysX definitely are unuderused, and reduced to useless eye candy.  

However, the software parts of PhysX, that run on the CPU like Havok, tend to be what makes the game playable and have anything moveable that interacts with the player.

I would really like to see PhysX uses to its full portential in games, with fully destructable environments, but saddly no developer will ever do that unless every gamer can use it.  This means we will never see it unless PhysX runs on ATi hardware, or at least runs on a cheap nVidia card with an ATi card as the main GPU.


----------



## ShRoOmAlIsTiC (May 29, 2010)

i get what your saying,  but its still kind of the same thing.  they make chipset for amd to make money from there chipsets.  Why not release the lock to make more money off ati/amd people who want physx.  physx isnt a big deciding point on buying the main rendering video card.  So for those people that do decide on ati they will still be able to make a buck off them buy selling them another video card for physx.  Its like a win win situation for them and there not taking advantage of it.


----------



## erocker (May 29, 2010)

ShRoOmAlIsTiC said:


> i get what your saying,  but its still kind of the same thing.  they make chipset for amd to make money from there chipsets.  Why not release the lock to make more money off ati/amd people who want physx.  physx isnt a big deciding point on buying the main rendering video card.  So for those people that do decide on ati they will still be able to make a buck off them buy selling them another video card for physx.  Its like a win win situation for them and there not taking advantage of it.



That would completely go against their marketing philosophy. PhysX and CUDA are the reasons they want to to buy their cards exclusively. They feel that blocking these features to users with a different graphics card makes them have to buy their cards.


----------



## DailymotionGamer (May 29, 2010)

I guess i am the only person that thinks nvidia physx is stupid right?


----------



## DannibusX (May 29, 2010)

u2konline said:


> I guess i am the only person that thinks nvidia physx is stupid right?



It's not stupid by any means, I think it's a cool technology.  The only problem is nVidia is keeping it locked from people who don't use their GPU's as a primary card.

Based on your system specs, you may not know the difference between hardware PhysX and the ilk.  The only title I can really comment on is Batman AA which makes excellent use of the technology and it looks great too.

Check out this video for a comparison between PhysX and non-PhysX.

http://www.youtube.com/watch?v=6GyKCM-Bpuw


----------



## dlpatague (May 29, 2010)

PhysX itself isn't stupid. What's stupid is the way Nvidia is handling it's usage. 

I really just have a very hard time understanding how they can justify disabling a feature that works 100% with non-Nvidia GPUs in the same system. Obviously if people want PhysX they have to use an Nvidia GPU. So either way they would get sales. It's not marketing it's pigheadedness. To think someone paid their money for an Nvidia card and they can't use it how they want to, regardless if it is not their primary adapter, and Nvidia intentionally disables a working feature of a video card is ridiculous.

Heck I bet it would help them clear the shelves of their older cards because people using ATI video cards would like to purchase a 9800GT or even newer cards for PhysX which would be a sell that otherwise wouldn't have happened at all.


----------



## newtekie1 (May 29, 2010)

ShRoOmAlIsTiC said:


> i get what your saying,  but its still kind of the same thing.  they make chipset for amd to make money from there chipsets.  Why not release the lock to make more money off ati/amd people who want physx.  physx isnt a big deciding point on buying the main rendering video card.  So for those people that do decide on ati they will still be able to make a buck off them buy selling them another video card for physx.  Its like a win win situation for them and there not taking advantage of it.



Personally, I agree with you.  However, I'm just stating why the PhysX and Chipset comparision is flawed.

Stopping their chipset business would be like shutting down an entire devision of their company, it would be stupid. And as I stated, AMD wasn't a competitor until very recently when they aquired ATi.



dlpatague said:


> PhysX itself isn't stupid. *What's stupid is the way Nvidia is handling it's usage. *
> 
> I really just have a very hard time understanding how they can justify disabling a feature that works 100% with non-Nvidia GPUs in the same system. Obviously if people want PhysX they have to use an Nvidia GPU. So either way they would get sales. It's not marketing it's pigheadedness. To think someone paid their money for an Nvidia card and they can't use it how they want to, regardless if it is not their primary adapter, and Nvidia intentionally disables a working feature of a video card is ridiculous.
> 
> Heck I bet it would help them clear the shelves of their older cards because people using ATI video cards would like to purchase a 9800GT or even newer cards for PhysX which would be a sell that otherwise wouldn't have happened at all.



Yep, it is completely idiotic.  I think a lot of ATi users would pick up a cheap nVidia card to use PhysX(and CUDA in games like Just Cause 2).  Of course the problem is that nVidia doesn't make as much money on the cheaper cards compared to the higher end, but something is better then making nothing...


----------



## DannibusX (May 29, 2010)

newtekie1 said:


> Yep, it is completely idiotic.  I think a lot of ATi users would pick up a cheap nVidia card to use PhysX(and CUDA in games like Just Cause 2).  Of course the problem is that nVidia doesn't make as much money on the cheaper cards compared to the higher end, but something is better then making nothing...



I think they would get pure profit from opening it up, simply because ATI users are ATI users.  Someone buying an nVidia card for PhysX support only wasn't necessarily going to buy a high-end nVidia in the first place.

I think I said it before, they could totally run on the whole "Why pay more for PhysX?" campaign.


----------



## cadaveca (May 29, 2010)

nV just needs to start selling G92 without display connections, and a blank backplate, to be specifically used as a Phys-X card. I fail to understand why they have not done this yet...


----------



## TRIPTEX_CAN (May 29, 2010)

DannibusX said:


> I think they would get pure profit from opening it up, simply because ATI users are ATI users.  *Someone buying an nVidia card for PhysX support only wasn't necessarily going to buy a high-end nVidia in the first place.*I think I said it before, they could totally run on the whole "Why pay more for PhysX?" campaign.



This is what I keep thinking. I would never buy an Nvidia GPU as a primary based on the minimal benifits of Physx or CUDA and I find it had to believe many people actually would. 

Could you imagine ATI diasbling Eyefinity functionality on all systems which employ secondary NV Physx GPUs?  I couldnt either. 

Allowing everyone regardless of the primary GPU to join the Physx party only results in not only in improved sales (which everyone knows they need) but also encourages more developers to put more A list Physx titles to market. Its a win situation for everyone in cluding NV despite how their flawed logic views the subject. 

Like I said before... not officially supporting this means NV will never receive my money for a new GPU dedcated to Physx but Ill still consider buying a used card.


----------



## xtremesv (May 29, 2010)

crow1001 said:


> LoL. why are people getting a stiffy for physx, all it is good for is trying, then you will soon realize what a waste of time, energy and heat a physx card is and subsequently remove it form your system.



Happened here!!!

Unless, 50% or more of the top PC games out there start relying on PhysX... I doubt this is going to occur anyway 

Bad move Nvidia.


----------



## xtremesv (May 29, 2010)

newtekie1 said:


> (...) I think a lot of ATi users would pick up a cheap nVidia card to use PhysX(and CUDA in games like Just Cause 2).  Of course the problem is that nVidia doesn't make as much money on the cheaper cards compared to the higher end, but something is better then making nothing...



You cannot use an Nvidia card as a secondary dedicated CUDA card in JC2. The main renderer has to support CUDA in order to enable the special features.


----------



## newtekie1 (May 29, 2010)

DannibusX said:


> I think they would get pure profit from opening it up, simply because ATI users are ATI users.  Someone buying an nVidia card for PhysX support only wasn't necessarily going to buy a high-end nVidia in the first place.
> 
> I think I said it before, they could totally run on the whole "Why pay more for PhysX?" campaign.



Not pure profit exactly.  As giving the consumer the option to buy your competitors high end high profit product, while getting the benefits of your product with a lower end low profit card isn't always best.

Put it like this, you've got two high end sports cards from two manufacturers.  Both are very similar all around in performance and price, say they are both about $200K, and the profit is $150K per car.  Car A has cup holders, and you like cup holders but don't need them. Car B has sun visors, and you like sun visors but don't need them.  Then, the manufacturer of Car B releases a very cheap $15K car that has sun visors also, that fit perfectly in Car A, but the profit of this new cheap car is only $2K.  So now, what are you the consumer going to do?  Buy Car A, give $150K in profit to that company, then buy the cheap car and only give $2K to that company.  Would you see why the company making the cheap car would then change their sun visors so they won't work in Car A?  Yes, they are still making the $2K profit, but if they didn't give the consume that easy option to go with the competitor and not loose any functionality, then they are losing a potential $148.  Yes, the consume still might have gone with the competitors car anyway, but they might not have.



xtremesv said:


> You cannot use an Nvidia card as a secondary dedicated CUDA card in JC2. The main renderer has to support CUDA in order to enable the special features.



Thats good, because I never said you could.

They can't use Hardware Accelerated PhysX either, at least not officially, but my point was that they would buy a cheaper nVidia card if they could.


----------



## Mussels (May 29, 2010)

newtekie1 said:


> Not pure profit exactly.  As giving the consumer the option to buy your competitors high end high profit product, while getting the benefits of your product with a lower end low profit card isn't always best.
> 
> Put it like this, you've got two high end sports cards from two manufacturers.  Both are very similar all around in performance and price, say they are both about $200K, and the profit is $150K per car.  Car A has cup holders, and you like cup holders but don't need them. Car B has sun visors, and you like sun visors but don't need them.  Then, the manufacturer of Car B releases a very cheap $15K car that has sun visors also, that fit perfectly in Car A, but the profit of this new cheap car is only $2K.  So now, what are you the consumer going to do?  Buy Car A, give $150K in profit to that company, then buy the cheap car and only give $2K to that company.  Would you see why the company making the cheap car would then change their sun visors so they won't work in Car A?  Yes, they are still making the $2K profit, but if they didn't give the consume that easy option to go with the competitor and not loose any functionality, then they are losing a potential $148.  Yes, the consume still might have gone with the competitors car anyway, but they might not have.



doesnt work. profit from high end GPU's is minimal, they make more money from bulk sales of low end cards.

Lets put it another way: If ATI have a 30% market share and PhysX was worth it and allowed in ATI systems, that could potentially be a lot of PC's running a secondary Nvidia card.

Putting a more accurate example in:

Nvidia sell a car, which runs hot and chews fuel. it has a sunroof and cup holders.

ATI released a car which is a tiny bit slower, but cheaper, far better on fuel, and has no sunroof and cupholders.

You can buy a kit nvidia sell optionally (say, a 9600GT) to add that sunroof and cup holder to your car... it fits. but Nvidia specifically forbid you to do so, even tho they make money from it cause they'd rather you buy a new car, than an optional product.


----------



## newtekie1 (May 29, 2010)

Mussels said:


> doesnt work. profit from high end GPU's is minimal, they make more money from bulk sales of low end cards.
> 
> Lets put it another way: If ATI have a 30% market share and PhysX was worth it and allowed in ATI systems, that could potentially be a lot of PC's running a secondary Nvidia card.
> 
> ...



Overall profit on high end cards is minimal, and in nVidia's case the profit per card is minimal also thanks to their gigantic die, but generally the profit per card is higher on high end cards.  And if you wanted to be technical, desktop cards, low through high end, only make up 1/3 of nVidia's graphics card profits, but account for 2/3 of their sales volume.  Sales of their ultra high-end Quadro cards make up 2/3 of their graphics card profits and 1/3 of the sales volume.  So...no, profit from high end cards isn't lower...

However, the volume is relatively the same when talking about buying cards just for PhysX.


----------



## Mussels (May 29, 2010)

newtekie1 said:


> Overall profit on high end cards is minimal, and in nVidia's case the profit per card is minimal also thanks to their gigantic die, but generally the profit per card is higher on high end cards.
> 
> However, the volume is relatively the same when talking about buying cards just for PhysX.



i just see it as a dumb move, cause i'd rather see my cards used as a feature booster than not at all.


----------



## newtekie1 (May 29, 2010)

Mussels said:


> i just see it as a dumb move, cause i'd rather see my cards used as a feature booster than not at all.



I agree entirely, as I've already pointed out. I'm just explaining what I believe their reasoning is, I'm not saying I agree with it.


----------



## Robert-The-Rambler (May 29, 2010)

*I will find a way regardless to use my just purchased 240 GT as a Physics PPU*

I have the room in my case and the slot on my motherboard and a 4850 X2 just looking for some help in Batman Arkham Asylum. I don't care if Nvidia wants to try to stop me. I will make it happen. For those of us who already have ATI graphics cards their business strategy is to block the only friggin reason for us to purchase any of their gear because we don't need it for anything else. Nvidia is so dumb that they don't realize that they are simply alienating a large crowd of gamers who already have ATI cards and frankly don't need to switch brands for normal graphics. WE WANT TO BUY YOUR PRODUCT. WHY ARE YOU TRYING TO STOP US? DID YOU EVER HEAR OF 3DFX? Yes you probably did. Hell, you bought their SLI technology  and I hope you either wake up or suffer the same fate as 3DFX. No marketing strategy has ever included preventing sales. I guess Nvidia is looking to be inventive. But a wheel is round for a reason and no possible sale should ever be turned away.


----------



## Benetanegia (May 29, 2010)

I don't know why so many people don't understand why Nvidia disables PhysX with a non-Nvidia card. It's just not profitable to ensure QA. Just because the hack (and in this case the un-locked beta drivers) works for the mayority, that doesn't mean it works for everybody without a single problem (for instance it won't work in Vista). Things that come from companies like Nvidia, Ati,Intel, etc have to work 100% or at least 99.9999999% of the times. Plain and simple.

Someone somewhere will always be able to hack something or mod something that will work 99% of the times without spending excesive time and money on the development, but they are free of responsability if that 1% for which it doesn't work as it should, breaks their PC trying to make it work. Companies have to ensure by law that it works on 100% of the cases and when it fails they have legal responsability. It's that 1% that costs these companies (and this goes for any tech company, game developer, car vendors, whatever) a lot of money in QA, but they have to do it, because even something that seems so small as 1% is a very big number of people in real life, outside iof enthusiast forums. A hack is used by very few people, which can literally translate to 99 people saying how well it works and only one person saying it broke his windows installment. That person will be ignored and people think it works flawlessly which in most cases is probably true, but not always. There's still the fact that it could NOT work in certain cases, because it has not been tested. If something untested was officially released and it didn't work in just 1% of people, that would still make a number of more than 1 million failing cases and that would make a lot of noise... class actions would be put in place etc, etc. I repeat, companies have to ENSURE it works flawlessly and that costs a lot of money, not to mention having access to tech and IP that the company might not have, like for example, for Nvidia Southern Islands/Northerns Islands. How are them supposed to ensure 100% interoperability when those cards are released? Average joe will not understand if for whatever reason PhysX doesn't work in his shiny new card. Why is he supposed to wait 2 months in order to have something he already had working before?

In a sense that's what is good about PC gaming and modding. Someone can make something and you can try it under your responsability. When I say "you", I mean an enthusiast, because average joe will not downlaod it, and that's the difference. Average joe won't download such a hack, but average joe will download an official release, average joe will try such official release and if it doesn't work average joe will blame the company and will go as far as taking legal action, because average joe knows much more about class actions than he knows about tech. And that's all, really. No campany is willing to spend so much money making something work when it won't even work in most systems out there (Vista). try explaining average joe why that something that is official works on XP or 7, but doesn't work on Vista... try...


----------



## Robert-The-Rambler (May 29, 2010)

*C'mon now*



Benetanegia said:


> I don't know why so many people don't understand why Nvidia disables PhysX with a non-Nvidia card. It's just not profitable to ensure QA. Just because the hack (and in this case the un-locked beta drivers) works for the mayority, that doesn't mean it works for everybody without a single problem (for instance it won't work in Vista). Things that come from companies like Nvidia, Ati,Intel, etc have to work 100% or at least 99.9999999% of the times. Plain and simple.
> 
> Someone somewhere will always be able to hack something or mod something that will work 99% of the times without spending excesive time and money on the development, but they are free of responsability if that 1% for which it doesn't work as it should, breaks their PC trying to make it work. Companies have to ensure by law that it works on 100% of the cases and when it fails they have legal responsability. It's that 1% that costs these companies (and this goes for any tech company, game developer, car vendors, whatever) a lot of money in QA, but they have to do it, because even something that seems so small as 1% is a very big number of people in real life, outside iof enthusiast forums. A hack is used by very few people, which can literally translate to 99 people saying how well it works and only one person saying it broke his windows installment. That person will be ignored and people think it works flawlessly which in most cases is probably true, but not always. There's still the fact that it could NOT work in certain cases, because it has not been tested. If something untested was officially released and it didn't work in just 1% of people, that would still make a number of more than 1 million failing cases and that would make a lot of noise... class actions would be put in place etc, etc. I repeat, companies have to ENSURE it works flawlessly and that costs a lot of money, not to mention having access to tech and IP that the company might not have, like for example, for Nvidia Southern Islands/Northerns Islands. How are them supposed to ensure 100% interoperability when those cards are released? Average joe will not understand if for whatever reason PhysX doesn't work in his shiny new card. Why is he supposed to wait 2 months in order to have something he already had working before?
> 
> In a sense that's what is good about PC gaming and modding. Someone can make something and you can try it under your responsability. When I say "you", I mean an enthusiast, because average joe will not downlaod it, and that's the difference. Average joe won't download such a hack, but average joe will download an official release, average joe will try such official release and if it doesn't work average joe will blame the company and will go as far as taking legal action, because average joe knows much more about class actions than he knows about tech. And that's all, really. No campany is willing to spend so much money making something work when it won't even work in most systems out there (Vista). try explaining average joe why that something that is official works on XP or 7, but doesn't work on Vista... try...



Why is it any easier to make CUDA work with Nvidia cards than ATI? Really? When Physics/CUDA started it was a separate entity from the GPU altogether and their is no reason why a GPU can't be designed to run as simply a dedicated PPU. How hard can it be? Even if there are bugs enthusiasts as we are will find a way to make it work ourselves more likely than some tech support yahoo at Nvidia unless they really bunk up the whole process. I don't like it when a company is bullshitting me. In this case Nvidia is not telling the whole story and they probably think that CUDA is enough of a selling point to convince idiots to by their higher end gear when they already have strong enough CUDAless gear. Flushing money down the toilet is not my style. It just aint enough of a reason to change our whole graphics setup. Certain gamers like me only want to try out the Physics tech at a smaller premium than altering our already super expensive super powerful gaming rigs. Nvidia is simply nuts and should have embraced the idea of a separate PPU idea instead of trying to solely integrate it only into their own GPU configurations.  I can't possibly think of any reason why somebody would purchase a higher end Nvidia card for CUDA when the ATI card they already have is fast enough. So why not sell CUDA for what it is; a separate entity from the GPU altogether. Why not have the option for both? I'm not buying the quality control aspect.


----------



## Mussels (May 29, 2010)

Robert-The-Rambler said:


> Why is it any easier to make CUDA work with Nvidia cards than ATI? Really? When Physics/CUDA started it was a separate entity from the GPU altogether and their is no reason why a GPU can't be designed to run as simply a dedicated PPU. How hard can it be? Even if there are bugs enthusiasts as we are will find a way to make it work ourselves more likely than some tech support yahoo at Nvidia unless they really bunk up the whole process. I don't like it when a company is bullshitting me. In this case Nvidia is not telling the whole story and they probably think that CUDA is enough of a selling point to convince idiots to by their higher end gear when they already have strong enough CUDAless gear. Flushing money down the toilet is not my style. It just aint enough of a reason to change our whole graphics setup. Certain gamers like me only want to try out the Physics tech at a smaller premium than altering our already super expensive super powerful gaming rigs. Nvidia is simply nuts and should have embraced the idea of a separate PPU idea instead of trying to solely integrate it only into their own GPU configurations.  I can't possibly think of any reason why somebody would purchase a higher end Nvidia card for CUDA when the ATI card they already have is fast enough. So why not sell CUDA for what it is; a separate entity from the GPU altogether. Why not have the option for both? I'm not buying the quality control aspect.



cuda cant, and never will run on ATI. its a hardware part of the GPU's.

PhysX could be made to run on ATI stream, but there is just no way in hell CUDA can run on ATI, nor stream could run on Nv.


----------



## Robert-The-Rambler (May 29, 2010)

*Thats not what I'm asking for*



Mussels said:


> cuda cant, and never will run on ATI. its a hardware part of the GPU's.
> 
> PhysX could be made to run on ATI stream, but there is just no way in hell CUDA can run on ATI, nor stream could run on Nv.



Give me a dedicated PPU like when it all started so I can still use my ATI cards as I always have been and let the PPU do its own job. Make it work with your own video cards or let specific video cards in your lineup such at the 240 GT function as a PPU only no matter what GPU you are using. At least try!!!


----------



## Mussels (May 29, 2010)

oh and just because i can...


if you two want to argue this stuff, at least get your facts right.

Please learn what these all are, and do:

CUDA
PhysX
Physics
Stream

once you've done that, come back, and make sure you dont screw things up. Nvidia doesnt do physics - they do PhysX. ATi doesnt do CUDA, and never will, and so on.


----------



## DannibusX (May 29, 2010)

Just to set the record straight, CUDA is part of nVidia's architecture, correct?  When they acquired PhysX they had it ported to CUDA so ATI couldn't use it, but it also made Ageia's PPU's obsolete.

CUDA is the reason you need an nVidia card, because that's what language PhysX speaks.

Is Streams ATI's version, or answer to CUDA?

Physics is math.

Just making sure it's all straight in my head.


----------



## Mussels (May 29, 2010)

DannibusX said:


> Just to set the record straight, CUDA is part of nVidia's architecture, correct?  When they acquired PhysX they had it ported to CUDA so ATI couldn't use it, but it also made Ageia's PPU's obsolete.
> 
> CUDA is the reason you need an nVidia card, because that's what language PhysX speaks.
> 
> ...



STREAM and CUDA are both languages used to allow the GPU on a video card to perform non-3D tasks. ATI use stream, nvidia use cuda.

is it possible for a CUDA app to be ported to STREAM and vice versa? yes. but no ones done it yet (probably due to legal reasons)


----------



## DannibusX (May 29, 2010)

Yar, I'm reading up on CUDA right now wikipedia, interesting stuff.

Ok, so no one will port CUDA to Stream, simply because nVidia would hammer them with lawsuits to protect their IP.  Using an nVidia card to use CUDA isn't illegal because that's the reason you bought it, and you own the product.

Maybe nVidia is trying to protect themselves from liability by not supporting the GTX for PhysX front, in case someone seriously messes up their computer.  They're not making it really hard for it to be hacked.


----------



## Mussels (May 29, 2010)

DannibusX said:


> Yar, I'm reading up on CUDA right now wikipedia, interesting stuff.
> 
> Ok, so no one will port CUDA to Stream, simply because nVidia would hammer them with lawsuits to protect their IP.  Using an nVidia card to use CUDA isn't illegal because that's the reason you bought it, and you own the product.
> 
> Maybe nVidia is trying to protect themselves from liability by not supporting the GTX for PhysX front, in case someone seriously messes up their computer.  They're not making it really hard for it to be hacked.




you *cant* port cuda to stream.

thats like saying i'm gunna port OSX to windows. Or run the GUI for an iphone on my samsung mobile phone. you can convert OSX apps to run in windows (with a lot of work) but you cant make OSX run in windows.


They're two different langauges, maybe i should stick with that for examples.

You can translate a japanese movie into english, and back and forth - but you cant make a movie in english and then expect japanese speakers to understand it WITHOUT that translation.

You could write a translation layer that translates CUDA into STREAM allowing CUDA apps to run on stream, but it wouldnt be a perfect lineup, just like spoken languages don't translate perfectly. manual debugging still needs to be done, and performance would be far worse than doing it natively


(ignore the whole VMware thing that someones going to throw up in response to this - emulation isnt the point at hand)


----------



## DannibusX (May 29, 2010)

Mussels said:


> you *cant* port cuda to stream.
> 
> thats like saying i'm gunna port OSX to windows. Or run the GUI for an iphone on my samsung mobile phone. you can convert OSX apps to run in windows (with a lot of work) but you cant make OSX run in windows.
> 
> ...



I totally misread that post.  For some reason my eyes skipped right over the "app" after Cuda.

Oh, I totally get it.  Like I said, I misread, lol.


----------



## Mussels (May 29, 2010)

i think the language part at the bottom is easier to comprehend, i added that in after you quoted me.


----------



## Robert-The-Rambler (May 29, 2010)

*Whatever....*



Mussels said:


> oh and just because i can...
> 
> 
> if you two want to argue this stuff, at least get your facts right.
> ...



You did understand what I was saying I hope. I know CUDA is what Nvidia uses to accellerate physics calculations that have been called PhysX since Ageia was in the business and ATI uses Stream to do stuff like assist in DVD interpolation in Power DVD 9 Ultra. I've never seen Stream used in games thus far.

Physics is PhysX, right.

I think I'm done.


----------



## Mussels (May 29, 2010)

Robert-The-Rambler said:


> You did understand what I was saying I hope. I know CUDA is what Nvidia uses to accellerate physics calculations that have been called PhysX since Ageia was in the business and ATI uses Stream to do stuff like assist in DVD interpolation in Power DVD 9 Ultra. I've never seen Stream used in games thus far.
> 
> Physics is PhysX, right.
> 
> I think I'm done.



no, you've still either screwing things up in your head, or just when you type them.

PhysX is one physics engine made by ageia, now owned by nvidia. the words physics and physx are NOT interchangeable.

ATI's stream and nvidias CUDA are the same thing, for their different hardware engines - a way to use the power of a GPU to perform non 3D tasks. anything doable on CUDA is doable on stream, so long as its coded natively for that platform.


----------



## lyndonguitar (May 29, 2010)

is 9500GT enough for a PhysX card and using a PCI-e x1 slot. im using it with 5850


----------



## DannibusX (May 29, 2010)

PhysX is really interesting, I've never seen anything like the difference in Batman AA.  Before it seemed like it was just a gimmick nVidia was trying to sling, but seeing it put into action so well, I like it a lot.  I still like my ATI card, but I'll continue to use the hack to keep my PhysX card running.  Of course, I'll have to say goodbye to it when I buy a second 5870 though.  I don't have enough PCI-E slots.

Robert, PhysX is nVidia's proprietary physics engine.  Physics is not PhysX.


----------



## Mussels (May 29, 2010)

DannibusX said:


> PhysX is really interesting, I've never seen anything like the difference in Batman AA.  Before it seemed like it was just a gimmick nVidia was trying to sling, but seeing it put into action so well, I like it a lot.  I still like my ATI card, but I'll continue to use the hack to keep my PhysX card running.  Of course, I'll have to say goodbye to it when I buy a second 5870 though.  I don't have enough PCI-E slots.
> 
> Robert, PhysX is nVidia's proprietary physics engine.  Physics is not PhysX.



batman AA is a poor example. many of those 'hardware physX' features are available in the console versions... its not so much that physX is used to make them work, its more that they turned off non-physX things if non nvidia card is detected.


----------



## DannibusX (May 29, 2010)

Lyndon, from what I've heard the 9500 is not a good card for PhysX.  I use an 8800GT as it has 112 Shaders and 512MB of ram.  A lot of people recommend the 8800 as a minimum, but prefer 9600 or a 9800.

Here's a thread with pretty in depth discussion on it:
http://forums.techpowerup.com/showthread.php?t=119217



			
				Mussels said:
			
		

> batman AA is a poor example. many of those 'hardware physX' features are available in the console versions... its not so much that physX is used to make them work, its more that they turned off non-physX things if non nvidia card is detected.



Meh, I've never played it on the console.  I'm starting to become a PC gaming guy these days.  All I ever built my machines for in the past was for WoW and I overkilled it.  My Xbox gathers dust, unless an exclusive title comes out I really want to play.


----------



## Robert-The-Rambler (May 29, 2010)

*I still don't get what I'm missing*



DannibusX said:


> PhysX is really interesting, I've never seen anything like the difference in Batman AA.  Before it seemed like it was just a gimmick nVidia was trying to sling, but seeing it put into action so well, I like it a lot.  I still like my ATI card, but I'll continue to use the hack to keep my PhysX card running.  Of course, I'll have to say goodbye to it when I buy a second 5870 though.  I don't have enough PCI-E slots.
> 
> Robert, PhysX is nVidia's proprietary physics engine.  Physics is not PhysX.



I gotta get to sleep. Maybe I'm just punchy. I bought into this stupid PhysX thing right from the start with a BFG PPU. I've sort of missed it since the PPU when poopoo. Something is lost in translation here.  PhysX is a PHYSICS engine, a way to a translate CUDA accelerated, CPU accelerated or in our dreams Stream Accelerated explosions and whatnot to enhance the gameplay experience. BTW Shadowgrounds Survivor is an awesome CPU PhysX enhanced game that used to be PPU that is real cheap on STEAM. Knocking trees down is a lot of fun while shooting Aliens.

But Mussels I led you astray in my earlier post. I was merely trying to question why CUDA would be harder to do on a NVidia GPU used as a PPU with an ATI also present as the main GPU. Why isn't this being done? That is all I was trying to shout.

These posts have way too many acronyms.  Good night all.


----------



## Mussels (May 29, 2010)

yes, PhysX is a physics engine. but there is more than one physics engine out there. you've been swapping the words back and forth in your messages, confusing the issue.

even again there you say something that makes no sense.



> PhysX is a PHYSICS engine, a way to a translate CUDA accelerated, CPU accelerated or in our dreams Stream Accelerated explosions and whatnot to enhance the gameplay experience.



PhysX is a software physics engine designed to run on the CPU primarily, or be accelerated by a PPU or Nvidia GPU. It has nothing to do with translating anything.


----------



## Robert-The-Rambler (May 29, 2010)

*Substitute Bring For Translate*



Mussels said:


> yes, PhysX is a physics engine. but there is more than one physics engine out there. you've been swapping the words back and forth in your messages, confusing the issue.
> 
> even again there you say something that makes no sense.
> 
> ...



I think you were being a bit too literal. Anyway hope we all learned something......


----------



## wahdangun (May 29, 2010)

Mussels said:


> yes, PhysX is a physics engine. but there is more than one physics engine out there. you've been swapping the words back and forth in your messages, confusing the issue.
> 
> even again there you say something that makes no sense.
> 
> ...



i think what robert mean was, why it's make it harder to use phisyx when ati card present, robert even said he have BFG PPU(AEGIA), but he can't use it anymore because of this restriction.


and btw i think this is ilegall move, we buy the card because of this feature. i hope EU can step in just like they do wit linux in ps3


----------



## Mussels (May 29, 2010)

oh i agree that limiting them (especially the PPU's) is illegal.

but theres nothing we can do about that here.


----------



## Wile E (May 29, 2010)

newtekie1 said:


> It isn't surprising, and btarunner pretty much hit the nail right on the head, this is done so they can clear out old weaker GPUs as PhysX cards.  A quick trip over to eVGADIA...I mean eVGA...and they have huge adverts on the main page that say "GT 240 Makes a great Dedicated PhysX card!" and "Maximize your gaming experience with a PhysX card!"  They aren't even calling them graphics cards at this point, they are simply referring to them as PhysX cards...
> 
> 
> 
> You've got a PCI-E x1 slot...buy a dremel and a super cheap $50 9800GT and make yourselft a PCI-E x1 PhysX card.  I recently chopped PCI-E x16 card down to fit in a board with no x16 slot, and I was actually surprised at how easy it really was.  I was affraid at first, and though it would be hard, but really it wasn't.


I'd rather just file the back of the PCIe slot to be open.



newtekie1 said:


> ATi has always been a direct competitor to nVidia.
> 
> However, nVidia started making AMD chipsets before AMD bought ATi, long before.  It has only been recently that AMD became a direct competitor by buying ATi, and it doesn't make sense for nVidia to just shut down their entire chipset devision because of it.
> 
> ...


Or they port Physx over to OpenCL or DirectCompute.



cadaveca said:


> nV just needs to start selling G92 without display connections, and a blank backplate, to be specifically used as a Phys-X card. I fail to understand why they have not done this yet...



I've been thinking that, too. Then, to eliminate all compatibility issues with gfx drivers, make it listed as a co-processor in the OS.


----------



## Wile E (May 29, 2010)

Oh, and can somebody PM me a link to the unblocked driver for Win 7 x64?


----------



## newfellow (May 29, 2010)

gonna just paste a link here check it out: 
Nice hybrid driver


----------



## zAAm (May 29, 2010)

RejZoR said:


> I know, but SLi and Crossfire can be done. So it's not entirely locked. Unless it's limited to one model only even if you have more of them. Unlike PhysX where you'll have a different kind of adapter.



No, it's locked. Like I said, only one *display driver*. SLI uses two or more *NVIDIA* cards and Crossfire uses two or more *ATI* cards - which means all of them use the same driver. Which is why it'll work on Vista. Once you start mixing brands with Vista = no go.


----------



## KainXS (May 29, 2010)

I can only hope nvidia gets some brains and starts trying to sell their GT21X cards with no backplate or connections instead of the G92's because they don't use much power at all and have better compute capabilities, They have the chance to make alot of money and try to redeem PhysX but for some reason I think they will lock everything up by the next release.


----------



## Helper (May 29, 2010)

wahdangun said:


> robert even said he have BFG PPU(AEGIA), but he can't use it anymore because of this restriction.



Actually, you can still use an Ageia PPU along w/ any other GPU. But you have to get old Physx pack from 2008,  one before they put up restriction and stopped supporting it. It'll have it's own blue control panel and settings to test the PPU etc, better then Nvidia's integration.  Kinda cute but doesn't really justify wasting a PCI slot IMO LOL.

Anyway, it was obvious they didn't "remove" restriction. They're Nvidia and that won't happen. The block was on Nvidia's CP level, in SLi&Physx settings. When used with a non-Nvidia GPU, Physx option disappeared. Now that they changed UI to a new one on that page in 257.15, block went away on first rev beta drivers. They forgot to put it back, LOL. Now I wonder if I can do SLi on Server 2003, without changing it to XP x64... maybe tri-way SLi, or how about Quad SLi on XP based OS? Is there ANY reason why I can't, other then their policy? Man Nvidia is stupid...


----------



## shevanel (May 29, 2010)

this is good and this is also hilarious.


----------



## newtekie1 (May 29, 2010)

Mussels said:


> cuda cant, and never will run on ATI. its a hardware part of the GPU's.
> 
> PhysX could be made to run on ATI stream, but there is just no way in hell CUDA can run on ATI, nor stream could run on Nv.



There isn't really a reason that CUDA can't run on ATi hardware.  CUDA is not a hardware part of the GPUs, CUDA is all software, built into the driver that uses standard unified Shaders to do work.  In theory any GPU with unified shaders should be able to use CUDA.

CUDA came out well after several of the supported GPUs, and support was added via drivers and nothing more, there is nothing in the physical GPU that enables CUDA.

Now, at this point, it would probably be a better idea to port PhysX onto Streams, as ATi definitely isn't going to add CUDA support to their drivers.



Wile E said:


> I'd rather just file the back of the PCIe slot to be open.



The problem with doing it like that is there might be components on the board behind the PCI-E x1 slots that would still interfere with the PCI-E connector on the card.  In the case of his P5Q, there are components that would prevent the card from going either PCI-E x1 slot if you just filed the back of the PCI-E slot.



Wile E said:


> Or they port Physx over to OpenCL or DirectCompute.



That is probably an even better idea then porting it to Streams!  You here that nVidia?  Get on it.



Wile E said:


> I've been thinking that, too. Then, to eliminate all compatibility issues with gfx drivers, make it listed as a co-processor in the OS.



I've said they should do this since Vista first showed the problem with only allowing one type of graphics driver active at a time.

Though I would prefer that they still put display outputs on the card, and make it a graphics card, and instead just put out special drivers that you use that make it a a co-processor in the OS.  So if you want to use it as a graphics card, you use the standard drivers, if you want to use it as a PPU only, you use the special drivers.


----------



## Mussels (May 29, 2010)

newtekie1 said:


> There isn't really a reason that CUDA can't run on ATi hardware.  CUDA is not a hardware part of the GPUs, CUDA is all software, built into the driver that uses standard unified Shaders to do work.  In theory any GPU with unified shaders should be able to use CUDA.
> 
> CUDA came out well after several of the supported GPUs, and support was added via drivers and nothing more, there is nothing in the physical GPU that enables CUDA.




cuda is a language to translate program calls to work on Nvidia GPU's. ATI GPU's are not nvidia GPU's. CUDA would never work there.

lets use a more simple example.

HD-DVD and BLU-RAY.

They both store on the same sized medium (GPU's) they both do the same thing (hold HD content). No matter what the hell you do, they're not compatible. you can convert the movie (the program) to work on the alternative format by re-burning it on the other disk type (recoding the program to work on openCL instead of CUDA, for example)... but no matter what you do, the language is keyed to that hardware.

people just dont seem to get that while you can code a CUDA app to work on another direct compute style system, YOU CANT RUN CUDA ITSELF ON ANYTHING BUT NVIDIA HARDWARE.


----------



## wahdangun (May 29, 2010)

Mussels said:


> cuda is a language to translate program calls to work on Nvidia GPU's. ATI GPU's are not nvidia GPU's. CUDA would never work there.
> 
> lets use a more simple example.
> 
> ...



but i kind a gree wit newtikie, it's just like ordinary CPU. just think about it, windows can run in either AMD or intel branded, or you can take extreme example like linux, it can run on PS3(with other OS support) or intel CPU although it's have really different architecture (PPC vs X86)


----------



## newtekie1 (May 29, 2010)

Mussels said:


> cuda is a language to translate program calls to work on Nvidia GPU's. ATI GPU's are not nvidia GPU's. CUDA would never work there.
> 
> lets use a more simple example.
> 
> ...



You can use all the examples you want, I get what you are trying to say, you are just wrong in saying that CUDA is part of the GPU.  That was my point.

It is not part of the GPU, it is entirely software based, there is nothing special required in the GPU for CUDA to run on it.  Your original statement was that it is a hardware part of the GPUs, and it is not.

As it is right now, CUDA can not run on ATi hardware.  However, there is no reason it couldn't, it just needs to be programmed to do so.  Granted, it is never going to happen because ATi and nVidia could never work together to do it, but there is no reason other than that and the huge amount of time and developement it would take that it couldn't.

I'll even use an example of my own, or rather a more accurate version of your example from earlier:

It is like running OSX on non-Apple hardware.  OSX is entirely software, there isn't really anything in the hardware that is OSX.  At this point, Apples hardware doesn't have a special part built in that allows OSX to run, there isn't anything special about Apple's hardware.

But when you try to run OSX on non-Apple hardware, 9 times out of 10 it bails out a few seconds into the boot sequence.  However, you do a little work, a little massaging, and pretty soon you have OSX running on non-Apple hardware.


----------



## WSP (May 29, 2010)

if so, then cuda is an apps writings exclusively for nvidia hardware?


----------



## human_error (May 29, 2010)

wahdangun said:


> but i kind a gree wit newtikie, it's just like ordinary CPU. just think about it, windows can run in either AMD or intel branded, or you can take extreme example like linux, it can run on PS3(with other OS support) or intel CPU although it's have really different architecture (PPC vs X86)



GPUs don't use the same arcitecture/instruction sets in the way cpus do. ATi and nvidia gpus have different instruction sets and very different arcitectures which although similar from a high level point of view at the level where CUDA/stream operate they are extremely different.

Your example of running linux on the cell BE processor in a ps3 and running it on an x86 processor is a bad one as if you were to compare the kernels of the operating system on a ps3 and compare it with that of  linux on x86 you would see it is completely different to accomodate the different arcitecture. Your reply would no doubt be "but yes but at the level that matters linux runs on both systems". My response to that is yes - but in the gpu world the kernel is equivalent to CUDA and stream and the linux os which sits on top of the kernel would be the applicaion implementing cuda and stream calls.

If you wanted to code something which runs on both ati and nvidia arcitectures then you would use openCl as ATi and Nvidia have built their own driver level translators to convert openCl calls to match their arcitecture - this is not a small task and removes the need to translate between cuda and stream. Saying nvidia could port cuda to run on ati hardware would be the same as saying intel could port their gpu drivers to run ati hardware - the purpose of the code at that level is to provide an interface to the hardware (CUDA is a hardware API as it gives access to hardware calls specific to certain hardware arcitectures, as opposed to a software API such as directx which is intercepted by drivers which then make the hardware calls appropriate to the hardware which is installed).


----------



## Mussels (May 29, 2010)

the HARDWARE can do it, but you're forgetting that we're talking about hte language used BETWEEN the software and the hardware.

CUDA apps converted to run on stream = entirely possible

running CUDA *itself* is whats it impossible, and what i'm getting sick of seeing people saying.


----------



## newtekie1 (May 29, 2010)

Mussels said:


> the HARDWARE can do it, but you're forgetting that we're talking about hte language used BETWEEN the software and the hardware.
> 
> CUDA apps converted to run on stream = entirely possible
> 
> running CUDA *itself* is whats it impossible, and what i'm getting sick of seeing people saying.



Running CUDA itself is not impossible, just not likely. CUDA, being entirely software, can be re-written and modified to run on pretty much any hardware, it is just up to nVidia what they are willing to do.


----------



## W1zzard (May 29, 2010)

newtekie1 said:


> Running CUDA itself is not impossible, just not likely. CUDA, being entirely software, can be re-written and modified to run on pretty much any hardware, it is just up to nVidia what they are willing to do.



nvidia's definition of cuda is that it is the hardware architecture enabling compute on their gpus. but you are correct in a way that it exposes a software interface for which an emulator can be written


----------



## KainXS (May 29, 2010)

it would probably be so slow you might be better off trying to emulate a ps3 on xbox360 or vice verca.

I think it might work but would take so much time to recode that you would be better off making a new standard completely.


----------



## dlpatague (May 30, 2010)

UPDATE: Official BLOG from Nvidia: http://blogs.nvidia.com/ntersect/2010/05/update-on-release-256-physx-support-1.html


----------



## OnBoard (May 30, 2010)

dlpatague said:


> UPDATE: Official BLOG from Nvidia: http://blogs.nvidia.com/ntersect/2010/05/update-on-release-256-physx-support-1.html



Now if it was true that main reason for not allowing ATI/NVIDIA combo is the QA work, then leave it enabled just in Beta drivers. Everyone would be happy enough, who needs support.

But it just isn't the truth, only thing needed on WHQL drivers would be an option to the control panel with hybrid support and default is off. Turn it on and you get a message "NVIDIA doesn't support AMD/NVIDIA hybrid GPU configurations, you use this option at your own risk"

Everyone wins and is happy, but no. They purposely make it now work, just because they can and even say it them selves that you need to hack the drivers to use your NVIDIA card..


----------



## DailymotionGamer (May 30, 2010)

DannibusX said:


> Based on your system specs, you may not know the difference between hardware PhysX and the ilk.


That has nothing to do with anything. 



DannibusX said:


> Check out this video for a comparison between PhysX and non-PhysX.


After watching the video, its nothing to be all hype about it, i never really notice anyway in games. The only type of physx i notice was in timeshift, which looks great. But overall, physx, nvidia, i won't cry if the stuff is disable in games. Nothing important to me.


----------



## newtekie1 (May 30, 2010)

OnBoard said:


> Now if it was true that main reason for not allowing ATI/NVIDIA combo is the QA work, then leave it enabled just in Beta drivers. Everyone would be happy enough, who need support.
> 
> But it just isn't the truth, only thing needed on WHQL drivers would be an option to the control panel with hybrid support and default is off. Turn it on and you get a message "NVIDIA doesn't support AMD/NVIDIA hybrid GPU configurations, you use this option at your own risk"
> 
> Everyone wins and is happy, but no. They purposely make it now work, just because they can and even say it them selves that you need to hack the drivers to use your NVIDIA card..



I like this idea, disable it in the WHQL drivers, and leave it enabled in the beta drivers.  Sounds like a perfect solution, since they don't support beta drivers anyway, they don't have to worry about supporting something that _might_ not work.


----------



## bebbee (May 30, 2010)

i hate these overpriced and expensive new GPUs.

a GTS 250 is still a fast and bang for the bucks card.

i am against all these new cards.


----------



## TRIPTEX_CAN (May 30, 2010)

dlpatague said:


> UPDATE: Official BLOG from Nvidia: http://blogs.nvidia.com/ntersect/2010/05/update-on-release-256-physx-support-1.html



I really respect nvidia for this decision. Official support itsnt a realistic expectation for ATI users but the enthusiast community isn't afraid of beta software.


----------



## Robert-The-Rambler (May 30, 2010)

*As long as we have a chance to at least try GPU PhysX*



TRIPTEX_MTL said:


> I really respect nvidia for this decision. Official support itsnt a realistic expectation for ATI users but the enthusiast community isn't afraid of beta software.



I'm happy with a use at your own risk policy. Hell, that rule applies every time I eat out.


----------



## Loosenut (May 30, 2010)

Robert-The-Rambler said:


> I'm happy with a use at your own risk policy. *Hell, that rule applies every time I eat out.*



+1 Robert, you ain't ramblin' now...


----------



## TheMailMan78 (May 30, 2010)

Benetanegia said:


> I don't know why so many people don't understand why Nvidia disables PhysX with a non-Nvidia card. It's just not profitable to ensure QA. Just because the hack (and in this case the un-locked beta drivers) works for the mayority, that doesn't mean it works for everybody without a single problem (for instance it won't work in Vista). Things that come from companies like Nvidia, Ati,Intel, etc have to work 100% or at least 99.9999999% of the times. Plain and simple.
> 
> Someone somewhere will always be able to hack something or mod something that will work 99% of the times without spending excesive time and money on the development, but they are free of responsability if that 1% for which it doesn't work as it should, breaks their PC trying to make it work. Companies have to ensure by law that it works on 100% of the cases and when it fails they have legal responsability. It's that 1% that costs these companies (and this goes for any tech company, game developer, car vendors, whatever) a lot of money in QA, but they have to do it, because even something that seems so small as 1% is a very big number of people in real life, outside iof enthusiast forums. A hack is used by very few people, which can literally translate to 99 people saying how well it works and only one person saying it broke his windows installment. That person will be ignored and people think it works flawlessly which in most cases is probably true, but not always. There's still the fact that it could NOT work in certain cases, because it has not been tested. If something untested was officially released and it didn't work in just 1% of people, that would still make a number of more than 1 million failing cases and that would make a lot of noise... class actions would be put in place etc, etc. I repeat, companies have to ENSURE it works flawlessly and that costs a lot of money, not to mention having access to tech and IP that the company might not have, like for example, for Nvidia Southern Islands/Northerns Islands. How are them supposed to ensure 100% interoperability when those cards are released? Average joe will not understand if for whatever reason PhysX doesn't work in his shiny new card. Why is he supposed to wait 2 months in order to have something he already had working before?
> 
> In a sense that's what is good about PC gaming and modding. Someone can make something and you can try it under your responsability. When I say "you", I mean an enthusiast, because average joe will not downlaod it, and that's the difference. Average joe won't download such a hack, but average joe will download an official release, average joe will try such official release and if it doesn't work average joe will blame the company and will go as far as taking legal action, because average joe knows much more about class actions than he knows about tech. And that's all, really. No campany is willing to spend so much money making something work when it won't even work in most systems out there (Vista). try explaining average joe why that something that is official works on XP or 7, but doesn't work on Vista... try...



You are to smart to buy into that PR crap from Nividia.

First of all name one thing in your experience that you have NEVER had a problem with in the computer world. Even my case panel sometimes doesn't close as it should but never once did I think "I'm going to file a class action lawsuit against Coolermaster because one screw doesn't ALWAYS line up!" I mean really both Nvidia and ATI have driver problems with SOMETHING in EVERY release. If they didn't why do they keep releasing updates? Because something new or old isn't 100% compatible!

A lone joker in the hacking world created a decent mod that works 99% of the time. What do you think Nvidias RD crew could do? Please this QA crap is just PR so they don't seem like greedy bastards. Not that its a bad thing wanting to make money off of what you paid for but don't insult peoples intelligence.


----------



## xBruce88x (May 30, 2010)

what about with my 9600gt in the lead and the hd3200 chipset? well i'll try the drivers out and let you guys know!


----------



## Benetanegia (May 30, 2010)

TheMailMan78 said:


> You are to smart to buy into that PR crap from Nividia.
> 
> First of all name one thing in your experience that you have NEVER had a problem with in the computer world. Even my case panel sometimes doesn't close as it should but never once did I think "I'm going to file a class action lawsuit against Coolermaster because one screw doesn't ALWAYS line up!" I mean really both Nvidia and ATI have driver problems with SOMETHING in EVERY release. If they didn't why do they keep releasing updates? Because something new or old isn't 100% compatible!
> 
> A lone joker in the hacking world created a decent mod that works 99% of the time. What do you think Nvidias RD crew could do? Please this QA crap is just PR so they don't seem like greedy bastards. Not that its a bad thing wanting to make money off of what you paid for but don't insult peoples intelligence.



I don't buy any PR crap, I've been saying this for a long long time, long, even before they said anything. Because I know for a matter of fact that things work that way. Not in the GPU or driver bussiness, but I've been there, so I know what is about. It doesn't matter if the QA is that important in the end or if it works at all, they have to do it, because in many countries it's obligatory. If they spent time and money and it doesn't work, no worries, but oh friend if it doesn't work and no QA was done... be prepared.

And they just don't want to spend the money on QA on something that is not really in their hands. A lot of that QA has to be made on AMD's end and they will just not do it. Even when only Nvidia cards are used, every PhysX driver update needs the latest GPU driver as well, or everything gets fucked up soon, that's something that I have suffered from. So a mix between Ati and Nvidia is always going to be worse.

Now, the idea of allowing it on the beta... that could work, but there's still the fact that it would not work on Vista systems and that's a nightmare to explain to average joe and it owuldn't be very different than the hack anyway. The hack has probably more support than the beta regarding Ati+Nvidia setup.


----------



## Mussels (May 30, 2010)

Benetanegia said:


> I don't buy any PR crap, I've been saying this for a long long time, long, even before they said anything. Because I know for a matter of fact that things work that way. Not in the GPU or driver bussiness, but I've been there, so I know what is about. It doesn't matter if the QA is that important in the end or if it works at all, they have to do it, because in many countries it's obligatory. If they spent time and money and it doesn't work, no worries, but oh friend if it doesn't work and no QA was done... be prepared.
> 
> And they just don't want to spend the money on QA on something that is not really in their hands. A lot of that QA has to be made on AMD's end and they will just not do it. Even when only Nvidia cards are used, every PhysX driver update needs the latest GPU driver as well, or everything gets fucked up soon, that's something that I have suffered from. So a mix between Ati and Nvidia is always going to be worse.
> 
> Now, the idea of allowing it on the beta... that could work, but there's still the fact that it would not work on Vista systems and that's a nightmare to explain to average joe and it owuldn't be very different than the hack anyway. The hack has probably more support than the beta regarding Ati+Nvidia setup.




The QA was already done... AGEIA PPU's worked on ATI, nvidia, SIS, matrox, etc.


----------



## Benetanegia (May 30, 2010)

Mussels said:


> The QA was already done... AGEIA PPU's worked on ATI, nvidia, SIS, matrox, etc.



That was a looooong time ago. Yes, GPU drivers from 2005 *worked* on my Radeon 9600 and games from that era too, but try running them today... They may work on many cases, but you are surely going to find a lot of problems. As a company Nvidia just wants to stay clear from any problems of that nature. Plain and simple.

The reason Nvidia is disabling PhysX is the same that Ati discontinued the X1000 series of cards, even when they were still selling them. There's a point in which a company doesn't want to spend more money in something that only few people are gonna use.


----------



## Mussels (May 30, 2010)

Benetanegia said:


> That was a looooong time ago. Yes, GPU drivers from 2005 worked on my Radeon 9600 and games from that era too, but try running them today... They may work on many cases, but you are surely going to find a lot of problems. As a company Nvidia just wants to stay clear from any problems of that nature. Plain and simple.



so come up with a toggle in the driver options to switch a video card from GPU to PPU/CUDA card, so that video drivers turn off and only CUDA (and apps that use it) remain.

Set in a safeguard so that it cant be used if a monitor is connected to the card, and away you go, back to the Ageia days.


The only reason nvidia are doing this is because they've done so much dodgy shit disabling features in the name of physX (such as with batman AA) that people might find out *gasp* that in fact, they just disable it on ATI even if physX is working.


----------



## TheMailMan78 (May 30, 2010)

Benetanegia said:


> I don't buy any PR crap, I've been saying this for a long long time, long, even before they said anything. Because I know for a matter of fact that things work that way. Not in the GPU or driver bussiness, but I've been there, so I know what is about. It doesn't matter if the QA is that important in the end or if it works at all, they have to do it, because in many countries it's obligatory. If they spent time and money and it doesn't work, no worries, but oh friend if it doesn't work and no QA was done... be prepared.
> 
> And they just don't want to spend the money on QA on something that is not really in their hands. A lot of that QA has to be made on AMD's end and they will just not do it. Even when only Nvidia cards are used, every PhysX driver update needs the latest GPU driver as well, or everything gets fucked up soon, that's something that I have suffered from. So a mix between Ati and Nvidia is always going to be worse.
> 
> Now, the idea of allowing it on the beta... that could work, but there's still the fact that it would not work on Vista systems and that's a nightmare to explain to average joe and it owuldn't be very different than the hack anyway. The hack has probably more support than the beta regarding Ati+Nvidia setup.


 Yeah and one small disclaimer on the box would cover them in most countries if they are that chicken shit. Like I said if one hacker can make it work great then why can't a billion dollar company? Its PR BS plain and simple. If you still don't think so then you should sue every game developer in the world for not making 100% compatible games with EVERY combination of hardware.



Mussels said:


> The QA was already done... AGEIA PPU's worked on ATI, nvidia, SIS, matrox, etc.


 Yup. Now tell me AGEIA was a better funded company than Nvidia.


----------



## Wile E (May 30, 2010)

Benetanegia said:


> That was a looooong time ago. Yes, GPU drivers from 2005 worked on my Radeon 9600 and games from that era too, but try running them today... They may work on many cases, but you are surely going to find a lot of problems. As a company Nvidia just wants to stay clear from any problems of that nature. Plain and simple.



No, they just wanted to force people to use only nVidia hardware, plain and simple. All they had to do was allow Physx to run in coprocessor mode, like the original PPU, if they were worried about conflicting video drivers.

Hell, they don't even have to go that far. They can simply say mixed gpu solutions are not officially supported, and they wash their hands of the imagined support costs.

And none of that explains why they let it go for so long before deciding to cut it out.

It has nothing to do with support at all. It's nVidia not happy with the situation, and taking their ball and going home.



Mussels said:


> so come up with a toggle in the driver options to switch a video card from GPU to PPU/CUDA card, so that video drivers turn off and only CUDA (and apps that use it) remain.
> 
> Set in a safeguard so that it cant be used if a monitor is connected to the card, and away you go, back to the Ageia days.
> 
> ...



Stop using Batman as an example. It's a poor one, and doesn't support your arguments at all. We've been over this a million times. That is one place nV was not wrong. The AA in Batman does not work in ATI properly, even when you force it.


----------



## erocker (May 30, 2010)

Wile E said:


> Stop using Batman as an example. It's a poor one, and doesn't support your arguments at all. We've been over this a million times. That is one place nV was not wrong. The AA in Batman does not work in ATI properly, even when you force it.



It does work with a Physics card rather well. Both AA and the PhysX. So is AA being run exclusively through my GT 240?


----------



## TheMailMan78 (May 30, 2010)

Wile E said:


> No, they just wanted to force people to use only nVidia hardware, plain and simple. All they had to do was allow Physx to run in coprocessor mode, like the original PPU, if they were worried about conflicting video drivers.
> 
> Hell, they don't even have to go that far. They can simply say mixed gpu solutions are not officially supported, and they wash their hands of the imagined support costs.
> 
> ...


The AA works fine when you force it. I even made a thread on it.


----------



## Wile E (May 30, 2010)

erocker said:


> It does work with a Physics card rather well. Both AA and the PhysX. So is AA being run exclusively through my GT 240?





TheMailMan78 said:


> The AA works fine when you force it. I even made a thread on it.




Is it enabled thru the in game settings?


----------



## Benetanegia (May 30, 2010)

Mussels said:


> so come up with a toggle in the driver options to switch a video card from GPU to PPU/CUDA card, so that video drivers turn off and only CUDA (and apps that use it) remain.
> 
> Set in a safeguard so that it cant be used if a monitor is connected to the card, and away you go, back to the Ageia days.



Like I said GPU drivers and PhysX drivers are closely tied, they can't do that.



TheMailMan78 said:


> Yeah and one small disclaimer on the box would cover them in most countries if they are that chicken shit. Like I said if one hacker can make it work great then why can't a billion dollar company? Its PR BS plain and simple. If you still don't think so then you should sue every game developer in the world for not making 100% compatible games with EVERY combination of hardware.



Like I said because that hacker has never QA it. Thousands of people in the internet have done it for free. If the hacker had to pay to everyone that helped him make the mod stable, he would need to be a multibillion company and have a real interest in spending that much on it too. Besides, they do a lot more reverse engineering in NGOH than it is "socially acceptable" in the bussiness world, if you know what I mean. They have much more (real, applicable) access to Ati hardware than any company will ever do.



> Yup. Now tell me AGEIA was a better funded company than Nvidia.



yup, and Ageia went bankrupt... 

Nvidia strugles to stay in green. Hell even Ati struggles and the reason that Stream never kicked off is because they simply didn't want to put money on it. Same for OpenCL right now, or GPU accelerated Havok or countless of other examples. Just because Nvidia has money doesn't mean they have to let it go down the drain.


----------



## Wile E (May 30, 2010)

Benetanegia said:


> *Like I said because that hacker has never QA it*. Thousands of people in the internet have done it for free. If the hacker had to pay to everyone that helped him make the mod stable, he would need to be a multibillion company and have a real interest in spending that much on it too. Besides, they do a lot more reverse engineering in NGOH than it is "socially acceptable" in the bussiness world, if you know what I mean. They have much more (real, applicable) access to Ati hardware than any company will ever do.
> 
> 
> 
> ...


It was working fine before nV blocked it. We didn't even need the hacker.


----------



## Mussels (May 30, 2010)

Wile E said:


> Stop using Batman as an example. It's a poor one, and doesn't support your arguments at all. We've been over this a million times. That is one place nV was not wrong. The AA in Batman does not work in ATI properly, even when you force it.



not just AA, the items/debris that suddenly appears as well, with physX on. its got nothing to do with physX as iirc, it was on the console versions.


I use it as an example because i keep hearing crap about it from nvidia users...and AA works just fine in it on ATI, if you force it via CCC.


----------



## Benetanegia (May 30, 2010)

Wile E said:


> It was working fine before nV blocked it. We didn't even need the hacker.



It was working *back then* and it might work now just like I can take my 9600 pro out of the closet, take it's dirver cd and play many games. Why the hell do they release GPU drivers every month? The previous ones work just well...

They don't want to have to worry EVER. Period. That's why you just cut it off. Other option is enable it and see the web flooding with complaints. And don't say that there would be no complaints, because there's been many complaints about far more irrelevant things than poor fps in a certain game which is the first sympton that would be noticed.



Mussels said:


> not just AA, the items/debris that suddenly appears as well, with physX on. its got nothing to do with physX as iirc, it was on the console versions.
> 
> 
> I use it as an example because i keep hearing crap about it from nvidia users...and AA works just fine in it on ATI, if you force it via CCC.



When forced from CCC it's not Nvidia's AA, it's the normal supersampling AA that is *always* posible.


----------



## Mussels (May 30, 2010)

Benetanegia said:


> It was working *back then* and it might work now just like I can take my 9600 pro out of the closet, take it's dirver cd and play many games. Why the hell do they release GPU drivers every month? The previous ones work just well...
> 
> They don't want to have to worry EVER. Period. That's why you just cut it off. Other option is enable it and see the web flooding with complaints. And don't say that there would be no complaints, because there's been many complaints about far more irrelevant things than poor fps in a certain game which is the first sympton that would be noticed.
> 
> ...



no one cares that nvidias special optimised AA mode is disabled, just that they blocked AA on ATI cards in the first place (Why not allow ATI to have supersampling AA in the in-game options?)


----------



## Wile E (May 30, 2010)

Mussels said:


> no one cares that nvidias special optimised AA mode is disabled, just that they blocked AA on ATI cards in the first place *(Why not allow ATI to have supersampling AA in the in-game options?*)



Doesn't work properly on the hardware. That was proven a long time ago with back to back screenshots with it enabled on ATI, when people first starting complaining about it.

Or if you mean the regular SSAA, it's because it's not in the engine at all. It would just be the equivalent of forcing it thru the CCC anyway.


----------



## Mussels (May 30, 2010)

Wile E said:


> Doesn't work properly on the hardware. That was proven a long time ago with back to back screenshots with it enabled on ATI, when people first starting complaining about it.



screenshots arent the best way to confirm AA is working, if its done in post processing it doesnt show up in screenies - i've heard of this many times before as to why screenshots look worse than in game.


----------



## Benetanegia (May 30, 2010)

Mussels said:


> no one cares that nvidias special optimised AA mode is disabled, just that they blocked AA on ATI cards in the first place (Why not allow ATI to have supersampling AA in the in-game options?)



MY GOD!!! This has been discussed thousands of times. Unreal Engine 3 has no AA and no other UE3 game besides Batman has an in-game AA option. If you want to enable it it has to be made from CCC... why batman has to be any different?? Nvidia didn't block anything, they added their own AA. Period.


----------



## Wile E (May 30, 2010)

Mussels said:


> screenshots arent the best way to confirm AA is working, if its done in post processing it doesnt show up in screenies - i've heard of this many times before as to why screenshots look worse than in game.



Even the people running it said there was no difference.

They didn't disable anything for ATI. They added a feature for themselves. 2 entirely different things. If they disabled shit for ATI, we would already be hearing about anti-trust/anti-competitive lawsuits or investigations. Nothing supports your theory, mussels.


----------



## TheMailMan78 (May 30, 2010)

Benetanegia said:


> yup, and Ageia went bankrupt...
> 
> Nvidia strugles to stay in green. Hell even Ati struggles and the reason that Stream never kicked off is because they simply didn't want to put money on it. Same for OpenCL right now, or GPU accelerated Havok or countless of other examples. Just because Nvidia has money doesn't mean they have to let it go down the drain.



Ageia went bankrupt because they had no financial backing to push Physx into the mainstream. Its not because they couldn't support the QA monetarily. Nvidia bought them out because they could. Now you are ether saying Physx is a waste of time and that is why Nvidia doesn't do the "QA".

Come on man admit it! Nvidia blocked a feature so that you would buy their hardware exclusively. All that hacker did was re-enable it. This has nothing to do with QA and everything to do with investment. If Nvidia was smart and REALLY wanted Phyisx to go mainstream they would sell dedicated PPU's like AGIEA. However they won't. Why because they think they bought the golden goose with Physx. Problem is none of the developers seem to agree unless you toss a bucket of money at them with TWIMTBP program.


----------



## Mussels (May 30, 2010)

i wasnt the one who brought up AA! i said batman AA as in batman Arkham Asylum...

i was talking about the other stuff, the random debris and effects that got disabled without physX, when they dont need it for those effects.


----------



## Wile E (May 30, 2010)

Mussels said:


> i wasnt the one who brought up AA! i said batman AA as in batman Arkham Asylum...
> 
> i was talking about the other stuff, the random debris and effects that got disabled without physX, when they dont need it for those effects.



So it's just an anti-Physx post then? Coming from someone who considers you a friend, that seems a bit trollish to me, Mussels.

Physx is capable of much more. It doesn't have the market share for devs to use it for anything more tho. Non-nV users still need to want to play the games, and using Physx too heavily counts them out of the super advanced features. That IS nV's fault, however, and this blocking Physx on systems with ATI is one of the prime reasons.


----------



## Benetanegia (May 30, 2010)

Mussels said:


> i wasnt the one who brought up AA! i said batman AA as in batman Arkham Asylum...
> 
> i was talking about the other stuff, the random debris and effects that got disabled without physX, when they dont need it for those effects.



PhysX is hardware accelerated on at least the PS3, maybe that's why they have certain features. The Xbox is able to handle 3 threads so they might have one only for physx too. AS much as you might disagree, NO, that cannot be done on the PC, because most people don't have quads. And games using PhysX, when runing from the CPU do use 2 threads, although that's something the developer decides, how many to use, always based on the lowest common denominator. For comparison Havok games use only one CPU core, I have tested that myself, with NFS:Shift, Batman and Mass Effect for PhysX and Fallout3, L4D and HL2:Ep2 for Havok.


----------



## lyndonguitar (May 30, 2010)

I have a question? What company is supporting Crysis 2 right now?. Nvidia or ATI? I saw some demos of Crysis 2 run on Eyefinity and was hosted by ATi and wondering because if its Nvidia again, PhysX will be there and i don't have a PhysX Card(yet.) If its Nvidia, might as well buy a cheap Nvidia card.


----------



## Benetanegia (May 30, 2010)

TheMailMan78 said:


> If Nvidia was smart and REALLY wanted Phyisx to go mainstream they would sell dedicated PPU's like AGIEA.



They don't do that, because that bussiness model was proved to be fail. Ageia already did it and everybody at the time agreed that a PPU could be a good idea as long as it was integrated, either in the MB or the GPU and Nvidia did exactly that, integrate it into the GPU.



> Problem is none of the developers seem to agree unless you toss a bucket of money at them with TWIMTBP program.



That is not a problem with PhysX. That is a problem that affects PhysX, affects the inclusion of dedicated servers, affects the optimization of the PC port, affects the UI... Without the pushing from PC centric companies, most games would lack dedicated servers, would look like a 2000 game while running so bad that would seem they were running off a Pentium 2 and you would have to use the arrow keys to aim and press the triangle to shoot and the circle to jump (not that it isn't happening already... i.e. Dead Space? Street Fighter 4?).


----------



## newtekie1 (May 30, 2010)

Mussels said:


> not just AA, the items/debris that suddenly appears as well, with physX on. its got nothing to do with physX as iirc, it was on the console versions.
> 
> 
> I use it as an example because i keep hearing crap about it from nvidia users...and AA works just fine in it on ATI, if you force it via CCC.



No, it wasn't on the console versions, at least not the PS3 version.  The console versions looked the same as the PC version with PhysX turned off.

The AA forced through CCC is FSAA, not MSAA, which is why the AA enabled through CCC comes at a huge performance hit, and the AA enabled through the game menu doesn't.

Also, the AA that nVidia added to UE3 for Batman doesn't work on ATi hardware as it wasn't coded for ATi hardware.  This is identical to your argument about CUDA not working on ATi hardware.  It wasn't coded for ATi hardware, so it doesn't work on ATi hardware.


----------



## GenL (May 30, 2010)

Guys, just to let you know... about a month ago i've made a fix for Batman which makes the game use its native MSAA code on any hardware.
You can find more info about it here: http://www.ngohq.com/graphic-cards/17716-batman-arkham-asylum-msaa-fix.html

Such thought as...





> doesn't work on ATi hardware as it wasn't coded for ATi hardware


are invalid.
This case has nothing to do with nvidia hardware/technology, just two VendorID checks. One in the launcher, another one in the game.

What about CUDA... it's just a badly programmed applications. They are supposed to choose CUDA GPU by themselves, but some apps just ignore anything but primary GPU.


----------



## the54thvoid (May 30, 2010)

The QA issue is nonsense.  Nvidia does not guarantee 'safety' of its _own drivers_ as clearly stated in the very end of the product info for all it's WHQL driver releases.  The following is a direct quote..

"*The software is provided 'as is', without warranty of any kind*, express or implied, including but not limited to the warranties of merchantability, fitness for a particular purpose and noninfringement.  *In no event shall the contributors or copyright holders be liable for any claim, damages or other liability*, whether in an action of contract, tort ot otherwise, *arising from, out of or in connection with the software or the use or other dealings with the software*"

I'm sure ATI has the same disclaimer, effectively saying if drivers are dodgy, tough - you installed them.  It's this sort of small print ant the end of the release notes thate makes a mockery of the notion that just because they're official you can use them with absolute certainty you have recourse to legal action if things go wrong.

So this kind of nullifies any arguments about QA for mixing gfx cards and physx where system damage is the end result.


----------



## Benetanegia (May 30, 2010)

the54thvoid said:


> The QA issue is nonsense.  Nvidia does not guarantee 'safety' of its _own drivers_ as clearly stated in the very end of the product info for all it's WHQL driver releases.  The following is a direct quote..
> 
> "*The software is provided 'as is', without warranty of any kind*, express or implied, including but not limited to the warranties of merchantability, fitness for a particular purpose and noninfringement.  *In no event shall the contributors or copyright holders be liable for any claim, damages or other liability*, whether in an action of contract, tort ot otherwise, *arising from, out of or in connection with the software or the use or other dealings with the software*"
> 
> ...



That disclaimer is as useless as the EULA in games. Maybe it has legal weight in the US, but outside of the US it's useless. They can say as much as they want but, at least in the EU, they have legal responsability no matter what they say. Laws are always above any contract.

Those disclaimers and the EULA are put there in order to make people think they can't do anything if something goes wrong and from what I see, it works.


----------



## wahdangun (May 30, 2010)

Benetanegia said:


> That disclaimer is as useless as the EULA in games. Maybe it has legal weight in the US, but outside of the US it's useless. They can say as much as they want but, at least in the EU, they have legal responsability no matter what they say. Laws are always above any contract.
> 
> Those disclaimers and the EULA are put there in order to make people think they can't do anything if something goes wrong and from what I see, it works.



hmmm so if QA nvdia so great why they release WHQL driver that shut off the fan and make the card overheating.

it seems to me they don't take QA seriously


----------



## TheMailMan78 (May 30, 2010)

wahdangun said:


> hmmm so if QA nvdia so great why they release WHQL driver that shut off the fan and make the card overheating.
> 
> it seems to me they don't take QA seriously



There is no point. Hes blinded by PR rhetoric.


----------



## newtekie1 (May 30, 2010)

wahdangun said:


> hmmm so if QA nvdia so great why they release WHQL driver that shut off the fan and make the card overheating.
> 
> it seems to me they don't take QA seriously



Yeah, that is definitely proof that nVidia doesn't care at all about QA.

Or maybe it is a sign that they are already working with a QA department that is stretched thin enough to let the occasional thing slip through.  So adding more QA work load is probably a bad idea...

But yeah, the nVidia doesn't care about QA theory makes far more sense.


----------



## GenL (May 30, 2010)

Guys, just to let you know, i've made a fix for Batman to make the game use its native MSAA code about a month ago. You can find it on NGOHQ.

As for





> doesn't work on ATi hardware as it wasn't coded for ATi hardware


, you are wrong if you think so. It's all about 2 VendorID checks - one in the launcher, another one in the game.


----------



## TheMailMan78 (May 30, 2010)

GenL said:


> Guys, just to let you know, i've made a fix for Batman to make the game use its native MSAA code about a month ago. You can find it on NGOHQ.
> 
> As for, you are wrong if you think so. It's all about 2 VendorID checks - one in the launcher, another one in the game.



Link?


----------



## Loosenut (May 30, 2010)

Welcome to TPU GenL


----------



## wahdangun (May 30, 2010)

GenL said:


> Guys, just to let you know, i've made a fix for Batman to make the game use its native MSAA code about a month ago. You can find it on NGOHQ.
> 
> As for, you are wrong if you think so. It's all about 2 VendorID checks - one in the launcher, another one in the game.



wow, so that's true after all,

where is the link man ?


----------



## newtekie1 (May 30, 2010)

TheMailMan78 said:


> Link?



It doesn't work, there was a long thread about it and the people that tested it were able to enable the option in the options, but it didn't actually do anything.  No AA was applied, and I seem to remember it actually caused the game to crash sometimes.

Edit:  This specific post talks about the issues with forcing the vender ID, which shouldn't be any different then removing the vender ID checks.  Basically, AA doesn't get enabled, or at least not the way it should, with some things having some AA applied, but others none at all.  It just doesn't work right on ATi hardware.  AFAIK, no one has got it working properly.  You can enable the option in the settings, but it doesn't actually work properly.


----------



## Benetanegia (May 30, 2010)

wahdangun said:


> hmmm so if QA nvdia so great why they release WHQL driver that shut off the fan and make the card overheating.
> 
> it seems to me they don't take QA seriously



In fact that's the point. That driver didn't shut off the fan in every card, not at all, only on a very few number of ones and look what has happened. Yet, you can bet that someone got punished in the QA department anyway.

Besides I never said they are great or serious about QA, not at all. I said they don't want to spend money on QAing something that would not make them more money than it costs. It's that simple and at this point I don't care if you all understand that simple concept or not.


----------



## wahdangun (May 30, 2010)

newtekie1 said:


> It doesn't work, there was a long thread about it and the people that tested it were able to enable the option in the options, but it didn't actually do anything.  No AA was applied, and I seem to remember it actually caused the game to crash sometimes.



so can u give me the link to the thread newtikie?


----------



## newtekie1 (May 30, 2010)

wahdangun said:


> so can u give me the link to the thread newtikie?



I did even better, I liked to the specific post addressing the issue.

I was already looking for it when you posed.


----------



## wahdangun (May 30, 2010)

newtekie1 said:


> I did even better, I liked to the specific post addressing the issue.
> 
> I was already looking for it when you posed.



what is that link? it's not from NGHOK ? 

btw i just want to see the original link, so maybe GenL have fix the issue


----------



## GenL (May 30, 2010)

wahdangun said:


> where is the link man ?


http://www.ngohq.com/graphic-cards/17716-batman-arkham-asylum-msaa-fix.html

Or just google "batman msaa" - you'll get lucky.


----------



## GenL (May 30, 2010)

wahdangun said:


> where is the link man ?


Link: www.ngohq.com/graphic-cards/17716-batman-arkham-asylum-msaa-fix.html

Or just google "batman msaa" - you'll get lucky.


----------



## xtremesv (May 30, 2010)

driver66 said:


> For all of the bitching about how physx sucks and is "useless" These threads sure do generate A LOT of interest



This thread comments section is extense by now. I think driver66 is right, we, ATi owners, are making a big deal of this stuff and giving to Nvidia arguments about PhysX and CUDA. I wouldn't be surprised that Nvidia's marketing department starts using some ATi users' comments to advertise something like "PhysX and CUDA makes them envy you".

If you really want to try hybrid PhysX out, use the beta driver or hack and play


----------



## wahdangun (May 30, 2010)

GenL said:


> Link: www.ngohq.com/graphic-cards/17716-batman-arkham-asylum-msaa-fix.html
> 
> Or just google "batman msaa" - you'll get lucky.





btw, why this "fix" was detected as virus with my avira?


----------



## GenL (May 30, 2010)

xtremesv said:


> I wouldn't be surprised that Nvidia's marketing department starts using some ATi users' comments to advertise something like "PhysX and CUDA makes them envy you".


And they would be wrong about this. It's not about envy, it's about limitations they are creating to advertise their products. People just don't like these limitations. You know, a twice-dropped fps after breaking a glass in-game or inability to use in-game MSAA look more like limitations to me.
People are still using nvidia GPUs to process PhysX, this is 100% legitimate since nvidia said their GPUs can process PhysX, don't you think? Why say they do envy if they are just using a product they own for a proper task it's supposed to process?



wahdangun said:


> btw, why this "fix" was detected as virus with my avira?


Because your avira doesn't know how to deal with some packers, i suppose.
Didn't you know that the Hybrid PhysX mod is also virus-tagged because of antivirus mistakes?..


----------



## newtekie1 (May 30, 2010)

wahdangun said:


> what is that link? it's not from NGHOK ?
> 
> btw i just want to see the original link, so maybe GenL have fix the issue



Sorry, that was the link to the original discussion when the Batman AA thing was big in the news.  They didn't remove the vender ID checks, they just emulated the vender ID.  The result was the same though, it allowed MSAA to be enabled in Batman, and it didn't work properly. It sort of worked, but not totally.



GenL said:


> Because your avira doesn't know how to deal with some packers, i suppose.
> Didn't you know that the Hybrid PhysX mod is also virus-tagged because of antivirus mistakes?..



Actually, if you believe Regeneration from NGOHQ, it is a big conspiricy by nVidia...because nVidia has so much power they can get anti-virus makers to put false positives in their software...


----------



## GenL (May 31, 2010)

newtekie1 said:


> It sort of worked, but not totally.


I don't quite understand this... For me it worked perfectly (fix, not emulation), i mean the game was using its own MSAA without any problems. If by "not totally" someone means that this native MSAA is not perfectly nice, that's because it's not a regular AA scheme, but a custom-implemented one at the engine level with some limitations. This doesn't change the fact that such MSAA looks same for nvidia and ati GPUs, though.
My screenshots of this MSAA can be found at the NGOHQ topic.



newtekie1 said:


> Actually, if you believe Regeneration from NGOHQ, it is a big conspiricy by nVidia...because nVidia has so much power they can get anti-virus makers to put false positives in their software...


Nope, perhaps some people who screamed about this were somehow biased, but the whole false-positive thing was caused by the packer and bad antivirus logic only.

You know, it's just a trivial thing today. I remember the very first time i was browsing through official Batman AA forums' content section, where users put their custom skins for download, some helpful tool they were using was flagged as a virus too. Why don't they scream about it? Because they understand it's not a big deal when some AV software does a mistake.


----------



## Mussels (May 31, 2010)

avira is just a bad AV, thats why.

thanks for posting that fix man


----------



## newtekie1 (May 31, 2010)

GenL said:


> Nope, perhaps some people who screamed about this were somehow biased, but the whole false-positive thing was caused by the packer and bad antivirus logic only.
> 
> You know, it's just a trivial thing today. I remember the very first time i was browsing through official Batman AA forums' content section, where users put their custom skins for download, some helpful tool they were using was flagged as a virus too. Why don't they scream about it? Because they understand it's not a big deal when some AV software does a mistake.



You mistunderstand me.  I totally agree that it is a false positive.  However, the last time something came from NGOHQ that was a false postive, several sites picked it up and reported that it contained a virus according to several anti-virus programs.  Yes, that is what tech sites do.  One site posts a popular hack contains a virus, and the rest report the same.  Regeneration completely flew off the handle, and started going on rants about how it is a scam that nVidia bribed all those sites to report it as a trojan...

These sites included far better sites then his, including HardOCP and Guru3d.  He even claims both sites are corrupt, in fact he claimed every site the reported the news was part of the scam and corrupt.

Of course he also blames eVGA, and by extension nVidia, because there was discussion about it on their forums by users with no affiliation with eVGA or nVidia...

I'm sure he now that your hack has been reported as a virus her, he will likely do the same and claim TPU is in nVidia's pocket and it is a big scam.

Of course if a more legit packing was used for these mods, and not one generally used by viruses to avoid detection, this wouldn't have been a problem to begin with...


----------



## Mussels (May 31, 2010)

newtekie1 said:


> You mistunderstand me.  I totally agree that it is a false positive.  However, the last time something came from NGOHQ that was a false postive, several sites picked it up and reported that it contained a virus according to several anti-virus programs.  Yes, that is what tech sites do.  One site posts a popular hack contains a virus, and the rest report the same.  Regeneration completely flew off the handle, and started going on rants about how it is a scam that nVidia bribed all those sites to report it as a trojan...
> 
> These sites included far better sites then his, including HardOCP and Guru3d.  He even claims both sites are corrupt, in fact he claimed every site the reported the news was part of the scam and corrupt.
> 
> ...



his mod shows clean in kaspersky. its not the mod-makers faults that some people insist on using shitty heuristics based AV's.


----------



## newtekie1 (May 31, 2010)

Mussels said:


> his mod shows clean in kaspersky. its not the mod-makers faults that some people insist on using shitty heuristics based AV's.



It doesn't matter, that isn't the point, false positives happen with every AV.  

The point is both mods are packed in a way that is known to cause false positives.  The mod makers know this, yet they continue to do it.  There is no reason they can't pack their files in a way that isn't known to cause false positives.  It certainly is the mod-maker's fault that they use shitty non-standard packers.

And the main point is that when the false positives happened, and were reported, Regeneration went ape shit calling every site that reported the news corrupt and part of a scam against HGOHQ.


----------



## Wile E (May 31, 2010)

I want to see more comparison screen shots with and without in game AA, on both brands of cards at the same settings.


----------



## Wile E (May 31, 2010)

ANd GenL, you really need to use a different packing method. It comes up dirty as hell on VirusTotal.

http://www.virustotal.com/analisis/...bd01ed51dd3d6ffcd68715c5fb203c2fd6-1274120389


----------



## newtekie1 (May 31, 2010)

Wile E said:


> ANd GenL, you really need to use a different packing method. It comes up dirty as hell on VirusTotal.
> 
> http://www.virustotal.com/analisis/...bd01ed51dd3d6ffcd68715c5fb203c2fd6-1274120389



Shhhh...

It isn't the mod-maker's fault, all those anti-viruses are just shit.


----------



## Regeneration (May 31, 2010)

newtekie1 said:


> Actually, if you believe Regeneration from NGOHQ, it is a big conspiricy by nVidia...because nVidia has so much power they can get anti-virus makers to put false positives in their software...



That wasn't just a regular false positive! It was a gamer-virus-false-positive. Jeeeee!! What a coincidence.

Instead of checking up those false positive reports, both sites rush to censor the word NGOHQ from their sites and warn users not to download anything from our site.  

Those sites seem to think they should report any news that hits any site without investigation, hasn't reported the latest PhysX development - like all the other sites, and yet they report some random false positive virus responses to known wrappers.

Just recently Gabriel Torres from Hardware Secrets got blacklisted by Nvidia, because he refused to be their puppet. 

So yes, I believe someone has intentionally tricked Symantec. Who? I don't know. But if it looks like a duck, sounds like a duck, good chances that... it is a duck.


----------



## Mussels (May 31, 2010)

newtekie1 said:


> Shhhh...
> 
> It isn't the mod-maker's fault, all those anti-viruses are just shit.



IMO... yeah, the ones that showed it as a virus ARE shit AV's. kaspersky, nod32, arent in there. neither is MSE for that matter.


still, with so many shit AV's on the market, it is best to pack things in a way they wont whine about.


----------



## Wile E (May 31, 2010)

Mussels said:


> IMO... yeah, the ones that showed it as a virus ARE shit AV's. kaspersky, nod32, arent in there. neither is MSE for that matter.
> 
> 
> still, with so many shit AV's on the market,* it is best to pack things in a way they wont whine about*.



Exactly. I trust that it's clean, but many wouldn't. Put that up on Demonoid, for instance, and it would get nuked the instant somebody posted those VT results.

I agree, most of them aren't very good AVs, but that many hits is going to cause controversy.


----------



## wahdangun (May 31, 2010)

so how exactly this patch work? 

and btw is this patch effect my save game?


----------



## GenL (May 31, 2010)

newtekie1 said:


> You mistunderstand me.


I believe i understand you. Such rants always could be biased (i mean all sides - fud is everywhere, you know!), but everyone have to agree that it came from HardOCP at first. Yes, i disagree with some points Regeneration posted at NGOHQ about this, as i stated in my comments there, although they all seem to be valid for most users who don't deeply understand how AV software works. It's not a big deal.
The accusal without any proofs is what i really refuse to understand. Moreover, the original author of those claims refused to even try to get any proofs after i contacted him. So i'll just stick with a fact that all those tech-sites relations are serious business and i don't really need/want to understand their policies.



Wile E said:


> you really need to use a different packing method.





newtekie1 said:


> It isn't the mod-maker's fault, all those anti-viruses are just shit.


I suggest everyone who wants to understand my point about this, to spend some time and read a topic at Rage3D: www.rage3d.com/board/showthread.php?t=33962732
My standpoint is fully explained in my posts there.



wahdangun said:


> so how exactly this patch work?
> 
> and btw is this patch effect my save game?


How it works, is explained just under the screenshots there at the original topic: http://www.ngohq.com/graphic-cards/17716-batman-arkham-asylum-msaa-fix.html
No, it shouldn't affect your savegames.



Wile E said:


> I want to see more comparison screen shots with and without in game AA, on both brands of cards at the same settings.


I would like to see it too. I can't do it by myself, sorry.


----------



## newtekie1 (May 31, 2010)

Regeneration said:


> That wasn't just a regular false positive! It was a gamer-virus-false-positive. Jeeeee!! What a coincidence.
> 
> Instead of checking up those false positive reports, both sites rush to censor the word NGOHQ from their sites and warn users not to download anything from our site.
> 
> ...



Look at the god damn results, there are like 20 anti-viruses that report the false positive.  Use a different wrapper that doesn't trip AVs, and stop all this BS nVidia is out to get us crap.  Symantec certainly has no reason to bow down to nVidia and put false positives in their software because nVidia wanted them to.  It is completely idiotic to even suggest that the false positive was caused by anything other then your use of a poor wrapper.

And Gabriel Torres deserved to be blacklisted.  His reviews are complete shit, he wants nVidia to keep giving him free samples, but won't even put CUDA and PhysX in a feature list when asked.  I wouldn't give him free shit either.


----------



## Regeneration (May 31, 2010)

newtekie1 said:


> Look at the god damn results, there are like 20 anti-viruses that report the false positive.  Use a different wrapper that doesn't trip AVs, and stop all this BS nVidia is out to get us crap.  Symantec certainly has no reason to bow down to nVidia and put false positives in their software because nVidia wanted them to.  It is completely idiotic to even suggest that the false positive was caused by anything other then your use of a poor wrapper.



You twisted my post. False positives usually look like this:

http://www.virustotal.com/analisis/...4e442ccc23cedec214b355c04157061fdb-1274174744

It is pretty simple to take a binary, bundle it with a trojan and submit it to AV vendors. Anyone can do it.


----------



## Wile E (Jun 1, 2010)

Regeneration said:


> You twisted my post. False positives usually look like this:
> 
> http://www.virustotal.com/analisis/...4e442ccc23cedec214b355c04157061fdb-1274174744
> 
> It is pretty simple to take a binary, bundle it with a trojan and submit it to AV vendors. Anyone can do it.



If I didn't already know, test and trust the patch myself, and have a basic understanding of AV and packing, 11 hits earns it a nuke in my book. I won't mess with an 11 hit file unless I have no other choice, then I sandbox it first. 4 or 5 hits on the lesser AVs, and I might take a chance, start getting into double digits, and I'm looking for something else if at all possible. 

Somebody donate him a better packer if need be, and the controversy disappears altogether. Shit, I'd donate to get him a better packer that doesn't throw a million false positives.


----------



## GenL (Jun 1, 2010)

Wile E said:


> Somebody donate him a better packer if need be, and the controversy disappears altogether. Shit, I'd donate to get him a better packer that doesn't throw a million false positives.


That's what i tried to tell at Rage3D - it really has nothing to do with money, as it won't help here.


----------



## Fourstaff (Jun 1, 2010)

Anyway, back on topic: I think Nvidia is less benevolent than we think:
http://www.anandtech.com/show/3744/...terogeneous-gpu-physx-its-a-bug-not-a-feature


----------



## GenL (Jun 1, 2010)

Fourstaff said:


> Anyway, back on topic: I think Nvidia is less benevolent than we think:
> http://www.anandtech.com/show/3744/...terogeneous-gpu-physx-its-a-bug-not-a-feature


Better source: http://blogs.nvidia.com/ntersect/2010/05/update-on-release-256-physx-support-1.html


----------



## TRIPTEX_CAN (Jun 1, 2010)

So I got some "new" shiny to play with. I'll be testing them tonight individually (rig in specs) but I'm a little worried about power draw. I think I'll most likely underclock the cards to around 500Mhz core. Do you guys think my system will be able to cope? Keep in mind my 5970 runs stock clocks. 

Also assuming both cards check out and run perfectly there is a chance one will go up for sale if anyone is interested they can PM me.


----------



## OnBoard (Jun 1, 2010)

TRIPTEX_MTL said:


> I think I'll most likely underclock the cards to around 500Mhz core. Do you guys think my system will be able to cope?



Underclock memory as low as it goes and if you underclock core make sure it's not linked with shaders. Although it might have plenty of performance on lower shaders too. And yes, should cope. PhysX doesn't peak the cards power consumption.


----------



## newtekie1 (Jun 1, 2010)

Wile E said:


> If I didn't already know, test and trust the patch myself, and have a basic understanding of AV and packing, 11 hits earns it a nuke in my book. I won't mess with an 11 hit file unless I have no other choice, then I sandbox it first. 4 or 5 hits on the lesser AVs, and I might take a chance, start getting into double digits, and I'm looking for something else if at all possible.
> 
> Somebody donate him a better packer if need be, and the controversy disappears altogether. Shit, I'd donate to get him a better packer that doesn't throw a million false positives.



You know, I would think it would be good practice to run any file you are going to distribute through that checker, if it generates false positives, then fix it before making it public...and of course don't fly off the handle with conspiricy theorys about everyone running a scam against you with nVidia at the top when reports start coming in about the false positives...

And isn't the 7zip packer free?


----------



## dadi_oh (Jun 1, 2010)

OnBoard said:


> Underclock memory as low as it goes and if you underclock core make sure it's not linked with shaders. Although it might have plenty of performance on lower shaders too. And yes, should cope. PhysX doesn't peak the cards power consumption.



My GT240 with only 96 shaders is used less than 60% even in Batman Arkham Asylum. In UT3 Physx levels it tops at about 25% GPU usage. Fluidmark benchmark only gets it to about 45% usage. That suggests to me that your 112 cores on the 8800GT would be just fine running at a lower clock to reduce temps and fan noise (those single slot 8800GT coolers are noisy). And as suggested I don't believe memory bandwidth plays much of a role at all so you could downclock that a bit too.


----------



## TRIPTEX_CAN (Jun 1, 2010)

OnBoard said:


> Underclock memory as low as it goes and if you underclock core make sure it's not linked with shaders. Although it might have plenty of performance on lower shaders too. And yes, should cope. PhysX doesn't peak the cards power consumption.



Thanks, I've never clocked an NV card before so I'm not really sure where to start. I have the feeling I can use EVGA precision with any NV GPU so I'll probably start with that. From what I understand physx power draw is pretty low and physx doesn't even saturate PCI-e x1.


----------



## newtekie1 (Jun 1, 2010)

dadi_oh said:


> My GT240 with only 96 shaders is used less than 60% even in Batman Arkham Asylum. In UT3 Physx levels it tops at about 25% GPU usage. Fluidmark benchmark only gets it to about 45% usage. That suggests to me that your 112 cores on the 8800GT would be just fine running at a lower clock to reduce temps and fan noise (those single slot 8800GT coolers are noisy). And as suggested I don't believe memory bandwidth plays much of a role at all so you could downclock that a bit too.



I just did an interesting quick write up on this exact subject.

Using a 9600GT LP, which is already underclocked from stock 9600GT specs, with only 64 shaders, there was actually no performance benefit of having it in my main rig as a dedicated with a GTX470, tested with Batman:AA.

Underclocking the 9600GT as low as the sliders would let me go in MSI Afterburner dropped the FPS about 3-5FPS.

So even an 8800GT is probably overkill, even if everything is underclocked as low as possible.



TRIPTEX_MTL said:


> Thanks, I've never clocked an NV card before so I'm not really sure where to start. I have the feeling I can use EVGA precision with any NV GPU so I'll probably start with that. From what I understand physx power draw is pretty low and physx doesn't even saturate PCI-e x1.



I'd try using the latest version of MSI Afterburner(1.6.0 Beta 6), it works with any nVidia cards like Precisions, but seems to work better with the latest drivers.  I had some problems getting the latest version of Precision to read the clock speeds properly and actually overclock my GTX470 when I moved the sliders.


----------



## TRIPTEX_CAN (Jun 1, 2010)

I wonder if it will conflict with my 5970 though. Unless the latest AB 1.6.0 has a drop box to choose which card to configure... I use AB to clock my 5970 and I cant see how I would possibly differentiate between GPUs in the program.

The older precision should recognize the 8800GT regardless of drivers no? Unless in the case of your 470 precision worked until you changed drivers then that's another issue.


----------



## newtekie1 (Jun 1, 2010)

TRIPTEX_MTL said:


> I wonder if it will conflict with my 5970 though. Unless the latest AB 1.6.0 has a drop box to choose which card to configure... I use AB to clock my 5970 and I cant see how I would possibly differentiate between GPUs in the program.
> 
> The older precision should recognize the 8800GT regardless of drivers no? Unless in the case of your 470 precision worked until you changed drivers then that's another issue.



I only ever tried with the latest drivers, so I'm not sure, it could be the card or the drivers.

Afterburner has a place in the settings to switch GPUs, though I don't know if it will apply the different profiles/clocks to both cards at the same time, or if it will only work on one card at a time.


----------



## TRIPTEX_CAN (Jun 1, 2010)

newtekie1 said:


> I only ever tried with the latest drivers, so I'm not sure, it could be the card or the drivers.
> 
> Afterburner has a place in the settings to switch GPUs, though I don't know if it will apply the different profiles/clocks to both cards at the same time, or if it will only work on one card at a time.



OK, I'll look for it but I plan to keep the 5970 stock for most of the tests tonight so I think AB should work.


----------



## TRIPTEX_CAN (Jun 2, 2010)

One care smoked when I powered up the system and when the other card was installed the system did boot but wouldn't display video.


----------



## OnBoard (Jun 2, 2010)

TRIPTEX_MTL said:


> One care smoked when I powered up the system and when the other card was installed the system did boot but wouldn't display video.



Eew, now you have to merge them together to get one working card


----------



## dadi_oh (Jun 2, 2010)

TRIPTEX_MTL said:


> One care smoked when I powered up the system and when the other card was installed the system did boot but wouldn't display video.



Probably not related but I did need to make dummy plug for the Physx card. Otherwise the nvidia driver disables it.


----------



## erocker (Jun 2, 2010)

dadi_oh said:


> Probably not related but I did need to make dummy plug for the Physx card. Otherwise the nvidia driver disables it.



If your monitor has two inputs you can just hook up the extra input to the PhysX card. Works for me.


----------



## TRIPTEX_CAN (Jun 2, 2010)

The card didnt post video from boot so I don't think drivers were the issue. I believe the second card is functioning but I didn't try it in the primary slot. 

meh , i might try again later


----------



## dadi_oh (Jun 2, 2010)

erocker said:


> If your monitor has two inputs you can just hook up the extra input to the PhysX card. Works for me.



Now that's something I never thought of  I have a bunch of 75ohm resistors so I just make dummy plugs with DVI to VGA adapters.


----------

