# Details on Nvidia PhysX - please read



## Mussels (Mar 24, 2010)

I'm posting this here to help clarify some things i keep seeing around the place about PhysX - there seems to be some large misconceptions and misunderstandings relating to it.


PhysX is a Physics engine made by Ageia, and bought out by Nvidia. It supports software mode, and hardware acceleration.

In pretty much every thread i see about Nvidia vs ATI (so, anything involving video cards really) someone ALWAYS mentions PhysX, either that Nvidia is superior for having it, or that ATI cant do it (or can with a second Nvidia card) and various other things.


Well, heres the part no one seems to mention: How many games actually USE it?

This Wikipedia page looks impressive - check out all those titles, even a few big names in there (Batman AA, Borderlands, dragon age, etc).

Ok cool, so hurry up Mussels and get to the point.

The point is which of those titles support HARDWARE acceleration - as in, which games actually benefit from having an Nvidia GPU.







Hmmm, 15 games. Lets count the big titles: Unreal tournament 3, Mirrors edge, Batman AA and GRAW 2. Ummm, yeah. Less than impressive.

When you consider that in those games PhysX pretty much amounts for non-interactive 'looks only' physics effects (waving cloth, or random added debris that hurts FPS and not helps) PhysX starts to lose its shine. In fact the only reason it seems to have taken off is a program not listed, and thats 3Dmark vantage. Vantage uses PhysX to calcluate the CPU score, so Nvidia users get a massively higher CPU score in the program.


So after my wall of text, whats the point? Hardware PhysX is basically worthless, do NOT base any purchases based on hardware physX support. 


P.S - this is not an nvidia bashing thread, which many of these turn into. Should this happen i'll just lock the thread - at least then people can still read the truth about PhysX without having to get yelled at by fanbois from both sides of the fence.


----------



## Grings (Mar 24, 2010)

I've always disabled it in the Nvidia driver anyway, but didnt actually realise how few games used it

I did try using my old 8800GTS as a dedicated physx card, and saw no difference in any of the supporting games, now i know why...


----------



## brandonwh64 (Mar 24, 2010)

im hoping to use it when my VisionTek 5850 gets here tomorrow. I will have a 9600GT as physx


----------



## Mussels (Mar 24, 2010)

brandonwh64 said:


> im hoping to use it when my VisionTek 5850 gets here tomorrow. I will have a 9600GT as physx



And uhh, what games do you intend to use it on?


----------



## surfingerman (Mar 24, 2010)

ehh not a bash thread eh... could have fooled me.. i didnt see ANY positive notes up there.. surely it has positive notes.. would be fair and balanced to add some


----------



## Mussels (Mar 24, 2010)

surfingerman said:


> ehh not a bash thread eh... could have fooled me.. i didnt see ANY positive notes up there.. surely it has positive notes.. would be fair and balanced to add some



Such as?

This is about hardware physX - which has very little benefit over software PhysX. If it was a bash thread, i'd be talking about how it tends to be buggy and crashy as well, i'm not. I'm trying to educate people about the fact that just because a game supports physX, it does NOT mean having an Nvidia card with hardware physX will provide any benefit whatsoever.


----------



## brandonwh64 (Mar 24, 2010)

Mussels said:


> And uhh, what games do you intend to use it on?



borderlands, BC2, metro 2033, and more


----------



## r9 (Mar 24, 2010)

PC performance at 99% of the time is GPU bottlenecked. Some how I always thought that CPU is way to go for physics.  So making spare unused cores doing physics sounds to me more logical than doing physics on something that is already used at maximum. And reading this some one provable is going to say that from the demos that ATI/NV did that GPUs are superior in that aspect but CPU are more easier to program for.


----------



## qubit (Mar 24, 2010)

Thanks for clarifying this for everyone, Mussels. Indeed, current implementations of PhysX only bring in a bit of eye candy, which is a waste for this technology. I've run the PhysX demos from nvidia and it looks really awsome when done properly.

I still haven't figured out how to actually see it in action in UT3, either. I've enabled it and used the right map, but still didn't see it. I have no idea what I'm doing wrong and just gave up. 

I'd like to see an industry standard physics API, that's implemented properly in grade A games. Instead we've got two proprietary systems (PhysX and Havoc) that aren't being implemented properly, which sucks.

If this thread isn't yet a sticky, it should be. 

qubit - GTX 285 owner.


----------



## HookeyStreet (Mar 24, 2010)

Good read, thank you.  I always felt PhysX was 'a bit of a gimmick', youve cleared things up for me


----------



## Mussels (Mar 24, 2010)

brandonwh64 said:


> borderlands, BC2, metro 2033, and more



Good work on actually reading the first post, where i point out that it wont work >.<

BC2 and borderlands dont support hardware physX. I'm not sure on metro - i've not played it, and its new enough its possible its not on the list. There is a complete list, from nvidias site, where only 15 games are listed - these are the only game titles that get hardware PhysX support (with the exception of some brand spanking new games that arent listed yet, not that i'm aware of any)


----------



## brandonwh64 (Mar 24, 2010)

Here is a current list. i didnt realize it was so small






SOURCE


----------



## Mussels (Mar 24, 2010)

thanks brandon. I didn't realize people wouldn't click the links in my first post, and that pictures would be necessary. I might as well add that image to the first post.


----------



## brandonwh64 (Mar 24, 2010)

Mussels said:


> thanks brandon. I didn't realize people wouldn't click the links in my first post, and that pictures would be necessary. I might as well add that image to the first post.



Alot of people just skim through the post to get the main idea (im guilty of it most of the time too). its always good to have picture up for us


----------



## bobzilla2009 (Mar 24, 2010)

I've always thought of physX as just some marketing crap which didn't amount to anything significant. Sure, if nvidia released an absolutely killer card (which they haven't since the 8 series) which was worth the price then physX would be a nice extra on a very small selection of games.

But it's like buying a card based on how it looks. Sure, you'll feel fuzzy when you first get it, but after 10 minutes it'll be in the case and unlikely to be looked at again, and only the real performance is what determines the card. It's just some bells and whistles that nvidia use to try to justify their ridiculous prices.


----------



## Lionheart (Mar 24, 2010)

Thanx for posting this up Mussels, physics never interested me anyways


----------



## brandonwh64 (Mar 24, 2010)

Mussels!!!!!! High Five Post!!!!


----------



## phanbuey (Mar 24, 2010)

physX is an interesting idea... but the fact that it is proprietary will kill it.  

Doesn't DX 11 have some sort of GFX physics built in?  I think once a standard adapts physics on gfx hardware it will really take off - as it is a great idea, and considering AMD's cards have a bazillion simple shaders - it might work well.


----------



## Mussels (Mar 24, 2010)

phanbuey said:


> physX is an interesting idea... but the fact that it is proprietary will kill it.
> 
> Doesn't DX 11 have some sort of GFX physics built in?  I think once a standard adapts physics on gfx hardware it will really take off - as it is a great idea, and considering AMD's cards have a bazillion simple shaders - it might work well.



DirectX 11 does not have a physics engine built in, however it does have DirectCompute, which is an open version of nvidias Cuda/ATI's stream. If someone writes physics for DirectCompute, then bam - all DX10 (and up) cards have access to a unified physics engine.


----------



## brandonwh64 (Mar 24, 2010)

If they are going to do this physx thing then instead of working on certain games, it should be work on EVERY game no matter who makes it.


----------



## Bjorn_Of_Iceland (Mar 24, 2010)

Mussels said:


> Good work on actually reading the first post, where i point out that it wont work >.<
> 
> BC2 and borderlands dont support hardware physX. I'm not sure on metro - i've not played it, and its new enough its possible its not on the list. There is a complete list, from nvidias site, where only 15 games are listed - these are the only game titles that get hardware PhysX support (with the exception of some brand spanking new games that arent listed yet, not that i'm aware of any)


BC2 and Bordelands definitely does not. Metro 2033 does though.

Here is a vid comparison:
http://www.youtube.com/watch?v=Bt8DEEEMTHw#movie_player
Nothing too dramatic though.. in fact you cant see difference at all except for "pebbles". 

Still the best implementation of hardware Physix is Batman AA for me.


----------



## Mussels (Mar 24, 2010)

Bjorn_Of_Iceland said:


> BC2 and Bordelands definitely does not. Metro 2033 does though.
> 
> Here is a vid comparison:
> http://www.youtube.com/watch?v=Bt8DEEEMTHw#movie_player
> ...



and as always, the PhysX in batman AA has very little to do with Physics itself - its just eye candy, and most of it would work without physx, they literally just turn 'debris' off if you dont have PhysX.


----------



## phanbuey (Mar 24, 2010)

Bjorn_Of_Iceland said:


> BC2 and Bordelands definitely does not. Metro 2033 does though.
> 
> Here is a vid comparison:
> http://www.youtube.com/watch?v=Bt8DEEEMTHw#movie_player
> ...



yeah me too I really liked it in Batman... i think a unified direct compute - what mussels mentioned will make it worthwhile for game devs to actually use the technology and more quality games will actually use it.

Right now its not worth to put in the time and effort to render 'extra pebbles' and have to use the physX technology, thus adding another layer of complexity, where more can go wrong.  Extra pebbles and flow cloths are nice and all...


----------



## Mussels (Mar 24, 2010)

looking at that youtube link, all i can help but see is this:

PhysX off seems to deliberately look worse. Theres a weird bubble-blur effect to simulate the 'gas' coming out the end of the gun at 1 minute in, whereas the physX on version gets a 'cloud'

Now tell me, why on earth do we need super accurate clouds of gas? The only other difference in the shot is that a few small amounts of debris collect on the floor - and if they're anything like GRAW2 and mirrors edge, they likely fade away in 20 seconds anyway.

1:30: grenades. all it does is add random debris into the explosions and a bigger/longer lasting
cloud of smoke (i'm seeing a trend here...)


If those were the differences between say, DX9 and DX11 in a game - you can bet your ass people would be screaming about how useless it is. Since its an Nvidia tech, people dont seem to care? thats the part that confuses me. Debris is hardly a feature i'd pay for in 16 games (15 in list + metro)


----------



## phanbuey (Mar 24, 2010)

Mussels said:


> looking at that youtube link, all i can help but see is this:
> 
> PhysX off seems to deliberately look worse. Theres a weird bubble-blur effect to simulate the 'gas' coming out the end of the gun at 1 minute in, whereas the physX on version gets a 'cloud'
> 
> ...




The idea is that it is a different way of rendering debree... and yeah, people will pay for the smallest, most minute improvements in gaming quality - nvidia tech or not.  And for games like GTA, mode debree would be awesome.

I mean what the hell was dx10? to me the difference was fancy water and better ambient lighting.  And look how many ppl bought DX10 cards way back when because they were dx10.  This is no different.


----------



## crazyeyesreaper (Mar 24, 2010)

I gotta agree with mussels here anything done by physx on the gpu can be done on the cpu if the game engines were better multithreaded and would at this time give better frame rates. Think on it a minute how many users are gonna buy 2 gtx 280s + a 9600gt to run cryostasis for example

gtx 280 triple SLI at 1600x1200 can only muster 56 fps in that game as it makes heavy use for physx and the comparible ati card gets 11 fps

now if a game at medium to high settings using physx needs 3 gtx 280s just to play it tell me are gonna spend that kind of cash to make it playable? 

http://www.legitreviews.com/article/855/1/

http://www.firingsquad.com/hardware/cryostasis_techdemo_performance/page4.asp

and the face is that when run in cpu mode while it does use up to 8 threads physx never uses the cpu enough to actually stress the damn thing.

Nvidia Physx is a nice gimmick nothing more it can be handled on most modern day cpus without issue its a great step forward but one thats hardly close to being usable at this time.

Cryostasis is the perfect example to get 60fps your gonna need 2 GTX 280s with a 9600gt  at 1600x1200 just because of how physx is implemented.  For hardware junkies like us it might not be a big deal. BUT its suicide for a company to use only physx and only on gpus untill its an open standard adopted by all. and thats fact.

im not bashing physx here i myself tried once upon a time to get it to work  (4870x2 + Aegia PhysX PPU later an 8800gts 640mb ) but found it to not be worth the hassel  for a few intresting addons that didnt improve my gameplay at all.  And isnt that what the key point SHOULD BE.. to improve physics in a game to better simulate realism or to give better emphasis on the gameplay.

Most games on that list posted above use PhysX  but guess what  a giant ****ing buttload of PS3 and Xbox 360 games use PhysX and guess what form it is? thats right its SOFTWARE. whatever is used on consoles is what you will get on PC so untill consoles can do physics calculation on there gpus you wont see any usable form on PC for a long while.  PhysX is one physics engine of many

bullet physics, Havok, PhysX, and more the first 2 can accomplish what physX can and do it on the cpu.... so again whats the point of proprietary tech that has no real viable use today tomorrow or 6months to 2 years from now


----------



## brandonwh64 (Mar 24, 2010)

So i see the 9600GT as physx gave it 10.4 more FPS than the standard 3way SLI



> *Legit Bottom Line: A dedicated NVIDIA PhysX card improves performance across the board and a pair of GeForce GTX 280 video cards in SLI with a dedicated PhysX card performs better than a set of GeForce GTX 280 in 3-way SLI!*


----------



## Mussels (Mar 24, 2010)

brandonwh64 said:


> So i see the 9600GT as physx gave it 10.4 more FPS than the standard 3way SLI






brandonwh64 said:


> standard 3way SLI




.... i never thought to see 3way SLI and 'standard' in the same sentence.


----------



## brandonwh64 (Mar 24, 2010)

Mussels said:


> .... i never thought to see 3way SLI and 'standard' in the same sentence.



What i ment by that was the game using the physx put out by the 3way SLI cards instead of having the 9600GT in there JUST for physx


----------



## crazyeyesreaper (Mar 24, 2010)

correct at the time of that article $1200 in gpus and a i7 couldnt get the 60fps mark it took sli with dedicated physx gpu to get the game playable and were talking enclosed spaces not large open areas so imagine applying physX to crysis i bet on the gpu u would be lucky to see 20fps with that much power do to the open areas and long draw distances


----------



## Mussels (Mar 24, 2010)

brandonwh64 said:


> What i ment by that was the game using the physx put out by the 3way SLI cards instead of having the 9600GT in there JUST for physx



you miss my point. any situation where 3 GPU's is normal, is f**ked up.


----------



## crazyeyesreaper (Mar 24, 2010)

yup not to mention look at the ati and nvida CPU physx..... 11fps for all gpus roughly thats unacceptable


----------



## brandonwh64 (Mar 24, 2010)

what about a 5850 with 9600GT as physx? i wished i could test that but for some reason my copy of cryostasis wouldnt work? it would start the game but i couldnt move around with the keyboard but i could use the mouse to look around?


----------



## phanbuey (Mar 24, 2010)

Mussels said:


> .... i never thought to see 3way SLI and 'standard' in the same sentence.



its the little things that remind me im on TPU


----------



## crazyeyesreaper (Mar 24, 2010)

http://www.firingsquad.com/news/newsarticle.asp?searchid=21623

side by side physx on and off lol they didnt even TRY on the physx off another example where money talks bullshit walks and developers take the easy cash 

again my major point here is the technology isnt ready and it wont be for a long time and i really dont see where companies will willingly screw themselves with using gpu accelerated physics as at this time its on nvidia only and telling about 30% of gamers using ati cards to get Fed in the A isnt gonna make you money


----------



## brandonwh64 (Mar 24, 2010)

ATI should try to get nvidia to start letting them made a driver that would allow a nvidia card ONLY as physx instead of using the stupid hack driver


----------



## crazyeyesreaper (Mar 24, 2010)

why?

Open CL   Direct Compute and DX11 basically say to nvidia your proprietary standard is obsolete why should ATi / AMD pay for the right to use nvidias tech when nvidia cant even afford to make its own chipsets these days and has to stop manufacturing gpus to keep afloat cash wise.

with the 2 above new standards theres no reason for ati to work with nvidia ati is already working with havok and bullet physics for gpu acceleration so again no real point 2/3 of the major physics engines will run on ati and when they do physics on the gpu MIGHT be viable till then theres no real point.  It would be nice for us the consumer yes but again its all about money. and its all its ever going to be about. the consumer will always take the back seat to profit.

thats why you wont see physx on an ati gpu or the 2 companies cooperate on that subject


----------



## brandonwh64 (Mar 24, 2010)

why dont ATI make there own physx design that would work with the current nvidia physx?


----------



## Bjorn_Of_Iceland (Mar 24, 2010)

brandonwh64 said:


> ATI should try to get nvidia to start letting them made a driver that would allow a nvidia card ONLY as physx instead of using the stupid hack driver


Well they paid lotsa cash acquiring Ageia, I doubt they would make it non proprietary / open sourced sooner ..


----------



## crazyeyesreaper (Mar 24, 2010)

thats just it theres alternatives already established ati IS working with Bullet Physics  + theres Havok and with Open Cl and Direct Compute both can and will eventually be made GPU accelerated.

so why should ati drop millions to make something that already exists and they already have good business relations with  as ive said physics today tomorrow 6months from now wont be making much headway. id say another 2 years from now when developers catch up to new hardware and when the next set of consoles make there debute thats when it will matter right now gpu vs cpu physics is a usless argument ooooo wavy flags and extra particles its the best theyve added for 5 years now thats right in 5 years the SAME thing has been added and not enough to even give a flying F.....  example Aegia Island for GRAW. go compare that to what physx offers in other games nothings changed at all period/ or Sacred 2 ooooo floating leaves that swirl YIPPIE. all junk filler.

Cryostasis does show whats possible but its not doable on todays hardware for the masses and because of that it dosent matter we make up 1% of the community as a whole and they dont care about the 1% they care about the 99% who have much lower end systems and have no idea what any of this stuff means.

i had 1 person try to tell me that Metro 2033 only runs on Nvidia GPUs thats how UN informed the public is.  You want gpu accelerated physics you want it open and used in more games well then u need to single handedly teach the sheeple of the world what there jargon stands for 

In the computer world people are like sheep, They see 1 person lead the way they all dumbly follow unable to make an educated decision for themselves, change some words throw in some fancy colors and bingo, the sheeple buy it and think they got the next great thing, when in reality its the same polished turd from the year before and the year before that.


----------



## newtekie1 (Mar 24, 2010)

I kind of skimmed over the thread, only really reading the original post, so forgive me if I bring up something that has already been discussed.

The problem with any hardware acceleration based physics, is that for it to be really implemented for anything more than some nice eye candy, it has to run and work on any hardware setup.  If it doesn't work universally on hardware that gamers already have, then game developers can't really integrate it into the gameplay for anything more than eye candy.

Ageia of course had the idea that if they could get everyone to buy their cards, it would work.  Well we saw how that worked out, no one bought the cards, and no games used it.

Then nVidia bought Ageia, and some really cool and promising stuff started to happen.  First, they adapted PhysX to run on their graphics cards in an amazingly short amount of time, something like 3 months I believe.  But the really promising thing started when a "hacker" managed to get PhysX working on ATi hardware...and nVidia actually went along with it and was willing to support him.  This actually makes a lot of sense for nVidia.  The problem was that ATi completely shut the door on it, refusing to give any help, and actually threatening lawsuits if the "hacker" continued to modify their drivers.  Many people argue that nVidia would have likely crippled PhysX on ATi hardware, but that isn't really possible.  PhysX doesn't take a whole lot of power to run, so if they did that it would have been pretty obvious.  However, the bigger point is that nVidia knew that they needed to get PhysX running on both ATi and nVidia cards if it was going to be successful.  I was really hoping Mussels would touch on this more, especially since it is mentioned at the begging of the post.  PhysX can run on ATi hardware, however ATi prevents it, and nVidia has taken it one step further and locked out running PhysX at all when an ATi card is being used to render the graphics.  One dick move leads to another...

Now, as for games that use PhsyX, I can say that the only game that really benefits from PhsyX is Batman.  Yes, the game is still entirely playable without PhysX, and the reason behind that I've already covered.  However, the PhysX really helps the games atmosphere.  When papers fall of desks, they really are papers, instead of just large blocks that looks like newspaper bundles.  Of course I would have liked to see, particularly in action games, some level destruction.  Breaking through walls, taking whole buildings to the ground.  Imagine COD:MW, but with buildings that deteriorate thoughout the battle, or even collapse once it can no longer support itself.  That is possible with PhysX, however it will never be implemented.  Why?  Because for that to work in game, everyone playing would have to able to support that happening, and without ATi support, game developers aren't going to create a game that only works on nVidia hardware.


----------



## btarunr (Mar 24, 2010)

The reason I'll agree is that Battlefield Bad Company 2 does a great job with destructible environments and general physics without any PhysX or Havoc mumbo jumbo so it's not like game developers need any specific physics API for waving flags, Batman's cape, or even breaking walls.


----------



## human_error (Mar 24, 2010)

Thanks for posting this information up mussles - too many people see physx installed with a new game and immediately assume that they can use hardware acceleration on it, when it is only enabled for software calculations.


----------



## crazyeyesreaper (Mar 24, 2010)

again sheeple


----------



## kid41212003 (Mar 24, 2010)

BFBC2 use a fixed, pre-programed physics, it's mean it looks the same everytime you broke something, the easiest thing you can see is the destruction of the buildings.

I say the TS is bias. Voicing his own opinion in an supposed "informative post" is not exactly informing people. Physics add the realism to games, eye candy or not.

Most dual-core users having slow down problems during destruction, even though their vid cards are high-end. While I'm playing game on an 8800GT and didn't have slow down. Why? Because the physics used in the game depend heavily on CPUs, and because I have 2 extra cores comparing to those people.

While it's not benefiting AMD because it's NVIDIA's physic engine, and because it supports hardware accerleration that run on NVIDIA's card doesn't mean it's suck.


----------



## air_ii (Mar 24, 2010)

Mussels said:


> Good work on actually reading the first post, where i point out that it wont work >.<
> 
> BC2 and borderlands dont support hardware physX. I'm not sure on metro - i've not played it, and its new enough its possible its not on the list. There is a complete list, from nvidias site, where only 15 games are listed - these are the only game titles that get hardware PhysX support (with the exception of some brand spanking new games that arent listed yet, not that i'm aware of any)



Some early benchmarks show no (literally - zero) drop in fps after enabling advanced physics in Metro2033 on any card...


----------



## Steevo (Mar 24, 2010)

I like the idea, but think the implementation is crap. I never bought Batman for the farce that it is, infected with TWIMTBP coding.


----------



## Grings (Mar 24, 2010)

Directcompute physics is a nice idea, but i still dont think dedicated physics hardware is necessary at all.

I'd like to see it all done one the cpu, hardly any games properly use more than 2 cpu cores effectively, due to difficulties in multithreading them, yet physics is so much easier to run seperate to other game threads that it can be done on a piss weak card on a pci bus (the original aegia card for example)


----------



## trickson (Mar 24, 2010)

So what is this stuff for then ? I mean I have the Physics installed and enabled but I really see no point if games and programs are not using it . Sounds like a one trick pony to me . What is the point of it any way ? Other than getting more points from 3Dmark06 and some other benchmarks it is worthless right ?


----------



## niko084 (Mar 24, 2010)

Good to know, I have played games "supported" but was never impressed.


----------



## suraswami (Mar 24, 2010)

I am using a PhysX dedicated PCI card for this purpose and played Batman AA and now MOH airborne (which supports PhysX), probably saw 2 to 5 FPS gain, but didn't notice that much of a difference in image quality, may be MOH is a real old game.

One thing that drived me nuts is the buggy drivers, Sleep will not work properly on W7.  I didn't realise this until I updated the drivers to the latest.  Stupid me even went thru changing mobos   But I accidentaly found its all caused by the one version old drivers from NV website.

Except consuming more watts and making more noise its useless to me.

But one day it might be put to good use in a real good game, so it stays he he


----------



## Deleted member 24505 (Mar 24, 2010)

As far as i'm concerned physX was fail when agea brought it out,i bet the agea comany breathed a sigh of releif when nvidia bought them out before it went down hill.


----------



## qubit (Mar 24, 2010)

suraswami said:


> One thing that drived me nuts is the buggy drivers, Sleep will not work properly on W7.  I didn't realise this until I updated the drivers to the latest.



My Windows 7 wouldn't return from sleep properly either, screwing up the network until I rebooted. Made sleep mode useless. However, thanks to Nick89, that problem is solved. 

Let us know if it works for you.


----------



## suraswami (Mar 25, 2010)

qubit said:


> My Windows 7 wouldn't return from sleep properly either, screwing up the network until I rebooted. Made sleep mode useless. However, thanks to Nick89, that problem is solved.
> 
> Let us know if it works for you.



I don't remember the settings on the network, but it started to happen randomly.  If I remember right, W7 on that machine was working flawless for about 5 months, then I installed Batman AA game and PhysX drivers, then I started to get the Sleep issues, but it was not always, just random, sometimes the PC will work for 15 days without a reboot, sometimes its like a cycle, every 3rd day it will not wake up.  After much headache, I gave up and decided maybe its the board and changed to a ECS 780G board, all these issues went away, I think I uninstalled all crap that I don't use and left with only the ones that I need (PhysX went on the don't need list), but I didn't have time to test it.  So I was about to RMA the Biostar board, but wanted to test it completely and simulated the whole thing and found this bastard


----------



## qubit (Mar 25, 2010)

So you had a different problem. I guess that won't apply then, lol.

If the PC wouldn't wake up, then it was likely to be a BIOS issue.* Updating to the latest *may* fix it, but I see you've nailed it anyway by changing the board.

*By that I don't mean that the BIOS was necessarily at fault. Could just be some wierd interaction between them, perhaps caused by a Windows 7 bug, who knows.


----------



## newtekie1 (Mar 25, 2010)

suraswami said:


> Except consuming more watts and making more noise its useless to me.



I agree with this so much that I actually removed the 9600GT I was using for PhysX in my main rig, it now sits on my desk rather than running in the computer doing nothing...


----------



## Zubasa (Mar 25, 2010)

newtekie1 said:


> I agree with this so much that I actually removed the 9600GT I was using for PhysX in my main rig, it now sits on my desk rather than running in the computer doing nothing...


I guess that 9600GT is still good for folding?


----------



## suraswami (Mar 25, 2010)

qubit said:


> So you had a different problem. I guess that won't apply then, lol.
> 
> If the PC wouldn't wake up, then it was likely to be a BIOS issue.* Updating to the latest *may* fix it, but I see you've nailed it anyway by changing the board.
> 
> *By that I don't mean that the BIOS was necessarily at fault. Could just be some wierd interaction between them, perhaps caused by a Windows 7 bug, who knows.



I thought so, so went back and forth with the latest bios and one version before.  One version before was more BSOD and hang ups than the latest.  But now with the latest it never crashed for this reason (with updated PhysX), and OCed the crap out of the board, is doing 272 x 14 with my X2 240, not full stable tho.  Earlier it would do only 250 with X2 5600 or X3 720BE. Don't know why, may be the 240 is a good clocker.


----------



## Mussels (Mar 25, 2010)

brandonwh64 said:


> why dont ATI make there own physx design that would work with the current nvidia physx?



PhysX is a closed, proprietary system. ATI just cant use it without paying tons of money... and who would, with how few games support it?

ATI have their own engine (Bullet Physics or something, it was in the news) which will work on ATI and nvidia - meaning that game devs will actually use it for something other than useless debris.



as for the comments on it working on ATI: Remember two things. One, if ATI supported it, every game with it would have Nvidia branding. Two, Nvidia would NOT have allowed it for free - ATI would have had to pay for every video card that had the physX support, as well as every box that had the logo on it.


----------



## DannibusX (Mar 25, 2010)

Mussels said:


> *as for the comments on it working on ATI: Remember two things. One, if ATI supported it, every game with it would have Nvidia branding. Two, Nvidia would NOT have allowed it for free - ATI would have had to pay for every video card that had the physX support, as well as every box that had the logo on it.*



Not to mention that ATI would not be paying for the licensing, we would.

And the blasphemy of ATI being splattered with nVidia logos, but then.  You said that


----------



## btarunr (Mar 25, 2010)

kid41212003 said:


> BFBC2 use a fixed, pre-programed physics, it's mean it looks the same everytime you broke something, the easiest thing you can see is the destruction of the buildings.



So what? It does the job. Besides it's not only destruction of buildings but also walls and barricades. Players don't have the time to see a building collapse 'accurately' to see their foes squished. The amount of realism with BC2's physics is easily passable and realistic. Don't need PhysX to do that job. At least DICE's in-house physics work is better than PhysX' implementation in many games.


----------



## Mussels (Mar 25, 2010)

and it also doesnt matter because unlike a scripted SP game, you're never looking from the same angle or from the same approach - and you dont have TIME to stop and stare, you're shitting yourself trying to escape the collapsing building, or in a fight with someone.


----------



## RejZoR (Mar 25, 2010)

But wouldn't be a nicer gaming experience if all graphic cards could run such physics engines, if all games could incorporate it and all users benefit from it. The gaming could evolve from current almost entirely static worlds into fully dynamic, destructible worlds that feel like the real thing.
Nothing beats the feeling where you're discharging your machine gun and entire world in front of it is collapsing. From wall holes, destructible furniture, ragdolls, objects that do damage by themself.
There were few games that did that, like Red Faction, Max Payne and Half-Life 2, but thats a very limited number of games that only partially deliver what we could enjoy in all games as a standard. And once you have an open standard, studios implementing it, all the others will follow for sure. I don't think all gamers know how to appreciate such leap in world interaction realism.
But i hope they will, before i turn 80...


----------



## shevanel (Mar 25, 2010)

after playing batman aa on a gtx 275 and bad company 2 on a 5870..

LOL.. physx sucks and is only beneficial in 3d mark vintage


----------



## newtekie1 (Mar 25, 2010)

Mussels said:


> PhysX is a closed, proprietary system. ATI just cant use it without paying tons of money... and who would, with how few games support it?
> 
> ATI have their own engine (Bullet Physics or something, it was in the news) which will work on ATI and nvidia - meaning that game devs will actually use it for something other than useless debris.
> 
> ...



Every game having nVidia branding wouldn't matter, they already have nVidia branding anyway. And nVidia was not only allowing it for free, they were actually paying to develope it for ATi...


----------



## Mussels (Mar 25, 2010)

newtekie1 said:


> Every game having nVidia branding wouldn't matter, they already have nVidia branding anyway. And nVidia was not only allowing it for free, they were actually paying to develope it for ATi...



Yeah, develop the software side of things for them, and then charge them fees later.

Not once was it stated nvidia would do it for free.


----------



## burebista (Mar 25, 2010)

Mussels said:


> I'm not sure on metro - i've not played it[...]


It's hardware. Nice smoke and debris from grenades. At least that's I've hear. 
But (if you already have an ATI card) now with modified drivers you can buy a cheap nVidia card and put it to work on PhysX. Of course that's only if you really want PhysX.


----------



## qubit (Mar 25, 2010)

Mussels said:


> looking at that youtube link, all i can help but see is this:
> 
> PhysX off seems to deliberately look worse. Theres a weird bubble-blur effect to simulate the 'gas' coming out the end of the gun at 1 minute in, whereas the physX on version gets a 'cloud'



Yup, you don't need PhysX for the cloud. UT2004 did this way back with good performance using volumetric fog, which looked very slick indeed and didn't nail performance to the wall.


----------



## bobzilla2009 (Mar 25, 2010)

Mussels said:


> Yeah, develop the software side of things for them, and then charge them fees later.
> 
> Not once was it stated nvidia would do it for free.



indeed, nvidia would not do it for free, and if they did you can bet everything you own they'd hamper ati performance in physX (ati cards, apart from this generation with fermi, have always has far more calculation grunt). Nvidia are just like that, they are the intel of the gpu world. Sly gits, but at least intel delivers.


----------



## qubit (Mar 25, 2010)

I thought nvidia had a free PhysX licence for everyone? If so, it doesn't really matter if it becomes the de facto standard.


----------



## Mussels (Mar 25, 2010)

qubit said:


> I thought nvidia had a free PhysX licence for everyone? If so, it doesn't really matter if it becomes the de facto standard.



free for game devs to use in their games. not free for video card makers to hardware accelerate, and stick their logos in places.


Even the free one for games has requirements, has to be part of TWIMTBP, has to have nvidia branding, has to be given to nvidia for adjustments (read: batman AA having no AA on ATI)


----------



## qubit (Mar 25, 2010)

Mussels said:


> free for game devs to use in their games. not free for video card makers to hardware accelerate, and stick their logos in places.
> 
> 
> Even the free one for games has requirements, has to be part of TWIMTBP, has to have nvidia branding, has to be given to nvidia for adjustments (read: batman AA having no AA on ATI)



Sheesh, no wonder AMD wouldn't touch it. How "generous" of nvidia to offer it to them.


----------



## Mussels (Mar 25, 2010)

qubit said:


> Sheesh, no wonder AMD wouldn't touch it. How "generous" of nvidia to offer it to them.



exactly. dirty marketing at its best.


----------



## pantherx12 (Mar 25, 2010)

I've always said Physx is a gimmick.

As people have stated CPU physics easily keep up with physx

Play Just Cause 2 to see no difference between GPU and CPU based Physics


----------



## newtekie1 (Mar 25, 2010)

Mussels said:


> Yeah, develop the software side of things for them, and then charge them fees later.
> 
> Not once was it stated nvidia would do it for free.





			
				nvidia spokesman said:
			
		

> [PhysX] is available for use by any developer for inclusion in games for any platform - all free of license fees. There's nothing restrictive or proprietary about that.



Sorry, but PhsyX is free to develope for any platform.  It is used in all three current consoles, without charge.  They know this is the only way it would be used, it had to run on ATi hardware to be sucessful, otherwise developers wouldn't use it.  ATi of course knew that also, which is why they didn't allow it.


----------



## lukesky (Mar 25, 2010)

You guys should really check out http://www.gamephys.com/category/game-physics/ for some real physics and physx action.

Another site that has the updated list of physx and hardware physx games http://physxinfo.com/


----------



## bobzilla2009 (Mar 25, 2010)

newtekie1 said:


> Sorry, but PhsyX is free to develope for any platform.  It is used in all three current consoles, without charge.  They know this is the only way it would be used, it had to run on ATi hardware to be sucessful, otherwise developers wouldn't use it.  ATi of course knew that also, which is why they didn't allow it.



physX ran on a cpu is free to develop. The gpu version is locked to nvidia. Also, nvidia would gimp ati performance to gain an unfair lead in performance. AMD would rather wait for an open standard to arrive, as should all gamers. No one company should have any direct power over another ability to perform well (i'm looking at you The way it's meant to be paid off).


----------



## newtekie1 (Mar 25, 2010)

bobzilla2009 said:


> physX ran on a cpu is free to develop. The gpu version is locked to nvidia. Also, nvidia would gimp ati performance to gain an unfair lead in performance. AMD would rather wait for an open standard to arrive, as should all gamers. No one company should have any direct power over another ability to perform well (i'm looking at you The way it's meant to be paid off).



It is free for GPU or CPU.  And I already addressed the issue of nVidia "gimping" performance, it simply isn't possible.

And TWIMTBP doesn't hinder ATi performance, it just improves nVidia's performance.


----------



## erocker (Mar 25, 2010)

newtekie1 said:


> It is free for GPU or CPU.  And I already addressed the issue of nVidia "gimping" performance, it simply isn't possible.
> 
> And TWIMTBP doesn't hinder ATi performance, it just improves nVidia's performance.



I still haven't seen PhysX being run on any other GPU than Nvidia or an Ageia PPU. I haven't seen any "hacks" to run PhysX on any other GPU. Are you sure this is fact that PhysX is open to run on any GPU? Is it purely software to make it work or does there need to be some archetectural things in place for it to work? Why hasn't anyone from the "modding" community gotten PhysX to work on a non-Nvidia GPU? Does PhysX need to be run on a GPU? From what I've seen PhysX isn't coded well enough to use a CPU properly, but it does a great job on Nvidia GPU's.. sometimes. With the way things are currently, it certainly seems as if PhysX isn't open at all.


----------



## Grings (Mar 25, 2010)

erocker said:


> I still haven't seen PhysX being run on any other GPU than Nvidia or an Ageia PPU. I haven't seen any "hacks" to run PhysX on any other GPU. Are you sure this is fact that PhysX is open to run on any GPU? Is it purely software to make it work or does there need to be some archetectural things in place for it to work? Why hasn't anyone from the "modding" community gotten PhysX to work on a non-Nvidia GPU? Does PhysX need to be run on a GPU? From what I've seen PhysX isn't coded well enough to use a CPU properly, but it does a great job on Nvidia GPU's.. sometimes. With the way things are currently, it certainly seems as if PhysX isn't open at all.



I've been wondering about the software side too, back when Aegia were making it games used to install the Aegia drivers, and that got updated by newer games also using PhysX, however since its been Nvidia i havent seen anything add in a PhysX driver on my Ati rig (and wouldnt on my Nvidia one anyway, its inside forceware)


----------



## newtekie1 (Mar 25, 2010)

erocker said:


> I still haven't seen PhysX being run on any other GPU than Nvidia or an Ageia PPU. I haven't seen any "hacks" to run PhysX on any other GPU. Are you sure this is fact that PhysX is open to run on any GPU? Is it purely software to make it work or does there need to be some archetectural things in place for it to work? Why hasn't anyone from the "modding" community gotten PhysX to work on a non-Nvidia GPU? Does PhysX need to be run on a GPU? From what I've seen PhysX isn't coded well enough to use a CPU properly, but it does a great job on Nvidia GPU's.. sometimes. With the way things are currently, it certainly seems as if PhysX isn't open at all.



It was running wonderfully on the HD3800 series.  The ATi drivers needed to be modified to get it running, apparently they modifications were rather minor and easy.  However, the biggest hurdle was that the drivers would only work in Test Mode, ATi had to officially add support for CUDA to allow it to work in normal mode in Windows.  The hack was posted on ngohq.  Then nVidia offered the hacker, Eran Badit editor-in-chief of ngohq.com, a spot on their developement team.  The issue then came down not to adding support for ATi into the PhysX engine, but adding support for CUDA into ATi's drivers.  That is where it ended, ATi never allowed CUDA support for the drivers, and Erin gave up trying it seems.  ATi had to do very little to get it working, and there wouldn't really be any shady stuff since developement was actually being done by a 3rd party, someone how originally though he was defying nVidia when he made his hack...

If you really want to lean who was responsible for ATi not having PhysX, this is an interesting article, it certainly wasn't nVidia's fault that PhysX doesn't run on ATi hardware, nVidia tried...


----------



## erocker (Mar 25, 2010)

newtekie1 said:


> It was running wonderfully on the HD3800 series.  The ATi drivers needed to be modified to get it running, apparently they modifications were rather minor and easy.  However, the biggest hurdle was that the drivers would only work in Test Mode, ATi had to officially add support for CUDA to allow it to work in normal mode in Windows.  The hack was posted on ngohq.  Then nVidia offered the hacker, Eran Badit editor-in-chief of ngohq.com, a spot on their developement team.  The issue then came down not to adding support for ATi into the PhysX engine, but adding support for CUDA into ATi's drivers.  That is where it ended, ATi never allowed CUDA support for the drivers, and Erin gave up trying it seems.  ATi had to do very little to get it working, and there wouldn't really be any shady stuff since developement was actually being done by a 3rd party, someone how originally though he was defying nVidia when he made his hack...
> 
> If you really want to lean who was responsible for ATi not having PhysX, this is an interesting article, it certainly wasn't nVidia's fault that PhysX doesn't run on ATi hardware, nVidia tried...



Now I remember the NGHQ thing. I mean really, if CUDA is free and takes minimal effort to implement, why not make CUDA the open solution since it works! PhysX aside, CUDA is more mature than ATi Stream which is still in beta and still has very limited support. It seems to be a pride issue, thing is if CUDA was being used as an open standard nobody would even care who came up with it. ATi also may have contractual obligations to use things such as Havok and whatever else there is, but that isn't anything we could or would know about.


----------



## crazyeyesreaper (Mar 25, 2010)

lukesky said:


> You guys should really check out http://www.gamephys.com/category/game-physics/ for some real physics and physx action.
> 
> Another site that has the updated list of physx and hardware physx games http://physxinfo.com/



you fail to understand the difference of CPU physx and GPU physx

19 games support the PPU  26 supported with GPU total including games indevelopment 

total games that use physx  247

so out of 247 games  19 support the ppu and 26 support the GPU

out of those games 3 on the PPU dont even NEED it to run with all those physx calculations ie cloth tearing etc from cellfactor was easily done a cpu and was just as fast

of those games for the GPU out of 26 total only 15 are released of the 15 released

only 1 makes heavy use of it Cryostasis and since 3 gtx 280s cant even maintain 60 fps at 1600x1200 means epic fail.

and Batman AA is a perfect example of ignorance Nvidia gets AA in game with a few extra particles and some extra effects ATi users get no AA in game but at the end of the day u get extra gimmicks and 30% of your target audience gets shafted.

software physx is whats used 90% of the time  out of 26 games indevelopment for PC only 15 are released and only 1 must have physx to be playable properly.

so tell me how many out there who play PC games are gonna buy an Nvidia gpu then buy another just for physx when only 26 games over the course of almost 5 years support it? 

wake up and smell the green tinted roses physx is a nice gimmick nothing more i fell for it once but never again (i had an ageia PPU the first day of release.) 

what i want to see is how Tessellation + Physx work after all Nvidias tessellation uses the shaders so as those get eaten up for tessellation and more is used for textures rendering etc how much more of a frame rate hit are we gonna see from Physx on a gpu.. ppl need to stop sniffing the green glue and look around i have yet to see a KILLER title that makes me want to waste money on gpu physx most of the games seem to be shooters and im to damn busy trying NOT to die to worry about a few particles that ill only see for a split second. wake me up when physx matters and isnt just marketing bullcrap


----------



## newtekie1 (Mar 25, 2010)

erocker said:


> Now I remember the NGHQ thing. I mean really, if CUDA is free and takes minimal effort to implement, why not make CUDA the open solution since it works! PhysX aside, CUDA is more mature than ATi Stream which is still in beta and still has very limited support. It seems to be a pride issue, thing is if CUDA was being used as an open standard nobody would even care who came up with it. ATi also may have contractual obligations to use things such as Havok and whatever else there is, but that isn't anything we could or would know about.



It is all politics my friend, all politics...:shadedshu


----------



## air_ii (Apr 11, 2010)

newtekie1 said:


> It is all politics my friend, all politics...:shadedshu



And business. I believe that folks at AMD are reluctant to support a proprietary standard, as it's a risk from business perspective. If CUDA gained dominant position, nVidia could suddenly change their minds and start charging AMD (or any other company) licence fees. AMD would be with their backs against the wall, as they would HAVE to support it to stay in the game.

EDIT: And it would be developed only by nVidia, which means it would never be optimised for another architecture. They would be in position to do almost anything performance wise.

So it's not that you can say they tried, as it's something that anyone sane would reject (the PhysX - CUDA deal combo)...


----------



## eidairaman1 (Apr 11, 2010)

MS probably doesnt like what NV is doing either.


----------



## $immond$ (Apr 11, 2010)

All in all I find physx more useful than eyefinity.


----------



## newtekie1 (Apr 11, 2010)

air_ii said:


> And business. I believe that folks at AMD are reluctant to support a proprietary standard, as it's a risk from business perspective. If CUDA gained dominant position, nVidia could suddenly change their minds and start charging AMD (or any other company) licence fees. AMD would be with their backs against the wall, as they would HAVE to support it to stay in the game.
> 
> EDIT: And it would be developed only by nVidia, which means it would never be optimised for another architecture. They would be in position to do almost anything performance wise.
> 
> So it's not that you can say they tried, as it's something that anyone sane would reject (the PhysX - CUDA deal combo)...



Legally, at least in the US and likely the EU, nVidia could not do that.  If they agree that it is free to develope in the beginning, they can't legally "change your mind" once it is popular and in use by other companies.  The trade courts would rip nVidia a new asshole bigger than the one they ripped Microsoft.

As for performance, the implementation was actually being done by a 3rd party, so while nVidia could optimized CUDA for their hardware, if it became popular ATi could actually do the same.  PhysX itself wouldn't likely matter, as it doesn't really take a whole lot to run, but other CUDA apps such as the video conversion tools likely benefit from optimizations, this would likely be left up to ATi to do if they really wanted the optimizations.



$immond$ said:


> All in all I find physx more useful than eyefinity.



Why do you keep comparing it to eyefinity?  This is like the 2nd or 3rd time I've seen you do it.  They aren't even similar technologies.  If you are going to make comments like this, do it with something like Streams, at least that is in the same category as PhysX/CUDA.

And really, as I asking the last thread and you ignored, I'd like to know what you found it useful in.  Because of the 15 current and future* games that support hardware PhysX, Batman is the only one I found PhysX actually made more enjoyable.  At least eyefinity works with essentially every game...  And it also seems to be a solution to a problem that _some_ people have been wanting for years.

*Are there even any future titles that plan to use it?


----------



## bobzilla2009 (Apr 12, 2010)

I suppose if you have one screen, then physX is more useful  but thats equivalent to saying you find radio better than tv overall and are blind.


----------



## $immond$ (Apr 12, 2010)

bobzilla2009 said:


> I suppose if you have one screen, then physX is more useful  but thats equivalent to saying you find radio better than tv overall and are blind.



Because they are both marketing gimmicks. I honestly think Eyefinity is an eyesore if you have depth perception.


----------



## bobzilla2009 (Apr 12, 2010)

$immond$ said:


> Because they are both marketing gimmicks. I honestly think Eyefinity is an eyesore if you have depth perception.
> 
> http://www.forcatech.com/blog/wp-content/uploads/2009/09/AMD-Eyefinity-Graphics1.jpg



6 screens is terrible. But eyefinity definitley has a future with bezel-less panels. Plus 3 screen setups work fine as the game is not stretched across the screens like you imagine they are ( for example, that picture above is actually nothing like 3 screen eyefinity) . The 2 extra screens give you a wider FOV that is more for immersions sake than for staring at. For example in L4D your gun is on the center screen as per usual, but the two side screens show more of the environment around you. The bezels dont get in the way because you're not staring at a stretched screen. You look at a central screen but have peripheral vision on the sides.

Plus PhysX is dead, or may as well be. In the 2 years i owned an nvidia card i played 1 physX game (ME) and physX is not the reason i bought it, whereas eyefinity actually has a future (albeit in the high end niche). Also, a lot more games run with eyefinity than with physX. Both are gimmicks, but one of them is the one with some potential, whereas the other will be dead before the year is out 






now, as you can see, the game is not stretched across the screens at all. The two screens provide extra viewing area to the game. You don't lose anything and you should still be focusing on the central screen. The image looks a bit off at the seems due to the angling of the screens themselves though  

Still, i won't use eyefinity personally, i can't fit 3 monitors on my desk ^^ but i'm trying to clear up some of your misconceptions of how it is implemented as proven by your image.


----------



## $immond$ (Apr 12, 2010)

This still looks unappealing...


----------



## bobzilla2009 (Apr 12, 2010)

$immond$ said:


> http://media.bestofmicro.com/ati-catalyst-eyefinity,E-E-238982-13.jpg
> 
> This still looks unappealing...



Bezel correction in 10.3 (or was it 10.2?) solved that.


----------



## $immond$ (Apr 12, 2010)

I am not sure spending $250-300 per monitor is going to enhance my gaming experience over 42" widescreen costing $700.


----------



## newtekie1 (Apr 12, 2010)

$immond$ said:


> http://media.bestofmicro.com/ati-catalyst-eyefinity,E-E-238982-13.jpg
> 
> This still looks unappealing...



Yes, and without it, you just get to see half a jeep, certainly better...

And  you have yet to answer the simple question that I've asking in both this and the other thread.  In what exactly did you find PhysX more useful?



$immond$ said:


> I am not sure spending $250-300 per monitor is going to enhance my gaming experience over 42" widescreen costing $700.



Yes, because despite that 42" being bigger, the resolution is still low, so you don't actually see anything more.  While eyefinity adds peripheral vision and actually increases the field of view in the game.  So spending $150 per monitor actually _will_ enhance the gaming experience, while a 42" screen won't do shit.


----------



## bobzilla2009 (Apr 12, 2010)

newtekie1 said:


> Yes, and without it, you just get to see half a jeep, certainly better...
> 
> And  you have yet to answer the simple question that I've asking in both this and the other thread.  In what exactly did you find PhysX more useful?



In the same way OpenCL and directcompute will be much more useful at doing for everyone ^^



$immond$ said:


> I am not sure spending $250-300 per monitor is going to enhance my gaming experience over 42" widescreen costing $700.



I find a single 22" monitor better than a 42" personally. Using a mouse and keyboard is a mega fail for the most part. So 3 19" monitors would beat a single 42" imo.


----------



## imperialreign (Apr 12, 2010)

erocker said:


> I still haven't seen PhysX being run on any other GPU than Nvidia or an Ageia PPU. I haven't seen any "hacks" to run PhysX on any other GPU. Are you sure this is fact that PhysX is open to run on any GPU? Is it purely software to make it work or does there need to be some archetectural things in place for it to work? Why hasn't anyone from the "modding" community gotten PhysX to work on a non-Nvidia GPU? Does PhysX need to be run on a GPU? From what I've seen PhysX isn't coded well enough to use a CPU properly, but it does a great job on Nvidia GPU's.. sometimes. With the way things are currently, it certainly seems as if PhysX isn't open at all.





I've been able to run with it enabled in both Metro and Cryostasis . . . and I take one helluva FPS hit with it on.

But, I wouldn't be surprised if this was all software side, being computed via the CPU, not the GPU.


----------



## TheMailMan78 (Apr 12, 2010)

Thanks for the thread mussels. I for one thought all games that supported Physx also supported hardware acceleration. How wrong I was. This was a real eye opener.


----------



## crazyeyesreaper (Apr 12, 2010)

lol yea 247 games only 26 use it of the 26 that do only 15 are real games of those 15 only 2-3 make use of it and of those 2-3 games only 1 actually NEEDS it for the visual quality as the others it can all be done with regular physics calculations. as i said wake me up when its nots marketing fluff oh and wasnt there a post in the news threads bout fluidmark or something making physx multi threaded and it ran better on a stock q9400  then it did a gtx 275 so tell me if thats the case how well would an i7 handle physx on 8 threads compared to nvidias precious gpus


----------



## Jeffredo (Apr 12, 2010)

I actually own one of those hardware accelerated games - and an Nvidia card.  Yay.


----------



## air_ii (Apr 12, 2010)

newtekie1 said:


> Legally, at least in the US and likely the EU, nVidia could not do that.  If they agree that it is free to develope in the beginning, they can't legally "change your mind" once it is popular and in use by other companies.  The trade courts would rip nVidia a new asshole bigger than the one they ripped Microsoft.



I wasn't talking about such situation. I was rather refering to extensions, new features, new versions that would not fall under the scope of free licensing anymore, but would become vital due to CUDA's dominant position. I don't know if they couldn't do that...


----------



## Wile E (Apr 12, 2010)

Mussels said:


> PhysX is a closed, proprietary system. ATI just cant use it without paying tons of money... and who would, with how few games support it?
> 
> ATI have their own engine (Bullet Physics or something, it was in the news) which will work on ATI and nvidia - meaning that game devs will actually use it for something other than useless debris.
> 
> ...


nVidia offered ATI to use it for free. They even offered help coding for it.

Newtekie covered this, so I don't really need to get into the details.

As for the argument that nVidia cripples ATI performance in TWIMTBP titles (specifically Batman AA), that is total and utter bullshit. ATI would have their asses in court for anti-competitive practices already if it were even remotely true. I mean, just give it a little thought. It's common sense.

Optimizing the game for their hardware is not the same as crippling ATI. 

Now, as for the Batman issue mentioned in this thread: The AA in Batman was completely broken on ATI when it released, even if you spoofed the game into enabling it on an ATI card, it didn't render the AA properly. NV took the time to help them code a completely custom AA for an engine that has no native AA, why would they then then ATI have it, when ATI put forth no effort in making it happen? EVen if it wan't broken on ATI hardware, blocking ATI hardware for that feature in that game was fully legit. If ATI wanted to have it, they should've offered help to the dev as well.

To be perfectly honest with you Mussels, this isn't an informative thread at all, it's a thread that serves no purpose but to bash a product. At least that's the way it comes across in it's current wording.


----------



## HalfAHertz (Apr 12, 2010)

Um from my understanding proprietary rights means exactly that - Nvidia has complete control over it. They may charge for it later on and they may prohibit access to the core code to anyone they desire.


----------



## TheMailMan78 (Apr 12, 2010)

Wile E said:


> nVidia offered ATI to use it for free. They even offered help coding for it.
> 
> Newtekie covered this, so I don't really need to get into the details.
> 
> ...



Well I didn't know that so few games supported hardware acceleration. I was under the impression all Physx supported games used hardware acceleration. Hell I was shopping for a 9800 to improve on these said games. Mussels just saved me some money.


----------



## bobzilla2009 (Apr 12, 2010)

TheMailMan78 said:


> Well I didn't know that so few games supported hardware acceleration. I was under the impression all Physx supported games used hardware acceleration. Hell I was shopping for a 9800 to improve on these said games. Mussels just saved me some money.



indeed, better to get a hd5770 or something when directcompute becomes popular


----------



## Mr McC (Apr 12, 2010)

I couldn't agree more with the original post. In summary, PhysX, as it stands, is a gimmick employed on a very limited number of games (16), with little or no future as it is not a open standard, whilst sucking up resources on the graphics card that may have an adverse effect on fps, all to produce effects that could easily be assigned to inactive or lowly taxed cpu cores. 

However, I would like someone to explain to me why every single review of an ATI card on this site includes a negative point in the conclusion for failure to support PhysX. Are the 16 titles that employ PhysX essential titles in any gamer’s library? Does Nvidia pay or otherwise coerce or persuade the site to include this point? I feel that the credibility of the site suffers when a forum moderator points out, and in my humble opinion correctly, that PhysX is not worth consideration and yet the main site actively criticises ATI hardware for its failure to implement this Nvidia proprietary technology at every available opportunity.

Certainly this post could be construed as inflammatory; however, I feel that in light of the original post, some explanation should be forthcoming to, at the very least, explain this glaring inconsistency.


----------



## HalfAHertz (Apr 12, 2010)

Mr McC said:


> I couldn't agree more with the original post. In summary, PhysX, as it stands, is a gimmick employed on a very limited number of games (16), with little or no future as it is not a open standard, whilst sucking up resources on the graphics card that may have an adverse effect on fps, all to produce effects that could easily be assigned to inactive or lowly taxed cpu cores.
> 
> However, I would like someone to explain to me why every single review of an ATI card on this site includes a negative point in the conclusion for failure to support PhysX. Are the 16 titles that employ PhysX essential titles in any gamer’s library? Does Nvidia pay or otherwise coerce or persuade the site to include this point? I feel that the credibility of the site suffers when a forum moderator points out, and in my humble opinion correctly, that PhysX is not worth consideration and yet the main site actively criticises ATI hardware for its failure to implement this Nvidia proprietary technology at every available opportunity.
> 
> Certainly this post could be construed as inflammatory; however, I feel that in light of the original post, some explanation should be forthcoming to, at the very least, explain this glaring inconsistency.



Because Physx is right here right now. If somebody bothers to add it, it is indeed a nice feature just like like AA or AF is a nice feature - it is not essential for enjoying the game and it decreases fps, but it makes things look better, which is and has always been the main driving force behind new hardware - getting better GFX.

Unlike Ati who have had countless press releases about Havoc, Bulet or w/e GPU accelerated physics being the next big thing after sliced bread, Nvidia have actually delivered, even if in only 15 titles. Don't get me wrong, I hate closed standards as much as the next guy but it's nice to have at least one option if everything else fails. It's something like hydrogen cars now. They exist but are not widely implemented because they would be commercial failures due to high costs, but who knows some time in the future they might become a viable solution...

Edit: Just the other day I was doing a research on commercial GPU accelerated  renderers. Ati currently have none, while Nvidia promotes no less than 3 for both biased and unbiased paths...guess what the next hardware refresh will be unless things don't change.


----------



## Mr McC (Apr 12, 2010)

HalfAHertz said:


> Because Physx is right here right now. If somebody bothers to add it, it is indeed a nice feature just like like AA or AF is a nice feature - it is not essential for enjoying the game and it decreases fps, but it makes things look better, which is and has always been the main driving force behind new hardware - getting better GFX.
> 
> Unlike Ati who have had countless press releases about Havoc, Bulet or w/e GPU accelerated physics being the next big thing after sliced bread, Nvidia have actually delivered, even if in only 15 titles. They shouldn't be bashed for promoting it, instead they should be encouraged to release it under GPL/FOSS so that adoption could be sped up.



I strongly disagree. Physics acceleration should be an open standard. If we support the solution of a particular company, Nvidia in this case, or applaud advances made in this regard, we are helping to bring about a situation that will not benefit consumers, in fact, quite the contrary. I do not want to have to ask if a game is an ATI or an Nvidia game before I purchase. Take the Batman debacle: some people say that Nvidia paid for the development, whereby the ATI AA lock out is justified. I cannot comment on that, but what I can say is that Eidos should never have provided Nvidia with the opportunity of taking charge of this area of the game. Obviously this saved them development costs, but how many sales did it lose them because ATI owners eschewed what in their eyes was a game deliberately crippled for their system? If ATI takes this approach, and you seem to suggest that ATI should take a similar approach, we arrive at the aforementioned situation of having to ask whether a game is an "ATI game" or an "Nvidia game" prior to purchase. I think we have enough problems without that sort of bullshit.

Moreover, Nvidia seems to be their own worst enemy when it comes to PhysX: if you want universal support, why disable or hinder a supplementary PhysX card when an ATI card is detected. I am not aware of what percentage of gamers use an ATI 48xx/58xx, but it appears that sales have been good. Nvidia have alienated a large portion of the gaming community with this idea and have probably lost a considerable number of sales to users who just had to have PhysX in those 16 essential titles.

Should ATI do more to further physics acceleration? Undoubtedy, but as an open standard. I am neither in favour of ATI or Nvidia, but I would be glad to see PhysX die off and be replaced by cpu physics acceleration. After all, all those wasted cores in our latest and greatest rigs might as well be put to some use when we play the most recent console port, don't you think?


----------



## Mussels (Apr 12, 2010)

Mr McC said:


> Moreover, Nvidia seems to be their own worst enemy when it comes to PhysX: if you want universal support, why disable or hinder a supplementary PhysX card when an ATI card is detected. I am not aware of what percentage of gamers use an ATI 48xx/58xx, but it appears that sales have been good. Nvidia have alienated a large portion of the gaming community with this idea and have probably lost a considerable number of sales to users who just had to have PhysX in those 16 essential titles.















*ATI 4800 series is the most popular GPU in gamers systems at this moment in time*


----------



## Mr McC (Apr 12, 2010)

So, in broad terms, we have more or less a third of the gaming community, who may possibly have bought a supplementary PhysX card, who are either locked out or reduced to using hacks with outdated drivers that may cause headaches in other areas, to play a total of 16 games with added smoke swirls and yet, we should support PhysX and applaud Nvidia? Do they know how to do business? Clearly they had a lot of people convinced that they were on to a good thing with PhysX.

If we isolate the numbers to the most popular card at the moment, your figures suggest that there are more users of ATI 4800 series. I guess Nvidia doesn't want to sell them any products as they have clearly let them know that they will punish all traitors to the cause. 

Thank you for writing a post that cuts through the crap and tells it like it is.


----------



## newtekie1 (Apr 12, 2010)

air_ii said:


> I wasn't talking about such situation. I was rather refering to extensions, new features, new versions that would not fall under the scope of free licensing anymore, but would become vital due to CUDA's dominant position. I don't know if they couldn't do that...



Again, legally nVidia can not do this either.  Once they have offered CUDA as a free developement tool that any hardware, CPU or GPU, can support they can not legally change that later.  They can change it so any new hardware company coming in would have to pay licencing fees, but the hardware companies already supporting it would not have to pay these fees.


----------



## Bjorn_Of_Iceland (Apr 12, 2010)

I really wanted to see.. a physics scene where calculation is done through CPU vs GPU (shared) vs dedicated GPU.. a scene where the amount of objects are equal in all of the render.. see if there is any framerate difference.


----------



## kid41212003 (Apr 12, 2010)

Mussels said:


> http://img.techpowerup.org/100412/552.jpg
> 
> http://img.techpowerup.org/100412/607.jpg
> 
> ...




That's quite mis-leading, combining all the DX10 NVIDIA GPUs and....


----------



## newtekie1 (Apr 12, 2010)

Bjorn_Of_Iceland said:


> I really wanted to see.. a physics scene where calculation is done through CPU vs GPU (shared) vs dedicated GPU.. a scene where the amount of objects are equal in all of the render.. see if there is any framerate difference.



I swear someone did this back when Ageia was first hitting the market.  Well it was a comparison of PhysX in GRAW or CellFactor running on the CPU and running on the Ageia PPU.  When it ran on the CPU the framerate was a slideshow, when it ran on the PPU it was smooth.

But the result is going to vary from physics engine to physics engine, it all depends on how the engine was optimized to run.


----------



## Mr McC (Apr 12, 2010)

kid41212003 said:


> That's quite mis-leading, combining all the DX10 NVIDIA GPUs and....



Let's not go down the road of disputing figures. How about we simply accept that there are considerable numbers of gamers who use ATI cards. Is it a good decision to lock out these potential consumers? Is it a good idea to hinder the attempts of someone who owns cards by both brands when they want to implement PhysX? This "my way or the highway" approach is neither good for consumers or likely to further the implementation of PhysX; however, as I explained above, I'm quite happy about that.


----------



## HalfAHertz (Apr 12, 2010)

Mr McC said:


> So, in broad terms, we have more or less a third of the gaming community, who may possibly have bought a supplementary PhysX card, who are either locked out or reduced to using hacks with outdated drivers that may cause headaches in other areas, to play a total of 16 games with added smoke swirls and yet, we should support PhysX and applaud Nvidia? Do they know how to do business? Clearly they had a lot of people convinced that they were on to a good thing with PhysX.



Not one person has said that we should start applauding Nvidia, nor that you should go out and buy their product just because of Phisx. Don't bend the truth.

Physx is only a nice bonus. It is just an extra option, just like having x16 AA instead of x8 AA. It is and will continue to be the icing on top of the green cake. Now if you don't like the green icing, you can always have your cake without it.


----------



## Mr McC (Apr 12, 2010)

HalfAHertz said:


> Not one person has said that we should start applauding Nvidia, nor that you should go out and buy their product just because of Phisx. Don't bend the truth.
> 
> Physx is only a nice bonus. It is just an extra option, just like having x16 AA instead of x8 AA. It is and will continue to be the icing on top of the green cake. Now if you don't like the green icing, you can always have your cake without it.



Stating that PhysX is the icing is tantamount to applauding its implementation, unless you don't have a sweet tooth and dislike pastry.

"Bonus" also appears to be pretty positive to me as is "nice", perhaps we understand language differently?

If we are both stating that physics acceleration is the way forward, then we agree. If you continue to state that PhysX is a step forward, we do not. Then again, that's no big deal, we can agree to disagree.


----------



## kid41212003 (Apr 12, 2010)

I accepted the facts, that:

~8% of Steam users own a HD4800 series card.
~8% still using 8800 series, which is an older series compare to HD4800 series.

If we combine 8800, 9800 and GTX260, that's 18% of cards that capable of PhysX with playable framerate (not included cards of that screenshot).

And overall 61% are using NVIDIA cards.

And the fact that this does not include people not from Steam.

NVIDIA did not lock out customers, because ATI users still can play games that have PhysX.


----------



## Mr McC (Apr 12, 2010)

kid41212003 said:


> I accepted the facts, that:
> NVIDIA did not lock out customers, because ATI users still can play games that have PhysX.



True, but Nvidia has forced them to jump through too many hoops and there are reports of compatibilty issues in other areas due to the fact that a hack has to be implemented and older Nvidia drivers have to be used. At least concede that Nvidia has done everthing in their power to ensure that ATI users do not use Nvidia cards in conjunction with their hardware.

Moreover, the developers should not allow Nvidia to implement areas that ought to be open standards or essential components of the game (i.e. in-game AA). I am not sure who is to "blame" for this, but as the number fo ATI users grows, the developers cannot fail to acknowledge that it may not be in their interests to continue to produce TWIMTBP games.


----------



## Mussels (Apr 12, 2010)

newtekie1 said:


> Again, legally nVidia can not do this either.  Once they have offered CUDA as a free developement tool that any hardware, CPU or GPU, can support they can not legally change that later.  They can change it so any new hardware company coming in would have to pay licencing fees, but the hardware companies already supporting it would not have to pay these fees.



however, what if they have things such as "you cant put the physX logo on your hardware boxes without paying us."

PhysX is free, the logos are not - people dont buy a video card for physX if its not on the box - what looks free and happy on the surface, is really just a marketing ploy.



Bjorn_Of_Iceland said:


> I really wanted to see.. a physics scene where calculation is done through CPU vs GPU (shared) vs dedicated GPU.. a scene where the amount of objects are equal in all of the render.. see if there is any framerate difference.



they did that. multithreaded CPU's are faster than physX via cuda.




kid41212003 said:


> I accepted the facts, that:
> 
> ~8% of Steam users own a HD4800 series card.
> ~8% still using 8800 series, which is an older series compare to HD4800 series.
> ...



we're not arguing that theres a good percentage of users that can use physX - i was just showing the numbers. ATi 4K cards are the most popular individual cards atm (and remember that the nvidia cards are lumped together too - 8800 series counts 8800GT, GTX, GS, GTS (G80 and G92 variants) whereas the ATI version is just four cards - 4830, 50, 70 and 90.

looking at it in its simplest terms: only 20% or so of those people have an nvidia card powerful enough to run PhysX (since it slows down 90% of the 15 games its in, when enabling it). thats still 80% of the market who cant USE this feature, and a large percentage of those are ATI users.

nvidia doomed physX when it locked out ATI users with nvidia/Ageia cards, becuase it did the one thing you should never do: cut your marketshare down.


----------



## Mr McC (Apr 12, 2010)

Mussels said:


> nvidia doomed physX when it locked out ATI users with nvidia/Ageia cards, becuase it did the one thing you should never do: cut your marketshare down.



That's exactly what I have been trying to say.


----------



## newtekie1 (Apr 12, 2010)

Mussels said:


> however, what if they have things such as "you cant put the physX logo on your hardware boxes without paying us."
> 
> PhysX is free, the logos are not - people dont buy a video card for physX if its not on the box - what looks free and happy on the surface, is really just a marketing ploy.



Yes, there are a lot of "what ifs" that we won't know because the deal was never given a chance.

However, I don't really see how a logo on a box would matter, or why nVidia would push to get ATi to support the standard only to then limit market availability.  That kind of seems backwards doesn't it?


----------



## kid41212003 (Apr 12, 2010)

I believe if you use PhysX in your games that have a sale price over $15, you will have to contact NVIDIA, so that they can put their logo on the box. It's still no charge.


----------



## Mr McC (Apr 12, 2010)

newtekie1 said:


> Yes, there are a lot of "what ifs" that we won't know because the deal was never given a chance.
> 
> However, I don't really see how a logo on a box would matter, or why nVidia would push to get ATi to support the standard only to then limit market availability.  That kind of seems backwards doesn't it?



Isn't the lock out backwards to begin with? Do you think the "my way or the highway" approach is likely to build ATI's willingness or desire to support PhysX?


----------



## Mussels (Apr 12, 2010)

newtekie1 said:


> Yes, there are a lot of "what ifs" that we won't know because the deal was never given a chance.
> 
> However, I don't really see how a logo on a box would matter, or why nVidia would push to get ATi to support the standard only to then limit market availability.  That kind of seems backwards doesn't it?



let me put it this way: do you really think nvidia would allow the following for free, given a choice:

Their logos on ATI hardware boxes
ATI support/branding on games
driver/tech support to help them code cuda to work on STREAM
test games before release to make them work on ATI cards



kid41212003 said:


> I believe if you use PhysX in your games that have a sale price over $15, you will have to contact NVIDIA, so that they can put their logo on the box. It's still no charge.



imagine how ATI get sued for putting a physX logo on a 5870 box, if nvidia didnt get royalties. free to use the software, is not the same as 'free to sell hardware advertising it to work with our stuff'


----------



## kid41212003 (Apr 12, 2010)

ATI refused PhysX in first place, one of the guy from ngohq modified the driver that made PhysX cards work with ATI card, and he even got support from NVIDIA. I believe now he got a job from NVIDIA.



Mussels said:


> imagine how ATI get sued for putting a physX logo on a 5870 box, if nvidia didnt get royalties. free to use the software, is not the same as 'free to sell hardware advertising it to work with our stuff'



Before they can do that, they probably contact NVIDIA first, and I mean if they really want to put PhysX logo on their boxes, which I think they never want to.


----------



## Mussels (Apr 12, 2010)

kid41212003 said:


> ATI refused PhysX in first place, one of the guy from ngohq modified the driver that made PhysX cards work with ATI card, and he even got support from NVIDIA. I believe now he got a job from NVIDIA.



please read my posts on the previous page. 'free to run' is not 'free to advertise' or 'without limitations'


look at it in another light: mac vs windows.

its like microsoft saying: sure thing man, you can run ALL windows apps for free on mac OSX.

However, our logo must be on all macs that support this - and you gotta pay royalties for it. Oh and YOU have to guarantee it works perfectly on all windows programs, not us. Oh and we make no assurances that it wont be optimized for our hardware over yours.


----------



## newtekie1 (Apr 12, 2010)

Mussels said:


> let me put it this way: do you really think nvidia would allow the following for free, given a choice:
> 
> Their logos on ATI hardware boxes



Absolutely, I'd love my logo on my competitors product



> ATI support/branding on games



What does this have to do with PhysX?



> driver/tech support to help them code cuda to work on STREAM



And why does CUDA have to run on STREAM?  I mean, I know it doesn't, but I'd like to know why you seem to think it does.



> test games before release to make them work on ATI cards



The beauty of CUDA.  If CUDA works on ATi hardware, and PhysX works on CUDA, and PhysX works in the game, then there is no additional testing needed.  CUDA apps are rather hardware idependent.



Mussels said:


> please read my posts on the previous page. 'free to run' is not 'free to advertise' or 'without limitations'
> 
> 
> look at it in another light: mac vs windows.
> ...



Ignoring the obvious wrong that assumes Microsoft controls all Windows apps...

Yeah, it is kind of like Apple advertising bootcamp can run Windows and all Windows apps for free...oh you probably didn't mean it like that...


----------



## Mussels (Apr 12, 2010)

newtekie1 said:


> Absolutely, I'd love my logo on my competitors product



Nvidia would love their logos on ATI hardware. ATI would not love it, nor would it be free.




newtekie1 said:


> What does this have to do with PhysX?



PhsyX titles are TWIMTBP titles. if ATI supported physX and it became popular, they'd be killing the GITG program and basically giving NVidia advertising rights to every single game.




newtekie1 said:


> And why does CUDA have to run on STREAM?  I mean, I know it doesn't, but I'd like to know why you seem to think it does.


I may have made a typo - i should have said PHYSX needs to run on stream 





newtekie1 said:


> The beauty of CUDA.  If CUDA works on ATi hardware, and PhysX works on CUDA, and PhysX works in the game, then there is no additional testing needed.  CUDA apps are rather hardware idependent.



Alright, fair point. what about performance issues? what about limitations to how much can run at any one time on various GPU's? like, lets say nvidia make it so they can handle 500 phsyX items at once and ATI can only handle 400 - would nvidia tell game devs to hold back for ATI's sake, or would they say "buy nvidia cause we do it better!"


----------



## kid41212003 (Apr 12, 2010)

It's for NVIDIA cards, they bought the company, they hired people to made it work with their GPUs.

It's like saying Eye something is not work on NVIDIA cards.

I'm always think of it this way:

I get this when I choose this brand, and I get another "different" thing when I choose a different brand. 

But still, it's not like when you have an ATI card you can't play games that have PhysX.

Adobe wrote a translator to translate programs code (not originally for Istuffs) to work with Apple, and Apple updated their Ipad so it will not work with programs using that translator. That's what I call BS.


----------



## bobzilla2009 (Apr 12, 2010)

Mussels said:


> Alright, fair point. what about performance issues? what about limitations to how much can run at any one time on various GPU's? like, lets say nvidia make it so they can handle 500 phsyX items at once and ATI can only handle 400 - would nvidia tell game devs to hold back for ATI's sake, or would they say "buy nvidia cause we do it better!"



the latter by a long way, although ati gpu's have always been good with regards to computing. So the only way they could force a hd5 series card to run terrible is to make it so only the gtx470 and gtx480 could run the game at all or make sure that the physX code just doesn't run too well on ATi cards. But it is nvidia after all, so they would do that.

Directcompute is the way to go, maybe openCL if they get their act together a bit quicker.


----------



## newtekie1 (Apr 12, 2010)

Mussels said:


> Nvidia would love their logos on ATI hardware. ATI would not love it, nor would it be free.



You say it wouldn't be free, I say it would.  We'll just have to agree to disagree, because there isn't anything really supporting either side, since the deal never got that far along.




> PhsyX titles are TWIMTBP titles. if ATI supported physX and it became popular, they'd be killing the GITG program and basically giving NVidia advertising rights to every single game.



They've already killed the GITG program, I can't even remember the last game that was part of the GITG program...  ATi has taken a strong stance on not helping developers to optimize games, and instead optimizing the drivers for the games themselves.

You are correct, nVidia would have been able to advertise PhysX for every game that uses it, but ATi could have advertised they supported it...well you say they wouldn't have...



> I may have made a typo - i should have said PHYSX needs to run on stream



Still don't see what Streams has to do with any of this...



> Alright, fair point. what about performance issues? what about limitations to how much can run at any one time on various GPU's? like, lets say nvidia make it so they can handle 500 phsyX items at once and ATI can only handle 400 - would nvidia tell game devs to hold back for ATI's sake, or would they say "buy nvidia cause we do it better!"



I've already talked about the performance issues that many bring up, so I'm not going to rehash that.  And now you are just trying to make up illogical reason to make nVidia look bad, there is nothing even suggesting that nVidia would hinder performance on ATi cards.  Plus, as already mentioned, a 3rd party was going to be doing the developing, so nVidia couldn't do any funny business even if they wanted to.


----------



## human_error (Apr 12, 2010)

newtekie1 said:


> You say it wouldn't be free, I say it would.  We'll just have to agree to disagree, because there isn't anything really supporting either side, since the deal never got that far along.
> 
> 
> They've already killed the GITG program, I can't even remember the last game that was part of the GITG program...  ATi has taken a strong stance on not helping developers to optimize games, and instead optimizing the drivers for the games themselves.



Before people start saying "nvidia would let ati use physx" or "ati don't help developers at all" I suggest everyone reads http://www.bit-tech.net/bits/interviews/2010/01/06/interview-amd-on-game-development-and-dx11/1 where it is claimed nvidia refused to let ati run physx on ati hardware at all, and goes into detail on how ati helps developers to ensure compatability with ati hardware, but some developers do not talk to ati at all (TWIMTBP games developers usually) and so ati have to release driver patches/update to ensure compatability after the game has launched.


----------



## Mussels (Apr 12, 2010)

newtekie: ATI stream is ATI's version of cuda.

They cant run cuda, so anything running, would have to run on stream...

are you seriously this in-depth in the discussion, with no idea what ATI stream is, or how its different to CUDA?


----------



## bobzilla2009 (Apr 12, 2010)

The GITG program = AMD: The future is fusion


----------



## newtekie1 (Apr 12, 2010)

Mussels said:


> newtekie: ATI stream is ATI's version of cuda.
> 
> They cant run cuda, so anything running, would have to run on stream...
> 
> are you seriously this in-depth in the discussion, with no idea what ATI stream is, or how its different to CUDA?



I know exactly what Stream is, yet I still fail to see what it has to do with PhysX.  Why exactly does PhysX have to run on Stream is my question?


----------



## bobzilla2009 (Apr 12, 2010)

newtekie1 said:


> I know exactly what Streams is, yet I still fail to see what it has to do with PhysX.  Why exactly does PhysX have to run on Streams is my question?



because stream is the ati version of cuda, it's how such programs interact with the gpu.


----------



## Mussels (Apr 12, 2010)

newtekie1 said:


> I know exactly what Stream is, yet I still fail to see what it has to do with PhysX.  Why exactly does PhysX have to run on Stream is my question?



because otherwise ATI cant run it?


----------



## newtekie1 (Apr 12, 2010)

Mussels said:


> because otherwise ATI cant run it?



So you're just ignoring the 3rd party adapting ATi's drivers to natively support CUDA then?


----------



## bobzilla2009 (Apr 12, 2010)

newtekie1 said:


> So you're just ignoring the 3rd party adapting ATi's drivers to natively support CUDA then?



when i actually see that proven i have to just dismiss it. All the evidence for the physX adaptation was one picture. One that could and certainly would have been photoshopped. If it was so effective it WOULD have been 'leaked' somehow, no one can throw around a lawsuit if they don't know who did it.

Also the excuse was that they couldn't get a hd4870, and thats why it fell apart. O RLY? if they wanted to, they could have just bought one. It reeks of general nonsense and veiled propaganda imo. Unless of course it's a new team doing it now? and not the one from 2-3 years ago?

I read a lot about that so called team trying it before and it all seemed like bull to me, especially the 'nvidia were very eager to help us port it to ati cards for free'...


----------



## Mussels (Apr 12, 2010)

newtekie1 said:


> So you're just ignoring the 3rd party adapting ATi's drivers to natively support CUDA then?



there is simply no way CUDA ran natively on ATI.

Emulated, ported to STREAM, ran on some early beta of directcompute or openCL, whatever it was, it was NOT natively ran.


----------



## human_error (Apr 12, 2010)

newtekie1 said:


> So you're just ignoring the 3rd party adapting ATi's drivers to natively support CUDA then?



Using an abstraction layer to translate calls in one language/syntax to another one is emulation, not native support (plus it was physx which was translated, not cuda). They were essentially emulating cuda support by making stream calls which were the equivalent to the cuda calls, however this was never released and so it is likely that there is not an exact stream equivalent to every cuda command, and visa-versa. This means that the abstraction layer would have to add more calls to emulate cuda sufficiently enough for physx to run and due to the extra work needed it would have not been very efficient or fast compared to running on a cuda platform.

Plus if it is unreleased with no comprehensive proof then it means nothing (didn't nvidia hire the guy who apparently made the abstraction layer on the condition that he never release it so to prevent any possibility of ati physx support).


----------



## Mussels (Apr 12, 2010)

i do get sick of people quoting one third party website which released nothing than a few possibly faked screenshots... i mean shite, the screenshots showing an ATI video card with boosted scores from physX was possible with an ATI and nvidia card in the same system, could have been as lame a hoax as that.


----------



## newtekie1 (Apr 12, 2010)

bobzilla2009 said:


> when i actually see that proven i have to just dismiss it. All the evidence for the physX adaptation was one picture. One that could and certainly would have been photoshopped. If it was so effective it WOULD have been 'leaked' somehow, no one can throw around a lawsuit if they don't know who did it.
> 
> Also the excuse was that they couldn't get a hd4870, and thats why it fell apart. O RLY? if they wanted to, they could have just bought one. It reeks of general nonsense and veiled propaganda imo. Unless of course it's a new team doing it now? and not the one from 2-3 years ago?
> 
> I read a lot about that so called team trying it before and it all seemed like bull to me, especially the 'nvidia were very eager to help us port it to ati cards for free'...





Mussels said:


> there is simply no way CUDA ran natively on ATI.
> 
> Emulated, ported to STREAM, ran on some early beta of directcompute or openCL, whatever it was, it was NOT natively ran.






			
				Eran Badit said:
			
		

> there are some issues that need to be addressed, since adding Radeon support in CUDA isn’t a big deal - but it’s not enough! We also need to add CUDA support on AMD’s driver level and its being addressed as we speak.



Note he says CUDA, not PhysX.

And it wasn't just an issue of getting an HD4000 series card, I'm pretty sure nVidia bought him one pretty quickly, the issue came down entirely to adding CUDA support to ATi's drivers.


----------



## bobzilla2009 (Apr 12, 2010)

Mussels said:


> i do get sick of people quoting one third party website which released nothing than a few possibly faked screenshots... i mean shite, the screenshots showing an ATI video card with boosted scores from physX was possible with an ATI and nvidia card in the same system, could have been as lame a hoax as that.



yes, and especially since a guy working at ATi later in year (i'll try and find the interview) said specifically that nvidia told them to 'go whistle' (exact words) when they asked about the physX on ATi cards. Nvidia are the gits, and the fact that ATi have outsold them for 2 generations is making them worse.



newtekie1 said:


> Note he says CUDA, not PhysX.



why did he stop? because they couldn't get a hd4870! lame excuse so they could end the hoax. I'm sure nvidia would have bought them one too, so why didn't they? hmm?


----------



## human_error (Apr 12, 2010)

bobzilla2009 said:


> yes, and especially since a guy working at ATi later in year (i'll try and find the interview) said specifically that nvidia told them to 'go whistle' (exact words) when they asked about the physX on ATi cards. Nvidia are the gits, and the fact that ATi have outsold them for 2 generations is making them worse.
> 
> 
> why did he stop? because they couldn't get a hd4870! lame excuse so they could end the hoax. I'm sure nvidia would have bought them one too, so why didn't they? hmm?



I posted the interview link already :-



human_error said:


> Before people start saying "nvidia would let ati use physx" or "ati don't help developers at all" I suggest everyone reads http://www.bit-tech.net/bits/interviews/2010/01/06/interview-amd-on-game-development-and-dx11/1 where it is claimed nvidia refused to let ati run physx on ati hardware at all, and goes into detail on how ati helps developers to ensure compatability with ati hardware, but some developers do not talk to ati at all (TWIMTBP games developers usually) and so ati have to release driver patches/update to ensure compatability after the game has launched.



The guy stopped development of the app because he was hired by nvidia - the best way to stop development was to hire the guy as a bribe to stop him developing further or releasing any code.

As for the cuda vs physx emulation the guy claimed to have he was referring to cuda call support on ati cards which meant translating cuda calls to stream calls for the purpose of running physx. It could be expanded further (any cuda apps, not just physx) but would require more work as every possible cuda call would have to be translated into the correct stream or series of stream calls (of which there could be multiple solutions with different levels of accuracy or efficiency, which would need to be selected differently depending on the task - which is a lot of work).


----------



## bobzilla2009 (Apr 12, 2010)

human_error said:


> I posted the interview link already :-



sorry  must have missed it

anyway, the post of the cuda on ati drivers was one picture with lots of promises of videos and proof, and even a driver release about 3 days after! None of it happened at all. So it obviously never worked.


----------



## newtekie1 (Apr 12, 2010)

bobzilla2009 said:


> yes, and especially since a guy working at ATi later in year (i'll try and find the interview) said specifically that nvidia told them to 'go whistle' (exact words) when they asked about the physX on ATi cards. Nvidia are the gits, and the fact that ATi have outsold them for 2 generations is making them worse.
> 
> 
> 
> why did he stop? because they couldn't get a hd4870! lame excuse so they could end the hoax. I'm sure nvidia would have bought them one too, so why didn't they? hmm?



That was a very early reason, a hurdle which was quickly cleared.  Again, the reason for stopping developement was ATi would not add, or allow him to add, support for CUDA to their drivers.



human_error said:


> I posted the interview link already :-
> 
> 
> 
> ...



He continued developement under nVidia, even gave several interviews, and nVidia publicly said they supported his efforts.  He wasn't exactly hired by nVidia either, like he wasn't directly working for them, he was added to their developement team, kind of like s sub-contractor.

And the interview, and the comments in it, were from waaaay after all of this.  After ATi had already shut the door on CUDA running on ATi hardware natively, and nVidia in turn entirely cut PhysX support if ATi hardware was detected.  On dick move leads to another....and another...and another...  I hate politics.


----------



## Mussels (Apr 12, 2010)

what you're talking about here, is the same as running x86 software on itanium, or an xbox360 game on a mac.

there is simply NO WAY cuda ran natively on ATI hardware.

I'm seriously sorry if you fell for the crap, but it just screams hoax.


----------



## bobzilla2009 (Apr 12, 2010)

newtekie1 said:


> That was a very early reason, a hurdle which was quickly cleared.  Again, the reason for stopping developement was ATi would not add, or allow him to add, support for CUDA to their drivers.
> 
> He continued developement under nVidia, even gave several interviews, and nVidia publicly said they supported his efforts.  He wasn't exactly hired by nVidia either, like he wasn't directly working for them, he was added to their developement team, kind of like s sub-contractor.
> 
> And the interview, and the comments in it, were from waaaay after all of this.  After ATi had already shut the door on CUDA running on ATi hardware natively, and nVidia in turn entirely cut PhysX support if ATi hardware was detected.  On dick move leads to another....and another...and another...  I hate politics.



Maybe ATi would lose the ability to use Havoc on their gpu's or something similar, but any real proof that it did work would have been nice. Anyway, the future of physics in games is either in open source or just putting it back onto the ol' cpu. It's not like they have to do much atm. The most demanding games like crysis rarely hit 40-50% cpu load on most decent processors, even with a standard (in the case of crysis, pretty decent) physics engine.

Also, why would ATi say that nvidia wouldn't help them port physX to ati cards AT ALL, but nvidia were actively supporting a random guy on the internet porting CUDA? Is it because CUDA would run horribly if it was ported, whereas physX would run very very well on stream? i think so, and it makes sense.


----------



## human_error (Apr 12, 2010)

newtekie1 said:


> After ATi had already shut the door on CUDA running on ATi hardware natively



Again i'll have to correct you on this - it is not native support. It is emulation. Emulation is almost always inefficient and buggy when trying to run hardware supported calculations on hardware which does not have those exact calculations supported. You would not get a very good result running CUDA on STREAM hardware (on the 4k series) because CUDA had things such as double floating point precision (i think that is correct, been a while since i looked into the fundamental differences), which stream and the 4k series did not have, so the emulation would needto do a lot more calculations to perform the same task, and even then it isn't guaranteed that the emulation would work with 100% compatability.

Anyway i feel this is getting too far off topic, if you want to discuss this further open up a new thread and we can look into it in detail there.


----------



## Mussels (Apr 12, 2010)

human_error said:


> Again i'll have to correct you on this - it is not native support. It is emulation. Emulation is almost always inefficient and buggy when trying to run hardware supported calculations on hardware which does not have those exact calculations supported. You would not get a very good result running CUDA on STREAM hardware (on the 4k series) because CUDA had things such as double floating point precision (i think that is correct, been a while since i looked into the fundamental differences), which stream and the 4k series did not have, so the emulation would needto do a lot more calculations to perform the same task, and even then it isn't guaranteed that the emulation would work with 100% compatability.
> 
> Anyway i feel this is getting too far off topic, if you want to discuss this further open up a new thread and we can look into it in detail there.



4k does have double float etc, although oddly not all 5k do (5770 doesnt for example)

but yeah - the hardware was totally different, theres no damn way.


----------



## cadaveca (Apr 12, 2010)

I didn't see anyone mention that it's not Phys-X that sucks really, but now developer's implement it?

I mean, it's like DX11 is now...kinda useless, and just used for visual effects, rather accelerating the existing app....

Any tech that serves as an "add-on" will be presented this way though. Open standards, or whatever, aren't going to make a difference until development houses actually work to have this stuff integrated into gameplay...and because the lowest common denominator is consoles, any sort of add-on to graphics is gonna be just that...an add-on.


----------



## bobzilla2009 (Apr 12, 2010)

cadaveca said:


> I didn't see anyone mention that it's not Phys-X that sucks really, but now developer's implement it?
> 
> I mean, it's like DX11 is now...kinda useless, and just used for visual effects, rather accelerating the existing app....
> 
> Any tech that serves as an "add-on" will be presented this way though. Open standards, or whatever, aren't going to make a difference until development houses actually work to have this stuff integrated into gameplay...and because the lowest common denominator is consoles, any sort of add-on to graphics is gonna be just that...an add-on.



Indeed, physX could be fantastic as an overlay for detail, but the fact that nvidia hold onto it like a safety blanket is going to cause its death. Tbh i expect the next console generation to be the end of consoles as a gaming medium for actual gamers and not casuals. I foresee a lot of wii clones (both the ps3 and the 360 are guilty of it already) and the PC will be left as the only real gaming platform. So that might lead to some decent games then ^^


----------



## newtekie1 (Apr 12, 2010)

human_error said:


> Again i'll have to correct you on this - it is not native support. It is emulation. Emulation is almost always inefficient and buggy when trying to run hardware supported calculations on hardware which does not have those exact calculations supported. You would not get a very good result running CUDA on STREAM hardware (on the 4k series) because CUDA had things such as double floating point precision (i think that is correct, been a while since i looked into the fundamental differences), which stream and the 4k series did not have, so the emulation would needto do a lot more calculations to perform the same task, and even then it isn't guaranteed that the emulation would work with 100% compatability.
> 
> Anyway i feel this is getting too far off topic, if you want to discuss this further open up a new thread and we can look into it in detail there.



No, it is native support, not emulation.  Emulation would be running CUDA inside of Stream, which wasn't what was happening.  CUDA was accessing the GPU directly to do the work.

Again, another beauty of CUDA, it can be adapted to run on pretty much anything.


----------



## Mussels (Apr 12, 2010)

cadaveca said:


> I didn't see anyone mention that it's not Phys-X that sucks really, but now developer's implement it?
> 
> I mean, it's like DX11 is now...kinda useless, and just used for visual effects, rather accelerating the existing app....
> 
> Any tech that serves as an "add-on" will be presented this way though. Open standards, or whatever, aren't going to make a difference until development houses actually work to have this stuff integrated into gameplay...and because the lowest common denominator is consoles, any sort of add-on to graphics is gonna be just that...an add-on.



PhysX itself can only do visual effects. it cant do 'interactive' physics - you know, you cant have the box you pick up with a gravity gun in half life 2 and squish people with it... you can only have 'visual effects' like more debris (that passes through everything) or flags/cloth that cant touch anything that isnt themselves.

combine that with the fact that they cant make it essential (if a game required hardware physX like say, warmonger... so few people can run it, that no one does)

the only way to make it do interactive physics is like what they did in the UT3 bonus maps - it REQUIRES hardware physX to run (thus making the amount of gamers that can use it, very damn little)


----------



## kid41212003 (Apr 12, 2010)

I have played few games on PS3 that use PhysX.

PhysX is not a problem here, it's GPU Acc thing, and PC only thing.


----------



## bobzilla2009 (Apr 12, 2010)

newtekie1 said:


> No, it is native support, not emulation.  Emulation would be running CUDA inside of Stream, which wasn't what was happening.  CUDA was accessing the GPU directly to do the work.



then why haven't we seen anything other than a botched screenshot?


----------



## newtekie1 (Apr 12, 2010)

bobzilla2009 said:


> then why haven't we seen anything other than a botched screenshot?



Already explained, please try reading.


----------



## bobzilla2009 (Apr 12, 2010)

Mussels said:


> PhysX itself can only do visual effects. it cant do 'interactive' physics - you know, you cant have the box you pick up with a gravity gun in half life 2 and squish people with it... you can only have 'visual effects' like more debris (that passes through everything) or flags/cloth that cant touch anything that isnt themselves.
> 
> combine that with the fact that they cant make it essential (if a game required hardware physX like say, warmonger... so few people can run it, that no one does)



I didn't know that! PhysX really is crap then  it has no future. Interactive physics >>>>>>> pretties.


----------



## bobzilla2009 (Apr 12, 2010)

newtekie1 said:


> Already explained, please try reading.



so they couldn't make a video? or post some pictures that show the hd3870 is doing the work? nothing? at all? really? you sure? please try thinking instead of swallowing up what some website told you.

Also, judging by some of the comments by the guy making it, pretty much all companies were out to get him lol. ATi just didn't give them cards it would seem. I can't seem to find the post saying amd blocked driver development. But i've seen ones where nvidia were banning the site from posting nvidia drivers ect. Basically, no-one wanted physX on ATi cards.


----------



## cadaveca (Apr 12, 2010)

Mussels said:


> PhysX itself can only do visual effects. it cant do 'interactive' physics - you know, you cant have the box you pick up with a gravity gun in half life 2 and squish people with it... you can only have 'visual effects' like more debris (that passes through everything) or flags/cloth that cant touch anything that isnt themselves.
> 
> combine that with the fact that they cant make it essential (if a game required hardware physX like say, warmonger... so few people can run it, that no one does)



Yeah, I think this is it's major downfall. Havoc provides interactive physics, and so does DICE's BFBC engine, so there's really no appeal to Phys-X at all for me...I mean, I was one of thsoe that bougth the Ageia hardware when it first came out, and back then, those effects were something that seemed new, and innovative....

But now that it has been years, and nothing new has been offered...it's kinda stale.

If every system had the required hardware, things might change, but that will never happen, or it would have already, so Phys-X is a dead end, for me.

That said, I'm gonna buy a GTX480, and I'll be able to see the effects Phys-X brings...but that's NOT why I'm buying one.


----------



## [I.R.A]_FBi (Apr 12, 2010)

these bitchfests get annoying quite fast =\


----------



## kid41212003 (Apr 12, 2010)

Umm... err

You forgot the game called Cell Factor that does that?

Even Bioshock that using CPU physics don't need GPU acc to do that.


----------



## newtekie1 (Apr 12, 2010)

bobzilla2009 said:


> so they couldn't make a video? or post some pictures that show the hd3870 is doing the work? nothing? at all? really? you sure? please try thinking instead of swallowing up what some website told you.



Lets see, swallow what some website told me, or listen to some random forum user...

Yeah, I'll believe the website that provided proof and not you, sorry.



cadaveca said:


> Yeah, I think this is it's major downfall. Havoc provides interactive physics, and so does DICE's BFBC engine, so there's really no appeal to Phys-X at all for me...I mean, I was one of thsoe that bougth the Ageia hardware when it first came out, and back then, those effects were something that seemed new, and innovative....
> 
> But now that it has been years, and nothing new has been offered...it's kinda stale.
> 
> ...



PhysX can in fact do interactive physics.  However, most developers didn't implement it in this way because it would make the game drasticly different on nVidia hardware then on ATi hardware.

Can you image a game that requires you to break down a wall or move a box or several boxes, and it works with nVidia but no ATi cards?  That game wouldn't work in the market.  So PhysX was limitted to only eye candy effects.


----------



## bobzilla2009 (Apr 12, 2010)

newtekie1 said:


> Lets see, swallow what some website told me, or listen to some random forum user...
> 
> Yeah, I'll believe the website that provided proof and not you, sorry.



I'm sorry  i guess i don't take one picture as proof, especially when the excuses for not making videos were:

'our camcorders aren't very good'

http://www.ngohq.com/news/14219-physx-gpu-acceleration-on-radeon-hd-3850-a-2.html

half way down the comments, but they never needed to camcorder it they could have done a desktop video and have provided far more compelling proof.

I'm sorry for not being so gullible  but it took me about 10 minutes of happily reading the comments about a year ago to realize that it was just a hoax or a sham. It would have been nice if it weren't though. Still, it would never have been official, so it never would have achieved anything. I can live without a few little moving pieces in about 2 of the games i own without having to pay an extra £150 for the same performance.


----------



## Mussels (Apr 12, 2010)

bobzilla2009 said:


> I'm sorry  i guess i don't take one picture as proof, especially when the excuses for not making videos were:
> 
> 'our camcorders aren't very good'
> 
> ...









I tell ya what i see here.

I see a user who ran a 3850 at the same time as an nvidia card for physx, and then edited a single word in his screenshot from nvidia to radeon.


----------



## newtekie1 (Apr 12, 2010)

bobzilla2009 said:


> I'm sorry  i guess i don't take one picture as proof, especially when the excuses for not making videos were:
> 
> 'our camcorders aren't very good'
> 
> ...



Or you are just more willing to believe whatever makes ATi look better...

So I guess you are calling all the reporters from the multiple big name websites that interviewed him and nVidia's spokemen guilible also.



Mussels said:


> http://www.ngohq.com/images/physxonradeon.jpg
> 
> I tell ya what i see here.
> 
> I see a user who ran a 3850 at the same time as an nvidia card for physx, and then edited a single word in his screenshot from nvidia to radeon.



I like how at first you are like, ATi was right for not allowing it. And once your arguments didn't really work out to prove that point, you switch to, well it probably wasn't real anyway...

Whatever...we're just going around in circles here anyway arguing about a technology that doesn't really matter anyway because it is barely used, and won't be used anymore.


----------



## bobzilla2009 (Apr 12, 2010)

newtekie1 said:


> Or you are just more willing to believe whatever makes ATi look better...
> 
> So I guess you are calling all the reporters from the multiple big name websites that interviewed him and nVidia's spokemen guilible also.



The reporters are just reporting what they were told to report, it's their job. Also, who trusts spokesmen for any company without any kind of real proof from other sources that aren't just reporting the garbage that was initially claimed? 

It's like when nvidia was initally showing the gtx480 to be super duper awesome, we all thought 'coolio!' but we all were skeptical (well, mostly) and rightly so. If AMD did that I'd be just as skeptical. Since it's been 2 years without any significant proof,  I quite happily say it's all crap.

But whatever you think, it cannot be significantly proved, nor disproved (as you can't disprove something that never existed if someone believes it to be so). We are in religion debate territory here. Therefore further debate is entirely pointless. You may as well be saying unicorns are real, the argument is equally valid because there are pictures too.






Therefore they exist, obviously.

Anyways, i think we should start hoping for better physics in games when we actually have crysis level graphics as standard


----------



## Mussels (Apr 12, 2010)

newtekie1 said:


> Or you are just more willing to believe whatever makes ATi look better...
> 
> So I guess you are calling all the reporters from the multiple big name websites that interviewed him and nVidia's spokemen guilible also.
> 
> ...



i never said ATI was right for not allowing it - i said the entire thing about nvidia offering it for free was made up bullshit. There was simply no way it was ever going to happen like the rumours with zero factual backing stated, just like the crap you're making up about CUDA running on ATI.


----------



## human_error (Apr 12, 2010)

2 minutes in paint results in this proof i have physx running on my 5970:






Reasons on discrepencies: "Now of course gpu-z didn't pick me up running physx because it wouldn't look for it on ATi cards", "I can't take a video because my camera isn't very good", "I can't distribute it because ATi sent me a nasty letter threatening legal action if i release it" etc

Screenshots mean nothing - unless we have reputable people confirming it running plus videos or a version to try ourselves then it is all FUD.


----------



## newtekie1 (Apr 12, 2010)

Mussels said:


> i never said ATI was right for not allowing it - i said the entire thing about nvidia offering it for free was made up bullshit. There was simply no way it was ever going to happen like the rumours with zero factual backing stated, just like the crap you're making up about CUDA running on ATI.



Now _I'm_ making it up...ok...

I'm just posting what has been said by others, I'm not making a thing up.  But I guess trying to invalidate the poster personally is a rather common resort when you argument is failing...

I guess the true test would have been for ATi to allow it and see what happened, but alas we didn't see that, so we can't really know.

And as I said, it is kind of pointless to keep arguing about it, since as pointed out in the first post, PhysX is pretty much unused at this point, and won't likely be used for anything major in the future.

Have a nice day ya'll, I'm going to play some games. Fuck PhysX.


----------



## bobzilla2009 (Apr 12, 2010)

human_error said:


> 2 minutes in paint results in this proof i have physx running on my 5970:
> 
> http://img.techpowerup.org/100412/Capture.png
> 
> ...



omg! physX running on a hd5970!!!! when are you going to release the drivers!


----------



## Mussels (Apr 12, 2010)

newtekie1 said:


> Now _I'm_ making it up...ok...
> 
> I'm just posting what has been said by others, I'm not making a thing up.  But I guess trying to invalidate the poster personally is a rather common resort when you argument is failing...



Dude... you fell for a fake screenshot. If i go around telling people that the sky is falling, i quote the source - and when that source turns out to be the town drunk in the middle of a hailstorm, well gee, i dunno... maybe its fake?

please stop threadcrapping about something that everyone but you knows is not true.


----------



## newtekie1 (Apr 12, 2010)

Mussels said:


> Dude... you fell for a fake screenshot. If i go around telling people that the sky is falling, i quote the source - and when that source turns out to be the town drunk in the middle of a hailstorm, well gee, i dunno... maybe its fake?
> 
> please stop threadcrapping about something that everyone but you knows is not true.



The screenshot is one thing, which I agree could be faked easily.  The verification by nVidia, and the statements from both him and nVidia afterward...thats something else and more to my point.

Why does everyone assume everything I've said is based soley on one screenshot?  Did none of you read the article with statements from nVidia saying they supported developement of CUDA for ATi hardware?


----------



## bobzilla2009 (Apr 12, 2010)

newtekie1 said:


> The screenshot is one thing, which I agree could be faked easily.  The verification by nVidia, and the statements from both him and nVidia afterward...thats something else and more to my point.



But the guys never showed anything for it at all, ever. So your evidence is a picture and some nvidia guys that were highly likely to be looking to tarnish AMD's reputation with the hd4 series being so well received by gamers. Hardly compelling proof.


----------



## Mussels (Apr 12, 2010)

newtekie1 said:


> The screenshot is one thing, which I agree could be faked easily.  The verification by nVidia, and the statements from both him and nVidia afterward...thats something else and more to my point.



No it isnt.

he made an nvidia card run as a physX only card with a primary ATI video card.

Nvidia 'hired' him.

Hmm, what even transpired recently? oh yeah, you can no longer run physx with any ATI cards detected in your system.

Hmmm, my version sounds far more realistic than yours.

If you remove that ONE word from the physX panel*, it looks just like when you pair an nvidia and ATI card together.




*which is so clearly god damned fake, i mean c'mon - did nvidia code it so that it changes to RADEON when ATI cards are detected? did they plan ahead for that? or did this guy reverse engineer the entire program/steal the source code?


----------



## newtekie1 (Apr 12, 2010)

bobzilla2009 said:


> but the guys never showed anything for it at all, ever. So your evidence is a picture and some nvidia guys that were highly likely to be looking to tarnish AMD's reputation with the hd4 series being so well received by gamers. Hardly compelling proof.



That makes a big assumption that nVidia would be willing to slander ATi, and then ATi would be willing to sit back and take it without a word...and this is after the AMD merger and we all know how AMD doesn't exactly like to let the competition get away with that shit...

Anyway, I don't know why I'm still fucking posting.  Again, fuck PhysX.  It is worthless to even argue about.  I'm done, let the thread die, because PhysX is pointless.


----------



## bobzilla2009 (Apr 12, 2010)

newtekie1 said:


> That makes a big assumption that nVidia would be willing to slander ATi, and then ATi would be willing to sit back and take it without a word...and this is after the AMD merger and we all know how AMD doesn't exactly like to let the competition get away with that shit...



Maybe they were too busy with intel at the time  still, it never happened, it was all FUD. If it did happen, then the unicorn exists! The reason why you're posting is that you can't back down!


----------



## Mussels (Apr 12, 2010)

do we even know it was posted/stated by nvidia, and that wasnt faked? 



			
				CEO Of Nvidia said:
			
		

> Fermi is made of rainbows, and physX runs on ATI



now if i put that on the news page, it doesnt suddenly make it real.


----------



## bobzilla2009 (Apr 12, 2010)

Mussels said:


> do we even know it was posted/stated by nvidia, and that wasnt faked?
> 
> 
> 
> now if i put that on the news page, it doesnt suddenly make it real.



But fermi IS made of rainbows. It's the reason why it takes so much power to run.


----------



## Wile E (Apr 12, 2010)

TheMailMan78 said:


> Well I didn't know that so few games supported hardware acceleration. I was under the impression all Physx supported games used hardware acceleration. Hell I was shopping for a 9800 to improve on these said games. Mussels just saved me some money.


Which is great, but it still doesn't change the fact that it's worded in a manner that suggests this is a thread to just bash a product.


Mr McC said:


> I strongly disagree. Physics acceleration should be an open standard. If we support the solution of a particular company, Nvidia in this case, or applaud advances made in this regard, we are helping to bring about a situation that will not benefit consumers, in fact, quite the contrary. I do not want to have to ask if a game is an ATI or an Nvidia game before I purchase. Take the Batman debacle: some people say that Nvidia paid for the development, whereby the ATI AA lock out is justified. I cannot comment on that, but what I can say is that Eidos should never have provided Nvidia with the opportunity of taking charge of this area of the game. Obviously this saved them development costs, but how many sales did it lose them because ATI owners eschewed what in their eyes was a game deliberately crippled for their system? If ATI takes this approach, and you seem to suggest that ATI should take a similar approach, we arrive at the aforementioned situation of having to ask whether a game is an "ATI game" or an "Nvidia game" prior to purchase. I think we have enough problems without that sort of bullshit.
> 
> *Moreover, Nvidia seems to be their own worst enemy when it comes to PhysX: if you want universal support, why disable or hinder a supplementary PhysX card when an ATI card is detected. I am not aware of what percentage of gamers use an ATI 48xx/58xx, but it appears that sales have been good. Nvidia have alienated a large portion of the gaming community with this idea and have probably lost a considerable number of sales to users who just had to have PhysX in those 16 essential titles.
> *
> Should ATI do more to further physics acceleration? Undoubtedy, but as an open standard. I am neither in favour of ATI or Nvidia, but I would be glad to see PhysX die off and be replaced by cpu physics acceleration. After all, all those wasted cores in our latest and greatest rigs might as well be put to some use when we play the most recent console port, don't you think?


Now that is where nVidia clearly went south with this. Is was a stupid move on their part.

And Physx being proprietary isn't the problem here. Havok is proprietary as well. Hell, most physics engines are proprietary. The problem is that Physx only runs in CUDA. If nVidia would port it over to OpenCL or DirectCompute, everything would be fine. Again, their own stupid mistake.


Mussels said:


> however, what if they have things such as "you cant put the physX logo on your hardware boxes without paying us."
> 
> PhysX is free, the logos are not - people dont buy a video card for physX if its not on the box - what looks free and happy on the surface, is really just a marketing ploy.


Informed buyers know better. That's just marketing anyway, and there's nothing wrong with that tactic. When eVGA first started making X58 boards, they didn't openly advertise Crossfire, yet people still bought them for Crossfire setups.


Mussels said:


> they did that. multithreaded CPU's are faster than physX via cuda.


Not faster than a dedicated Physx card with a dedicated render card. Besides, the author of the bench even said it wasn't optimized.


Mussels said:


> we're not arguing that theres a good percentage of users that can use physX - i was just showing the numbers. ATi 4K cards are the most popular individual cards atm (and remember that the nvidia cards are lumped together too - 8800 series counts 8800GT, GTX, GS, GTS (G80 and G92 variants) whereas the ATI version is just four cards - 4830, 50, 70 and 90.
> 
> looking at it in its simplest terms: only 20% or so of those people have an nvidia card powerful enough to run PhysX (since it slows down 90% of the 15 games its in, when enabling it). thats still 80% of the market who cant USE this feature, and a large percentage of those are ATI users.


80% is stretching it a bit, especially considering that particular poll only covers a small group of gamers to begin with. But I get the overall point you are trying to make.


Mussels said:


> nvidia doomed physX when it locked out ATI users with nvidia/Ageia cards, becuase it did the one thing you should never do: cut your marketshare down.


That I can agree on. But then, i will take care of itself as devs use them less and less. We don't need such a negatively biased thread to tell us that much. The market will work itself out.


Mussels said:


> newtekie: ATI stream is ATI's version of cuda.
> 
> They cant run cuda, so anything running, would have to run on stream...
> 
> are you seriously this in-depth in the discussion, with no idea what ATI stream is, or how its different to CUDA?


It wouldn't have to run on Stream. What about the aforementioned DirectCompute or OpenCL? And why are you even OK with Stream? It's a proprietary tech as well. The same exact type of thing you are blasting nVidia and Pysx/CUDA for.


Mussels said:


> i do get sick of people quoting one third party website which released nothing than a few possibly faked screenshots... i mean shite, the screenshots showing an ATI video card with boosted scores from physX was possible with an ATI and nvidia card in the same system, could have been as lame a hoax as that.


And I'm equally sick of people accusing nVidia of purposely crippling ATI in TWIMTBP games with absolutely no proof.


Mussels said:


> PhysX itself can only do visual effects.* it cant do 'interactive' physics* - you know, you cant have the box you pick up with a gravity gun in half life 2 and squish people with it... you can only have 'visual effects' like more debris (that passes through everything) or flags/cloth that cant touch anything that isnt themselves.
> 
> combine that with the fact that they cant make it essential (if a game required hardware physX like say, warmonger... so few people can run it, that no one does)
> 
> the only way to make it do interactive physics is like what they did in the UT3 bonus maps - it REQUIRES hardware physX to run (thus making the amount of gamers that can use it, very damn little)


Yes, it can do interactive physics. That debris interacts with the environment. It's up to the devs to get that detailed, but it's all there in the api.

Summary: Physx and TWIMTBP is not the problem. Physx is a very powerful API, but it is not used to it's full potential, due to nVidia's choice to only allows accelerated Physx to run on CUDA, when it could easily be ported over to OpenCL or DirectCompute. This is clearly nVidia's fault, but it's their choice to make, and they will suffer the consequenses for it.

What we don't need, is a thread that's clearly dedicated to bashing nVidia. Everything after the picture in the OP is clearly Physx and nVidia bashing. It was a perfectly good, informative post up until then. Adding a PS claiming it's not a bashing thread doesn't magically make derogatory comments not derogatory. Derogatory comments = bashing, period.

My problem with this thread is twofold:

The first problem is not with trying to show how unimportant Physx should be as a buying decision, my problem is with the way the OP chose to word his post.

The second problem is all the misinformation going around in this thread about Physx and TWIMTBP. I'm not defending hardware Physx, so don't take it that way. NV could clearly go about this in a better way. I am just correcting the large amount of misinformation.


----------



## cadaveca (Apr 12, 2010)

Wile E said:


> Yes, it can do interactive physics. That debris interacts with the environment. It's up to the devs to get that detailed, but it's all there in the api.



Of course...CellFactor, that shipped with the Ageia cards, shows this perfectly, as does the GRAW2 level just for Phys-X...


And what you just posted kinda mimicks my original post...my point about it not being interactive was due to hardware limitations on most purchased platforms...and that they only way I see it taking off would be if there was dedicated hardware in one of the consoles..which is not likely to happen any time soon.


Sorry if my post kinda detracted from that.


I personally feel that until developers make Phys-X-capable hardware nessecary for thier titles, it will never take off. Metro2033 has very high recommended specs..didn't stop them from release, so no other developer can cry" not enough user base"...that's just a cop-out, to me.

nVidia killed Ageia cards with thier first released Phys-X driver. Now titles that ship with Phys-X alsao break this functionality...and the games WILL NOT WORK RIGHT with the older Ageia Phys-X API. That needs to change. They also need to change the hardware restrictions when running ATI cards...many ATI users would by mid-level nV cards is Phys-X would actually work without hacking. Then the user-base would grow...


----------



## crazyeyesreaper (Apr 12, 2010)

and yet even then if what the fluidmark creators state is true a Q9400 can do the same physx calculation and be on par with a 275 + 250 for physx so if physx ran on the CPU and was multithread aware and actually USED IT we wouldnt need dedicated hardware at all face it physx is a cluster fuck situation where everything that could go wrong in its adoption has in terms of  use and and adoption rate  locking it to 1-2cores on a cpu is ignorant again if a Q9400 can nearly rival a gts 275 +gts 250 how well wil a core i7 handly physx calculations? my guess is witha  decent cpu and physx multi core aware it would allow sli or crossfire users to get the same performance it takes 2 gpus + 1 more for physx now


----------



## TheMailMan78 (Apr 12, 2010)

Maybe I am simplifying things a bit much but I have a question. I have asked it before but never got a straight answer.

Why would company (A) competitor to company (B) go out and buy company (C) and then offer its technology to their competitor company (B) for free?

In other words....

Why would Nvidia buy out Ageia and they try and give the technology to ATI for free?


----------



## cadaveca (Apr 12, 2010)

Because in order to get wide-spread adoption, they'd have to?

I know it seems foolish, but sometimes you must invest to see returns, and opening up Phys-X to ATI would be seen as such..an investment.

But then, AMD would have to properly support it, requiring many software engineers, and other extra expenses...when they have the driver issues they do, I cannot see that as a practical situation for AMD currently. AMD would only take Phys-X without liscencing, and without having to provide software support.


----------



## [I.R.A]_FBi (Apr 12, 2010)

This reminds me of banks in Jamaica, they offered abm use for free, got everyone there then piled on the charges ...


----------



## RejZoR (Apr 13, 2010)

PhysX is used for visuals only because you can't make a gameplay based on a feature that is supported only in half of computers. Because if you otherwise sold, i don't know, 100.000 copies of the game, you'd only sell 50.000 because of this. Who on earth would do that? No one. But if PhysX could work on all graphic cards, devs would certanly think about making it core element. But that will not happen with PhysX, because NVIDIA is arogant and stupid.

If info is right, Havok is on the path of GPU acceleration (again) and AMD's Bullet is also in the view. They will both work on all OpenCL/DirectCompute hardware. NVIDIA will support it through CUDA as that's their OpenCL base platform, AMD will support it as native API or maybe through ATI Streams. So that's a win win. I just want for them to move their asses a bit faster. This is not going anywhere and i want some real hardcore physics before i get old and die for christ sake.


----------



## phanbuey (Apr 13, 2010)

PhysX is and has always been a huge gimmick IMO.  The only reason Nvidia bought Aegia out, is that they have been seriously looking at GPGPU since the early 7 series, and Aegia basically had a vector processing card that was easier to program than a GPU.

They bought them out, used their developers, and voila... CUDA.  PhysX is nothing more than a side-effect of CUDA at this point, which is meant for HPC and not gamers.  It just happens that its really easy to use the API that you already bought, and use a brand name that was already there to try and sell more cards.  I dont think nvidia would really give a damn if PhysX went under, but they would definitely care if Stream became more popular than CUDA.

AAAAnnnnd if PhysX can run on ATI cards... then so can CUDA!  with a little tweaking, a 2tflop card would decimate a 600Gflop card and nvidia isf***ed.  So yeah, im sure if they hired that guy it was more to stop the bleeding early than anything else.


----------



## newtekie1 (Apr 13, 2010)

phanbuey said:


> PhysX is and has always been a huge gimmick IMO.  The only reason Nvidia bought Aegia out, is that they have been seriously looking at GPGPU since the early 7 series, and Aegia basically had a vector processing card that was easier to program than a GPU.
> 
> They bought them out, used their developers, and voila... CUDA.  PhysX is nothing more than a side-effect of CUDA at this point, which is meant for HPC and not gamers.  It just happens that its really easy to use the API that you already bought, and use a brand name that was already there to try and sell more cards.  I dont think nvidia would really give a damn if PhysX went under, but they would definitely care if Stream became more popular than CUDA.
> 
> AAAAnnnnd if PhysX can run on ATI cards... then so can CUDA!  with a little tweaking, a 2tflop card would decimate a 600Gflop card and nvidia isf***ed.  So yeah, im sure if they hired that guy it was more to stop the bleeding early than anything else.



Considering CUDA was initially released in February '07 and nVidia didn't aquire Ageia until February '08, I think your facts are a little wrong...:shadedshu

They bought out Ageia entirely for PhysX, and nothing more, they already had a GPGPU platform, which is what allowed them to adapt PhysX in a few short months to run on their graphics cards.

And as for the flop counts of the cards, it has been shown over and over again, that despite ATi cards having a much higher flop count, the ATi card does not outperform the nVidia cards.


----------



## bobzilla2009 (Apr 13, 2010)

newtekie1 said:


> Considering CUDA was initially released in February '07 and nVidia didn't aquire Ageia until February '08, I think your facts are a little wrong...:shadedshu
> 
> They bought out Ageia entirely for PhysX, and nothing more, they already had a GPGPU platform, which is what allowed them to adapt PhysX in a few short months to run on their graphics cards.
> 
> And as for the flop counts of the cards, it has been shown over and over again, that despite ATi cards having a much higher flop count, the ATi card does not outperform the nVidia cards.



It depends on the situation, in most calculations that would be relevant to gaming or physics calculations, the ATi cards always beat the nvidia cards. Raw computer power = raw computing power. Unsurprisingly. ATi just never really made a way to exploit the power like nvidia have been trying to do with CUDA and physX, but ATi are getting there. Overall, AMD has always had the better hardware for calculation (and in terms of transistors/performance almost certainly still does) but nvidia have always had the better software like CUDA.

The gtx480 probably does better than the hd5870 now in such things on an overall standpoint, but i haven't bothered to check up on those technical reviews yet


----------



## phanbuey (Apr 13, 2010)

newtekie1 said:


> Considering CUDA was initially released in February '07 and nVidia didn't aquire Ageia until February '08, I think your facts are a little wrong...:shadedshu
> 
> They bought out Ageia entirely for PhysX, and nothing more, they already had a GPGPU platform, which is what allowed them to adapt PhysX in a few short months to run on their graphics cards.
> 
> And as for the flop counts of the cards, it has been shown over and over again, that despite ATi cards having a much higher flop count, the ATi card does not outperform the nVidia cards.



Ah ok... so I was wrong about the gimmick, got my timing mixed up.

BUT... my point about ATi running PhysX and Running CUDA at the same time is still valid.  Since, like you said, they run PhysX through CUDA now.

Yeah the flops arent the only thing that determine _graphics_ performance, there are many other factors that do that.  But if you have a _computing_ platform, then the story may be different. 

Sure Nvidia cards make up in scheduling efficiency and more sophisticated shader design, but *and this is purely speculative* a CUDA compiler can, theoretically, be tweaked to compensate for the weaknesses of the ATi card and unleash some of the massive power that these cards hold.  Which would be a VERY bad situation for nvidia.


----------



## poo417 (Apr 13, 2010)

I have used a 260 gtx for physx in past and taken out and put it in a work computer that folds all day long.  Was of no great use and was a pain to get a new hack every time new drivers were installed.  Nvidia don’t even have to give ati physx they just need to release drivers to run their own cards as a phys card.  Ageia could run them with no problems.  If physx was actually useful in a decent number of games and worked as an add on card allot of ati people would probably get one.  Sadly nvidia can’t put their ego out of the way and insist on their cars being used as the primary card.


----------



## Edito (Apr 13, 2010)

sumarizing: the devs these days only care about money this is, it can u please coun't the Dx10 and 11 games? u won't find many and this will stay as it is cause they just want money. They won't use Physx and we won't see the greatness of DX11...

But please don't tell me the physx doesn't make a difference cause it makes in a few games and benchmark programs but it makes...

don't get me worng cause of this its just the way i see it...


----------



## HalfAHertz (Apr 13, 2010)

+1 the last few AAA games have been pretty lackluster

The latest Physx title is Metro 2033. Now I don't have any hands on experience but from the screenshots and gameplay videos I watched both Physx and Dx11 don't add anything special to the experience except tanking the fps...


----------



## Wile E (Apr 14, 2010)

bobzilla2009 said:


> It depends on the situation, in most calculations that would be relevant to gaming or physics calculations, the ATi cards always beat the nvidia cards. Raw computer power = raw computing power. Unsurprisingly. ATi just never really made a way to exploit the power like nvidia have been trying to do with CUDA and physX, but ATi are getting there. Overall, AMD has always had the better hardware for calculation (and in terms of transistors/performance almost certainly still does) but nvidia have always had the better software like CUDA.
> 
> The gtx480 probably does better than the hd5870 now in such things on an overall standpoint, but i haven't bothered to check up on those technical reviews yet



ATI does not have better hardware for calculations. I think Folding@Home demonstrates that pretty clearly. ATI has actually been able to run F@H for much longer than NV, and has way more time to optimize, as has Stanford, but the NV cards decimate the ATI cards, and have from day one. A lowly 8800GT out folds my 4870X2, for instance. These are the same type of floating point calculations you would see in physics models as well.


----------



## cadaveca (Apr 14, 2010)

Sure, ATI FAH sucks...I get the same points on my 2900XT as I do on my 4870's, and teh 4890's...all get around 3600 points per day...and all are @ 850 mhz(nV's "CUDA CORES" are twice the engine clocks). There's a reason there's no FAH performance...ATI doesn't care about that right now. Like tesselation, ATI started it, and nV hopped on later, once ATI had built interest.

Mike Houston's resopnse on poor ALU utilization:



> Smaller proteins don't generate enough parallelism to fill the chip.
> 
> --------------------------------------------------------------------------------
> 
> ...



http://forum.beyond3d.com/showthread.php?p=1419102


----------



## Wile E (Apr 14, 2010)

cadaveca said:


> Sure, ATI FAH sucks...I get the same points on my 2900XT as I do on my 4870's, and teh 4890's...all get around 3600 points per day...and all are @ 850 mhz(nV's "CUDA CORES" are twice the engine clocks). There's a reason there's no FAH performance...ATI doesn't care about that right now. Like tesselation, ATI started it, and nV hopped on later, once ATI had built interest.
> 
> Mike Houston's resopnse on poor ALU utilization:
> 
> ...



Yeah, but the point still remains, ATI's hardware layout holds them back in these calculations in F@H. The calculations needed for physics are very similar. I don't see ATI being any faster than nVidia in physics calcs any time soon. There are only very specific situations that ATI can actually hit their rated throughput.


----------



## cadaveca (Apr 14, 2010)

Actually, according to Mike, it's Brook+ that is holding them back...the OpenCl client isn't fully working yet. HE said all this last week, BTW.

He also mentioned that on large WU's, the performance is nearly identical.



> As far as the older client, my post explaining some of the differences in implementations was already quoted in this thread (soon to be a break-off thread?), so I won't rehash too much. No, the client isn't artificially limited to a certain number of stream processors. However, smaller work units will not fully utilize newer chips. In the GPU2 era, AMD and Nvidia run very different algorithms achieving the same result. As I have stated before, and the data in in the publications, for really large proteins the performance distance closes much more, however, really large proteins are also really hard on the system (UI can get really laggy). Nvidia's implementation and hardware had an advantage on smaller proteins ("narrower" machine with higher ALU clocks) and we were more competitive with bigger stuff ("wider" machine with lower ALU clocks), despite an algorithmic disadvantage. We couldn't support some of the other algorithmic variants all that well on R6XX that may have performed better. *R7XX was more flexible, but still had restrictions that made it tricky and the Brook model was showing it's limits.*




All that said, physics are highly-parallel, yet the nV cards excel in FAH from NOT being as parallel-computation focused.


----------



## Wile E (Apr 14, 2010)

cadaveca said:


> Actually, according to Mike, it's Brook+ that is holding them back...the OpenCl client isn't fully working yet. HE said all this last week, BTW.
> 
> *He also mentioned that on large WU's, the performance is nearly identical.*



He's also lying. The 8800GT still spanks my 4870X2, even on the large units.


----------



## cadaveca (Apr 14, 2010)

Wile E said:


> He's also lying. The 8800GT still spanks my 4870X2, even on the large units.



And your 4870x2 is getting the same points that my 2900XT does. Your point? It's mostly about engine speed, and the nV cards have much higher core speeds than ATI...


And THAT is why nV is better for phys-X type stuff...raw engine speed...


----------



## Wile E (Apr 14, 2010)

cadaveca said:


> And your 4870x2 is getting the same points that my 2900XT does. Your point? It's mostly about engine speed, and the nV cards have much higher core speeds than ATI...
> 
> 
> And THAT is why nV is better for phys-X type stuff...raw engine speed...



Wait, so we are both saying the same thing? I was only saying that NV is better at these calculations. Why does not matter.


----------



## cadaveca (Apr 14, 2010)

The REASON they are better is all-important. It's dually hardware and software that are holding the ATI cards back in FAH...it's not the gpu shader design, in what math is available, but just in raw speed.


----------



## Wile E (Apr 14, 2010)

cadaveca said:


> The REASON they are better is all-important.



It absolutely does not matter at all. All that matters is results, period.


----------



## cadaveca (Apr 14, 2010)

Maybe for you, but I like to know more than just the answer. Maybe the question, and the route to the answer, is what keeps me interested.

The OpenCL FAH client has been in the works for years. WHen it comes out, that performance difference will be not so noticible...but it's taking the small team @ stanford alot of time to get it out there...plain and simple Mike admitted that AMD didn't have the resources to push FAH any further right now. It's all there in that thread I linked above...


Point of fact...although the algorythm is based on the same base material, the client for either gpu work very differntly, and as such, are not even running the same code, so the results from FAH are not comparable. It's almost like comparing 3Dmark06 to Vantage...Futuremark made them both too...so the numbers they generate, are 100% comparable, right? Uh, no.

EDIT:

I mean, I get what you are on about here, and sure, nV is good at some things, but then agian, they are not so good at others. This is going to constantly change as time goes on..it always does.

The point of the thread here, I think, is that like many others...he sees the potential of PHys-X, but at the same time is frustrated by the roadblocks preventing it from going further...becuase really, they are all asinine.

Nv being better at Phys-X has nothing to do with it...nor being better @ FAH...that's not the problem with Phys-X. they can be 5000x faster than ATI, and it wouldn't help.


----------



## Wile E (Apr 14, 2010)

cadaveca said:


> Maybe for you, but I like to know more than just the answer. Maybe the question, and the route to the answer, is what keeps me interested.
> 
> The OpenCL FAH client has been in the works for years. WHen it comes out, that performance difference will be not so noticible...but it's taking the small team @ stanford alot of time to get it out there...plain and simple Mike admitted that AMD didn't have the resources to push FAH any further right now. It's all there in that thread I linked above...
> 
> ...


No, it's not like comparing 06 and Vantage. 06 and Vantage aren't calculating the same things on both benches. Both ATI and NV calculate the same things in F@H. Different ways of calculating the same thing, but calculating the same things, none the less. Completely broken analogy. It's more like comparing Vantage scores between ATI and nVidia (which we do, btw). Different means to the same ends.

As far as performance claims with the OpenCL client, I'll believe it when I see it. Both companies pull numbers out of their asses prior to products releasing. I think he is flat out lying and overstating things. ATI has let NV crush them for far too long in F@H for me to believe it's simply a software issue. He is making excuses.

nVidia is just better suited for these calculations.

Again, all the matters is the results, not how you get there. Customers don't care why their card gets 60 fps in their favorite games, only that it does.

And I didn't once use this as a basis as to why Physx is doing poorly. I already know why it's doing poorly, I was simply commenting on his claim that ATI hardware is better for these calculations.


----------



## cadaveca (Apr 14, 2010)

Yes, of course. And I know it was a bad analogy..but that's the best I got I guess 05 and 06 would have been more prudent...but not really. heh.



> Again, all the matters is the results, not how you get there. Customers don't care why their card gets 60 fps in their favorite games, only that it does.



Sure. But those customer aren't visiting these forums...and many of us here are here becuase we were told one thing, spent money to get that thing, and got another, and now we want to know why.

Personally, like I said, it's not just the answer. I mean sure, fastest card, single gpu..GTX480.

But...I can get a 5970 for about the same cost, and get the same performance. Same power usage, almost, too. Screw that..I can get better.

So what do you choose? You have to weigh the other features...and in this situation, Phys-X is one of those features.

A feature, that in the end, is hurting the consumer. I won't take that 60FPS if it's gonna make future games suck.

While that may not matter to you, I bet that it matters to others. In fact, I KNOW it does, or this whole issue would be one that drums up such attention, each and every time.

You sound like Mr nV himself...



> ATI has let NV crush them for far too long in F@H for me to believe it's simply a software issue. He is making excuses.






> Again, all the matters is the results, not how you get there.



heh. no harm intended due, but that really does sound like something he'd say.




> I already know why it's doing poorly, I was simply commenting on his claim that ATI hardware is better for these calculations.



Well, really, you don't know for sure, do you? I mean, you must want to move to Africa so you can turn those american bucks into millions of rand. All that matter is the end numbers right? Not how ya got there?

I don' work like that. I'm try hard not to be a smart ass about it, but I'm finding it kinda hard, so I apologize, right now.  I'm so not a competitive person, so anything that that kinda relates to that, kinda makes no sense. I'm not a teenager any longer...maybe I need some testosterone pills...


----------



## Wile E (Apr 14, 2010)

cadaveca said:


> Yes, of course. And I know it was a bad analogy..but that's the best I got I guess 05 and 06 would have been more prudent...but not really. heh.
> 
> 
> 
> ...


Well, smart ass or not, the differences in the way the calculate things still doesn't change the results. We don't but video cards because of the manner in which they calculate or render, we buy them to get the results of those calculations or renderings.


----------



## cadaveca (Apr 14, 2010)

Maybe, Maybe...it's not like I'm anything near the average person, so for me to understand...well...it just isn't gonna happen.


----------



## phanbuey (Apr 14, 2010)

i see what you're both trying to get at, but I'm with cad on this one.  the REASONS behind the results are extremely important, because then the results are in context.  that context matters, because, as it is in this case, AMD cards have potential.

if you look at the results alone, then it becomes misleading as to where the companies will stand in the foreseeable future.


----------



## cadaveca (Apr 14, 2010)

Well, the perfect example, and why I popped out that analogy, is cpu scores in Vantage, when Phys-X is enabled. Totally not comparable results...to such a degree that even FM pulled those results from thier "world rankings", and made a stink about it. Back then, I was the only one who seemed to mention that the cpu test was modfied based on threads, and that nV cards weren't accounted for in that way, so the score JUMPED hugely. Add a phys-X card though, and those threads were accounted for, and this made a negligible difference in the end result.

To me, it's kinda the same situation...how those numbers were attained  matter...maybe that boost in numbers is false.


----------



## Wile E (Apr 14, 2010)

phanbuey said:


> i see what you're both trying to get at, but I'm with cad on this one.  the REASONS behind the results are extremely important, because then the results are in context.  that context matters, because, as it is in this case, AMD cards have potential.
> 
> if you look at the results alone, then it becomes misleading as to where the companies will stand in the foreseeable future.



Not if those results are consistant for a long stretch of time. Of course there are exceptions with poorly optimized software needing an update, for instance. That clearly isn't the case here tho.



cadaveca said:


> Well, the perfect example, and why I popped out that analogy, is cpu scores in Vantage, when Phys-X is enabled. Totally not comparable results...to such a degree that even FM pulled those results from thier "world rankings", and made a stink about it. Back then, I was the only one who seemed to mention that the cpu test was modfied based on threads, and that nV cards weren't accounted for in that way, so the score JUMPED hugely. Add a phys-X card though, and those threads were accounted for, and this made a negligible difference in the end result.
> 
> To me, it's kinda the same situation...how those numbers were attained  matter...maybe that boost in numbers is false.


Well of course they aren't comparable, because you are trying to compare a CPU to a GPU. That has nothing to do with the computation power of an ATI card, so has no bearing in this conversation. Find a way to enable Physx on an ATI card, however, and the results would become directly comparable. Vantage is not a good example to use in this discussion, at all.


----------



## cadaveca (Apr 14, 2010)

Wile E said:


> Not if those results are consistant for a long stretch of time. Of course there are exceptions with poorly optimized software needing an update, for instance. That clearly isn't the case here tho.



GPu client was last updated Nov, 2008. The actual core was released in April, 2008. That's two years since the software was updated(for other than bug fixes, 1.5 years since most recent core release), making it highly irrelevant in today's market. It doesn't even work right in Vista....

Your whole bit about FAH is just as unimportant as my comment about Vantage...


----------



## Wile E (Apr 14, 2010)

cadaveca said:


> GPu client was last updated Nov, 2008. The actual core was released in April, 2008. That's two years since the software was updated(for other than bug fixes, 1.5 years since most recent core release), making it highly irrelevant in today's market. It doesn't even work right in Vista....
> 
> Your whole bit about FAH is just as unimportant as my comment about Vantage...



How long prior to that did ATI have to help optimize? So from the first ATI only GPU client, to the modern client, ATI couldn't get it right with that large of a head start? Sorry, I just don't buy it.

And if you read that thread you posted, you'd see that ATI is slower due to it's cache structure, not because of the software. That's a hardware limitation.

And if we want to get really technical, this whole argument is unimportant (and a bit off topic). lol


----------



## Mussels (Apr 14, 2010)

yeah it is all off topic really


----------



## eidairaman1 (Apr 14, 2010)

Id say put this topic under general nonsense now since its going that direction anyway


----------



## Mussels (Apr 14, 2010)

eidairaman1 said:


> Id say put this topic under general nonsense now since its going that direction anyway



no, i'd rather people still read the first post and learn that PhysX and hardware physX are very different things - after seeing posts all around the forum, i'd say the original post is doing its job.


seems like everything has been hashed out, so for now i'll lock the thread... if physX changes at some point, i'll unlock the thread and clean it up.


----------

