Thursday, December 11th 2008

PhysX will Die, Says AMD

In an interview with Godfrey Cheng, Director of Technical Marketing in AMD's Graphics Products Group, Bit-Tech.net has quoted him saying that standards such as PhysX would die due to their proprietary and closed nature. Says Mr. Cheng:
"There is no plan for closed and proprietary standards like PhysX," said Cheng. "As we have emphasised with our support for OpenCL and DX11, closed and proprietary standards will die."
Bit-Tech.net interviewed the AMD person to get the company's take on EA and 2K's decision to adopt NVIDIA PhysX across all of their worldwide studios, earlier this week. Interestingly, when asked about how the major publishers such as EA adopting PhysX across all of their studios would impact the propagation of the API, Cheng responded with saying that monetary incentives provided to publishing houses alone won't help a great deal in propagating the API, and that the product (PhysX) must be competitive, and that AMD viewed Havoc and its physics simulation technologies as leaders. "Games developers share this view. We will also invest in technologies and partnerships beyond Havok that enhances gameplay." he added. PhysX is a proprietary physics simulation API created by Ageia technologies, which was acquired and developed by NVIDIA. You can read the full Bit-Tech.net interview with Godfrey Cheng here.
Add your own comment

67 Comments on PhysX will Die, Says AMD

#26
Tatty_Two
Gone Fishing
newtekie1Proprietary is used in the loosest way possible here considering nVidia has expressed that they are more than willing to help get it running on ATi's hardware. ATi is forcing it to be propriatary by refusing to work with nVidia to get it working.

The performance hit is going to happen regardless of what API is used to create the physics. If they both are creating the same level of physic, the performance hit will be the same, as the graphics cards are asked to render more on the screen due to the physics.

Edit: Of course I hope AMD realizes that they have kind of screwed themselves by saying that. History shows that when a company says their competitions product will fail, the product usually becomes wildly popular.
Agreed, well most of it, I think AMD may also be being clever in a kind of sadistic way, their "current" architectural route (and possibly their future route) in it's most basic terms is to throw loads of fixed speed shaders (for fixed....read slow :)) where as NVidia is on the few but faster road (I am not going into the more complex differences between the 2 deliberatly!), now, with Physx, it will use a proprtion of shaders to get the job done, a GPU running shaders at twice the speed of a competitor I would GUESS is gonna get much better performing Physx, AMD would not want that perhaps?
Posted on Reply
#27
DarkMatter
brian.caPhysx is not going to kill anyone... it's dx10.1 all over again. No one is going to make/sell a game relying on this thing if it's gonna screw over a good portion of the market, or leave them left out of any significant part of the game. Unless AMD/ATI support physx it will never be anything more than optional eyecandy and that limited role will limit it as a factor for people to might buy into it (would you pay $500 for a new card instead of $300 so you can get more broken glass?).

Some of you are talkiing about studios and games adopting physx but what have you seen so far? Mirror's edge seems to do a bit of showcase work for physx but how many would really buy that game? Being an EA game through and though I personally have a hard time believing the game would offer anything more than what I could get watching the trailer or some demos. Otherwise I haven't seen physx do anything that havok wasn't already doing. There's no extra edge here. There's probably some incentives from Nvidia - but then that comes back to what this guy was saying in the first place;

"We cannot provide comments on our competitor's business model except that it is awfully hard to sustain support by monetary incentives. The product itself must be competitive. We view Havok technologies and products to be the leaders in physics simulation and this is why we are working with them. Games developers share this view."



This lends itself to why this stuff won't kill ATI... end of the list is Apple, didn't they put together the OpenCL standard? Which do you think they'll be pushing, CUDA or their own standard? Microsoft will be pushing it's own thing with DX11 down the road. Adobe recently took advantage of Nvidia's CUDA and ATI's Stream if I'm not mistaken... but do you think they'll want to keep on making 2 versions of their products when other people are pushing for a unified standard?



I guess this is all moot anyways then... if AMD responding to an Nvidia announcement for a reporter will guarantee success for Physx then surely the grandstanding Nvidia took part in vs. Intel will have Larrabee burying all competition.


At the end of the day people may not like what this guy is saying, why, or how, but it's true. AMD is not going to support Nvidia's proprietary APIs (and why the hell would they?), and with out that support other companies will have less incentive to get on board unless Nvidia provides it. That requires either a superior product or probably cash incentive. Now realisticly.. when the immediate alternatives to Nvidia's systems seem to be OpenCL (Apple - but open), DirectX 11 (Microsoft), and Havok (Intel), do you think these other standards won't have the resources behind them to provide both those things moreso than Nvidia? If you were in AMD's shoes who would you side with? They could do all but seriously... why? It'd just confuse other efforts and probably waste their own resources, and for what? To better prop up the competition that they can beat? So they can claim some moral high ground when they recall how Nv made DX10.1 completely moot?
All that falls appart from the simple fact that, had Ati adopted PhysX when Nvidia gave it for free. Or at least when the guy at NGOHQ already made PhysX on Ati cards* possible, then we would already have games with 10 times better physics. The point of "Game developers are not supporting it because it doesn't run on all cards." loses it's value, when that is your own fault, when you have obstructed all efforts for the thing running in your cards.

And the one "We are not going to follow a standard that only few developers have adopted" is hypocrit at best, when you are doing as much as you can for that to be true. This falaceous comments included.

It's a lame path the one that AMD has taken. As I said I have lost any faith in them, and the worst thing is that they sided with the most dangerous and "evil" company, Intel, and once Intel has their GPU they are going to crush them so hard it's going to hurt us as well. As a popular quote says: "One must never rely on alliances with the powerful, they probably want something more of that allegiance than help us."

*What can be read in that thread is not the end of story. Nvidia ended up supporting 100% that guy's efforts, but obviously it's not at their hands to release such a thing. They could help make it feasible, but not release it. Neither the guy can release that, remember the guy releasing Vista drivers for X-Fi cards. AMD on the other hand negated any help to the guy, not even providing a simple single HD4000 card for testing. Furthermore the thing went to shadows suddenly, probably, because AMD (or Intel due to AMD's deal with Havok) threatened him with litigations. After that and under a judge's mandate he probably can't say anything about the issue...
I would like to update you about what’s going on. First, we were very pleased to see so many users and readers have applied to our Beta Test Program! To be specific: 191 users, 5 spies and 2 double agents have submitted applications during the last week. Those that will be chosen will be informed early before the beta is available – we can’t still point to “when” at this stage.

The bad news is we still don’t have access to any HD 4800 hardware yet. It is very important for this project to receive AMD’s support on both developer and PR levels. It seems that AMD still is not being cooperative, we get the feeling that they want this project to fail. Perhaps their plans are to strangle PhysX since AMD and Intel have Havok. We truly hope this is not the case since “format wars” are really bad for the consumers (For example: Blu-ray vs. HD-DVD).

Before we get to the good news, I’m going to ask you to hold on to something steady as some of you are going to feel a bit dizzy after you hear this. The truth is… Nvidia is now helping us with the project and it seems they are giving us their blessings. It’s very impressive, inspiring and motivating to see Nvidia's view on this. Why they help us? My best guess would be: They probably want to take on Intel with CUDA and to deal with the latest Havok threat from both AMD and Intel.

Some other good news, we are getting a lot of help from cool journalists like Theo Valich to address the HD 4800 access issue. I can confirm that our CUDA Radeon library is almost done and everything is going as planned on this side. There are some issues that need to be addressed, since adding Radeon support in CUDA isn’t a big deal - but it’s not enough! We also need to add CUDA support on AMD’s driver level and its being addressed as we speak.
And since then AFAIK nothing.
Posted on Reply
#28
leonard_222003
Well in AMD's defense , Nvidia didn't want to adopt dx10.1 in spite and forced developers to not use that feature , it would've gived AMD a performance boost , this time it's kind of the same thing , physx would give Nvidia some perf. supremacy if AMD would use this feat. and they would drop dead before using it :).
You can understand how tehnology progresses when money is in the way :).
Posted on Reply
#29
btarunr
Editor & Senior Moderator
leonard_222003Well in AMD's defense , Nvidia didn't want to adopt dx10.1 in spite and forced developers to not use that feature , it would've gived AMD a performance boost , this time it's kind of the same thing , physx would give Nvidia some perf. supremacy if AMD would use this feat. and they would drop dead before using it :).
You can understand how tehnology progresses when money is in the way :).
How do you come to the conclusion that if NVIDIA had DirectX 10.1 compliance, ATI would have some sort of supremacy? On the other hand, I can straightaway call PhysX on NVIDIA GPUs a supremacy over PhysX on ATI GPUs, and I have the F@H performance numbers to back my statement.

If developers used DirectX 10.1 and it became a trend, NVIDIA would've been forced to gain compliance or devs would've avoided DX10.1 altogether in fear of sales loss due to a major GPU player not supporting it. Chicken and egg really :).
Posted on Reply
#30
DarkMatter
leonard_222003Well in AMD's defense , Nvidia didn't want to adopt dx10.1 in spite and forced developers to not use that feature , it would've gived AMD a performance boost , this time it's kind of the same thing , physx would give Nvidia some perf. supremacy if AMD would use this feat. and they would drop dead before using it :).
You can understand how tehnology progresses when money is in the way :).
That's only half the story. Nvidia didn't force developers to not use it and the thing that makes DX10.1 faster is also available in DX10, but is only a suggestion and not a requirement. More specifically the memory buffer readback, a thing that Nvidia does support on hardware.

The idea (meme at this pont) you pointed above is purely based on what happened with Assassin's Creed. Well the developers themselves said it had nothing to do with Nvidia. That developer posture is not an isolated thing. Ati supporters have been claiming that kind of things with TWIMTBP since the beginning, while Ati never had a word about that (interesting) and many developers (iD, Epic Games, Crytek, Ubisoft) have specifically negated such a thing many times. If I have to believe anyone, I believe ALL the developers. Between what developers say and a Ati fan made conspiracy theory, where even Ati never took part, I obviously take developer's word.

But let's get back to what happened to Assasin's Creed. The same developer Ubi Montreal has just released FarCry2 and in this game DX10 Antialiasing performance is higher than in DX9 and the same that under DX10.1*. Intriguing? Conspiracy again? Not for all of us that know (and always have known) the truth about DX10.1.

* JUst in case people don't get the whole meaning of that sentence, FarCry2 does support DX10.1, but in words of the developers, it didn't make a difference in performance and features. Under DX10.1 things are simpler to program, no one will ever discuss that.
Posted on Reply
#31
leonard_222003
btarunrHow do you come to the conclusion that if NVIDIA had DirectX 10.1 compliance, ATI would have some sort of supremacy? On the other hand, I can straightaway call PhysX on NVIDIA GPUs a supremacy over PhysX on ATI GPUs, and I have the F@H performance numbers to back my statement.

If developers used DirectX 10.1 and it became a trend, NVIDIA would've been forced to gain compliance or devs would've avoided DX10.1 altogether in fear of sales loss due to a major GPU player not supporting it. Chicken and egg really :).
Well it wouldn't be much of a problem for them to implement dx10.1 but spending money on a new generation of graphics card would've been a problem , if developers would use dx10.1 graphic cards like based on old argitecture g80,g92 would've been a bit left behind by ati products in some games and there is the problem of AA on foliage wich Nvidia can't do because it doesn't have dx10.1 , unigine proved that.
Assasin creed is another proof Nvidia made some noise when a patch enabled dx10.1 for ATI but the next patch that feature became unavailable , people started ponting fingers at Nvidia and anandtech made an article about that and the perf. numbers ATI gained with this feature.
So dx10.1 could've made and could make in the future things much better for ATI , why Nvidia would not adopt this in the new GT200 generation ? because they would kill all the cards in the mainstream market for them where ATI already has dx10.1 on them adn could render a 3870 a worthy foe for a 8800gt.
In one way or another developers held back from using ATI features considering Nvidia supports them in developing games when ATI does nothing , why should they give a f.... about ATI ? i can understand developers but for the consumer of ATI prodycts this is not nice.
Posted on Reply
#32
VulkanBros
Just a little wood for the fire: Who is posting (my guess: a lot of money) into the game developers pocket with the slogan: The way it´s meant to be played ?
Posted on Reply
#33
DarkMatter
leonard_222003Well it wouldn't be much of a problem for them to implement dx10.1 but spending money on a new generation of graphics card would've been a problem , if developers would use dx10.1 graphic cards like based on old argitecture g80,g92 would've been a bit left behind by ati products in some games and there is the problem of AA on foliage wich Nvidia can't do because it doesn't have dx10.1 , unigine proved that.
Assasin creed is another proof Nvidia made some noise when a patch enabled dx10.1 for ATI but the next patch that feature became unavailable , people started ponting fingers at Nvidia and anandtech made an article about that and the perf. numbers ATI gained with this feature.
So dx10.1 could've made and could make in the future things much better for ATI , why Nvidia would not adopt this in the new GT200 generation ? because they would kill all the cards in the mainstream market for them where ATI already has dx10.1 on them adn could render a 3870 a worthy foe for a 8800gt.
In one way or another developers held back from using ATI features considering Nvidia supports them in developing games when ATI does nothing , why should they give a f.... about ATI ? i can understand developers but for the consumer of ATI prodycts this is not nice.
First of all the fact that Nvidia doesn't support DX10.1 doesn't mean they don't support the fetures under DX10.1. M$ decided that in order to be DX10 compliant, the cards had to support EVERY feature specified in the API. That means that if you don't support 1 feature out of 1000, your card can't be labeled DX10 (or DX10.1 or DX11 etc) compliant. Prior to DX10 that was not necessary. If M$ decided to implement the same system for previous DX versions, there WOULDN'T BE A SINGLE DX9, DX8... comliant card in the market. As I said everything that was used in DX10.1 version of AC is available in DX10 and Nvidia cards, Farcry 2 is the proof. Plus where did you hear that G80 or G92 (had Nvidia supported 100% DX10.1) would have been behind? Nevermind, I know where, which points me out to why I said what I said in my first entry to this thread...
Posted on Reply
#34
Tatty_Two
Gone Fishing
VulkanBrosJust a little wood for the fire: Who is posting (my guess: a lot of money) into the game developers pocket with the slogan: The way it´s meant to be played ?
Agreed, however, there are a number of "The way it's meant to be played" games that actually perform as well (and in some cases perhaps better) on ATI hardware although I know where you are coming from but to be fair, from a NON ATI or NVidia standpoint, you could argue the reasons why, it seems that one of the teams occasionally have to develop a driver patch for a specific game because their hardware runs the game so poorly, and sometimes that game does not start with the screenshot "the way it's meant to be played"

Some might say (again from a middle of the road standpoint) that "The way it's meant to be played" actually helps fund "better games" as that funding, support, blackmail, whatever you would like to call it actually gives the developers more time to get it right in the first place, all for our gaming pleasure of course, I am always a little weiry of games that are released and in their first month of retail life have 2 or more patches released for them to fix bugs that IMO should not have been there in the first place, many developers are just getting lazy IMO, 10 or 15 years ago when the web was was probably only accessed by 10% of us, the developer had to try much harder to get the game right first time as these "patches/fixes" were of course much less readily available, kind of a coincidence!
Posted on Reply
#35
DarkMatter
VulkanBrosJust a little wood for the fire: Who is posting (my guess: a lot of money) into the game developers pocket with the slogan: The way it´s meant to be played ?
Not a single penny has been put in developers hand under TWIMTBP nor Nvidia under other conditions. And by no means they ask developers to cripple Ati performance. They do help developers with support and access to hardware, and cross marketing, something very valuable for them, but that's not the same. People doesn't know how valuable the access to hardware and support in man-power form is for developers...

I've benn seing 10 Nvidia names and only 2 AMD/Ati ones under the thanks section of credits in games for so long it's not even funny.

I'll tell you a little story. We have three friends, A, N and D (Ati, Nvidia, Developers :)). When D had problems with his wife, N brought beers and had a long conversation with him. When D's mother was sick N brought medicines and lately acompanied them to the doctor. When D lost his job, N advertised his abilities to everybody he knew, which was a lot of people.
When D was...

One day A and N's children (all of them called C of customers :)) said they wanted the latest console. A and N kenw it was released in low quantities and that no store in town had them already. They did know of a store in the nearest town that was going to have it, but will soon be out of stock. They had to get there as soon as possible. They had a problem though, they had no means of transport. D has a motorcycle so they both call him asking for help. D in the end makes a decision: he will take N to the store and then return and carry on A. In the meantime A can make the trip on his own by foot or by any other means. A's children cry a lot, why would D take N first? It's not fair! :cry::cry:

But at the end of the day... IT IS.
Posted on Reply
#36
newtekie1
Semi-Retired Folder
Musselsa lot of people dont seem to be getting it

PhysX is an nvidia only way of doing this
DirectX 11 is doing an open (any video card) version of this.

ATI/AMD are saying that nvidias closd one will die, and the open version will live on.
The problem is that PhysX has the opertunity to not be nVidia only, but ATi decided to make it nVidia only. Nvidia had no problems with getting PhysX up and running on ATi hardware, IMO because they knew that is what it would take for the technology to actually be adopted in the industry. Ageia originally failed because it couldn't get PhysX adopted in the industry, and people didn't want to buy a seperate card that sat idle in their PC 90% of the time.

Now, nVidia had a different idea for PhysX, if you use a small fraction of the GPU's power to calculate PhysX, then people don't have to worry about buying another add-on card that is going to sit idle most of the time. If a game doesn't support PhysX, then the that small fraction of GPU power is returned for use rendering the game.
eidairaman1Sort of Like Nvidia Saying the 4830 is defective when its not. Then Before that The head of NV said they underestimated the R770
When did nVidia say the HD4830 is defective?

And the head of nVidia saying the underestimated the RV770 is totally different from ATi saying PhysX will never catch on.
Posted on Reply
#37
Kursah
Meh...physx will be around for a while...it won't just stop being used the day DX11 comes out...just in DX revisions, how many forgot how long it took DX10 to be more widely used and it's still not as widely used as was once thought...it was going to be the "end of DX9" the "beginning of amazing graphics", blah, blah, blah. DX11 may have physics, but they have a lot of work to do...imo PhysX will be a step or two ahead just because of the time and practice utilising physics from GPU power is now hugely supported for the newer range of NV cards that so many are using out there.

You guys will see, DX11 will come out, there will be like 20-30 threads created about it...woohoo...how about let's see some true results, I'm waiting...

Physx will still be around and used for quite a while IMO...especially if it's easier or some game creators are more used to using that code they may opt to stick with PhysX instead....ya never know!

:toast:
Posted on Reply
#38
Hayder_Master
only games company's have a word about this , and only them can say you need physics hardware or not , but when i see all new games looks amd win right now
Posted on Reply
#39
DarkMatter
It's funny when people talk about how many games are being released with it. How much took DX10 to be used in games? And that's only an update to an API they know like they know their on name... PhysX has been released only 3 months ago, for God's sake...
Posted on Reply
#40
Octavean
Whether AMD / ATI opted out of PhysiX support or not, it doesn't really matter. PhysX is not universally supported in any offical capacity and DX11 will be with DX11 hardware. IMO, AMD's statment is likely accurate, PhysX will probably die and DX11 will almost certainly prevail. It doesn't matter one wit that ATI rejected support for PhysX and opted instead for Intel owned Havok (they had to go to the compatition which ever way they turned because they had nothing to compete). If the PhysX / CUDA API went open source I still don't think that would save it in this respect. DX11 is coming and Microsoft has a habbit of killing off competition in a way only a virtually unchecked poorly regulated monopoly juggernaut can.

Still, not everyone will or can upgrade to DX11 hardware when it comes out and there is a huge number of gamers that refuse to upgrade to Vista let alone "Windos 7". Then there is the issue of DX11 titles in a world where DX10 hasn't really taken off. It will be some time before DX11 is viable in any real applicable way beyond the odd tech demo here and there. AMD / ATI may well be first to market with DX11 hardware but DX9 and DX10 will probably be much more relevant when that happens, with DX11 being an unusable checkmark feature. In the mean time PhysX is helping to sell cards for nVidia in much the same way SLI did. I've even heard of ATI users buying nVidia cards just to run PhysX. I ask you, if you can get people who use a fast / powerful competing product to buy one of your cards anyway,.....haven't you already accomplished alot,....haven't you already won a big victory,....???

I say PhysX would have served its purpose by the time it dies and yes it will likely die.
Posted on Reply
#41
3870x2
PhysX is an Nvidia platform, if I'm not mistaken. Would this not be equivalent to ford supporting chevrolet motor division, by using their engines? The point of selling in competition is saying that their equipment is the best, so AMD supporting PhysX says that Nvidia was right in buying physx. AMDs only business choice in this matter was to forgo PhysX.
I dont think PhysX will die, but it will probably be behind a curtain as a part of Nvidia's system.
Posted on Reply
#42
imperialreign
btarunrBit-Tech.net interviewed the AMD person to get the company's take on EA and 2K's decision to adopt NVIDIA PhysX across all of their worldwide studios, earlier this week. Interestingly, when asked about how the major publishers such as EA adopting PhysX across all of their studios would impact the propagation of the API, Cheng responded with saying that monetary incentives provided to publishing houses alone won't help a great deal in propagating the API, and that the product (PhysX) must be competitive, and that AMD viewed Havoc and its physics simulation technologies as leaders. "Games developers share this view. We will also invest in technologies and partnerships beyond Havok that enhances gameplay." he added. PhysX is a proprietary physics simulation API created by Ageia technologies, which was acquired and developed by NVIDIA. You can read the full Bit-Tech.net interview with Godfrey Cheng here.
and thus it begins - ATI vs nVidia PPU shootout round 3!!

Let's get it on!
Posted on Reply
#43
DarkMatter
3870x2PhysX is an Nvidia platform, if I'm not mistaken. Would this not be equivalent to ford supporting chevrolet motor division, by using their engines? The point of selling in competition is saying that their equipment is the best, so AMD supporting PhysX says that Nvidia was right in buying physx. AMDs only business choice in this matter was to forgo PhysX.
I dont think PhysX will die, but it will probably be behind a curtain as a part of Nvidia's system.
You don't get it. AMD is supporting Havok which is Intel's. NOW WHO really is the competition? At least in the GPU arena they had a chance against Nvidia.
Posted on Reply
#44
brian.ca
DarkMatterAll that falls appart from the simple fact that, had Ati adopted PhysX when Nvidia gave it for free. Or at least when the guy at NGOHQ already made PhysX on Ati cards* possible, then we would already have games with 10 times better physics. The point of "Game developers are not supporting it because it doesn't run on all cards." loses it's value, when that is your own fault, when you have obstructed all efforts for the thing running in your cards.
That would make it fall apart if I was arguing that AMD wanted to push physx if able but I wasn't... it's like I said, Physx won't take off unless ATI supports it b/c of the whole chicken and the egg thing and ATI just doesn't want to support it. And I can't see why they would. Physx started as a propreitary system to make Ageia money selling their hardware now Nvidia bought them out it would seem foolish to think that physx will not favor Nvidia cards and that they couldn't pull some strong arm tactics down the road if physx were to become the industry standard.
And the one "We are not going to follow a standard that only few developers have adopted" is hypocrit at best, when you are doing as much as you can for that to be true. This falaceous comments included.
Saying that may be misleading but at the end of the day developers didn't go to Havok b/c ATI wouldn't support physx. They stuck with Havok b/c it's what they used, what they know and the incentives to switch were not great enough.
It's a lame path the one that AMD has taken. As I said I have lost any faith in them, and the worst thing is that they sided with the most dangerous and "evil" company, Intel, and once Intel has their GPU they are going to crush them so hard it's going to hurt us as well. As a popular quote says: "One must never rely on alliances with the powerful, they probably want something more of that allegiance than help us."
Here's the thing with Intel though... they're not going to be beaten. Nvidia can be beaten by and has beaten ATI. They might not be powerful in the big picture vs. a company like Intel but vs. ATI they are... so that quote applies to them as well. Now I'd ask, who's the smarter side to get on... The company that could probably crush your competition but has reason enough to keep you around who you have no chance of beating. Or the company that you can beat, doesn't really offer you much, that also stands to gain if they get you out of the way?

Lame or not it's the smart path
Posted on Reply
#45
FordGT90Concept
"I go fast!1!11!1!"
I wait for Intel Larrabee. AMD/NVIDIA have "lost their way." They are both extremely proprietary and it's time to move to an microarchitecture-based GPU and PPU system. Intel is poised to do that.
Posted on Reply
#46
brian.ca
newtekie1The problem is that PhysX has the opertunity to not be nVidia only, but ATi decided to make it nVidia only. Nvidia had no problems with getting PhysX up and running on ATi hardware, IMO because they knew that is what it would take for the technology to actually be adopted in the industry. Ageia originally failed because it couldn't get PhysX adopted in the industry, and people didn't want to buy a seperate card that sat idle in their PC 90% of the time.

When did nVidia say the HD4830 is defective?.
It could have not been Nvidia only but when Nvidia developed this I'd be hard pressed to believe that it wasn't developed for their cards and that they wouldn't always retain some upperhand here (it would make sense for them to do this). When you consider that alternatives were coming about from parties with less of a vested interest in favoring a specific card vendor I'm not sure what ATI's motivation to support Nv's solution would be.

About the 4830 thing I'd wager he was referring to Nv putting out those slides showing the performance of the cards that had the unactivated SPs (iirc).
Posted on Reply
#47
3870x2
DarkMatterYou don't get it. AMD is supporting Havok which is Intel's. NOW WHO really is the competition? At least in the GPU arena they had a chance against Nvidia.
this probably means that pII kicks ass, and they know it already.
Posted on Reply
#48
brian.ca
DarkMatterIt's funny when people talk about how many games are being released with it. How much took DX10 to be used in games? And that's only an update to an API they know like they know their on name... PhysX has been released only 3 months ago, for God's sake...
I don't mean to seem like I'm picking on you or anything but I wanted say something about the DX10 comments... one thing people need to consider is that DX10 was tied to vista which a lot of people didn't like. So it shouldn't be any surprise that DX10 had a slow adoption rate if Vista had one as well. If Windows 7 has a better adoption rate and DX11 is available for Vista then I don't think it's safe to try and judge how well received DX11 will be. If Win7 turns out to be good and they kill support for XP forcing a lot of users to migrate to the new OS the major bottle neck would become the hardware. And then like you said, not being fully compliant with a new standard doesn't mean some of the features aren't already supported on older cards so that may not even be much of a bottleneck.

When you say 3 months are you referring to that driver update? PhysX has been around a lot longer than that. In the original article's comment sections I think I saw what seemed like an Nv fanboy say Nv's been pushing physx for 10months now
Posted on Reply
#49
Tatty_Two
Gone Fishing
OctaveanWhether AMD / ATI opted out of PhysiX support or not, it doesn't really matter. PhysX is not universally supported in any offical capacity and DX11 will be with DX11 hardware. IMO, AMD's statment is likely accurate, PhysX will probably die and DX11 will almost certainly prevail. It doesn't matter one wit that ATI rejected support for PhysX and opted instead for Intel owned Havok (they had to go to the compatition which ever way they turned because they had nothing to compete). If the PhysX / CUDA API went open source I still don't think that would save it in this respect. DX11 is coming and Microsoft has a habbit of killing off competition in a way only a virtually unchecked poorly regulated monopoly juggernaut can.

Still, not everyone will or can upgrade to DX11 hardware when it comes out and there is a huge number of gamers that refuse to upgrade to Vista let alone "Windos 7". Then there is the issue of DX11 titles in a world where DX10 hasn't really taken off. It will be some time before DX11 is viable in any real applicable way beyond the odd tech demo here and there. AMD / ATI may well be first to market with DX11 hardware but DX9 and DX10 will probably be much more relevant when that happens, with DX11 being an unusable checkmark feature. In the mean time PhysX is helping to sell cards for nVidia in much the same way SLI did. I've even heard of ATI users buying nVidia cards just to run PhysX. I ask you, if you can get people who use a fast / powerful competing product to buy one of your cards anyway,.....haven't you already accomplished alot,....haven't you already won a big victory,....???

I say PhysX would have served its purpose by the time it dies and yes it will likely die.
I agree with much of what you are saying there , however, just one point, DX11 hardware is irrelivent if games dont support it, now let's briefly look back, say at NVidia and ATI, their first DX10 hardware (8800 G82 series and 2900 for ATi) were on the shelves for a year before there were 10.....yes thats just 10 games with DX10 enabled, now look how many games to day are DX10 enabled, divide them by the number of months DX10 hardware has been available....end result.........a cr*p load of money spent by us with VERY little return in gaming terms, damn I have owned 7 different DX10 graphic cards, I own 7 DX10 enabled games (OK I am only a light to mid gamer)....DX11 cards will be released, we will wait 6 months for one DX11 game and a year for 5, in which time we will all have spent our hard earned money to find out that those early DX11 cards just dont cut it so we buy better ones! Now with Physx, at least it's here, it's growing whether AMD like it or not but most importantly, it gives the consumer even more choice and thats gotta be good in my book.
Posted on Reply
#50
leonard_222003
DarkMatterFirst of all the fact that Nvidia doesn't support DX10.1 doesn't mean they don't support the fetures under DX10.1. M$ decided that in order to be DX10 compliant, the cards had to support EVERY feature specified in the API. That means that if you don't support 1 feature out of 1000, your card can't be labeled DX10 (or DX10.1 or DX11 etc) compliant. Prior to DX10 that was not necessary. If M$ decided to implement the same system for previous DX versions, there WOULDN'T BE A SINGLE DX9, DX8... comliant card in the market. As I said everything that was used in DX10.1 version of AC is available in DX10 and Nvidia cards, Farcry 2 is the proof. Plus where did you hear that G80 or G92 (had Nvidia supported 100% DX10.1) would have been behind? Nevermind, I know where, which points me out to why I said what I said in my first entry to this thread...
Darkmatter , it's known that those 300 or 800 shaders can be used for AA if dx10.1 features are used by the game , take some low tech games in terms of optimizations to compare to good engines like farcry 2 , you will see the ati card does very well in farcry 2 but this doesn't apply to games that don't support those features and ATI is behind , what explanation is for this ? the way is meant to be played is a big reason for this , they stoped evolution in that direction in fear of those products that could become more competitive.
Bottom line we could only embrace everything they throw at us that make our graphics in this case more enjoyable and blame the ones that says otherwise but we have to remember they both do this Nvidia when it suits them and ati too , throw some dirt and we idiots fight for them here , let them fuck each other and we watch the fighting and buy the better product always.
Posted on Reply
Add your own comment
Dec 22nd, 2024 16:35 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts