Friday, March 20th 2009
AMD to Demonstrate GPU Havok Physics Acceleration at GDC
GPU-accelerated physics is turning out to be the one part of specifications AMD is yearning for. One of NVIDIA's most profitable acquisitions in recent times, has been that of Ageia technologies, and its PhysX middleware API. NVIDIA went on to port the API to its proprietary CUDA GPGPU architecture, and is now using it as a significant PR-tool apart from a feature that is genuinely grabbing game developers' attention. In response to this move, AMD's initial reaction was to build strategic technology alliance with the main competitor of PhysX: Havok, despite its acquisition by Intel.
In the upcoming Game Developers Conference (GDC) event, AMD may materialize its plans to bring a GPU-accelerated version of Havok, which has till now been CPU-accelerated. The API has featured in several popular game titles such as Half Life 2, Max Payne II, and some other Valve Source-based titles. ATI's Terry Makedon, in his Twitter-feed has revealed that AMD would put forth its "ATI GPU Physics strategy." He also added that the company would present a tech-demonstration of Havok technology working in conjunction with ATI hardware. The physics API is expected to utilize OpenCL and AMD Stream.
Source:
bit-tech.net
In the upcoming Game Developers Conference (GDC) event, AMD may materialize its plans to bring a GPU-accelerated version of Havok, which has till now been CPU-accelerated. The API has featured in several popular game titles such as Half Life 2, Max Payne II, and some other Valve Source-based titles. ATI's Terry Makedon, in his Twitter-feed has revealed that AMD would put forth its "ATI GPU Physics strategy." He also added that the company would present a tech-demonstration of Havok technology working in conjunction with ATI hardware. The physics API is expected to utilize OpenCL and AMD Stream.
226 Comments on AMD to Demonstrate GPU Havok Physics Acceleration at GDC
EDIT:
Just read the physx page, seems they do. Tho I have never seen it implemented.
I still dont get the cloth physx tho. Does anyone actually think it looks real :confused:
You guys are talking too much and you never saw one single PhysX demostration!! PhysX has everything Havok has always have, plus many other things like the ones people are mentioning here that they want, like real fluids, massive physics (I suggest you see a pair of demos)... You could have had ALL those things implemented since 2006, if Intel had not tried so hard to ban it from games (yeah even before the adquisition it was for their interest) OR if you didn't asked so passionately to ban hardware physics from games. If what you trully wanted is all that, you could have asked AMD to support it instead of bashing a product you know NOTHING about. :shadedshu
EDIT: BTW examples of PhysX running in software mode (and crappy console CPU) are: Gears of War, Mass Effect. I don't know you, but I would say that those two games have amazing physics.
Still, I have beat Mass Effect probably 3-5 times already and not one time have I thought to myself "this game looks pretty" or "those are nice physics." Actually, I scolded the physics a few times when a Krogan gets bionic lifted on Feros and falls down under a scaffolding where he can't be killed. Not once did I praise the physics or graphics because frankly, I couldn't care less about them.
And i have Mass Effect, it's one of my favorite games.
Havok, though, has been the mainstream physics engine for absolute ages . . . and both ATI/AMD and Intel have supported it in the past (and still do). Some of the more popular titles names using Havok:
FEAR
FEAR 2
Thief: Deadly Shadows
Timeshift
Assassin's Creed
Bioshock
Company of Heroes
Fallout 3
Half-Life 2
Halo 3
StarCraft II
Diablo 3
Ghost Recon: Advanced Warfighter 2
as well as:
Futuremark 3Dmark05
Futuremark 3Dmark06
Futuremark 3DmarkVantage
and countless other mainstream titles . . . PhsyX itself, based on Aegia's engine, is good . . . but it's not as heavily supported as Havok.
Thing is, if AMD and Intel can come together and start agreeing on implimentation of the Havok engine (of which, IIRC, Intel had bought back in '07), they could quickly and easily drive nVidia out of the physics market . . . Haok is used across both console and PC platforms, and has the bigger market dominance over PhsyX. the only thing that nVidia has going for them, in regards to their implimentation, is their large GPU dominance . . . but Intel and AMD working together could quickly drive them out.
even still - although nVidia might be the leader in the GPU market . . . if AMD and Intel ever collaborate and push Havok further, nVidia nad their monolithic hardware wouldn't stand a chance in the physics market against the two.
But, that all balances against AMD and Intel ever deciding to work together with Havok implimentation.
That's another reason why NVIDIA feels threatened and was starting to talk about making their own x86 CPU.
My 9600 GT has 64 SPUs, so if I enably PhysX on it then that reduces my SPU count to a minimum of 48. That is not good!
ATI cards have up to 800 SPUs. How many of those would Havok based Physics require to run properly? 40, 80, 200? I'd guess 80 because thats what their lowest end cards usually have. Any other ideas on SPU count required for this?
I think it's very cool to see Havok getting support like this, really what I would like to see is the two in comparison in the same game via middleware patch or something. Show the differences, show the effects/affects of each engine, etc. I think Havok is great stuff since it's been used so long, but I dont' know much about it to know just how well it will work for more realistic games in the future...same with PhysX though. While it is neat, it's not used, I don't really care either way yet because there are quite a few games that use CPU driven proprietary physics engines for that specific game that works fine. Though if we could see a blend of PhysX/Havok that could be something truly worth having around, that'd be the way to go...as-far-as AMD and Intel making Havok a standard, it could happen...whether it will...we'll find out within the next couple years I believe. None-the-less, not worth making a big deal out of till there's a big deal to be made from results imo. I want to see AMD/ATI cards with Physx support on the end-user side like NV's had for PhysX for months to make my own judgement...will you notice a difference in HL2 or any other game that uses Havock with a newer processor being offloaded and new GPU being loaded more? Could be more negative than good depending on how it's executed and just what's going on in the particular scene I suppose...I'll wait and not really worry about it till there's something more solid and out there for end-users.
:toast:
Anyway, did you praise the ones in Oblivion?? You can't blame an engine because of how it has been used in a game... I have said it already. There's almost no game using it to all it's extension because.
a) Intel and AMD have tried so hard to ban PhysX from games.
b) because of the comments from so many people anong the lines seen here. If developers see that people don't care about physics they will not spend their time implementing anything.
My comment was not for those who don't care about physics (good for them), is for those who seem to want some better physics and at the same time are bashing PhysX, which has been delivering exactly what they wanted since it's creation, but could never be implemented because of the points above.
And my post was directly directed at those spilling BS about that PhysX can't do this or that. It can do everything that Havok can do on the CPU and much much more when on the GPU (until now, we'll see). I'm in no way saying this Havok GPU implementation is worse than PhysX, but I can almost say it won't be better either. Thing is we don't know.
DON'T expect this other implementation to be implemented more than PhysX, as it will face the same problems, unless Intel really wants it implemeted, which would be very suspicious. It's coming 1-2 years later so it will take time nevertheless.
All in all, my post was regarding the BS about PhysX (that it is flawed, no collision, etc), and not saying it's any better than other engines. GPU physics is much better than any CPU based physics and PhysX is just a very good one that has already proven itself. On the other hand, this Havok implementation still needs to demostrate if it has what it takes. Yet all of you are already praising it as if it was the Godsend and at the same time bashing PhysX, with clueless allegations. I wonder if it has anything to do with who is releasing it?? :rolleyes:
I don't care if it's PhysX or is Havok or is any other one the physics implementation that wins, but I want it NOW already and PhysX is the only one that can do it right now. Thats why I support it, why I have always supported it, not because of who it belongs. On the other hand is pretty clear the bias that most of you guys have. GPU physics was a waste of time until yesterday, but it just takes one newspost to make it the best thing ever and now everybody wants massive physics, fluids and whatnot. That is, the same things that Ageia was doing 4 years ago and Nvidia was capable of doing since the adquisition, but this time in the hands of someone else. Because, you are not happy because this is an open standard, because it's not, nor because it's free for the developers, because it's not, nor because it's a better implementation, because you don't know. You are happy because it's AMD, period. And that's plain and simply biased.
Just to finish, tell me which PhysX demos you have seen, because it's pretty clear for me you didn't see anyone. There are tons of videos in youtube if you can't see them directly on a Nvidia GPU.
Show me some games with full PhysX utilization and maybe I will think it's ok but for the time being, it's a dead engine.
Also Intel CANNOT eat AMD with some "fish and chips". If they could they would have already. Intel would love nothing more than to be the undisputed king of the hill. AMD taking PhysX on with Havok is just good old competition.
No wheres my damn drivers?
The only game I'd say that had notably good physics is Freelancer (Havok engine). When you get hit by those disorientation mines, holy $h!t. I can't say any other game impressed me in regard to physics.
The only game that impressed me in regards to graphics was X3: Reunion. It was just awesome getting close to a capital ship and seeing all the details on its surface. They did a brilliant job there and yet, it still ran well on lowly hardware. I am more impressed by them taking the time to really get it right (the models/textures) more so than the "eye-candy."
On the contrary, it's for Intel's best interest to keep AMD alive, but with the smaller market share posible. Intel could have and can crush AMD whenever they liked to. Their CPUs are cheaper to make so they can actually release them cheaper and everybody knows they are faster. That is specially true every time they release a new batch on a lower fab process. When 32nm are released they could put the new processors at a price that AMD would never survive, but as I said they will never do it, because it's better to have a weak enemy that you already know than letting a new player enter (also most probably that new player would adquire AMD just in time).
Only reason there's no more relevant companies in the market, is because there's always only place for two: the leader (which ususally offers the best but at a price) and the alternative to the leader, which is the cheaper alternative. If a 3rd tries to enter a market it has to be significantly better than the mentioned alternative, while being cheap or will never take off. Why? Because most people wants products from the leader and if they can't afford them, they will always elect the cheap alternative that they already know, very few will take the cheap, slow and NEW alternative. It's hard to make a new product, so very few times you will make a better product than the others and because you are new, you will never get enough revenue to keep going with the other two.
AMD is the shield that Intel has against other companies that could want to enter the market, even something like IBM. IBM doesn't need to enter the consumer market, and it's not for their best interest to fight against Intel and AMD there. They would be 3rd, even when they are IBM, but without AMD there would be a hole that IBM could very easily fill and once they entered and obtain AMD's current market share, they could do a lot of things to compete, things that AMD can't do because it is so small.
And apart from that, there is the fact that they could face some issues regarding monopoly if AMD didn't exist anymore, and no other took over. They could be forced to make x86 free for all, for example.
how good the engine is.
Crysis has good physics and the ones that use GPU accelerated PhysX too have very good physics. If GPU Havok is well implemented it will also offer good physics with a good ammount of integration, but I am still skeptical of why would Intel let AMD make their CPUs look like crap at handling their own physics engine. IMO there's something shaddy there, or this GPU Havok is nothing more than a PR stunt. I vote for this last thing.
Two physx items can collide and have merry fun with each other - but non physx entities cant collied. Mirrors edge as a loose example - you can shoot cloth and have holes appear in it, but you cant go walking on said cloth, or drop a gun on it and expect it to stay there in the realistic *appearing* cloth.
Path 1: Make the game use a generic physics engine, for people without CUDA (old Nv cards, ATI) - Physx does as little as possible in this example, so that they dont need to duplicate any coding (two physics engines for the same items) - thats when you have items that dont collide together.
path 2: make two engines coded for everything. When physx is enabled everything moves over, and everything can interact with everything else.
If you were strapped for cash and time as a game developer with an unknown, brand new concept for a game... which would you take?
DirectX 10.1 and Hardware Tesselation are dead for the time being.
Might have to look that up when i get home.