# AMD to Demonstrate GPU Havok Physics Acceleration at GDC



## btarunr (Mar 20, 2009)

GPU-accelerated physics is turning out to be the one part of specifications AMD is yearning for. One of NVIDIA's most profitable acquisitions in recent times, has been that of Ageia technologies, and its PhysX middleware API. NVIDIA went on to port the API to its proprietary CUDA GPGPU architecture, and is now using it as a significant PR-tool apart from a feature that is genuinely grabbing game developers' attention. In response to this move, AMD's initial reaction was to build strategic technology alliance with the main competitor of PhysX: Havok, despite its acquisition by Intel. 

In the upcoming Game Developers Conference (GDC) event, AMD may materialize its plans to bring a GPU-accelerated version of Havok, which has till now been CPU-accelerated. The API has featured in several popular game titles such as Half Life 2, Max Payne II, and some other Valve Source-based titles. ATI's Terry Makedon, in his Twitter-feed has revealed that AMD would put forth its "ATI GPU Physics strategy." He also added that the company would present a tech-demonstration of Havok technology working in conjunction with ATI hardware. The physics API is expected to utilize OpenCL and AMD Stream.





*View at TechPowerUp Main Site*


----------



## [I.R.A]_FBi (Mar 20, 2009)

wow .... so quick ...


----------



## ucanmandaa (Mar 20, 2009)

at last


----------



## alexp999 (Mar 20, 2009)

Thanks to InnocentCriminal for sending this in.


----------



## iStink (Mar 20, 2009)

uhohhhhhh lol


----------



## Skywalker12345 (Mar 20, 2009)

thats sweet now we can run vantage hopefully with phsyx like nvidia


----------



## FordGT90Concept (Mar 20, 2009)

This is getting stupid.  Someone (most likely Intel) needs to step forward and standardize instructions for physics processing.  Game developers won't take the time to develop for PhysX and Havok--you'll only get one at most.


----------



## csendesmark (Mar 20, 2009)

Rip PhyzX...


----------



## alexp999 (Mar 20, 2009)

FordGT90Concept said:


> This is getting stupid.  Someone (most likely Intel) needs to step forward and standardize instructions for physics processing.  Game developers won't take the time to develop for PhysX and Havok--you'll only get one at most.



Intel owns Havok


----------



## iStink (Mar 20, 2009)

alexp999 said:


> Intel owns Havok



That's interesting.  Perhaps havok will end up being bigger than we thought.


----------



## FordGT90Concept (Mar 20, 2009)

alexp999 said:


> Intel owns Havok


How could I forget that? 

So yeah, it's just a matter of time before Intel standardizes something based on Havok.


----------



## LAN_deRf_HA (Mar 20, 2009)

I'm pretty sure both havok and physx are going to become pointless with dx11. There's supposed to be a unified physics processing related to compute shaders or w/e... so what developer would pick havok or physx when they could pick this new standard that will work on ati, nvidia, and future intel cards?


----------



## iStink (Mar 20, 2009)

LAN_deRf_HA said:


> I'm pretty sure both havok and physx are going to become pointless with dx11. There's supposed to be a unified physics processing related to compute shaders or w/e... so what developer would pick havok or physx when they could pick this new standard that will work on ati, nvidia, and future intel cards?



good point, but how far away is dx11?


----------



## alexp999 (Mar 20, 2009)

iStink said:


> good point, but how far away is dx11?



Windows 7 







But obiviously the hardware and drivers arent out yet. My next rig is gonna be a DX11/ i5 rig.
I'm betting it will coincide with the launch of windows 7 later this year. Bring on GT300 and RV8xx


----------



## FryingWeesel (Mar 20, 2009)

FordGT90Concept said:


> This is getting stupid.  Someone (most likely Intel) needs to step forward and standardize instructions for physics processing.  Game developers won't take the time to develop for PhysX and Havok--you'll only get one at most.



physx is open, havoc isnt,(anybody can support physx, intel owns havoc, intel isnt really intrested in anything being gpu accelerated, they feel everthing will be on cpu, just as nvidia insist everything will move to gpu....

ms is just using OpenCL to try and pull the 2 sides(havoc and physx or stream/cuda) togather  to get things unified, d3d 11 will have specs for that (using opencl) 

all this bickering needs to END, its getting OLD and helps NOBODY, amd could add support for physx to their drivers, just as nvidia could add havoc support using cuda, its just a matter of them being willing to accept that their way isnt the only way and doing whats best for the customer insted of what looks best for their image in their minds.....


blah!!!


----------



## ShRoOmAlIsTiC (Mar 20, 2009)

all I know is im jealous of nvidia users,  I would love to turn my 4850 into a ppu and just run my 4850x2 for graphics.  I cant even run a ppu is I wanted to right now(vista issue).  ATI really should do something about that.  Right now with them not being able to do physics is just making nvidia more money.  People are reverting back to XP and jumping to windows 7 just so they can run physic and still have ATI as a main gpu.


----------



## Binge (Mar 20, 2009)

And ATi earlier said chasing PhsyX was a terrible idea for graphics... now this, which was previously rumored, materializes?  Maybe nV had something with buying Ageia


----------



## sfp1987 (Mar 20, 2009)

Hmm so this must be Havok FX atlast or something else?

ATI did demonstrate that on the gpu ages ago. If I remember correctly.


----------



## newtekie1 (Mar 20, 2009)

I really don't care what standard is used, as long as *one* is picked as the standard.  Right now, this going back and forth, trying to split the industry between PhysX and Havok is only leading to developers not wanting to use either.

The industry needs to pick a single physics standard that runs on all hardware, and move on with that.  That is the only way we will see developers start to truly pick up detailed physics in games.


----------



## jagass (Mar 20, 2009)

Wow...A very good news...


----------



## h3llb3nd4 (Mar 20, 2009)

flip a coin ppl!


----------



## sfp1987 (Mar 20, 2009)

idd newtekie1


----------



## OnBoard (Mar 20, 2009)

sfp1987 said:


> Hmm so this must be Havok FX atlast or something else?
> 
> ATI did demonstrate that on the gpu ages ago. If I remember correctly.



I remember the ATI demos too, here's something from youtube

http://www.youtube.com/watch?v=gLgb9AdnaBI&fmt=18

This is getting interesting in the sense that 'wonder what will happen' and bad at the same time, if this turns to another HD-DVD vs. Blu-ray thing.

I don't want an end result where every other game is ATI physics and every other NVIDIA. Either some sort of emulation for both parties, or one physics to rule them all.


----------



## ShadowFold (Mar 20, 2009)

Awesomeness. Finally a good physics engine gets some GPU acceleration! I would love to see Valve flagship this with Ep3 or what ever they have coming out!


----------



## PCpraiser100 (Mar 20, 2009)

ShadowFold said:


> Awesomeness. Finally a good physics engine gets some GPU acceleration! I would love to see Valve flagship this with Ep3 or what ever they have coming out!



Yeah, Valve has associated with ATI a lot lately in the Half-Life 2 series, as you can tell when you get an ATI symbol when using the HD 2000 series in settings. ATI needs this badly if they can force it in titles like Crysis, as Physics has mostly been an enemy for low ROPs GPUs especially if used in conjunction with the rather exclusive PhysX engine. This can increase scores on ATI cards as well in F@H sessions, better email Stanford to get on that.


----------



## Imsochobo (Mar 20, 2009)

Well, nvidia's physx isnt transparent to me, while the whole havoc thing works flawlessly, im voting for havoc, and a piece in my cpu that can push out an 200gigaflop(gpgpu) and enjoy the offload for the cpu.


but for the time being.

I enjoy playing my games, with the physics, 1920x1200 all maxed out, and with 8X MSAA.
I just dont see where physx should help me, and comparison with ppu and without, isnt that big of an deal that i would throw money into getting that little extra.

Unless all can run it, no matter what, and i dont have to install an stupid application.


----------



## Ben_UK (Mar 20, 2009)

Valve is now in bed with nVidia, so that would never happen.

The ATi logos from Valve's games were removed about a year or more ago.


----------



## ShadowFold (Mar 20, 2009)

There's one on the back of my Orange Box. And Left 4 Dead isn't "The way it's (not) meant to be played" so I don't think they are nvidia. I'm pretty sure they just made themselves neutral.


----------



## leonard_222003 (Mar 20, 2009)

Nice to know AMD/ATI finally does something here , i was worried i will have to buy Nvidia eventually if most games would add a special physx feature for every game that Ati can't do that well or not at all.
Now the question , is it going to be just as good as physx or better ? i know it's wide spread in a lot of games so it should be the guru of physics but i don't know , hope it's good.
For now physics being hardware or software didn't impressed me in any way , explosions are exagerated by physx or any software/hardware physics engine , graphics effects didn't improved to the point of being much better , it's just a thing there that is hardly noticeable afetr some playing time or anoying with over the top explosions and a stick breaking in 100 pieces.
This rush for physics is not explicable by me , if hardware got so performant to the point every pixel on the image need  some physics calculations then go ahead and do it  but  , games still look like cartoons and trees are made from a few bitmaps.
Well it's the new gimmick that Nvidia has over Ati and will try to make it as important as they can but in reality it's hardly of any matter for the final game , the simple physics we had until now have becomed greatly exagerated and not at all life like , it still looks like a computer generated physics.


----------



## PCpraiser100 (Mar 20, 2009)

ShadowFold said:


> There's one on the back of my Orange Box. And Left 4 Dead isn't "The way it's (not) meant to be played" so I don't think they are nvidia. I'm pretty sure they just made themselves neutral.



Definitely! I can tell too cause in Half-Life benchmarks its rather a fair game between the two video giants. Many games were crowned to Nvidia because the devs optimized them for ONLY Nvidia, meaning that  in HL2 benches such weird-feeling news to green fans happens like the HD 3870X2 beats the 8800Ultra in Half-Life 2 Episode Two. 

Anyway, before I get off-topic too much, they should get some drivers in to invent a technology with Havok to finally counteract with CUDA in F@H or 3DMark. Should we create a thread for codenames if ATI is already underway?


----------



## newtekie1 (Mar 20, 2009)

PCpraiser100 said:


> This can increase scores on ATI cards as well in F@H sessions, better email Stanford to get on that.



This won't affect F@H scores at all.


----------



## TreadR (Mar 20, 2009)

I was just going through the list of games that use PhysX and/or Havok... and I don't see why people make such a fuss about it.

For me, the games I like to play and enjoy and that would really make a deal out of this gpu accelerated physics are... few... those that use PhysX are GRAW 1 & 2, MoH and UT3 and those that use Havok are Assasin's Creed,  HL:OB, MoH (also), Saints Row 2 (only X360?) and The Godfather.

And I bet a lot of other gamers play also only a few titles in this sense.


----------



## Imsochobo (Mar 20, 2009)

TreadR said:


> I was just going through the list of games that use PhysX and/or Havok... and I don't see why people make such a fuss about it.
> 
> For me, the games I like to play and enjoy and that would really make a deal out of this gpu accelerated physics are... few... those that use PhysX are GRAW 1 & 2, MoH and UT3 and those that use Havok are Assasin's Creed,  HL:OB, MoH (also), Saints Row 2 (only X360?) and The Godfather.
> 
> And I bet a lot of other gamers play also only a few titles in this sense.



starcraft II diablo III, uhm well, there is alot of games using havoc.


But agree, can anyone show those type of physics aegia bragged about in their videos, in a real game, real situation ?.


----------



## CDdude55 (Mar 20, 2009)

Should be interesting, now Nvidia and Ati are getting into a GPU Physics wars..

Even tho Havok is on pretty much everything(all the consoles have it).


----------



## ShadowFold (Mar 20, 2009)

Oblivion/Fallout3, Source Engine(Half-Life, Left 4 Dead, Team Fortress 2, Portal..), Rainbow Six Vegas, Saints Row, Hellgate, Company of Heroes, FEAR, Halo, Shadowrun.. The list goes on. A lot of developers know and love Havok, this is a great move on AMD's part.


----------



## csendesmark (Mar 20, 2009)

CDdude55 said:


> Should be interesting, now Nvidia and Ati are getting into a GPU Physics wars..
> 
> Even tho Havok is on pretty much everything(all the consoles have it).



PhyzX support in games is very limited

Havok has the big marketshare


----------



## CDdude55 (Mar 20, 2009)

csendesmark said:


> PhyzX support in games is very limited
> 
> Havok has the big marketshare



Because Havok it  used in pretty much everything as i said(the Console have it to)

PhysX isn't that bad, but once it gets included in more titles and the developers really try and bring it out in games, it would be great.


----------



## BrooksyX (Mar 20, 2009)

Interesting stuff. Can't wait to see some benches and results.


----------



## TreadR (Mar 20, 2009)

Imsochobo said:


> starcraft II diablo III, uhm well, there is alot of games using havoc.



It's not what I meant... I'm am referring to a list of games that each and everyone has, games that he/her plays and enjoys.

True, there are a lot of games on each side, but I know that only a few people like to play EVERYTHING... and the rest play a few select games.

In this sense, how much does it really matter?


----------



## kid41212003 (Mar 20, 2009)

Intel already bought AMD secretly, they're manipulating the market.

AMD adopt Havok pretty much proved this.


----------



## KainXS (Mar 21, 2009)

I am hoping PhysX will soon die because most physics heavy games are havok based


----------



## Frizz (Mar 21, 2009)

Imsochobo said:


> starcraft II diablo III, uhm well, there is alot of games using havoc.
> 
> 
> But agree, can anyone show those type of physics aegia bragged about in their videos, in a real game, real situation ?.



if blizzard is using havoc for their upcoming games then it won't make this such a hard decision for AMD/ATI because I can guess that only ten million copies will be sold on the first day of their releases lol .


----------



## Steevo (Mar 21, 2009)

Don't forget ATI/AMD are hand holders with MS, Nvidia is always out in left field doing unmentionable things with a basebal bat, glove, and a victorias secret catalog.


----------



## VanguardGX (Mar 21, 2009)

Off topic: Just checked the back of my half life 2 episode pack and its got the "nvidia" on it! I thought this was ATi game. 

On topic: I dont like this whole phyzX Vs Havok thing the fact that there will be 2 different API's competing for for dominance. This means some of us are gonna miss out as some games will support havok and others will support phyzx, ATi or nvidia. Developers are not gonna spend the extra cash or dev time to incorporate both in the same game.


----------



## ShadowFold (Mar 21, 2009)

Meh, no one uses PhysX now, why would they start? Like I said before, a lot of devs already use Havok and are familiar with it so I think they will use it.
And about the episode pack, valve did sport the nvidia logo for a short time awhile after portal came out, they gave nvidia owners a demo of it, woo hoo


----------



## VanguardGX (Mar 21, 2009)

ShadowFold said:


> Meh, no one uses PhysX now, why would they start? Like I said before, a lot of devs already use Havok and are familiar with it so I think they will use it.



I dunno man nvidia can be a very influential company. They got most every developer to forget about dx 10.1 

I would love to see what they're gonna do when dx 11 comes out.
 Open CL could render phyzx dead.


----------



## ShadowFold (Mar 21, 2009)

Well they aren't very persuasive with PhysX since almost no one has used it. Some use it as a primary engine, but none really use it heavily like in Warmonger or.. Yea.


----------



## Frizz (Mar 21, 2009)

NVIDIA is an influential company and so is ATI since their the only two players out for the crown at the moment. And none of us will be alive to see a winner in the end because to me, this is just an endless cycle of what's new and whats going to be out phased. But for the time being ATI AMD + Microsoft + Blizzard is going to be a beast of a combination.

Physx and havok, I doubt there is much to miss there but moving cloth, so be it a PhysX or Havok game we'll all still be able to experience the best out of all the good games to come.


----------



## ShadowFold (Mar 21, 2009)

randomflip said:


> Physx and havok, I doubt there is much to miss there but moving cloth, so be it a PhysX or Havok game we'll all still be able to experience the best out of all the good games to come.


True that! But from what I've seen on Half-Life 2 I think it will have actual mass physics. Play Garry's Mod and spawn a ton of boxes or something, it doesn't start lagging until you spawn about 50-80 of them(for me), the only thing I've seen from PhysX is fancy cloth and glass, who cares? I want mass physics and explosions!


----------



## Mussels (Mar 21, 2009)

Yay. another reason to justify me getting my 4870.


----------



## Frizz (Mar 21, 2009)

ShadowFold said:


> True that! But from what I've seen on Half-Life 2 I think it will have actual mass physics. Play Garry's Mod and spawn a ton of boxes or something, it doesn't start lagging until you spawn about 50-80 of them(for me), the only thing I've seen from PhysX is fancy cloth and glass, who cares? I want mass physics and explosions!



I need a question answered. Froblins the demo and ATI Stream Processing, is the stream processing similar to Nvidia's Physx? If so, is it an example of mass physics?


----------



## Mussels (Mar 21, 2009)

stream processing is similar to Nvidas CUDA.


----------



## ShadowFold (Mar 21, 2009)

randomflip said:


> I need a question answered. Froblins the demo and ATI Stream Processing, is the stream processing similar to Nvidia's Physx? If so, is it an example of mass physics?



No Havok will be. Stream has nothing to do with gaming, it's more for business type stuff.


----------



## DarkMatter (Mar 21, 2009)

ShadowFold said:


> True that! But from what I've seen on Half-Life 2 I think it will have actual mass physics. Play Garry's Mod and spawn a ton of boxes or something, it doesn't start lagging until you spawn about 50-80 of them(for me), the only thing I've seen from PhysX is fancy cloth and glass, who cares? I want mass physics and explosions!



With PhysX you can spawn 5000 boxes and still no lag at all in a 8800 GT. You have all those much boxes in the PhysX screensaver.

Only reason why you are not seing that kind of utilisation is because of the pressure of Intel and AMD.


----------



## WarEagleAU (Mar 21, 2009)

I thought Physx and Havok were pretty much using the same API and software and what not? Im all for AMD getting behind the physx deal but I hope its not like betamax, VHS :/


----------



## FordGT90Concept (Mar 21, 2009)

WarEagleAU said:


> I thought Physx and Havok were pretty much using the same API and software and what not? Im all for AMD getting behind the physx deal but I hope its not like betamax, VHS :/


It is, and backed by popular demand, VHS/Havok wins.  Havok has time on its side as it has been used in many games long before Ageia showed up out of no where.


----------



## Mussels (Mar 21, 2009)

If you look at how physx is being used, it makes me wonder if its flawed somehow.

Havok: moving boxes that the player can use/jump on. Hell look what they can do in Gmod with the engine, i built a flying spaceship using in game havok physics, then crashed it into the moon. epic.

Ageia/Nvidia Physx: glass and cloth, hailstorms and a tornado map in UT3 I'm yet to see INTERACTIVE physx. Why? is it because it'd be too similar to Havok, or is it because it just cant do it?


----------



## Error 404 (Mar 21, 2009)

ShadowFold said:


> True that! But from what I've seen on Half-Life 2 I think it will have actual mass physics. Play Garry's Mod and spawn a ton of boxes or something, it doesn't start lagging until you spawn about 50-80 of them(for me), the only thing I've seen from PhysX is fancy cloth and glass, who cares? I want mass physics and explosions!



I play Garry's Mod a lot, and the fact that the current Source engine only uses one core of my CPU to run everything and still not lag until I get some really physics-heavy stuff happening (like the 50 boxes example), its a testament to how good Havok is.
What I want to see if fluid dynamics! I want to be able to pour water into a box, and have boxes float on water because of displacement and not a pre-set bouyancy level!
I've considered upgrading to an ATI card once I have some money, and this would be a great reason for me to get one.

Just checked, and my Orange Box pack has the ATI logo on it.


----------



## FordGT90Concept (Mar 21, 2009)

I think it's because Havok was designed by gamers, for game developers.  They knew that it is only useful to developers if it has collision detection as well.

NVIDIA, on the other hand, focus on "eye candy."  They don't see the value nor necessity in collision detection and thus, assume it will be handled separately.  It takes twice as much work for the programmer because they have to wrap collision detection code around the PhysX.


At the same time, what you saw could just be a graphical demonstrator and not really be about gaming/development.  It's hard to say.  What is certain is Havok is well received in developer circles and NVIDIA is going to have a very, very hard time unseating it.


----------



## Mussels (Mar 21, 2009)

FordGT90Concept said:


> I think it's because Havok was designed by gamers, for game developers.  They knew that it is only useful to developers if it has collision detection as well.
> 
> NVIDIA, on the other hand, focus on "eye candy."  They don't see the value nor necessity in collision detection and thus, assume it will be handled separately.  It takes twice as much work for the programmer because they have to wrap collision detection code around the PhysX.
> 
> ...



collision detection is the term i needed when writing m post, thanks.


----------



## ShinyG (Mar 21, 2009)

The fact is that both Intel and nVidia are big players and they are trying to set standards. That's why both are trying to convince everybody they are right and the other guys are idiots. 

So, Intel and AMD are both involved in Havok. Havok is already implemented at a CPU level with palpable results in the gaming world. Convincing developers to use their already accumulated expertise to implement these at a GPU level using AMD's SDK is far easier than it is for nVidia to convince them they should ditch everything and come join the green side of physics. 
Fans will always say PhysX is better because it's "green" but I think we should let developers decide...


----------



## Error 404 (Mar 21, 2009)

Mussels said:


> collision detection is the term i needed when writing m post, thanks.



Collision detection is going to be very important in future games IMO; the scenario of spawning heaps of boxes in gmod relies upon collision detection, and if a GPU could handle that then it would make the amount of collisions possible be raised by a huge factor.
In the future, I hope to see games that have collisions like ricocheting bullets and shrapnel, or even the ability for props to warp or shatter realisticly under impact, instead of just disintegrating or a image of a bullet hole appearing.
That would be frickin awesome!


----------



## FordGT90Concept (Mar 21, 2009)

Error 404 said:


> Collision detection is going to be very important in future games IMO; the scenario of spawning heaps of boxes in gmod relies upon collision detection, and if a GPU could handle that then it would make the amount of collisions possible be raised by a huge factor.
> In the future, I hope to see games that have collisions like ricocheting bullets and shrapnel, or even the ability for props to warp or shatter realisticly under impact, instead of just disintegrating or a image of a bullet hole appearing.
> That would be frickin awesome!


There is no 3D game it isn't important in.  Without it, you, your units, your allies, your foes, the guns, the wammo, and everything else that can move will fall through the floor and never stop falling until it throws some mathematical exception.

The thing is, most (if not all) games can get by with very simplified physics including collision detection.  It is literally as simple as this:

```
if (z < floor)
  // you fell off the map
else
  // you're still on, or above, the floor
```

Even simulating a bullets motion isn't very hard.  What makes it hard is details:
a) Is there wind blowing?
b) Is gravity involved?
c) Is there humidty?
d) What's the air pressure and density?
e) What is the muzzle velocity?
f) Do we make objects react to impact?  If so, how complex?
etc.

Add on top of that, the volume of bullets created by a minigun.  Then add on top of that 20 enemies firing mini guns at you at the same time.  Do you make the enemies all show and their hits just a probability with no physics at all?  Do you just not allow that many miniguns in the scene at a time? etc.

The early games broke down the physics of firing to as simple as this:

```
if (cursor.target == enemy.hitbox)
  // enemy hurt
else
  // not hit
```

As the average computer grows more powerful, developers are adding more and more physics calculations, triangles on characters/walls/objects, and more objects that aren't attached to something.

But, as always, it is the graphics that are still killing performance more than anything else (binary sucks for graphics).  So the more physics calculations you add, it is still the ability to render graphics that is dragging the performance down.  Physics, except in games that are centered around it like World of Goo and Portal, just aren't that high of a priority.


----------



## shk021051 (Mar 21, 2009)

is this physics only fo 4890 or all 4800 series?


----------



## btarunr (Mar 21, 2009)

shk021051 said:


> is this physics only fo 4890 or all 4800 series?



Every ATI GPU Since Radeon HD 2000 series (all of which are capable of stream-processing).


----------



## Wile E (Mar 21, 2009)

I remember they ran a gpu physics demo back in the X1950 days.


----------



## btarunr (Mar 21, 2009)

Both ATI and NVIDIA announced GPU-physics support in reaction to Ageia back then. NVIDIA had something it called "Quantum Physics Technology", which was listed in the specs of 8800 GTX / GTS. Neither really had an approach.


----------



## shk021051 (Mar 21, 2009)

what is time Havok Physics realese?


----------



## FordGT90Concept (Mar 21, 2009)

It was first released in 2000 and appeared on 150 games since.


----------



## alexp999 (Mar 21, 2009)

Not taking a dig at Physx, but isnt Havok a lot more effeceint and use less resources on the CPU?

I have seen Havok on loads of games, especially on the 360, but hardly ever physx.


----------



## FordGT90Concept (Mar 21, 2009)

Havok is growing with CPU power (Havok FX was supposed to run on ATI and NVIDIA cards but it might be canceled).  PhysX is really quite pointless when Havok does the job without requiring/recommending additional hardware.


----------



## Mussels (Mar 21, 2009)

alexp999 said:


> Not taking a dig at Physx, but isnt Havok a lot more effeceint and use less resources on the CPU?
> 
> I have seen Havok on loads of games, especially on the 360, but hardly ever physx.



as has been said, havoc is Physics as we know it, while PhysX is just looks with no substance.

Look at mirrors edge, the physx does nothing but make glass and cloth look better when shot.


----------



## alexp999 (Mar 21, 2009)

Thats what I have never got about Physx, it never seems to do that much but consumes tons of resources in the process.

Ragdoll physics of havok on the other hand are quite spectacular.


----------



## btarunr (Mar 21, 2009)

The one department where PhysX looks more promising than Havok, is destructible environments (DE). We got just a small glipse of that in Warmonger. DEs need a lot of processing, which is why GPUs and PPU have been able to keep up. You're not throwing barrels at your enemy, you're breaking a wall that accurately crumbles down into a large number of fragments.


----------



## alexp999 (Mar 21, 2009)

btarunr said:


> The one department where PhysX looks more promising than Havok, is destructible environments (DE). We got just a small glipse of that in Warmonger. DEs need a lot of processing, which is why GPUs and PPU have been able to keep up. You're not throwing barrels at your enemy, you're breaking a wall that accurately crumbles down into a large number of fragments.



What was the physics engine that the force unleashed used for its destructible environments?

EDIT:

Ahh wait here it is:

DMM Engine

That game used three physics engines! 

Havok - General Physics
Euphoria - Character physics
DMM - Environmental Physics


----------



## btarunr (Mar 21, 2009)

PhysX does the job of all three: General (fluid, projectiles,  - as in Warmonger, UT3), Character - as in Mirror's Edge, and Environmental - as inWarmonger, Cryostasis.


----------



## alexp999 (Mar 21, 2009)

I didnt think Physx did character models? Just environments and fluids?

EDIT:

Just read the physx page, seems they do. Tho I have never seen it implemented.

I still dont get the cloth physx tho. Does anyone actually think it looks real


----------



## DarkMatter (Mar 21, 2009)

Sorry but some comments in this thread are just stupid. And TBH I call BS on them and trolling up to this point. Sorry because I'm talking about people that is been in TPU for a long time, but I'm just amazed of how much people can talk about (and bash) a thing, without even knowing what it does. *PhysX no collision detection? My GOD! Character physics and ragdoll?  Of course it does all those things FFS!! Requiring aditional hardware? NO!! (Only for massive physics, you don't need it for very small number of particles, or 50-100 boxes)*

You guys are talking too much and you never saw one single PhysX demostration!! PhysX has everything Havok has always have, plus many other things like the ones people are mentioning here that they want, like real fluids, massive physics (I suggest you see a pair of demos)... You could have had ALL those things implemented since 2006, if Intel had not tried so hard to ban it from games (yeah even before the adquisition it was for their interest) OR if you didn't asked so passionately to ban hardware physics from games. If what you trully wanted is all that, you could have asked AMD to support it instead of bashing a product you know NOTHING about. :shadedshu

EDIT: BTW examples of PhysX running in software mode (and crappy console CPU) are: Gears of War, Mass Effect. I don't know you, but I would say that those two games have amazing physics.


----------



## FordGT90Concept (Mar 21, 2009)

Mass Effect is ran on Unreal Engine 3. Unreal Engine 3 has core physics coded by James Golding and also offers support for NVIDIA PhysX as middleware.  It is present but that doesn't mean they have to use it.


Still, I have beat Mass Effect probably 3-5 times already and not one time have I thought to myself "this game looks pretty" or "those are nice physics."  Actually, I scolded the physics a few times when a Krogan gets bionic lifted on Feros and falls down under a scaffolding where he can't be killed.  Not once did I praise the physics or graphics because frankly, I couldn't care less about them.


----------



## ShadowFold (Mar 21, 2009)

DarkMatter said:


> Sorry but some comments in this thread are just stupid. And TBH I call BS on them and trolling up to this point. Sorry because I'm talking about people that is been in TPU for a long time, but I'm just amazed of how much people can talk about (and bash) a thing, without even knowing what it does. *PhysX no collision detection? My GOD! Character physics and ragdoll?  Of course it does all those things FFS!! Requiring aditional hardware? NO!! (Only for massive physics, you don't need it for very small number of particles, or 50-100 boxes)*
> 
> You guys are talking too much and you never saw one single PhysX demostration!! PhysX has everything Havok has always have, plus many other things like the ones people are mentioning here that they want, like real fluids, massive physics (I suggest you see a pair of demos)... You could have had ALL those things implemented since 2006, if Intel had not tried so hard to ban it from games (yeah even before the adquisition it was for their interest) OR if you didn't asked so passionately to ban hardware physics from games. If what you trully wanted is all that, you could have asked AMD to support it instead of bashing a product you know NOTHING about. :shadedshu
> 
> EDIT: BTW examples of PhysX running in software mode (and crappy console CPU) are: Gears of War, Mass Effect. I don't know you, but I would say that those two games have amazing physics.



So where are these AWESOME games that have these awesome PhysX demonstrations? The only game that made me go "hm, nice physics" was half-life 2. when i had my 280, nothing with PhysX was any good.

And i have Mass Effect, it's one of my favorite games.


----------



## imperialreign (Mar 21, 2009)

ShadowFold said:


> So where are these AWESOME games that have these awesome PhysX demonstrations? The only game that made me go "hm, nice physics" was half-life 2. when i had my 280, nothing with PhysX was any good.
> 
> And i have Mass Effect, it's one of my favorite games.



Only game I can recall that has activelly used the _new_ PhysX implimentation was *Mirrors Edge*

Havok, though, has been the mainstream physics engine for absolute ages . . . and both ATI/AMD and Intel have supported it in the past (and still do).  Some of the more popular titles names using Havok:

FEAR
FEAR 2
Thief: Deadly Shadows
Timeshift
Assassin's Creed
Bioshock
Company of Heroes
Fallout 3
Half-Life 2
Halo 3
StarCraft II
Diablo 3
Ghost Recon: Advanced Warfighter 2

as well as:

Futuremark 3Dmark05
Futuremark 3Dmark06
Futuremark 3DmarkVantage

and countless other mainstream titles . . . PhsyX itself, based on Aegia's engine, is good . . . but it's not as heavily supported as Havok.

Thing is, if AMD and Intel can come together and start agreeing on implimentation of the Havok engine (of which, IIRC, Intel had bought back in '07), they could quickly and easily drive nVidia out of the physics market . . . Haok is used across both console and PC platforms, and has the bigger market dominance over PhsyX.  the only thing that nVidia has going for them, in regards to their implimentation, is their large GPU dominance . . . but Intel and AMD working together could quickly drive them out.


----------



## hat (Mar 21, 2009)

great... havok on amd gpus, physx on nvidia gpus... now what am I supposed to do? I got all excited cause I could set my IGP to run Physx while my 9800gt focuses its undivided attention on my games but now this comes out. sigh


----------



## Lillebror (Mar 21, 2009)

imperialreign said:


> ...Futuremark 3DmarkVantage..



There is a reason why people with nvidia cards scores so high  It uses physx to offload it to the cpu - or the gpu if its enabled!


----------



## imperialreign (Mar 21, 2009)

Lillebror said:


> There is a reason why people with nvidia cards scores so high  It uses physx to offload it to the cpu - or the gpu if its enabled!



there has been a lot of debate over that in the past . . . namely, over whether or not the PhysX scores are legitmate . . .

even still - although nVidia might be the leader in the GPU market . . . if AMD and Intel ever collaborate and push Havok further, nVidia nad their monolithic hardware wouldn't stand a chance in the physics market against the two.

But, that all balances against AMD and Intel ever deciding to work together with Havok implimentation.


----------



## FordGT90Concept (Mar 21, 2009)

hat said:


> great... havok on amd gpus, physx on nvidia gpus... now what am I supposed to do? I got all excited cause I could set my IGP to run Physx while my 9800gt focuses its undivided attention on my games but now this comes out. sigh


I think Intel and AMD will be releasing processors with dedicated PPUs on-die (or GPUs that can act as a physics processor).  When not used for physics, it could be used for something else.

That's another reason why NVIDIA feels threatened and was starting to talk about making their own x86 CPU.


----------



## Error 404 (Mar 21, 2009)

PhysX requires 256 MB of RAM and 16 SPUs on an 8x00 series card or higher, at minimum, to work, IIRC.
My 9600 GT has 64 SPUs, so if I enably PhysX on it then that reduces my SPU count to a minimum of 48. That is not good!
ATI cards have up to 800 SPUs. How many of those would Havok based Physics require to run properly? 40, 80, 200? I'd guess 80 because thats what their lowest end cards usually have. Any other ideas on SPU count required for this?


----------



## Kursah (Mar 21, 2009)

Error 404 said:


> PhysX requires 256 MB of RAM and 16 SPUs on an 8x00 series card or higher, at minimum, to work, IIRC.
> My 9600 GT has 64 SPUs, so if I enably PhysX on it then that reduces my SPU count to a minimum of 48. That is not good!
> ATI cards have up to 800 SPUs. How many of those would Havok based Physics require to run properly? 40, 80, 200? I'd guess 80 because thats what their lowest end cards usually have. Any other ideas on SPU count required for this?



The way SPU's are counted between ATI and NV is different, there's a breakdown of it on TPU and on the web. Not that big of a deal, both took slightly different routes on SP's and types of SP's, which is good imo, both have shown that each route is quite capable.

I think it's very cool to see Havok getting support like this, really what I would like to see is the two in comparison in the same game via middleware patch or something. Show the differences, show the effects/affects of each engine, etc. I think Havok is great stuff since it's been used so long, but I dont' know much about it to know just how well it will work for more realistic games in the future...same with PhysX though. While it is neat, it's not used, I don't really care either way yet because there are quite a few games that use CPU driven proprietary physics engines for that specific game that works fine. Though if we could see a blend of PhysX/Havok that could be something truly worth having around, that'd be the way to go...as-far-as AMD and Intel making Havok a standard, it could happen...whether it will...we'll find out within the next couple years I believe. None-the-less, not worth making a big deal out of till there's a big deal to be made from results imo. I want to see AMD/ATI cards with Physx support on the end-user side like NV's had for PhysX for months to make my own judgement...will you notice a difference in HL2 or any other game that uses Havock with a newer processor being offloaded and new GPU being loaded more? Could be more negative than good depending on how it's executed and just what's going on in the particular scene I suppose...I'll wait and not really worry about it till there's something more solid and out there for end-users.


----------



## DarkMatter (Mar 21, 2009)

FordGT90Concept said:


> Mass Effect is ran on Unreal Engine 3. Unreal Engine 3 has core physics coded by James Golding and also offers support for NVIDIA PhysX as middleware.  It is present but that doesn't mean they have to use it.
> 
> 
> Still, I have beat Mass Effect probably 3-5 times already and not one time have I thought to myself "this game looks pretty" or "those are nice physics."  Actually, I scolded the physics a few times when a Krogan gets bionic lifted on Feros and falls down under a scaffolding where he can't be killed.  Not once did I praise the physics or graphics because frankly, I couldn't care less about them.



In those games PhysX is the physics engine in use.
Anyway, did you praise the ones in Oblivion?? You can't blame an engine because of how it has been used in a game...



ShadowFold said:


> So where are these AWESOME games that have these awesome PhysX demonstrations? The only game that made me go "hm, nice physics" was half-life 2. when i had my 280, nothing with PhysX was any good.
> 
> And i have Mass Effect, it's one of my favorite games.



I have said it already. There's almost no game using it to all it's extension because. 

a) Intel and AMD have tried so hard to ban PhysX from games. 

b) because of the comments from so many people anong the lines seen here. If developers see that people don't care about physics they will not spend their time implementing anything.

My comment was not for those who don't care about physics (good for them), is for those who seem to want some better physics and at the same time are bashing PhysX, which has been delivering exactly what they wanted since it's creation, but could never be implemented because of the points above.

And my post was directly directed at those spilling BS about that PhysX can't do this or that. It can do everything that Havok can do on the CPU and much much more when on the GPU (until now, we'll see). I'm in no way saying this Havok GPU implementation is worse than PhysX, but I can almost say it won't be better either. Thing is we don't know.

DON'T expect this other implementation to be implemented more than PhysX, as it will face the same problems, unless Intel really wants it implemeted, which would be very suspicious. It's coming 1-2 years later so it will take time nevertheless.

All in all, my post was regarding the BS about PhysX (that it is flawed, no collision, etc), and not saying it's any better than other engines. GPU physics is much better than any CPU based physics and PhysX is just a very good one that has already proven itself. On the other hand, this Havok implementation still needs to demostrate if it has what it takes. Yet all of you are already praising it as if it was the Godsend and at the same time bashing PhysX, with clueless allegations. I wonder if it has anything to do with who is releasing it?? 

I don't care if it's PhysX or is Havok or is any other one the physics implementation that wins, but I want it NOW already and PhysX is the only one that can do it right now. Thats why I support it, why I have always supported it, not because of who it belongs. On the other hand is pretty clear the bias that most of you guys have. GPU physics was a waste of time until yesterday, but it just takes one newspost to make it the best thing ever and now everybody wants massive physics, fluids and whatnot. That is, the same things that Ageia was doing 4 years ago and Nvidia was capable of doing since the adquisition, but this time in the hands of someone else. Because, you are not happy because this is an open standard, because it's not, nor because it's free for the developers, because it's not, nor because it's a better implementation, because you don't know. You are happy because it's AMD, period. And that's plain and simply biased.

Just to finish, tell me which PhysX demos you have seen, because it's pretty clear for me you didn't see anyone. There are tons of videos in youtube if you can't see them directly on a Nvidia GPU.


----------



## TheMailMan78 (Mar 21, 2009)

Ok I don't care about this debate. When will I see some drivers? I have a 4850 just itching to do some physics processing.


----------



## DarkMatter (Mar 21, 2009)

imperialreign said:


> Some of the more popular titles names using Havok:
> 
> ...
> Ghost Recon: Advanced Warfighter 2
> ...



Excuse me???



> Thing is, if AMD and Intel can come together and start agreeing on implimentation of the Havok engine (of which, IIRC, Intel had bought back in '07), they could quickly and easily drive nVidia out of the physics market . . . Haok is used across both console and PC platforms, and has the bigger market dominance over PhsyX.  the only thing that nVidia has going for them, in regards to their implimentation, is their large GPU dominance . . . but Intel and AMD working together could quickly drive them out.



That is completely true, but there's nothing good about that. What do you think it will happen when PhysX (or Nvidia) is out of the game? Intel will eat AMD with some fish and chips, alltogether. AMD is giving Intel the keys to the gaming and GPU markets and Intel will be second to none, at least if they give them such advantages and AMD can't afford that luxury. It's funny because people think it was smart for AMD to not adopt PhysX because it belonged to Nvidia, but now them supporting Intel's Havok is the best thing ever? And the thing is that the company against which AMD has filled lawsuits for unfair competition is Intel and not Nvidia. Also while AMD has released many competent CPUs that were just as fast and sometimes faster in the bussiness market, or in mainstream programs, it's been almost invariably lagging behind in games, I wonder why...


----------



## ShadowFold (Mar 21, 2009)

DarkMatter said:


> In those games PhysX is the physics engine in use.
> Anyway, did you praise the ones in Oblivion?? You can't blame an engine because of how it has been used in a game...
> 
> 
> ...



Tl;dr
Show me some games with full PhysX utilization and maybe I will think it's ok but for the time being, it's a dead engine.


----------



## TheMailMan78 (Mar 22, 2009)

DarkMatter said:


> What do you think it will happen when PhysX (or Nvidia) is out of the game? Intel will eat AMD with some fish and chips, altogether. AMD is giving Intel the keys to the gaming and GPU markets and Intel will be second to none, at least if they give them such advantages and AMD can't afford that luxury.


 The world losing PhysX will not mean "game over" for AMD or Nvidia. As far as Nvidia being shut down I don't think we have anything to worry about. Its not like the only thing keeping them alive is PhysX.

Also Intel CANNOT eat AMD with some "fish and chips". If they could they would have already. Intel would love nothing more than to be the undisputed king of the hill. AMD taking PhysX on with Havok is just good old competition.

No wheres my damn drivers?


----------



## FordGT90Concept (Mar 22, 2009)

DarkMatter said:


> In those games PhysX is the physics engine in use.
> Anyway, did you praise the ones in Oblivion?? You can't blame an engine because of how it has been used in a game...


Oblivion's only strong suit is length of gameplay and the voice acting.  If you do everything there is to do in the game with official mods and the expansion pack, you can easily break 100 hours of gameplay.  The mechanics of the game weren't really notable (movement seemed a bit awkward, all maps were pretty dumbed down/repetitive, combat is pretty bland and repetitive, etc.).

The only game I'd say that had notably good physics is Freelancer (Havok engine).  When you get hit by those disorientation mines, holy $h!t.  I can't say any other game impressed me in regard to physics.

The only game that impressed me in regards to graphics was X3: Reunion.  It was just awesome getting close to a capital ship and seeing all the details on its surface.  They did a brilliant job there and yet, it still ran well on lowly hardware.  I am more impressed by them taking the time to really get it right (the models/textures) more so than the "eye-candy."


----------



## DarkMatter (Mar 22, 2009)

TheMailMan78 said:


> The world losing PhysX will not mean "game over" for AMD or Nvidia. As far as Nvidia being shut down I don't think we have anything to worry about. Its not like the only thing keeping them alive is PhysX.
> 
> Also Intel CANNOT eat AMD with some "fish and chips". If they could they would have already. Intel would love nothing more than to be the undisputed king of the hill. AMD taking PhysX on with Havok is just good old competition.
> 
> No wheres my damn drivers?



By eat alive I meant that Intel had >90% of the market share in both CPU and GPU markets. They don't want AMD to dissapear.

On the contrary, it's for Intel's best interest to keep AMD alive, but with the smaller market share posible. Intel could have and can crush AMD whenever they liked to. Their CPUs are cheaper to make so they can actually release them cheaper and everybody knows they are faster. That is specially true every time they release a new batch on a lower fab process. When 32nm are released they could put the new processors at a price that AMD would never survive, but as I said they will never do it, because it's better to have a weak enemy that you already know than letting a new player enter (also most probably that new player would adquire AMD just in time). 

Only reason there's no more relevant companies in the market, is because there's always only place for two: the leader (which ususally offers the best but at a price) and the alternative to the leader, which is the cheaper alternative. If a 3rd tries to enter a market it has to be significantly better than the mentioned alternative, while being cheap or will never take off. Why? Because most people wants products from the leader and if they can't afford them, they will always elect the cheap alternative that they already know, very few will take the cheap, slow and NEW alternative. It's hard to make a new product, so very few times you will make a better product than the others and because you are new, you will never get enough revenue to keep going with the other two. 

AMD is the shield that Intel has against other companies that could want to enter the market, even something like IBM. IBM doesn't need to enter the consumer market, and it's not for their best interest to fight against Intel and AMD there. They would be 3rd, even when they are IBM, but without AMD there would be a hole that IBM could very easily fill and once they entered and obtain AMD's current market share, they could do a lot of things to compete, things that AMD can't do because it is so small.

And apart from that, there is the fact that they could face some issues regarding monopoly if AMD didn't exist anymore, and no other took over. They could be forced to make x86 free for all, for example.


----------



## DarkMatter (Mar 22, 2009)

FordGT90Concept said:


> Oblivion's only strong suit is length of gameplay and the voice acting.  If you do everything there is to do in the game with official mods and the expansion pack, you can easily break 100 hours of gameplay.  The mechanics of the game weren't really notable (movement seemed a bit awkward, all maps were pretty dumbed down/repetitive, combat is pretty bland and repetitive, etc.).
> 
> The only game I'd say that had notably good physics is Freelancer (Havok engine).  When you get hit by those disorientation mines, holy $h!t.  I can't say any other game impressed me in regard to physics.
> 
> The only game that impressed me in regards to graphics was X3: Reunion.  It was just awesome getting close to a capital ship and seeing all the details on its surface.  They did a brilliant job there and yet, it still ran well on lowly hardware.  I am more impressed by them taking the time to really get it right (the models/textures) more so than the "eye-candy."



Physics in Oblivion were crappy and they used Havok, that was my only point with that. Other games have amazing physics and they use Havok. It' irrelevant which engine you use as long as you use it well. There were tons crappy games using Unreal Engine 2, that even ran slow and had bad graphics (Postal 2 anyone?), but that doesn't make UE2 a bad engine. On the contrary it was amazing. PhysX is the same. They lack support and it's because of that you don't see games using it. It has nothing to do with
 how good the engine is.

Crysis has good physics and the ones that use GPU accelerated PhysX too have very good physics. If GPU Havok is well implemented it will also offer good physics with a good ammount of integration, but I am still skeptical of why would Intel let AMD make their CPUs look like crap at handling their own physics engine. IMO there's something shaddy there, or this GPU Havok is nothing more than a PR stunt. I vote for this last thing.


----------



## Mussels (Mar 22, 2009)

Physx demos are not Physx games.

Two physx items can collide and have merry fun with each other - but non physx entities cant collied. Mirrors edge as a loose example - you can shoot cloth and have holes appear in it, but you cant go walking on said cloth, or drop a gun on it and expect it to stay there in the realistic *appearing* cloth.


----------



## DarkMatter (Mar 22, 2009)

Mussels said:


> Physx demos are not Physx games.
> 
> Two physx items can collide and have merry fun with each other - but non physx entities cant collied. Mirrors edge as a loose example - you can shoot cloth and have holes appear in it, but you cant go walking on said cloth, or drop a gun on it and expect it to stay there in the realistic *appearing* cloth.



 What the hell are you saying? If you make the character walk on a cloth item, the cloth item reacts to your body as it would do in real life. I have not tried to throw the gun, but in the case that it doesn't react, then that's because it's not been declared like a physics item in the game. Under PhysX pretty much every object is a physics object, in the sense that it has all the properties that a real object would have.


----------



## Mussels (Mar 22, 2009)

DarkMatter said:


> What the hell are you saying? If you make the character walk on a cloth item, the cloth item reacts to your body as it would do in real life. I have not tried to throw the gun, but in the case that it doesn't react, then that's because it's not been declared like a physics item in the game. Under PhysX pretty much every object is a physics object, in the sense that it has all the properties that a real object would have.



No its not. Phsyx is CAPABLE of it, but they simply cant do it. I'll try and explain it more simply.

Path 1: Make the game use a generic physics engine, for people without CUDA (old Nv cards, ATI) - Physx does as little as possible in this example, so that they dont need to duplicate any coding (two physics engines for the same items) - thats when you have items that dont collide together.

path 2: make two engines coded for everything. When physx is enabled everything moves over, and everything can interact with everything else.


If you were strapped for cash and time as a game developer with an unknown, brand new concept for a game... which would you take?


----------



## btarunr (Mar 22, 2009)

ShadowFold said:


> Tl;dr
> Show me some games with full PhysX utilization and maybe I will think it's ok but for the time being, it's a dead engine.



Cryostasis: Sleep of Reason. period.

DirectX 10.1 and Hardware Tesselation are dead for the time being.


----------



## Haytch (Mar 22, 2009)

ShadowFold said:


> Tl;dr
> Show me some games with full PhysX utilization and maybe I will think it's ok but for the time being, it's a dead engine.



I remember playing Cell Factor Revolution back in the day, correct me if im wrong, but wasnt that all PhysX ?   I remember running a few tests, with alternating hardware and what not.  Looked to me like that game was all the Asus Ageia PhysX P1 card.

Might have to look that up when i get home.


----------



## ShadowFold (Mar 22, 2009)

Cell Factor doesn't work on nvidia PhysX. I tried.. And I forgot Cryostasis, but it's really the ONLY game that uses it well.. I remember trying the demo and it was ok, but ran pretty bad on a gtx 280 and didn't really see anything cool besides the mercury, I mean water.


----------



## DarkMatter (Mar 22, 2009)

Mussels said:


> No its not. Phsyx is CAPABLE of it, but they simply cant do it. I'll try and explain it more simply.
> 
> Path 1: Make the game use a generic physics engine, for people without CUDA (old Nv cards, ATI) - Physx does as little as possible in this example, so that they dont need to duplicate any coding (two physics engines for the same items) - thats when you have items that dont collide together.
> 
> ...



In Mirror's Edge when you walk on a cloth it interacts with your character. That is when extensive PhysX is enabled and when not, I suppose it doesn't interact, because current CPUs don't have enough power to handle that kind of extensive use. If it'd be using another engine, like Havok it wouldn't interact either, you need stream power to do so, be it a GPU or a PPU. So I'm lost and don't know what are you trying to say.

Anyway, in path 1 you can take PhysX just as well as you could take any other, namely Havok, and PhysX is free. If you are not going to make anything special, the free version of PhysX, the one without access to the source code, is just enough.


----------



## DaveK (Mar 22, 2009)

Woot go Havok! They're Irish  Might stop by their office someday.


----------



## Mussels (Mar 22, 2009)

DarkMatter said:


> In Mirror's Edge when you walk on a cloth it interacts with your character. That is when extensive PhysX is enabled and when not, I suppose it doesn't interact, because current CPUs don't have enough power to handle that kind of extensive use. If it'd be using another engine, like Havok it wouldn't interact either, you need stream power to do so, be it a GPU or a PPU. So I'm lost and don't know what are you trying to say.
> 
> Anyway, in path 1 you can take PhysX just as well as you could take any other, namely Havok, and PhysX is free. If you are not going to make anything special, the free version of PhysX, the one without access to the source code, is just enough.



alright, my info on the cloth stuff could be wrong.

If you want to test it, test how it works once a few holes are in the cloth - i have a suspicion they're using another physics engine for the collision detection on top of physx (walking on the cloth wont change between physx on and off, for example)


----------



## DarkMatter (Mar 22, 2009)

ShadowFold said:


> Cell Factor doesn't work on nvidia PhysX. I tried.. And I forgot Cryostasis, but it's really the ONLY game that uses it well.. I remember trying the demo and it was ok, but ran pretty bad on a gtx 280 and didn't really see anything cool besides the mercury, I mean water.



It doesn't matter that CellFactor doesn't work on Nvidia PhysX, it's just an example of what a gane with PhysX enabled can do. If the developers wanted, they could port it to Nvidia PhysX. 

About Cryostasis I have seen it playing in a 9800GTX+ and it ran well, so I dunno why it ran bad on a GTX280.

Anyway, as I have already said, you won't see any extensive implementation of either of the GPU physics until both GPU companies support them. And that's lame, and I will never never never understand why AMD went the Intel route instead of the PhysX route. I could have understood if they made their own engine, but they took Havok just after discarting PhysX because was propietary and not an open standard, well, what's Havok??  Also the Havok engine will never be optimized to run on GPUs, only in x86, because even Intel's GPU will be x86.


----------



## DarkMatter (Mar 22, 2009)

Mussels said:


> alright, my info on the cloth stuff could be wrong.
> 
> If you want to test it, test how it works once a few holes are in the cloth - i have a suspicion they're using another physics engine for the collision detection on top of physx (walking on the cloth wont change between physx on and off, for example)



Man it's all physics. Period.


----------



## Mussels (Mar 22, 2009)

DarkMatter said:


> Man it's all physics. Period.



its a very different story if they're using multiple engines, to cover physx flaws.


----------



## DarkMatter (Mar 22, 2009)

Mussels said:


> its a very different story if they're using multiple engines, to cover physx flaws.



They are not using two engines. Where have you seen that PhysX has no collision? That's BS, it HAS. It's in the features list and everything.



> Complex rigid body object physics system
> 
> The rigid body dynamics component enables you to simulate objects with a high degree of realism. It makes use of physics concepts such as reference frames, position, velocity, acceleration, momentum, forces, rotational motion, energy, friction, impulse, collisions, constraints, and so on in order to give you a construction kit with which you can build many types of mechanical devices.
> 
> ...



http://developer.nvidia.com/object/physx_features.html


----------



## Mussels (Mar 22, 2009)

DarkMatter said:


> They are not using two engines. Where have you seen that PhysX has no collision? That's BS, it HAS. It's in the features list and everything.
> 
> 
> 
> http://developer.nvidia.com/object/physx_features.html




you're ignoring what i'm saying.

I'm saying Physx items cant collide with things outside the physx engine - which means they either code two engines (one for Nv physx on and one for off) or they do a hybrid between the two (to make it lower requirements/workable in software)

They arent making 100% of the game world ported entirely the physx, or the damn thing wouldnt run without hardware acceleration!


----------



## DarkMatter (Mar 22, 2009)

Mussels said:


> you're ignoring what i'm saying.
> 
> I'm saying Physx items cant collide with things outside the physx engine - which means they either code two engines (one for Nv physx on and one for off) or they do a hybrid between the two (to make it lower requirements/workable in software)
> 
> They arent making 100% of the game world ported entirely the physx, or the damn thing wouldnt run without hardware acceleration!



And you are not paying attention to anything that I have said in the thread. PhysX DOESN'T REQUIRE hardware at all. It can run in software mode...


----------



## Mussels (Mar 22, 2009)

DarkMatter said:


> And you are not paying attention to anything that I have said in the thread. PhysX DOESN'T REQUIRE hardware at all. It can run in software mode...



i've done that. it runs like crap. even an OC'd Q6600 cant handle it without massive FPS decrease, so its useless on most systems.


----------



## btarunr (Mar 22, 2009)

Mussels said:


> i've done that. it runs like crap. even an OC'd Q6600 cant handle it without massive FPS decrease, so its useless on most systems.



My Phenom 9750 seemed pretty comfortable with PhysX Screensaver with 25% particle density.


----------



## DarkMatter (Mar 22, 2009)

Mussels said:


> i've done that. it runs like crap. even an OC'd Q6600 cant handle it without massive FPS decrease, so its useless on most systems.



It doesn't run like crap. At all. IF and only if you enable the PhysX checkbox in Mirror's Edge and disable PhysX in the CP, of course it lags. But that's all about GPU PhysX. CPUs can't handle that kind of massive physics.


----------



## TheMailMan78 (Mar 22, 2009)

btarunr said:


> My Phenom 9750 seemed pretty comfortable with PhysX Screensaver with 25% particle density.



Link to this said screensaver?


----------



## Mussels (Mar 22, 2009)

DarkMatter said:


> It doesn't run like crap. At all. IF and only if you enable the PhysX checkbox in Mirror's Edge and disable PhysX in the CP, of course it lags. But that's all about GPU PhysX. CPUs can't handle that kind of massive physics.



so... you're saying its ok because it runs on a CPU, then your next post saying it doesnt work cause CPU's cant do it? thats... what i said.


How about we just agree to disagree? we're rehashing at this point.


----------



## DarkMatter (Mar 22, 2009)

Mussels said:


> so... you're saying its ok because it runs on a CPU, then your next post saying it doesnt work cause CPU's cant do it? thats... what i said.
> 
> 
> How about we just agree to disagree? we're rehashing at this point.



Let's see. You don't have to code or have two different engines. The CPU can't handle the massive physics that a GPU can do, so when running in software mode those objects (i.e cloth) are declared as non-physx-enabled objects and that's all. In Mirror's cloth items are replaced with animated flags, and some others just disapear from the game.

BTW, I have just tried disabling PhysX in the CP and running ME with PhysX enabled and it runs perfectly in my Quad, maybe a little bit lower FPSs but that's all.
EDIT: It does lag badly when glass is shot down.


----------



## btarunr (Mar 22, 2009)

TheMailMan78 said:


> Link to this said screensaver?



Part of the NVIDIA Powerpack. Find it on NVIDIA's website.


----------



## Lazzer408 (Mar 22, 2009)

Here we go again. They been talking Havok since the 1xxx series. I bet Duke Nukem Forever supports it.


----------



## DarkMatter (Mar 22, 2009)

Lazzer408 said:


> Here we go again. They been talking Havok since the 1xxx series. I bet Duke Nukem Forever supports it.



PhysX has more chances of being DNF's physics API. It started by using Meqon physics, which was aquired by Ageia so...

@Mussels

Maybe you understand the thing better this way:

The only difference between PhysX hardware enabled and software mode is in the number of enabled objects. It can be directly compared to how the different graphics settings work.


----------



## Lazzer408 (Mar 22, 2009)

It was intended as a joke.


----------



## DarkMatter (Mar 22, 2009)

Lazzer408 said:


> It was intended as a joke.



I know.  Sorry.


----------



## kumquatsrus (Mar 22, 2009)

ShadowFold said:


> Cell Factor doesn't work on nvidia PhysX. I tried.. And I forgot Cryostasis, but it's really the ONLY game that uses it well.. I remember trying the demo and it was ok, but ran pretty bad on a gtx 280 and didn't really see anything cool besides the mercury, I mean water.



yes, cellfactor doesn't work on nvidia physx.  however, cellfactor: revolution does indeed work on nvidia physx.  i still play it right now from time to time on my 8800GT sli setup.  i tried running the original cellfactor before, and an error message pops up stating that a ppu is not detected.  so download the free game (cellfactor revolution, not cellfactor) and check it out for yourself.  it should work.


----------



## ShadowFold (Mar 22, 2009)

Well I don't have my 280(and don't plan on going back..) anymore, but I'm pretty sure I had revolution.


----------



## Hayder_Master (Mar 22, 2009)

yeeeeeeeeeeeehaaaaaaaaaaaaaa, at last physics from ATI , and i my gift to nvidia is a song from AKON ( SMACK THAT ) .
from me i was think about swap my 4870 with 4870x2 and pick an nvidia card for physics but now time to crossfire my 4870


----------



## Hayder_Master (Mar 22, 2009)

Mussels said:


> Yay. another reason to justify me getting my 4870.



you are dumm lucky man mussels ,  why you don't pick 4870 from the begging


----------



## Mussels (Mar 22, 2009)

hayder.master said:


> you are dumm lucky man mussels ,  why you don't pick 4870 from the begging



I didnt pick the 4870 because it was too expensive. I upgraded once hte prices dropped, and my only real concern was the lack of PhysX.

Now that ATI are accelerating Havok (which most of the games i play already use) i'm even more convinced i made the right choice.


----------



## TheMailMan78 (Mar 22, 2009)

ShadowFold said:


> Well I don't have my 280(and don't plan on going back..) anymore, but I'm pretty sure I had revolution.



Welcome to the darkside young one. My shares welcome your adoption of superior hardware.


----------



## DeathTyrant (Mar 22, 2009)

Mussels said:


> Now that ATI are accelerating Havok (which most of the games i play already use) i'm even more convinced i made the right choice.


 I have to agree with you. While there are some very cool examples of Physx in Cryostasis, Mirror's Edge, and some Tech Demos, I play more games with Havok. A lot more.


----------



## btarunr (Mar 22, 2009)

Sure, you people may be having more games that run Havok, but the variant you have is the one that makes do with CPU. Me here, with my GTX 260 will be able to play those games, plus also PhysX. The point I am trying to make is, Havok and its GPU-accelerated avatar is in essence a newbie, we are starting on a fresh slate. Havok in its present form is generations behind of PhysX and, everyone, including GeForce users have access to it. When games with the amount of Havok physics-effects do come out, which really could do with GPU-acceleration, you will have many more games that are based on PhysX. So nobody really made an investment by buying ATI hardware in this particular case.


----------



## DarkMatter (Mar 22, 2009)

btarunr said:


> Sure, you people may be having more games that run Havok, but the variant you have is the one that makes do with CPU. Me here, with my GTX 260 will be able to play those games, plus also PhysX. The point I am trying to make is, Havok and its GPU-accelerated avatar is in essence a newbie, we are starting on a fresh slate. Havok in its present form is generations behind of PhysX and, everyone, including GeForce users have access to it. When games with the amount of Havok physics-effects do come out, which really could do with GPU-acceleration, you will have many more games that are based on PhysX. So nobody really made an investment by buying ATI hardware in this particular case.



Exactly. Nvidia had a hard time trying to convince developers to use an engine that only runs on their hardware, and they've been outselling Ati 2 to 1 for almost 2 years, meaning there are probably twice as much Nvidia cards than Ati's. Good luck convincing developers to adopt a thing that only runs in 35% of the hardware out there AMD.


----------



## FryingWeesel (Mar 23, 2009)

newtekie1 said:


> I really don't care what standard is used, as long as *one* is picked as the standard.  Right now, this going back and forth, trying to split the industry between PhysX and Havok is only leading to developers not wanting to use either.
> 
> The industry needs to pick a single physics standard that runs on all hardware, and move on with that.  That is the only way we will see developers start to truly pick up detailed physics in games.



your right but wrong, What needs to happen is all physx style work needs to be done via OpenCL removing the hardware lockin's to nvidia/ati/exct and leaving it possible for ANYBODY to support physx/havoc/exct phisics engines hardware accelerated.

The problem we have today is that we have the 2 top companys in GPU's acting childish and refusing to just PLAY NICE and support everything.

all they are doing is hurting the customer in this case, seems like intel/nvidia(both acting very childish) are at war constantly as now are intel/amd(intel being childish) and ati/nvidia ofcorse "both acting childish by not supporting eachothers standreds/capabilities.

large/huge companys acting like children=we all loose..........


----------



## Mussels (Mar 23, 2009)

I wouldnt be surprised if havok ties into DX11's GPGPU standards, and thats how ATI is supporting it.

Its possible because MS and Intel (owner of havok) are BFF's, and read the first post clearly - it is "assumed" this will be done via ATI stream... theres no evidence either way that this cant be tied into DX11 (we'll need to see what OS they run on when they do this demo)


----------



## FryingWeesel (Mar 23, 2009)

DarkMatter said:


> Exactly. Nvidia had a hard time trying to convince developers to use an engine that only runs on their hardware, and they've been outselling Ati 2 to 1 for almost 2 years, meaning there are probably twice as much Nvidia cards than Ati's. Good luck convincing developers to adopt a thing that only runs in 35% of the hardware out there AMD.



acctualy physx adotion has jumped drastickly since nVidia bought it and made it open(anybody can use or support it) 

EA for example has distributed it to most if not all of their dev houses(i hate EA but they are without dought one of the largist game publishers out there) 

partial list of physx games from wikipedia.



> Games
> 
> The following games feature PhysX support (list may be incomplete):[14]
> 2 Days to Vegas
> ...



http://en.wikipedia.org/wiki/PhysX


in my experiance PhysX and havoc both have their own advanteges, Havoc has better ragdoll effects, where Physx has FAR FAR FAR better vehicles I am not alone in this opinion, games like Mass Effect show how much better physx is with vehicles then havoc.

havoc vehicles are.......well they feel like toys is what i think when i play havoc games that have them.

Neither is better if you are talking about being well rounded, they both have their own plus's and minuses.

Im betting that nvidia ports their physx driver to support OpenCL in dx11 (and most likely there will be an opencl update/install for all windows versions) 

stop this "havoc is better" and "Physx is better" and such, they are both good engines, and fact is that if it wasnt for Intel hardware accelerated phsix(havoc/physx/exct) would already have been here back in the x1900days, Intel payed good money to insure that game developers and even Havoc themselves didnt push to get gpgpu support built in, Intel feels everything's going to be on the cpu and that gpu's are dieing/a dead end, they have said this(mostly because they dont got their own gpu's, just GMA thats still based off the I720 a chipset from when AGP first came out) 

blah, damn selfish companys :/


----------



## grunt_408 (Mar 23, 2009)

I guess when the green team and the red team were growing up nobody taught them how to share....!!! Shame shame.  I am staying with ATI, just look at what they have done recently.


----------



## FordGT90Concept (Mar 23, 2009)

FryingWeesel said:


> havoc vehicles are.......well they feel like toys is what i think when i play havoc games that have them.


Freelancer probably had the best collision physics.  Hitting asteroids, space stations, or getting blown up by a mine always acted quite natural assuming the impact doesn't kill you.

If you're talking about driving up hills and jumping off, Test Drive Off-Road 3 and Hard Truck: Apocalypse are IMO better.  Especially HTA.  Mass Effect really didn't have anything special/unique in terms of vehicles and/or physics from my perspective.


----------



## Hayder_Master (Mar 23, 2009)

Mussels said:


> I wouldnt be surprised if havok ties into DX11's GPGPU standards, and thats how ATI is supporting it.
> 
> Its possible because MS and Intel (owner of havok) are BFF's, and read the first post clearly - it is "assumed" this will be done via ATI stream... theres no evidence either way that this cant be tied into DX11 (we'll need to see what OS they run on when they do this demo)



sure it be cuz you say it ("because MS and Intel owner of havok"). so ATI have great support
but there is something confused me how intel and AMD work together


----------



## btarunr (Mar 23, 2009)

Mussels said:


> I wouldnt be surprised if havok ties into DX11's GPGPU standards, and thats how ATI is supporting it.
> 
> Its possible because MS and Intel (owner of havok) are BFF's, and read the first post clearly - it is "assumed" this will be done via ATI stream... theres no evidence either way that this cant be tied into DX11 (we'll need to see what OS they run on when they do this demo)



Well, if it's tied to DX11, Vista and XP users are immediately cut-off from the technology. Vista gets DX11 only later. Stream, on the other hand, is available to XP, Vista, 7, and even Posix-like OSes. That would be a foolish way to start a technology standard.


----------



## Mussels (Mar 23, 2009)

btarunr said:


> Well, if it's tied to DX11, Vista and XP users are immediately cut-off from the technology. Vista gets DX11 only later. Stream, on the other hand, is available to XP, Vista, 7, and even Posix-like OSes. That would be a foolish way to start a technology standard.



Where i'm going with this is related to that.

Under DX11, anyone can use it via GPGPU.

Under DX9/10, ATI translates it to work via stream. This would give ATI a 1-2 year head start on Nv, while choosing hte standard more likely to stay in the market long term (since ATI wont do PhysX, its certainly hampering pickup)


----------



## btarunr (Mar 23, 2009)

You said "tied to", which made me think you meant "exclusive to". Anyway, that would mean increased amount of developer overhead, for AMD that is (to develop Stream and DX11 variants). They would rather minimize that by coding it for Stream. NVIDIA's CUDA model is almost abstract to the OS, it runs on three generations of Windows OS (XP/2003, Vista/2008, and Win 7). So it isn't really a head start of sorts. It's not like the GPGPU standard DX11 brings will force everyone to code for it. CUDA will stay, and so will PhysX.


----------



## Mussels (Mar 23, 2009)

btarunr said:


> You said "tied to", which made me think you meant "exclusive to". Anyway, that would mean increased amount of developer overhead, for AMD that is (to develop Stream and DX11 variants). They would rather minimize that by coding it for Stream. NVIDIA's CUDA model is almost abstract to the OS, it runs on three generations of Windows OS (XP/2003, Vista/2008, and Win 7). So it isn't really a head start of sorts. It's not like the GPGPU standard DX11 brings will force everyone to code for it. CUDA will stay, and so will PhysX.


It is only a theory. I make no claims to it being fact.

Why cant intel code it to work in DX11 GPGPU generically, while ATI works on jsut stream for now?


----------



## btarunr (Mar 23, 2009)

Mussels said:


> Why cant intel code it to work in DX11 GPGPU generically, while ATI works on jsut stream for now?



That is a possibility, though I personally don't think Intel will work on anything GPGPU till it has a GPU of its own. It is the only entity that is seriously threatened by GPGPU, and what implications it has on the future of expensive x86 chips. Sure, it's allowing AMD to do the work of accelerating Havok via GPU, but that's for propagating high-performance Havok itself. It's in essence a "controlled entropy".


----------



## FryingWeesel (Mar 23, 2009)

FordGT90Concept said:


> Freelancer probably had the best collision physics.  Hitting asteroids, space stations, or getting blown up by a mine always acted quite natural assuming the impact doesn't kill you.
> 
> If you're talking about driving up hills and jumping off, Test Drive Off-Road 3 and Hard Truck: Apocalypse are IMO better.  Especially HTA.  Mass Effect really didn't have anything special/unique in terms of vehicles and/or physics from my perspective.



just as an example ME just feels more "real" when driving around then any havoc game i have played, i love havoc ragdoll, specly when its exadgerated like in requiem(body parts flying thu the air and shit) 

And btarunr, ms has confermed that dx11 isnt tied to win7 like 10 was to xp, so i dont see the validity of your point, ms will probbly put out 11 for vista at the same time or VERY close to the same time it becomes avalable to win7.

and ms is using OpenCL (google it) for their GPGPU work, neither ati or nvidia are any farther along in gpgpu then the other atm, diffrance is that ati got out gpgpu first but it didnt go anywhere with the 1900 and 2900 cards other then folding, mostly because Intel is thretened by gpu's doing work that was done by cpu's.

now even the opensource community is getting involved with gpgpu, mediacoder, vlc, and many other projects are working on gpu accelerated work, and they would rather beable to pipe it in via OpenCL because it avoids using cuda or stream dirrectly, thus isnt locked into one companys card or the others.

havoc use has droped off alot since Intel bought havoc......alot of companys have been using other solutions or creating their own.


----------



## btarunr (Mar 23, 2009)

FryingWeesel said:


> And btarunr, ms has confermed that dx11 isnt tied to win7 like 10 was to xp, so i dont see the validity of your point, ms will probbly put out 11 for vista at the same time or VERY close to the same time it becomes avalable to win7.



That's because you clearly didn't read things completely. I said:



btarunr said:


> Well, if it's tied to DX11, Vista and XP users are immediately cut-off from the technology. *Vista gets DX11 only later*. Stream, on the other hand, is available to XP, Vista, 7, and even Posix-like OSes. That would be a foolish way to start a technology standard.



I am aware of the fact that DX11 will be made available to Vista. I am also aware of the possibility that it won't be made available as soon as Windows 7 is released.


----------



## FordGT90Concept (Mar 23, 2009)

Ragdoll always annoyed me because it is only accurate if the individual is unconscious and has no broken bones.  It looks silly/comical 99% of the time.  Seriously, if you're flying through the air, you think you're just going to let wind/gravity have its way with you?  Of course not.  You'll be twisting and turning preparing for the landing.  Instinctively, that means getting your feet closest to the ground.  The only way someone wouldn't do that is if they are knocked out cold.  In which case, I wouldn't expect them to get up right after they land either.


As I stated before, physics in games are only as real as the game developers want them to be.  In most cases, physics (beyond the basics) are one of the lowest priority tasks (if not thee lowest priority task) of game development.  They can cut all kinds of corners there that increase performance quite dramatically and the player won't even notice unless they are looking for it.  That's essentially why the PPU failed--it just doesn't fit the game development paradigm (think cost vs benefit).


For example, Nightfire was far more entertaining to me than Quantum of Solace because Nightfire had very acadish physics (very rapid paced) and player damage schemes (takes 4 shots to the head to kill with the PP9).  If you take that arcade feeling out of it, the game becomes boring.


----------



## DarkMatter (Mar 23, 2009)

Also DX11 or OpenCL doesn't change anything. Nvidia can and will adapt PhysX to DX11/OpenCL when Win 7 launches too. Remember that it took them 3 months to adapt PhysX to CUDA. It won't take much more to adapt it to any other API.

And maybe I'm wrong, but when that happens, it's going to be very interesting, because Ati cards will be able to play PhysX, with AMD's permission or without. The only requirement for PhysX will be a card capable of running DX11 compute shaders then and AMD cards will do so. It will matter very little if AMD wants PhysX on their cards or not, unless they do something shaddy, they will not be able to prevent that.

The one that will win this game is the one that gets better support now and IMHO that's PhysX at the moment.


----------



## jaydeejohn (Mar 23, 2009)

Skipped thru alot of this. We know that ATI's current drivers are for both W7 and Vista, and DX11 implementation on both OS' should go rather quickly. Alot of this depends actually on M$ on the future of physics on gpu. Ill also point out, LRB wont be the same working on Havok as will be nVidia and ATI.
IMO, it all comes down to accessibility, ease of use etc, and any proprietary app will fail long term.


----------



## wolf (Mar 23, 2009)

.................

*slowly backs away from thread*


----------



## TheMailMan78 (Mar 23, 2009)

Ok WHEN will I see this supported on my setup is all I want to know.


----------



## ShRoOmAlIsTiC (Mar 23, 2009)

well I read this entire thread hoping to get some insight on what was happing with physics and ati.  Now all I want to do is pull my hair out.  Luckily I have a shaved head.   We wont know anything till the show this week.  We will probly see a video clip or something of the demonstration and get some kind of insight.  Hopefully its good news for everyone.  I hope ATi falls back on what they were saying back when they had the crossfire + physics in the 790fx boards.  I have my old 1950gt on hand hopeing it will work.


----------



## ShadowFold (Mar 23, 2009)

It's gotta be stream compatible I'm guessing so no X1000's. HD 2000+. I hope the video explodes our minds!


----------



## DarkMatter (Mar 23, 2009)

ShadowFold said:


> I hope the video explodes our minds!



Oh, that's for sure. Ati is much better at that. Ati always has some good looking demos, while Nvidia always shows minimalist tech demos where you can only see the tech they want to show. Like with the ray-tracing demos some time ago: who cares if the ugly and minimalist "veyron demo" from Nvidia was showing a superior technology, if you could see Ruby scaping from the mech, while avoiding the cars all in full and fancy colors??


----------



## TheMailMan78 (Mar 23, 2009)

DarkMatter said:


> Oh, that's for sure. Ati is much better at that. Ati always has some good looking demos, while Nvidia always shows minimalist tech demos where you can only see the tech they want to show. Like with the ray-tracing demos some time ago: who cares if the ugly and minimalist "veyron demo" from Nvidia was showing a superior technology, if you could see Ruby scaping from the mech, while avoiding the cars all in full and fancy colors??



Stop H8tin son!


----------



## ShadowFold (Mar 23, 2009)

but seriously, I don't really care for tech demos. But when it involves physics, I certainly want to see what's up. I love the idea of realistically rendered mass physics with small performance hits!

Also, it irks me that they spelled it hater*ade*, it's AIDE....


----------



## DarkMatter (Mar 23, 2009)

TheMailMan78 said:


> Stop H8tin son!



That is true to a point, but it does enervate me that they released a demo about ray-tracing where you couldn't actually see ray-tracing, even if it was there. But hey it looked good.
Then we also have the froblin demo... Actually most of their demos are pure marketing, but hey you like them, and they serve their purpose, so I guess I should accept that. I know you don't like to hear this, but Ati is all about marketing lately and most of you buy everything it comes from them. I used to love them in the past, when Ati was about the products and the tech and Nvidia was marketing (FX line, even GF 6 series), but lately...


----------



## TheMailMan78 (Mar 23, 2009)

DarkMatter said:


> That is true to a point, but it does enervate me that they released a demo about ray-tracing where you couldn't actually see ray-tracing, even if it was there. But hey it looked good.
> Then we also have the froblin demo... Actually most of their demos are pure marketing, but hey you like them, and they serve their purpose, so I guess I should accept that. I know you don't like to hear this, but Ati is all about marketing lately and most of you buy everything it comes from them. I used to love them in the past, when Ati was about the products and the tech and Nvidia was marketing (FX line, even GF 6 series), but lately...



Good marketing = money. Money for ATI = money for me. Anyway ATIs products havent been this strong in years. This time the marketing is truthful. I mean even Wizz has been rating them high for the price/performance and right now people are "cost effective" minded.


----------



## ShadowFold (Mar 23, 2009)

Think about it this way. It's Intel, when have they failed in recent years?(Just forget Pentium 4 ever happened lol) They've always had the best performing CPU's on the market and have had almost no competition in the high end range until the Phenom II's came out. I think with AMD AND Intel collaborating on this, they can't fail. Now I don't think this will become like a standard since like you said, a lot of people have nvidia cards. They wouldn't leave them behind, they will most likely get it to work on CUDA as well.


----------



## DarkMatter (Mar 23, 2009)

TheMailMan78 said:


> Good marketing = money. Money for ATI = money for me. Anyway ATIs products havent been this strong in years. This time the marketing is truthful. I mean even Wizz has been rating them high for the price/performance and right now people are "cost effective" minded.



I understand what marketing is, and I understand why they have to do it, but that doesn't change the fact that we don't have to acept it as dogma. Marketing is never truthful, and of course they have good products, but in no way they are better in every aspect as Ati and fans want to make the world believe. I can understand they have to do something because of their actual position, but I will never accept false claims in order to help selling the underdog. This is capitalism and they are where they are for a reason, maybe they should have dissapeared in the last couple of years, and let another one enter the game, who knows (I don't want that as you can see in previous posts).

And now this thread, even before we've seen anything, GPU Havok is so much better than PhysX, that suddenly is crap and flawed, and has a much better future. Every Ati fan said how wonderful was Ati because they said they wouldn't support a propietary physics SDK and bashed Nvidia for using one and trying to bring it to games. Now Ati supports a propietary physics API and everyting is so bright and wonderful...

I am really the only one who sees how they try to play with us?? First they say that a propietary API will never catch on and that it will die, then they make everything in their hands to bash that API and finally *one year later* they make their own one. If 2008 was a bad year to adopt a propietary engine because DX11 was coming out a year later, what happens in the same year of release of that said DX11? Some moths before actually?


----------



## DarkMatter (Mar 23, 2009)

ShadowFold said:


> Think about it this way. It's Intel, when have they failed in recent years?(Just forget Pentium 4 ever happened lol) They've always had the best performing CPU's on the market and have had almost no competition in the high end range until the Phenom II's came out. I think with AMD AND Intel collaborating on this, they can't fail. Now I don't think this will become like a standard since like you said, a lot of people have nvidia cards. They wouldn't leave them behind, they will most likely get it to work on CUDA as well.



That's indeed part of the problem I see. IMO Intel is playing with AMD. Havok will run much better on Larrabee than on anything else, Intel will ensure that, and with the huge architecture differences (SPs versus, x86 cores) they have the perfect excuse. Even if AMD released something like Larrabee they know they are much faster with x86, it's graphic what they lack.

In the near future only PhysX running in GPUs could (IMO it will) be faster, so they want to rule it out, just as they did with Ageia in the past. Intel wants to be the bigger high-end GPU provider too and knows he can't compete with Nvidia and Ati in graphics so they are trying to do something. That something was ray-tracing first, but both Nvidia and Ati demostrated they could do that just as well or better, and developers aren't really interested yet, so they have to find something. IMO that's physics but to an extent, controlling. Accelerated physics is going to be the next revolution in games, so pretty soon physics will affect performance as much as graphics and Intel wants to control that part. He wants to play in their field, because PhysX is much better than anything they can do ATM and IMO it will always be. NOT because of the API, which is similar, but because of where it runs, because of the ever evolving GPUs. It's like a racer with the slower car but the better acceleration, his only chance is to overtake the other before the other takes traction. The *bad* part for us is that the slow car will always slow down the fast one thereafter.


----------



## Steevo (Mar 23, 2009)

X1K Toy Shop demo. First use of current production game coding and engines. highly optomized, why can't games run like this now? Water droplet physics, much else.


----------



## TheMailMan78 (Mar 23, 2009)

DarkMatter said:


> I understand what marketing is, and I understand why they have to do it, but that doesn't change the fact that we don't have to acept it as dogma. Marketing is never truthful, and of course they have good products, but in no way they are better in every aspect as Ati and fans want to make the world believe. I can understand they have to do something because of their actual position, but I will never accept false claims in order to help selling the underdog. This is capitalism and they are where they are for a reason, maybe they should have dissapeared in the last couple of years, and let another one enter the game, who knows (I don't want that as you can see in previous posts).
> 
> And now this thread, even before we've seen anything, GPU Havok is so much better than PhysX, that suddenly is crap and flawed, and has a much better future. Every Ati fan said how wonderful was Ati because they said they wouldn't support a propietary physics SDK and bashed Nvidia for using one and trying to bring it to games. Now Ati supports a propietary physics API and everyting is so bright and wonderful...
> 
> I am really the only one who sees how they try to play with us?? First they say that a propietary API will never catch on and that it will die, then they make everything in their hands to bash that API and finally *one year later* they make their own one. If 2008 was a bad year to adopt a propietary engine because DX11 was coming out a year later, what happens in the same year of release of that said DX11? Some moths before actually?



People are happy because Havok is more widly used in "A list" games. They also use it in movies. I'm sorry but Havok isnt going anywhere. Heres a list of Havok games. Say how much better this is than that but you have to admit if you had a choice it would have been Havok over PhysX for Nvidia.

24: The Game  
PlayStation2

After Burner: Black Falcon  
PSP

Alone in the Dark  
Xbox 360, PLAYSTATION3, Wii, PlayStation2, PC

America's Army: True Soldiers  
Xbox 360

Ant Bully  
PlayStation2, Wii, GameCube, PC

Assassin's Creed  
Xbox 360, PLAYSTATION3, PC

Astro Boy  
PlayStation2

Auto Assault  
PC

Battlefield: Bad Company  
Xbox 360, PLAYSTATION3

BioShock  
Xbox 360, PC

Blacksite : Area 51  
Xbox 360, PLAYSTATION3, PC

Boom Blox  
Wii

Bottle Buster  
PC

Carnival Games  
Wii

Cars  
PSP

Company of Heroes  
PC

Company of Heros: Opposing Fronts  
PC

Conan  
Xbox 360, PLAYSTATION3

Condemned 2: Bloodshot  
Xbox 360, PLAYSTATION3

Crackdown  
Xbox 360

Dark Messiah of Might and Magic: Elements  
Xbox 360, PC

Dawn of Mana  
Playstation 2

de Blob  
Wii

Dead Rising  
Xbox 360

Def Jam: Icon  
Xbox 360, PLAYSTATION3 

Desperados 2: Cooper's Revenge  
PC

Destroy All Humans! 2  
PlayStation2, Xbox, PC

Destroy All Humans: Big Willie Unleashed  
Wii

Dungeons & Dragons Online: Stormreach  
PC

F.E.A.R. Files  
Xbox360, PLAYSTATION3, PC

Fable 2  
Xbox 360

Fallout3  
Xbox 360, PLAYSTATION3, PC

Folklore  
PLAYSTATION3

Full Spectrum Warrior  
PlayStation2, Xbox, PC

Full Spectrum Warrior: Ten Hammers  
PC

George of the Jungle: Search for the Secret  
Wii, PlayStation2

Go! Sports Ski  
PLAYSTATION3

Guitar Hero III  
X360, PLAYSTATION3, PC

Half-Life 2: The Orange Box  
Xbox 360, PLAYSTATION3, PC

Halo 2  
Xbox, PC

Halo 3  
Xbox 360

Happy Feet  
PlayStation2, Wii, GameCube, PC

Harry Potter and the Order of The Phoenix  
Xbox 360, PLAYSTATION3, PC, Wii

Heavenly Sword  
PLAYSTATION3

Hellgate: London  
PC

IHRA Drag Racing: Sportsman Edition  
PlayStation2, Xbox, PC

Iron Man  
Xbox 360, PLAYSTATION3, Playstation2, Wii, PC

Just Cause  
PlayStation2, Xbox 360, Xbox, PC

Kane & Lynch: Dead Men  
Xbox 360, PLAYSTATION3, PC

Killzone: Liberation  
PSP

Looney Tunes: Acme Arsenal  
Xbox 360

Lost Planet: Extreme Condition  
Xbox 360, PLAYSTATION3, PC

LOTR Online: Shadows of Angmar  
PC

Medal of Honor Heroes  
PSP

Medal Of Honor: Heroes 2  
PSP, Wii

Mercenaries 2  
Xbox 360, PLAYSTATION 3, PS2, PC

Micro Machines V4  
PlayStation2, PSP, PC

Monster House  
PlayStation2, GameCube

MotorStorm  
PLAYSTATION3

MotorStorm: Pacific Rift  
PlayStation 3

My Sims  
Wii

Neopets Petpet Adventure:The Wand of Wishing  
PSP

Novastrike  
PLAYSTATION3

Over the Hedge  
PlayStation2, Xbox, GameCube, PC

PAIN  
PlayStation Network

Painkiller  
PC

Painkiller: Overdose  
PC

Playground  
Wii

Requiem: Bloodymare  
PC

Saints Row  
Xbox 360

Saints Row 2  
Xbox 360

Scene It? Lights, Camera, Action  
Xbox 360

Shadowrun  
Xbox 360, PC

Shrek SuperSlam  
PlayStation2, Xbox, GameCube, PC

Shrek the Third  
Xbox 360, Playstation2, PSP, Wii

Soldier of Fortune: Payback  
Xbox 360, PLAYSTATION3, PC 

Sonic Riders 2: Zero Gravity  
PlayStation2

Sonic the Hedgehog  
Xbox 360, PLAYSTATION3

Soulcalibur IV  
PlayStation 3, Xbox 360

Spiderman: Friend or Foe  
PSP

Spore  
PC

Star Wars: The Force Unleased  
Xbox 360, PLAYSTATION3

Stranglehold  
Xbox 360, PLAYSTATION3, PC

Stuntman: Ignition  
Xbox 360, PLAYSTATION3, PlayStation 2

Super Smash Bros.Brawl  
Wii

Superman Returns: The Videogame  
PlayStation2, Xbox 360, PC

Syphon Filter: Logan's Shadow  
PSP

Teen Titans  
PlayStation2, Xbox, GameCube

Test Drive Unlimited  
Xbox 360

The Elder Scrolls IV: Oblivion  
Xbox 360, PLAYSTATION3, PC   

The Godfather  
Xbox 360, PlayStation2, Xbox, PC

The Godfather: Blackhand Edition  
Wii

The Golden Compass  
Xbox 360, PLAYSTATION3, Wii, PlayStation2, PC

The Incredible Hulk  
Xbox 360, PLAYSTATION3, PlayStation2, PC, Wii

The Outfit  
Xbox 360

The Simpsons  
Xbox 360, PLAYSTATION3, Wii, PlayStation2, PSP

Time Crisis 4  
PLAYSTATION3

Timeshift  
Xbox 360, PLAYSTATION3, PC

TNA iMPACT!  
Xbox 360, PLAYSTATION3

Tom Clancy's Ghost Recon Advanced Warfighter 2  
Xbox 360, PLAYSTATION3

Alan Wake  

Diablo III  

Fracture  

Halo Wars  

Indiana Jones  

Ride to Hell  

Saboteur  

Splatterhouse  

Starcraft II  

This is Vegas  

Wheelman

Movies:
Troy
Kingdom of Heaven
Charlie and the Chocolate Factory
X-Men: The Last Stand
Poseidon
Harry Potter and the Order Of The Phoenix
10,000 BC
The Chronicles of Narnia: Prince Caspian


----------



## Wile E (Mar 23, 2009)

I'd choose Physx, only because it runs on the much more capable gpu, as opposed to a cpu. Havok physics, while good, is still limited by cpu power at this point. And when it does go GPGPU, the old titles won't magically get GPU computing support. They would have to be recoded. Meaning that Havok will be starting from square one anyway, giving Physx the advantage in the market due to being more mature.

I look forward to Havok on a gpu as well. But this anti-Physx sentiment needs to stop.


----------



## TheMailMan78 (Mar 24, 2009)

Wile E said:


> I'd choose Physx, only because it runs on the much more capable gpu, as opposed to a cpu. Havok physics, while good, is still limited by cpu power at this point. And when it does go GPGPU, the old titles won't magically get GPU computing support. They would have to be recoded. Meaning that Havok will be starting from square one anyway, giving Physx the advantage in the market due to being more mature.
> 
> I look forward to Havok on a gpu as well. But this anti-Physx sentiment needs to stop.



I was never against Physx. I was jealous Nvidia had it to be honest. I just think AMD adopting Havok is great news and shouldn't be played down.


----------



## CDdude55 (Mar 24, 2009)

I doubt we will see any big differences between Havok for ATI and PhyX for Nvidia, This should help games that really utilize physics processing(which isn't many), whether you go with ATI or Nvidia, shouldn't really be a big pick in terms of wanting a Physics processor.


----------



## DarkMatter (Mar 24, 2009)

TheMailMan78 said:


> I was never against Physx. I was jealous Nvidia had it to be honest. I just think AMD adopting Havok is great news and shouldn't be played down.



I'm in no way downplaying Havok, no one has, re-read this thread because the only API that has been downplayed here is PhysX, while I've been claiming equality. 

But AMD adopting Havok is indeed a bad move by them, IMO. Not only it will be at Intel's mercy, but it starts later and in no way it has assured a better adoption from developers. As Wile said it's the CPU Havok that has been in use, not the GPU one. So actually the score is 7-0 at the moment, that I can remember right now: UT3, Mirror's Edge, Warmonger, Cryostasis, Crazi Machines, Metal Knight Zero and Nurien. And if you take Ageia PPU games into account, that's a 10-0, by adding Cellfactor, and GRAW 1 and 2.

Also the CPU Havok lacks a lot of features that PhysX has (like advanced cloth and fluids in real time*) and although I suppose that the GPU Havok has those, they are new to developers. And no matter how you look at it, the API will be very different from the one in the CPU. So the already known factor won't have a big role here.

Considering what AMD said about PhysX and it's short lifespan, GPU Havok has a much shorter life even in AMDs mind so it's a dead end even before it has been demostrated. Nvidia at least believes in PhysX, but I highly doubt AMD sees this as much more than a short stop gap and a marketing tool to divert attention from PhysX and hey congrats AMD because it served it's purpose.

* There is an off-line or production Havok that is very different from the one in games, and that is the one used in Movies. If those movies would serve as any proof of Havoks superiority, I might as well say Nvidia has a much much better image quality, because MentalRay, it's production renderer and ray-tracer has been used in almost all mayor titles to some extent, or a good number of them. i.e Matrix trilogy. I say this just as an example that one good product doesn¡t automatically make all the similar ones from that company a better product. I'm just presenting a parallel example.


----------



## Steevo (Mar 24, 2009)

Everyone keeps forgetting that most of this gaming goes on inthe Windoze relm, where a magical elf name Bill makes a "DX" and everyone makes games for this. Also the 360 is made by bill, so the new DX if implementing Physics as a standard API will probably be used for both, developers are the key to this, do you code for two of the largest customers?


----------



## DarkMatter (Mar 24, 2009)

Steevo said:


> Everyone keeps forgetting that most of this gaming goes on inthe Windoze relm, where a magical elf name Bill makes a "DX" and everyone makes games for this. Also the 360 is made by bill, so the new DX if implementing Physics as a standard API will probably be used for both, developers are the key to this, do you code for two of the largest customers?



Electronic Arts is indeed distributing PhysX as their physics engine to their developers. And although I would agree to any critic to the quality of their games, there's no one bigger than them.

And 2k games. I forgot about 2k games.


----------



## ShRoOmAlIsTiC (Mar 24, 2009)

think of it as crossfire and sli.  they both will work and we will be able to benefit from them no matter what setup we have.


----------



## DarkMatter (Mar 24, 2009)

ShRoOmAlIsTiC said:


> think of it as crossfire and sli.  they both will work and we will be able to benefit from them no matter what setup we have.



I highly doubt both will be used at the same time. Too much work and money, because I could see Nvidia giving PhysX out for free, considering they already did to some extent, but Havok...


----------



## TheMailMan78 (Mar 24, 2009)

DarkMatter said:


> I'm in no way downplaying Havok, no one has, re-read this thread because the only API that has been downplayed here is PhysX, while I've been claiming equality.
> 
> But AMD adopting Havok is indeed a bad move by them, IMO. Not only it will be at Intel's mercy, but it starts later and in no way it has assured a better adoption from developers. As Wile said it's the CPU Havok that has been in use, not the GPU one. So actually the score is 7-0 at the moment, that I can remember right now: UT3, Mirror's Edge, Warmonger, Cryostasis, Crazi Machines, Metal Knight Zero and Nurien. And if you take Ageia PPU games into account, that's a 10-0, by adding Cellfactor, and GRAW 1 and 2.
> 
> ...



Nvidia is distributing PhysX for free to saturate the market in hope of Havok not becoming industry standard. The reason they are doing this is because of Intel. Intel owns Havok as you know and don't you think this alone gives it a HUGE advantage over PhysX? I mean Nvidia has to develop and support things for PhysX on their own when Havok has a MUCH larger support base in Intel. I agree PhysX is ahead of Havok in being GPU accelerated. No doubt. But with AMD/ATI working with Intel who do you see will be the long term "victor"? AMD and Nvidia are not the only two players in this particular game. Who do you think Microsoft is going to support if both x86 developers support one standard?

Also to clear the record I've always loved the idea of GPU acceleration.


----------



## DarkMatter (Mar 24, 2009)

TheMailMan78 said:


> Nvidia is distributing PhysX for free to saturate the market in hope of Havok not becoming industry standard. The reason they are doing this is because of Intel. Intel owns Havok as you know and don't you think this alone gives it a HUGE advantage over PhysX? I mean Nvidia has to develop and support things for PhysX on their own when Havok has a MUCH larger support base in Intel. I agree PhysX is ahead of Havok in being GPU accelerated. No doubt. But with AMD/ATI working with Intel who do you see will be the long term "victor"? AMD and Nvidia are not the only two players in this particular game. Who do you think Microsoft is going to support if both x86 developers support one standard?
> 
> Also to clear the record I've always loved the idea of GPU acceleration.



Yes, but I don't see Intel pushing GPU Havok anytime soon, CPU Havok yes, but not the GPU one, until they have a GPU to compete. Don't you see that GPU Havok would make Intel CPUs look like crap? It's as if it screams "AMD is better for gaming" everyday??

Lots of developers have stated they like the idea of GPU physics so it may be posible they adopt PhysX if there isn't any other one. 
I was not talking about that anyway. The idea is simple, Havok (Intel) will probably win, as you said, I'm not saying the contrary, but that doesn't mean AMD will win with them, GPU Havok in AMD GPUs will probably never see the light of day (except for 1 or 2 titles), so we will loose both implementations.

Anyway when it comes to giving support to something, I couldn't care less what is going to be the winner. I will always support the one that I think is better for consumers. I'm seing how many in this forum, you included, seem to support Havok just because it has more chances of becoming the winner. That's stupid. If you think it will win buy accordingly when it comes the time, if you wish, but don't let that affect your mind and your words. It would have been much better for William Wallace to scream "mercy", but that wasn't what he had to do.


----------



## Mussels (Mar 24, 2009)

DarkMatter said:


> Yes, but I don't see Intel pushing GPU Havok anytime soon, CPU Havok yes, but not the GPU one, until they have a GPU to compete. Don't you see that GPU Havok would make Intel CPUs look like crap? It's as if it screams "AMD is better for gaming" everyday??
> 
> Lots of developers have stated they like the idea of GPU physics so it may be posible they adopt PhysX if there isn't any other one.
> I was not talking about that anyway. The idea is simple, Havok (Intel) will probably win, as you said, I'm not saying the contrary, but that doesn't mean AMD will win with them, GPU Havok in AMD GPUs will probably never see the light of day (except for 1 or 2 titles), so we will loose both implementations.
> ...



Why is it so unlikely that we'll see titles with GPU havok? they're doing a press release on it, the support is there... what makes you think that it wont take off? You seem completely sure that its going to fail, without providing a reason for that


----------



## TheMailMan78 (Mar 24, 2009)

DarkMatter said:


> Yes, but I don't see Intel pushing GPU Havok anytime soon, CPU Havok yes, but not the GPU one, until they have a GPU to compete. Don't you see that GPU Havok would make Intel CPUs look like crap? It's as if it screams "AMD is better for gaming" everyday??
> 
> Lots of developers have stated they like the idea of GPU physics so it may be posible they adopt PhysX if there isn't any other one.
> I was not talking about that anyway. The idea is simple, Havok (Intel) will probably win, as you said, I'm not saying the contrary, but that doesn't mean AMD will win with them, GPU Havok in AMD GPUs will probably never see the light of day (except for 1 or 2 titles), so we will loose both implementations.
> ...


 You're fighting the good fight are you?


----------



## DarkMatter (Mar 24, 2009)

Mussels said:


> Why is it so unlikely that we'll see titles with GPU havok? they're doing a press release on it, the support is there... what makes you think that it wont take off? You seem completely sure that its going to fail, without providing a reason for that



I already said: Intel.



TheMailMan78 said:


> You're fighting the good fight are you?



Yes, I am.  Reason being: Intel.


----------



## ShadowFold (Mar 24, 2009)

Do you not have an Intel CPU? They must be doing something right


----------



## Mussels (Mar 24, 2009)

DarkMatter said:


> I already said: Intel.



I dont see any logic behind that.

If intel makes it x86, then AMD can run it, via can run it, etc.
If they make it GPU capable, then at least they're getting paid for it in royalties.


----------



## DarkMatter (Mar 24, 2009)

ShadowFold said:


> Do you not have an Intel CPU? They must be doing something right



As I said, what hardware I have has nothing to do with which technologies I want for the future. I bought Intel because I always buy whichever is best in the price range I choose to pay (maybe you all could remember this Intel example the next time you choose to think I'm a Nvidia fanboy). At the same time I know how shaddy and evil Intel is. I gave them my money, because they had the better product, but I never give my heart to any company. I know they've been holding back physics in games and I know they will continue with that in the future. In the future I will probably buy Intel CPU again, but I don't want them doing the physics, because I know that represents the stagnation, milking and sandbagging of something that I've been wanting for almost a decade.


----------



## DarkMatter (Mar 24, 2009)

Mussels said:


> I dont see any logic behind that.
> 
> If intel makes it x86, then AMD can run it, via can run it, etc.
> If they make it GPU capable, then at least they're getting paid for it in royalties.



It's not about who, but *how*. If Intel makes it x86 it will happen two things:

1. That it will never reach the heighs that GPU physics would reach. GPUs are always going to be faster (at number crunching) than any x86 based CPU/GPU for the same die area and power consumption. In any case you would need to buy an Intel GPU or AMD start making them, because CPUs will never hold enough power into them. It's not cost effective to add so many ALUs into a CPU, when most of the times they will be idling. And high-end on die GPUs so that you don't need a discrete one (hence the ALUs would be working most of the time), will never happen, because of thermal constraints.

2. Because of the growing importance of physics in games if Havok is used and it is x86 based, the better gaming solution would be an Intel GPU, because at Graphics+Physics it would be faster, although it would be lagging far behind at pure graphics, and would also be lagging in physics if Havok wasn't used and true GPUs from Nvidia and AMD could take advantage of PhysX instead of being forced to adopt Intel's path.

Not to mention that I think it's time for everybody to open the eyes and see that AMD has nothing to do against Intel when it comes to x86, it's ages ahead and I don't think AMD will ever catch on. They will continue being competitive because that's what most benefits Intel, but will never be ahead again.


----------



## TheMailMan78 (Mar 24, 2009)

DarkMatter said:


> It's not about who, but *how*. If Intel makes it x86 it will happen two things:
> 
> 1. That it will never reach the heighs that GPU physics would reach. GPUs are always going to be faster (at number crunching) than any x86 based CPU/GPU for the same die area and power consumption. In any case you would need to buy an Intel GPU or AMD start making them, because CPUs will never hold enough power into them. It's not cost effective to add so many ALUs into a CPU, when most of the times they will be idling. And high-end on die GPUs so that you don't need a discrete one (hence the ALUs would be working most of the time), will never happen, because of thermal constraints.
> 
> ...


Thats a LOT assumption and a pessimistic view on your part Dark. Almost everything happening points to just the opposite of what you just said. A year ago people were saying AMD was done for and buying ATI was a huge mistake. Now look where we are.


----------



## DarkMatter (Mar 24, 2009)

TheMailMan78 said:


> Thats a LOT assumption and a pessimistic view on your part Dark. Almost everything happening points to just the opposite of what you just said. A year ago people were saying AMD was done for and buying ATI was a huge mistake. Now look where we are.



And where are we?? Phenom II has a hard time even competing with Intels 2 years old hardware that had been sandbagged to begin with. Now Intel's Nehalem is somewhere 30% faster clock for clock and even with the restrictions they have put in place to hold back overclocking they still overclock like a charm. The die area of Nehalem is the same as that of Phenom 2, so it's actually cheaper to produce for Intel because of their much bigger fabs. If Intel wanted or needed, they could make anything from creating much faster Nehalems to lowering the price to a lower point than AMD if they trully required that.


----------



## TheMailMan78 (Mar 24, 2009)

DarkMatter said:


> And where are we?? Phenom II has a hard time even competing with Intels 2 years old hardware that had been sandbagged to begin with. Now Intel's Nehalem is somewhere 30% faster clock for clock and even with the restrictions they have put in place to hold back overclocking they still overclock like a charm. The die area of Nehalem is the same as that of Phenom 2, so it's actually cheaper to produce for Intel because of their much bigger fabs. If Intel wanted or needed, they could make anything from creating much faster Nehalems to lowering the price to a lower point than AMD if they trully required that.



Wow the R&D behind Nehalem must have been free then. As for the Phenom II having a hard time competing I think you better research thier intended performance bracket. The Phenom II is mopping the floor with Intel. Its an overachiever when it comes to games also. In gaming it goes toe to toe with the i7. I wont even go into ATIs end of things. So yeah we are FAR better off than we were a year ago. The future looks bright!


----------



## DarkMatter (Mar 24, 2009)

TheMailMan78 said:


> Wow the R&D behind Nehalem must have been free then. As for the Phenom II having a hard time competing I think you better research thier intended performance bracket. The Phenom II is mopping the floor with Intel. Its an overachiever when it comes to games also. In gaming it goes toe to toe with the i7. I wont even go into ATIs end of things. So yeah we are FAR better off than we were a year ago. The future looks bright!



There's a time when you have to come back to reality. AMD is better now than one year ago, but I don't see the bright future you talk about. I want to see it, but I'm too realist for that. 

I'm sorry for you, really. Just out of curiosity, how much did you pay for those stocks? I was close to buy 4000€ in AMD stock around 18 months ago for $7-8 or so I think, and I'm so happy that I didn't... uff I dodged that one by a hair.


----------



## TheMailMan78 (Mar 24, 2009)

DarkMatter said:


> There's a time when you have to come back to reality. AMD is better now than one year ago, but I don't see the bright future you talk about. I want to see it, but I'm too realist for that.
> 
> I'm sorry for you, really. Just out of curiosity, how much did you pay for those stocks? I was close to buy 4000€ in AMD stock around 18 months ago for $7-8 or so I think, and I'm so happy that I didn't... uff I dodged that one by a hair.



 Any stock right now is shot. However I've been buying into all kinds of things. 5 years from now people are going to be kicking themselves. Patience is key. Stocks never reflect the moral of a good company. Only negative press. Right now AMD is producing some great press. If this continues stocks will rise. How do you think Apple survived?

FYI me owning a part of AMD has nothing to do with my views. I used to own an 8800 on my old gaming system


----------



## Wile E (Mar 25, 2009)

TheMailMan78 said:


> Wow the R&D behind Nehalem must have been free then. As for the Phenom II having a hard time competing I think you better research thier intended performance bracket. The Phenom II is mopping the floor with Intel. Its an overachiever when it comes to games also. In gaming it goes toe to toe with the i7. I wont even go into ATIs end of things. So yeah we are FAR better off than we were a year ago. The future looks bright!



The Phenom is not mopping the floor with Intel. You need to double check that. The only thing it really keeps up with Intel in is encoding and gaming. And it only keeps up in gaming because current titles are limited by the gpu, not the cpu. AMD is still behind clock for clock. It's only equal in some tasks. To put that into perspective tho, they are now just starting to catch up to a 2 year old design. Compared to i7, Phenom II loses everywhere.

Yes, AMD made a step in the right direction, and they are now competitive in their bracket, but I doubt they'll pass Intel up in performance any time in the next couple of years. Intel has a much larger R&D budget.


----------



## kumquatsrus (Mar 26, 2009)

http://firingsquad.com/news/newsarticle.asp?searchid=21450

there you have it, havok cloth and havok destruction accelerated by the gpu over open-cl.


----------



## ShadowFold (Mar 26, 2009)

Well that's kinda cool. I want to see some videos of this destruction tho


----------



## Wile E (Mar 26, 2009)

I think nVidia should port Physx to use OpenCL.


----------



## DarkMatter (Mar 27, 2009)

I don't see anything in that link, just PR. I thought there would be a video there. I have seen the videos in Havok's page though and are not impressive at all. I'll stay with PhysX thank you.


----------



## FryingWeesel (Apr 4, 2009)

DarkMatter said:


> Also DX11 or OpenCL doesn't change anything. Nvidia can and will adapt PhysX to DX11/OpenCL when Win 7 launches too. Remember that it took them 3 months to adapt PhysX to CUDA. It won't take much more to adapt it to any other API.
> 
> And maybe I'm wrong, but when that happens, it's going to be very interesting, because Ati cards will be able to play PhysX, with AMD's permission or without. The only requirement for PhysX will be a card capable of running DX11 compute shaders then and AMD cards will do so. It will matter very little if AMD wants PhysX on their cards or not, unless they do something shaddy, they will not be able to prevent that.
> 
> The one that will win this game is the one that gets better support now and IMHO that's PhysX at the moment.



yeah, PhysX is getting better support and is being used far more then havoc, mostly because intel dont really want gpu enhanced phisics they want to keep everything on the cpu.

i dont see it taking them that long honestly, i would bet nvidia is already working on the project maby only a couple people but fact is if they get started early they can put it out either early or on the day dx11 is avalable.

also from what i have read this is NOT TIED TO dx11 shaders, so you wont have to replace your 8800+ or hd2k+ card to use it(hell even the x1900/1950 are acctualy capable of this kinda work) 

The prospecs is exciting to me, if only both companys would grow up and realise they need to team up against intel (main company driving this "keep everything on the cpu" BS)

IF ati/amd and nvidia could work togather in at least a limmited fassion they could take some of the wind out of intels sails, i mean it wouldnt be perfect but at least amd/ati and NV onboard could be used as a ppu, think about it, again not "optimal" but if all boards had onboard that could support physx(780/790gx and nvidia equivlants), 

blah......just blah!!


----------



## soryuuha (Apr 6, 2009)

DarkMatter said:


> I don't see anything in that link, just PR. I thought there would be a video there. *I have seen the videos in Havok's page* though and are not impressive at all. I'll stay with PhysX thank you.



You see Havok video or HavokFX video?


----------



## FryingWeesel (Apr 6, 2009)

Wile E said:


> I think nVidia should port Physx to use OpenCL.



I would bet you they are already working on it, if havoc can run on openCL then nvidias gonna want a peice of that action and the ability to have their physx do the same thing and work on the same hardware.

I would bet you start seeing other phisics engines moving to support opencl as well, it just makes sence to support it if avalable.


----------



## Wile E (Apr 6, 2009)

FryingWeesel said:


> I would bet you they are already working on it, if havoc can run on openCL then nvidias gonna want a peice of that action and the ability to have their physx do the same thing and work on the same hardware.
> 
> I would bet you start seeing other phisics engines moving to support opencl as well, it just makes sence to support it if avalable.



That's my take on it. But I get this feeling it won't happen. nvidia really likes to lock their technologies to their hardware for some reason. Presumably to help their hardware sales. I'm hoping that Physx doesn't provide enough hardware demand that they have no choice but to port it over to OCL.


----------



## FryingWeesel (Apr 6, 2009)

well, nvidia wants physx to be top dog, they where even trying to help those people make a port for ati cards(the one that turned out to be vaporware) 

for nvidia pushing physx would have no downside really, even if its allowing support over opencl, their cuda based implementation would be more optimzied anyway, after all they own the engine, they could make sure it just runs better on their cards then on opencl based cards.


----------



## Wile E (Apr 6, 2009)

FryingWeesel said:


> well, nvidia wants physx to be top dog, they where even trying to help those people make a port for ati cards(the one that turned out to be vaporware)
> 
> for nvidia pushing physx would have no downside really, even if its allowing support over opencl, their cuda based implementation would be more optimzied anyway, after all they own the engine, they could make sure it just runs better on their cards then on opencl based cards.



Yeah, that's a way to look at it too, i suppose.

I will say, Physx is actually one thing I miss about owning an nv card. But then again, I really like GRAW and GRAW2 a lot.


----------



## Mussels (Apr 6, 2009)

Game devs will have a simple choice

Nv physx, that only works on G92 and newer cards

Havok, that works on ATI cards (2k and up)

Now its a tough choice - comes down to whatevers easiest to code, or cheapest to use. If one of them goes openCL and becomes viable on both brands, that engine takes the lead - no alienating your customer base.


----------



## FryingWeesel (Apr 6, 2009)

Mussels said:


> Game devs will have a simple choice
> 
> Nv physx, that only works on G92 and newer cards
> 
> ...



um, from what i reammber g80 card can now run physx(any cuda compatable card can) so even older 8800's can support physx.

as to havok vs physx, they both got their flaws and advanteges, as i have said b4, As I understand it nvidia is giving physx away 
http://developer.nvidia.com/object/physx.html
infact they are!!!

physx has gone "licence free" this makes it more attractive to developers then havok thats owned by intel and thus requiers them to get a licence from intel to use/support it.


----------



## Mussels (Apr 6, 2009)

G80 and up!

screwup on my part, i was getting confused because i've been using coreAVC, and its G92 and up.


----------



## FryingWeesel (Apr 7, 2009)

A little info/show of what hardware physx can do.
http://www.guru3d.com/article/geforce-gtx-275-review-test/7

Ambient Occlusion
http://www.guru3d.com/article/geforce-gtx-275-review-test/6

AO perf hit in games i have tested it in has been pretty low on my 8800gts 512mb as long as im using the 185.66 drivers, older drivers the hit was higher, would guess that the perf hit will go down even more as the drivers mature more 

WoW for example looks amazing with AO enabled, shadows acctualy are NICE, NwN2 looks better, as well as some other older games I have tested.


----------



## TheMailMan78 (Apr 10, 2010)

Whatever happen to this? I have yet to see any GPU accelerated physics on ATI.


----------



## Mussels (Apr 10, 2010)

TheMailMan78 said:


> Whatever happen to this? I have yet to see any GPU accelerated physics on ATI.



nothing really.

at a guess, game devs are holding off til ATI and Nv have working Direct compute/openCL drivers


----------



## kid41212003 (Apr 10, 2010)

I believe GDC is already passed?


----------



## TheMailMan78 (Apr 10, 2010)

Mussels said:


> nothing really.
> 
> at a guess, game devs are holding off til ATI and Nv have working Direct compute/openCL drivers



Lord knows when that will be.


----------



## FordGT90Concept (Apr 10, 2010)

Most likely, AMD discovered it ain't so easy to transfer Havok calls to their GPUs so they gave up on it.  That, or Intel did something to shoo them away.

There was talks of Havok FX (Havok on GPU) a long time ago but it never happened.  Havok just appears to be in limbo.  Intel doesn't want to do anything with it because of the issues with Larrabee.  They are just expanding the libraries to do more stuff like AI pathfinding and cloth.


But think about it: what is Havok's speciality?  Making physics code believable but not very intensive so it can run on the CPU without causing problems.  There really is no market for them to make a more complex physics engine (like PhysX) that increases the hardware burden substantially and the end user can't tell a difference.  Havok is havok--works great and is hardware friendly.

Maybe Intel, Havok, and AMD discovered this so Havok is now just minding their own businesses improving on their already successful product?


----------



## TheMailMan78 (Apr 10, 2010)

Meh. I'm going Nvidia in my next build. ATI has a bunch of features that no one uses.


----------



## erocker (Apr 10, 2010)

TheMailMan78 said:


> Meh. I'm going Nvidia in my next build. ATI has a bunch of features that no one uses.



What? So you can use PhysX for all of those PhysX games?  I hear Fermi's love Florida this time of year.


----------



## TheMailMan78 (Apr 10, 2010)

erocker said:


> What? So you can use PhysX for all of those PhysX games?



Yeah the few that offer it. Plus the folding aspect of them and TWIMTBP program are both nice. We have Stream/Open CL that supports.........um?

Oh an no Fermi. To hot for my region. I have to wait until the next gen.


----------



## sneekypeet (Apr 10, 2010)

Dont let erocker pull your chain, he is currently raising funds for said fail physX...lol


----------



## erocker (Apr 10, 2010)

TheMailMan78 said:


> Yeah the few that offer it. Plus the folding aspect of them and TWIMTBP program are both nice. We have Stream/Open CL that supports.........um?



It doesn't matter to me bud. Do we know folding performance yet? I've been playing plenty of "PhysX" games just fine with my card. None of these features to me are anything to get rid of my current card over. Though, I would gladly replace it with a GTX 480 on performance alone.




sneekypeet said:


> Dont let erocker pull your chain, he is currently raising funds for said fail physX...lol



OMG NO YOU!!!!


----------



## TheMailMan78 (Apr 10, 2010)

sneekypeet said:


> Dont let erocker pull your chain, he is currently raising funds for said fail physX...lol



All I'm saying is I think Ill go Nvidia next time.


----------



## sneekypeet (Apr 10, 2010)

erocker said:


> OMG NO YOU!!!!



Just waiting for Monday I just hope they dont release while Im at the dentist and I miss everything.


----------



## Deleted member 24505 (Apr 10, 2010)

I'm just waiting for the free nvidia card with TWIMTBP branded games,otherwise i will just avoid them in future if i can.I am sick of nvidia giving game companies back handers to make sure games run like crap on ati hardware.


----------



## sneekypeet (Apr 10, 2010)

Im in it for the free t-shirt Newegg is offering.


----------



## Fourstaff (Apr 10, 2010)

I am waiting for a day when Intel bans Havok on AMD chips. Hopefully it doesnt happen, but there might be a possibility.


----------



## FordGT90Concept (Apr 10, 2010)

tigger said:


> I'm just waiting for the free nvidia card with TWIMTBP branded games,otherwise i will just avoid them in future if i can.I am sick of nvidia giving game companies back handers to make sure games run like crap on ati hardware.


I play lots, and lots, and lots of games.  Overall, I think there's fewer problems with AMD cards than NVIDIA cards for one simple reason: AMD releases drivers monthly while NVIDIA is doing good to release them bi-annually.  If there is a problem, AMD is likely to get it fixed long before NVIDIA does.

Lets also not forget that NVIDIA is the reason DX 10.1 exists (DX 10.1 features were intended to be part of DX 10 but NVIDIA couldn't make Microsoft's deadline so they had to release DX 10 and later DX 10.1 with the features NVIDIA couldn't support but AMD could) and even then, it took them years to finally adapt it.  NVIDIA was also late in releasing DX11 parts by about half a year.

Oh, and the obscenely high failure rates on GeForce 8 series cards. 


All of the above are the reasons I went back to AMD.




Fourstaff said:


> I am waiting for a day when Intel bans Havok on AMD chips. Hopefully it doesnt happen, but there might be a possibility.


If Intel did that, Havok would be like PhysX with rare implementations.  Intel won't do that for the sake of keeping Havok a viable company.


----------



## Wile E (Apr 11, 2010)

tigger said:


> I'm just waiting for the free nvidia card with TWIMTBP branded games,otherwise i will just avoid them in future if i can.I am sick of nvidia giving game companies back handers to make sure games run like crap on ati hardware.



This is total and complete BS. I get so sick of this claim. Not even ATI themselves make this claim.

TWIMTBP does not cripple ATI hardware, PERIOD. It just means nVidia took the time to give help to the dev to get their hardware optimized. Optimizing for nVidia is not the same as crippling ATI. ATI has the same opportunities to offer dev help, but usually choose not to. How is this, in any way, "evil" nVidia crippling ATI?

PS: Sorry I sound snippy, tig. It's not meant to be personal, it's just a frustrating topic to see always popping up.



FordGT90Concept said:


> I play lots, and lots, and lots of games.  Overall, I think there's fewer problems with AMD cards than NVIDIA cards for one simple reason: AMD releases drivers monthly while NVIDIA is doing good to release them bi-annually.  If there is a problem, AMD is likely to get it fixed long before NVIDIA does.
> 
> Lets also not forget that NVIDIA is the reason DX 10.1 exists (DX 10.1 features were intended to be part of DX 10 but NVIDIA couldn't make Microsoft's deadline so they had to release DX 10 and later DX 10.1 with the features NVIDIA couldn't support but AMD could) and even then, it took them years to finally adapt it.  NVIDIA was also late in releasing DX11 parts by about half a year.
> 
> ...


Except that for the past 1 1/2 years, it's been ATI with the more bug laden drivers. nVidia's turn will come back around again tho. Both companies go back and forth on driver quality, just like they go back and forth in performance.

And the failure rate on 8 series cards did not seem that high to me. Certainly not much different than ATI. Unless you meant the big defective batch of mGPUs?


----------



## a_ump (Apr 11, 2010)

i agree with wile E. i mean the way i can relate to it is a recent project i had in english 12. we had a group of 4, acting a scene in Macbeth(i hate eng 12). one of our group wasn't there for the script writing. we wrote ours, and then had to write his. he didn't really care for his but we were fine with ours. Are we "evil" or wrong for writing his script cause he wasn't there? no. Is nvidia wrong for helping dev's on their hardware when ATI isn't helping dev's on theirs? no.


----------



## Grings (Apr 11, 2010)

So, just because ATI dont shout about it at every opportunity like Nvidia do means they dont work with game developers?

Nonsense.

http://www.bit-tech.net/bits/interviews/2010/01/06/interview-amd-on-game-development-and-dx11/1


----------



## Wile E (Apr 11, 2010)

Grings said:


> So, just because ATI dont shout about it at every opportunity like Nvidia do means they dont work with game developers?
> 
> Nonsense.
> 
> http://www.bit-tech.net/bits/interviews/2010/01/06/interview-amd-on-game-development-and-dx11/1



That wasn't the point at all. The point was TWIMTBP is not anti-ATI.

Besides, nVidia still does it more, and has stronger ties to more devs. ATI (and AMD in general, actually) does not push itself as hard in the market as they could, especially compared to their competitors.

PS: ATI is launching their answer to TWIMTBP this year, from what I understand. So we may be seeing more of them in the dev process of games. It's about damn time, too.


----------



## Grings (Apr 11, 2010)

From the interview i posted (discussing Batman Arkham Asylum's Nvidia only anti aliasing):



> The part that I totally hold in contempt is the appalling way they added MSAA support that uses standard DirectX calls - absolutely nothing which is proprietary in any useful sense. They just did ordinary stuff, a completely standard recommendation that they make and that we make to developers for how to do MSAA, and they put it in and locked it to their hardware knowing it would run just fine on our hardware. And indeed, if you simply spoof the vendor ID in the driver - which we and other people have documented - it runs absolutely fine on AMD hardware. There's nothing proprietary about it in that sense, nothing new. I think that's exceptionally poor.


----------



## a_ump (Apr 11, 2010)

Grings said:


> So, just because ATI dont shout about it at every opportunity like Nvidia do means they dont work with game developers?
> 
> Nonsense.
> 
> http://www.bit-tech.net/bits/interviews/2010/01/06/interview-amd-on-game-development-and-dx11/1



dam good read


----------



## Wile E (Apr 11, 2010)

Grings said:


> From the interview i posted (discussing Batman Arkham Asylum's Nvidia only anti aliasing):



nVidia added it to the engine themselves, and did not test on ATI hardware (and why would they?). Of course they locked it out.

And him claiming that it worked perfectly fine on ATI hardware is a bold faced lie. It was shown that it doesn't work properly on ATI, even if you spoof it. There were screenshots all over the place proving it when the game released. ATI did not apply AA on shadows and other weird anomalies. They may have since fixed it in drivers, but at launch it was, in fact, broken. And it was not nVidia's job to get it working on ATI hardware either. If the would've left it unlocked, they would've caught hell for it being broken, and we still would've seen people claiming they did it on purpose. They were damned if they did, and damned if they didn't.


----------



## TheMailMan78 (Apr 11, 2010)

Wile E said:


> nVidia added it to the engine themselves, and did not test on ATI hardware (and why would they?). Of course they locked it out.
> 
> And him claiming that it worked perfectly fine on ATI hardware is a bold faced lie. It was shown that it doesn't work properly on ATI, even if you spoof it. There were screenshots all over the place proving it when the game released. ATI did not apply AA on shadows and other weird anomalies. They may have since fixed it in drivers, but at launch it was, in fact, broken. And it was not nVidia's job to get it working on ATI hardware either. If the would've left it unlocked, they would've caught hell for it being broken, and we still would've seen people claiming they did it on purpose. They were damned if they did, and damned if they didn't.



Just to be clear Batman:AA now does in fact offer AA on ATI drivers. I proved it.....

http://forums.techpowerup.com/showthread.php?t=119242


----------



## Mussels (Apr 12, 2010)

TheMailMan78 said:


> Just to be clear Batman:AA now does in fact offer AA on ATI drivers. I proved it.....
> 
> http://forums.techpowerup.com/showthread.php?t=119242



via in game settings, or CCC?

edit: read the thread.

The fact you need to FORCE AA, means that the game does NOT "offer" AA on ATI drivers - it just means there is a way to make it work, even if it is performance heavy.


----------



## TheMailMan78 (Apr 12, 2010)

Mussels said:


> via in game settings, or CCC?
> 
> edit: read the thread.
> 
> The fact you need to FORCE AA, means that the game does NOT "offer" AA on ATI drivers - it just means there is a way to make it work, even if it is performance heavy.



Well before you couldnt do it. Now you can and to be honest its the Unreal 3 engine which you could force AA on every other title but Batman.....until now


----------



## Mussels (Apr 12, 2010)

TheMailMan78 said:


> Well before you couldnt do it. Now you can and to be honest its the Unreal 3 engine which you could force AA on every other title but Batman.....until now



yeah, but let credit go where credit is due - ATI allow you to do it via CCC, B:AA and its sponsors are still doing everything they can to block you.


----------



## Wile E (Apr 12, 2010)

Mussels said:


> yeah, but let credit go where credit is due - ATI allow you to do it via CCC, B:AA and its sponsors are still doing everything they can to block you.



Not unlocking AA for ATI is not the same as blocking ATI. At this point, it's purely up to the dev to unlock it in a patch if ATI does have it working properly.


----------



## Mussels (Apr 12, 2010)

Wile E said:


> Not unlocking AA for ATI is not the same as blocking ATI. At this point, it's purely up to the dev to unlock it in a patch if ATI does have it working properly.



yes, it is.

Nvidia get an AA mode that only applies to whats neccesary - ATI are forced to use a generic AA profile that takes a large performance hit (applies AA to unnecessary elements) - i dunno about you, but its very clear to me ATI was blocked and had to resort to workarounds to even get this slower AA method working.


----------



## Wile E (Apr 12, 2010)

Mussels said:


> yes, it is.
> 
> Nvidia get an AA mode that only applies to whats neccesary - ATI are forced to use a generic AA profile that takes a large performance hit (applies AA to unnecessary elements) - i dunno about you, but its very clear to me ATI was blocked and had to resort to workarounds to even get this slower AA method working.



No, it isn't. 

The only thing that is clear to me, is that ATI opted to not offer any dev help on making a custom AA engine for that game.


----------

