Tuesday, September 29th 2009

Batman: Arkham Asylum Enables AA Only on NVIDIA Hardware on PCs

Anti-Aliasing has been one of the most basic image-quality enhancements available in today's games. PC graphics hardware manufacturers regard it as more of an industry standard, and game developers echo with them, by integrating anti-aliasing (AA) features in the game, as part of its engine. This allows the game to selectively implement AA in parts of the 3D scene, so even as the overall image quality of the scene is improved, so is performance, by making sure that not every object in the scene is given AA. It seems that in one of the most well marketed games of the year, Batman: Arkham Asylum, doesn't like to work with ATI Radeon graphics cards when it comes to its in-game AA implementation.

Developed under NVIDIA's The Way it's Meant to be Played program, and featuring NVIDIA's PhysX technology, the game's launcher disables in-game AA when it detects AMD's ATI Radeon graphics hardware. AMD's Ian McNaughton in his recent blog thread said that they had confirmed this by an experiment where they ran ATI Radeon hardware under changed device IDs. Says McNaughton: "Additionally, the in-game AA option was removed when ATI cards are detected. We were able to confirm this by changing the ids of ATI graphics cards in the Batman demo. By tricking the application, we were able to get in-game AA option where our performance was significantly enhanced." He further adds that the option is not available for the retail game as there is a secure-rom.

With no in-game AA available to ATI Radeon users, although the features do technically work on ATI Radeon hardware, the only way AA can be used is by forcing it in Catalyst Control Center. This causes the driver to use AA on every 3D object in the scene, reducing performance, compared to if the game's in-game AA engine is used. "To fairly benchmark this application, please turn off all AA to assess the performance of the respective graphics cards. Also, we should point out that even at 2560×1600 with 4x AA and 8x AF we are still in the highly playable territory," McNaughton adds. Choose with your wallets.
Add your own comment

353 Comments on Batman: Arkham Asylum Enables AA Only on NVIDIA Hardware on PCs

#326
ArmoredCavalry
newtekie1The concept of AA is not proprietary, however the different implementations are.
Yeah but do a lot of games develop their own 'algorithms' for AA? Or is there an industry standard of sorts? I wouldn't think that companies pay to develop something over and over that is so widely used.

And if they used a commercial solution, wouldn't they have somewhere in the credits "developed using X brand AA" like physics/audio is basically... ?
Posted on Reply
#327
erocker
*
You can run AA on an ATi card without a problem using CCC as far as I know.
Posted on Reply
#328
ArmoredCavalry
erockerYou can run AA on an ATi card without a problem using CCC as far as I know.
from article: "the only way AA can be used is by forcing it in Catalyst Control Center. This causes the driver to use AA on every 3D object in the scene, reducing performance, compared to if the game's in-game AA engine is used."

Yeah, I'd imagine most high-end cards will run it with CCC AA with no issues... Since Unreal Engine 3 is getting on in years.

Of course I don't have the game (doesn't interest me enough) so I couldn't tell ya for sure. :ohwell:
Posted on Reply
#329
Benetanegia
ArmoredCavalryYeah but do a lot of games develop their own 'algorithms' for AA? Or is there an industry standard of sorts? I wouldn't think that companies pay to develop something over and over that is so widely used.

And if they used a commercial solution, wouldn't they have somewhere in the credits "developed using X brand AA" like physics/audio is basically... ?
The tools required to do MSAA are inside DX, but they are that, tools that developers can use in their engines. Usually developers implement it in their rendering pipeline, in the way that better fit their engines or desired effect or expected performance. But when you are creating your game AA it's not a checkbox inside DX, developers have to implement it.

The other option is supersampling that doesn't require being implemented in the engine. The same frame is rendered 4 times and blended into one (more or less). The quality is better than MSAA, but the performance hit is huge.

Epic didn't implement AA into UE3 for some reason (PS3 can't do HDR+AA)(AA is dificult to implement in defferred engines or whatever reason). The developer behind Batman was not going to implement it, but I supose Nvidia convinced them. That's what TWIMTBP is for. The situation is not usual, most engines have AA implemented.
Posted on Reply
#331
dr emulator (madmax)
troyrae360you might not be able to run AA on batman with ATI card but can you do this with a NV card? www.youtube.com/watch?v=ujf6P6iGcfc :nutkick:
wo that freaked me out you tube stopped and then it said erocker moderator, for a sec i was like wtf i know he's a mod here but at youtube as well :eek::laugh:
Posted on Reply
#332
DaedalusHelios
Now that I have played the game I think its a boring beat'em up like devil may cry. No reason to fuss because its not a great game IMO. ;)
Posted on Reply
#333
mR Yellow
TBH, i've done the PhysX hack and it didn't add much to the game play. All i saw was smoke, spiderwebs and paper effects. Nothing to go wow about.

To date PhysX has jusy been nothing but a sales gimmick. PORTAL and HL2 was way better in terms of game play.
Posted on Reply
#334
newtekie1
Semi-Retired Folder
mR YellowTBH, i've done the PhysX hack and it didn't add much to the game play. All i saw was smoke, spiderwebs and paper effects. Nothing to go wow about.

To date PhysX has jusy been nothing but a sales gimmick. PORTAL and HL2 was way better in terms of game play.
For the most part, you are correct. PhysX doesn't add much beyond a little eye candy to any game so far.

PhysX had/has a lot of potential. However, it hasn't even come close to showing the true pontential in games simply because it is proprietary, and not supported on all hardware. So developers have to create a normal game, then just add a few PhysX elements to the game later. Nothing related to gameplay is PhysX related, because it would ruin the game for people without PhysX.

Now if a developer based the game, and gameplay elements on PhysX right from the beginning of developement, we would see some pretty amazing stuff. A lot more realistic environments, fully destructable environments. Imagine CounterStrike, but instead of having to enter a building only through the door, or a window, you can also just blow a hole in the wall and walk in, and not just at certain pre-defined spots in the way, but anywhere in the wall you wanted.

Saddly, we will never see this because it doesn't run natively on ATi hardware. It is clear that nVidia knew this was required to see PhysX really show it's potential, and this is why they wanted to get it up and running on ATi hardware. I'm sure at the time, ATi definitely didn't want this, since they were in-bed with Intel and Havok.
Posted on Reply
#335
mR Yellow
newtekie1For the most part, you are correct. PhysX doesn't add much beyond a little eye candy to any game so far.

PhysX had/has a lot of potential. However, it hasn't even come close to showing the true pontential in games simply because it is proprietary, and not supported on all hardware. So developers have to create a normal game, then just add a few PhysX elements to the game later. Nothing related to gameplay is PhysX related, because it would ruin the game for people without PhysX.

Now if a developer based the game, and gameplay elements on PhysX right from the beginning of developement, we would see some pretty amazing stuff. A lot more realistic environments, fully destructable environments. Imagine CounterStrike, but instead of having to enter a building only through the door, or a window, you can also just blow a hole in the wall and walk in, and not just at certain pre-defined spots in the way, but anywhere in the wall you wanted.

Saddly, we will never see this because it doesn't run natively on ATi hardware. It is clear that nVidia knew this was required to see PhysX really show it's potential, and this is why they wanted to get it up and running on ATi hardware. I'm sure at the time, ATi definitely didn't want this, since they were in-bed with Intel and Havok.
Good point. Maybe nVidia should release a title to demonstrate this. Wasn't there a game called Cell Factor that was supposed to do this?
Posted on Reply
#336
TheMailMan78
Big Member
newtekie1For the most part, you are correct. PhysX doesn't add much beyond a little eye candy to any game so far.

PhysX had/has a lot of potential. However, it hasn't even come close to showing the true pontential in games simply because it is proprietary, and not supported on all hardware. So developers have to create a normal game, then just add a few PhysX elements to the game later. Nothing related to gameplay is PhysX related, because it would ruin the game for people without PhysX.

Now if a developer based the game, and gameplay elements on PhysX right from the beginning of developement, we would see some pretty amazing stuff. A lot more realistic environments, fully destructable environments. Imagine CounterStrike, but instead of having to enter a building only through the door, or a window, you can also just blow a hole in the wall and walk in, and not just at certain pre-defined spots in the way, but anywhere in the wall you wanted.

Saddly, we will never see this because it doesn't run natively on ATi hardware. It is clear that nVidia knew this was required to see PhysX really show it's potential, and this is why they wanted to get it up and running on ATi hardware. I'm sure at the time, ATi definitely didn't want this, since they were in-bed with Intel and Havok.
Havok already does this without the over head Physx brings. Look at the "frostbite" engine.
Posted on Reply
#337
DaedalusHelios
TheMailMan78Havok already does this without the over head Physx brings. Look at the "frostbite" engine.
PhysX is much more complicated and can be used for a better gaming experience where good physics can shine. I don't think anybody is dumb enough to think PhysX is not as good as Havok. The problem is that PhysX requires Nvidia hardware. Thats not accessible like Havok is, which can run on just about anything. They need to develop it to run on ATi hardware and realize that widespread adoption is better than just keeping it to themselves. And once it would become the physics stadard they could charge low cost licensing like game engines etc. Nvidia is playing a good hand but not using it right. Probably because of the arrogant CEO they have.
Posted on Reply
#338
BelligerentBill
mR YellowTBH, i've done the PhysX hack and it didn't add much to the game play. All i saw was smoke, spiderwebs and paper effects. Nothing to go wow about.

To date PhysX has jusy been nothing but a sales gimmick. PORTAL and HL2 was way better in terms of game play.
Atmosphere of a game is a pretty big deal. Batman is a damn fine game and the additional atmosphere really is a nice bonus IMO. Ever since I demoted my 8800 GTS 512 to a dedicated Physx PPU I simply see no reason to go without PhysX... in fact it's much like a drug... it's there and I must have it. No the feature isn't critical to any game... but I would liken it's entertainment value to watching a movie in Blu-Ray HD as opposed to a standard DVD.
Posted on Reply
#339
Wile E
Power User
TheMailMan78Havok already does this without the over head Physx brings. Look at the "frostbite" engine.
Havok's current capabilities are a fraction of what Physx is capable of. Now, come back and discuss this when Havok actually releases their gpu accelerated physics implementation. Until then, Physx has the most potential. It's just that it's currently untapped by developers.
Posted on Reply
#340
TheMailMan78
Big Member
Wile EHavok's current capabilities are a fraction of what Physx is capable of. Now, come back and discuss this when Havok actually releases their gpu accelerated physics implementation. Until then, Physx has the most potential. It's just that it's currently untapped by developers.
I have yet to see Physx do ANYTHING Havok can't. Again I say research the Frostbite engine.
Posted on Reply
#341
Wile E
Power User
TheMailMan78I have yet to see Physx do ANYTHING Havok can't. Again I say research the Frostbite engine.
I did. And I'm telling you, just because you haven't seen it, doesn't mean it's not capable. Physx is capable of much, MUCH more than all other current physics implementations. OpenCL and gpu accelerated Havok may change that, but as it stands, Physx has superior capabilities. No developers have chosen to tap into it's full capabilities yet, as they don't want to alienate non-nVidia users. Doesn't make it any less capable.
Posted on Reply
#342
TheMailMan78
Big Member
Wile EI did. And I'm telling you, just because you haven't seen it, doesn't mean it's not capable. Physx is capable of much, MUCH more than all other current physics implementations. OpenCL and gpu accelerated Havok may change that, but as it stands, Physx has superior capabilities. No developers have chosen to tap into it's full capabilities yet, as they don't want to alienate non-nVidia users. Doesn't make it any less capable.
Proof man proof. Show me something Physx can do that havok can't.
Posted on Reply
#343
Wile E
Power User
TheMailMan78Proof man proof. Show me something Physx can do that havok can't.
Why don't you use google buddy? Where is your proof that Havok is capable of everything that Physx is capable of?
Posted on Reply
#344
TheMailMan78
Big Member
Wile EWhy don't you use google buddy? Where is your proof that Havok is capable of everything that Physx is capable of?
Your the one making the accusations Havok isn't on par with Physx. All I said is they were equal and you said Physx was better. I gave you proof with the frostbite and you offer none.

Again where is the beef man?
Posted on Reply
#345
Wile E
Power User
TheMailMan78Your the one making the accusations Havok isn't on par with Physx. All I said is they were equal and you said Physx was better. I have you proof with the frostbite and you offer none.

Again where is the beef man?
No, you're the one making accusations that Physx isn't more capable. This thread isn't about Havok. The burden of proof lies on you.

Besides, you have to be a developer to understand the raw data that's out there. I don't have the ability to translate. All the info you need is in the Physx and Havok SDK's. Download them, and have a go at it.

Not to mention, we haven't even touched on how much faster gpus are at crunching physics numbers vs cpus. It's just common sense that Physx is capable of more. Even if it can only do the same types of Physx, it still can do more of them.
Posted on Reply
#346
TheMailMan78
Big Member
Wile ENo, you're the one making accusations that Physx isn't more capable. This thread isn't about Havok. The burden of proof lies on you.

Besides, you have to be a developer to understand the raw data that's out there. I don't have the ability to translate. All the info you need is in the Physx and Havok SDK's. Download them, and have a go at it.

Not to mention, we haven't even touched on how much faster gpus are at crunching physics numbers vs cpus. It's just common sense that Physx is capable of more. Even if it can only do the same types of Physx, it still can do more of them.
You're assuming its more capable because its dedicated. In theory you're right. However NOTHING in the industry shows that it is. As a matter of fact everything points to the opposite. IF it were that much better how come Intel went with Havok? Why are most engines using Havok? Just because an SDK is more crowded doesn't make it better.

You say Intel went with Havok because Nvidia owns them but I say its because Physx is inferior. I also believe it will soon be dead too. Say what you will but my proof is in practice. Yours is in theory.
Posted on Reply
#347
Wile E
Power User
TheMailMan78You're assuming its more capable because its dedicated. In theory you're right. However NOTHING in the industry shows that it is. As a matter of fact everything points to the opposite. IF it were that much better how come Intel went with Havok? Why are most engines using Havok? Just because an SDK is more crowded doesn't make it better.

You say Intel went with Havok because Nvidia owns them but I say its because Physx is inferior. I also believe it will soon be dead too. Say what you will but my proof is in practice. Yours is in theory.
I didn't say anything about Intel and Havok. But anyway, Intel went Havok because Physx was already bought out, and they needed something to push with Larabee. Has nothing to do with technical capabilities.

And more engines use Physx than you think. Physx also has a cpu based api, just like Havok.

Again, the adoption rate is low because devs don't like to alienate customers. This is nv's fault for sure, for not making gpu Physx run on an open standard, but adoption rates do not in any way prove capabilities. Not to mention, how much longer has Havok been around? That's a pretty piss poor argument, tbh.

And Physx is not necessarily dead either. With the release of OpenCL, all nVidia has to do is port it from CUDA to OpenCL, and it will be alive and well. Whether they do that or not, is a different story. They seem to have pride issues on opening up their API's for maximum exposure.

At any rate, nothing you have mentioned points to Physx having inferior capabilities. You still haven't proven anything either.
Posted on Reply
#348
TheMailMan78
Big Member
Wile EI didn't say anything about Intel and Havok. But anyway, Intel went Havok because Physx was already bought out, and they needed something to push with Larabee. Has nothing to do with technical capabilities.

And more engines use Physx than you think. Physx also has a cpu based api, just like Havok.

Again, the adoption rate is low because devs don't like to alienate customers. This is nv's fault for sure, for not making gpu Physx run on an open standard, but adoption rates do not in any way prove capabilities. Not to mention, how much longer has Havok been around? That's a pretty piss poor argument, tbh.

And Physx is not necessarily dead either. With the release of OpenCL, all nVidia has to do is port it from CUDA to OpenCL, and it will be alive and well. Whether they do that or not, is a different story. They seem to have pride issues on opening up their API's for maximum exposure.

At any rate, nothing you have mentioned points to Physx having inferior capabilities. You still haven't proven anything either.
You're correct. You didn't say anything about Intel. My mistake. I'm so used to that argument I got ya confused :laugh:

Anyway I don't feel Physx is inferior for its capabilities. I feel its inferior due to the way its executed. (Nividia only hardware). What I do believe is its no better than Havok and even when its GPU accelerated I have yet to see something Havok cannot do and has been proven to do. Does it have more potential in theory? Hell yeah but I haven't seen a damn thing yet to justify a dedicated GPU other than some slick marketing by Nvidia.

As for adoption rates just look at Havok vs Physx SINCE physx was first released. I think you'll be surprised.
Posted on Reply
#349
Wile E
Power User
TheMailMan78You're correct. You didn't say anything about Intel. My mistake. I'm so used to that argument I got ya confused :laugh:

Anyway I don't feel Physx is inferior for its capabilities. I feel its inferior due to the way its executed. (Nividia only hardware). What I do believe is its no better than Havok and even when its GPU accelerated I have yet to see something Havok cannot do and has been proven to do. Does it have more potential in theory? Hell yeah but I haven't seen a damn thing yet to justify a dedicated GPU other than some slick marketing by Nvidia.

As for adoption rates just look at Havok vs Physx SINCE physx was first released. I think you'll be surprised.
I'm not surprised at all. I already admitted Nv is holding gpu Physx back, and by extension, cpu Physx. But directly comparing it to Havok is still pointless, because Havok has been around so much longer that it has had more time to penetrate the market and bring up it's brand recognition.

None of that changes the fact that it's capable of more than any cpu based physics.
Posted on Reply
#350
mR Yellow
BelligerentBillAtmosphere of a game is a pretty big deal. Batman is a damn fine game and the additional atmosphere really is a nice bonus IMO. Ever since I demoted my 8800 GTS 512 to a dedicated Physx PPU I simply see no reason to go without PhysX... in fact it's much like a drug... it's there and I must have it. No the feature isn't critical to any game... but I would liken it's entertainment value to watching a movie in Blu-Ray HD as opposed to a standard DVD.
Valid point, but the difference isn't that huge as SD and HD.
Posted on Reply
Add your own comment
Nov 26th, 2024 12:50 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts