Tuesday, September 29th 2009
Batman: Arkham Asylum Enables AA Only on NVIDIA Hardware on PCs
Anti-Aliasing has been one of the most basic image-quality enhancements available in today's games. PC graphics hardware manufacturers regard it as more of an industry standard, and game developers echo with them, by integrating anti-aliasing (AA) features in the game, as part of its engine. This allows the game to selectively implement AA in parts of the 3D scene, so even as the overall image quality of the scene is improved, so is performance, by making sure that not every object in the scene is given AA. It seems that in one of the most well marketed games of the year, Batman: Arkham Asylum, doesn't like to work with ATI Radeon graphics cards when it comes to its in-game AA implementation.
Developed under NVIDIA's The Way it's Meant to be Played program, and featuring NVIDIA's PhysX technology, the game's launcher disables in-game AA when it detects AMD's ATI Radeon graphics hardware. AMD's Ian McNaughton in his recent blog thread said that they had confirmed this by an experiment where they ran ATI Radeon hardware under changed device IDs. Says McNaughton: "Additionally, the in-game AA option was removed when ATI cards are detected. We were able to confirm this by changing the ids of ATI graphics cards in the Batman demo. By tricking the application, we were able to get in-game AA option where our performance was significantly enhanced." He further adds that the option is not available for the retail game as there is a secure-rom.
With no in-game AA available to ATI Radeon users, although the features do technically work on ATI Radeon hardware, the only way AA can be used is by forcing it in Catalyst Control Center. This causes the driver to use AA on every 3D object in the scene, reducing performance, compared to if the game's in-game AA engine is used. "To fairly benchmark this application, please turn off all AA to assess the performance of the respective graphics cards. Also, we should point out that even at 2560×1600 with 4x AA and 8x AF we are still in the highly playable territory," McNaughton adds. Choose with your wallets.
Developed under NVIDIA's The Way it's Meant to be Played program, and featuring NVIDIA's PhysX technology, the game's launcher disables in-game AA when it detects AMD's ATI Radeon graphics hardware. AMD's Ian McNaughton in his recent blog thread said that they had confirmed this by an experiment where they ran ATI Radeon hardware under changed device IDs. Says McNaughton: "Additionally, the in-game AA option was removed when ATI cards are detected. We were able to confirm this by changing the ids of ATI graphics cards in the Batman demo. By tricking the application, we were able to get in-game AA option where our performance was significantly enhanced." He further adds that the option is not available for the retail game as there is a secure-rom.
With no in-game AA available to ATI Radeon users, although the features do technically work on ATI Radeon hardware, the only way AA can be used is by forcing it in Catalyst Control Center. This causes the driver to use AA on every 3D object in the scene, reducing performance, compared to if the game's in-game AA engine is used. "To fairly benchmark this application, please turn off all AA to assess the performance of the respective graphics cards. Also, we should point out that even at 2560×1600 with 4x AA and 8x AF we are still in the highly playable territory," McNaughton adds. Choose with your wallets.
353 Comments on Batman: Arkham Asylum Enables AA Only on NVIDIA Hardware on PCs
Sounds like one of the reasons I said in the beginning of this thread... Hey, another reason I said in the beginning.
Seems like the simplest solution is most likely to be correct...
You know, a good reporter would put up a retraction correcting his misinformation...of course real reports do research to make sure their story is straight before reporting it and then wrongfully bashing who they believe to be at fault... Definitely, but I'm sure they will, it just takes time. We are only a week out from the launch of the HD5870, so I expect a price cut announcement on at least the GTX285 and GTX295 very soon.(The others still fit well in the Performance per dollar graph, thanks to the fact that they have had competition from ATi already).
i will not do that again, i'm really sorry
NB : if you came to indonesia just call me, i will be your guide. and i will show you how beautiful is indonesia. and btw i'm at 20 now
Those that feel offended by NV and the developers antics know what not to buy, and those who don't can support division of gamers.
www.tomshardware.com/news/nvidia-physx-ati,5764.html
Nvidia even gave him a lot of support.
www.tomshardware.com/news/nvidia-ati-physx,5841.html
Quote: "In the end, if Badit could get PhysX to run on Radeon cards, the PhysX reach would be extended dramatically and Nvidia would not be exposed to any fishy business claims - since a third party developer is leading the effort."
In the end AMD didn't allow that to happen, and lied about which the reasons were behind that decision, because they had a deal with Intel's Havok which only runs on the CPU. Since Intel didn't want GPU acceleration at all, PhysX could not happen, at least fully supported PhysX couldn't happen.
EDIT: And yeah, I know they are slowly porting Havok to run on GPUs too, but that is more than a year after that happened, because PhysX has some support after all, despite their efforts to block it and because by the time they finish porting it Intel will have their Larrabee out. The thing about GPU Havok is so fishy that the demo of Havok running in AMD's HD5xxx were using AMD's propietary Stream API, but the final product is going to be OpenCL...
NOW if AA was offered no matter what GPU you had but in fact ran better on Nvidia than I would accept fair play with TWIMTBP program. However Nvidia cheated ATI users out of something their card is VERY capable of doing natively. After all we are talking about AA. Not Physx.
Nvidia just had a Tonya Harding moment.
Seriously read the other posts and edit if necessary. ;)
- The claim was made that nVidia paid to have AA disabled for ATi hardware.
- The claim was made that AA works.
- The claim was made that there was no reason to disable the feature for ATi hardware, other than nVidia paying to have it disabled.
- Some arguing.
- The claim was made that AA was a feature that nVidia funded the addition of.
- The claim was also made that, perhaps the feature was disabled on ATi hardware due to it breaking the game.
- Some arguing.
- The claim was made that AA is a standard feature in the Unreal 3.5 Engine.
- The claim was made that ATi proved it doesn't break the game, because if it works in the demo, it will work in the entire game.
- Some arguing.
- It was revealed that AA is not a standard feature in the Unreal 3.5 Engine, and nVidia did infact fund the addition of it to the game.Source
- It was revealed that changing the device ID to allow AA to be enabled in-game, actually breaks the game on ATi hardware.Source
- It was revealed that, even with the setting enabled, ATi hardware didn't actually do AA because the feature was not designed for ATi hardware.Source
I think that about covers it.The discussion should be pretty much over with that. There is no wrong doing on nVidia's part. They paid for the developement and inclusion of AA in Batman, it is only fair that only their hardware gets the benefit. ATi was more than open to do the same, but they didn't, it is their loss, and more imporantly the loss of their customers. And unlike the original reports by ATi, the feature doesn't actually work on ATi hardware. The setting can be enabled in the demo, and full game, but it doesn't actually do anything and it breaks the full version of the game.
Perhaps if the two of them would work a little bit more together, we could see extras like this added to all games that work on both. Though we don't want them working so closely together we get another price fixing situation...:laugh:
ATI hacked a demo.
The developer did not cripple ATI because Nvidia paid them to do it. Seriously people... this isn't the US Government. Somebody needs to get facts and settle this BS because I've seen nothing but hearsay from ATI.
At the end of the day, I have an Nvidia card :roll:
It will be fun to see a game with nvidia (tm) AA, nvidia (tm) physx, ati (tm) tessellation, s3 (tm) AF, ati (tm) hdr, etc...
Pathetic.
Anyway, bioshock and mass effect had aa through control panel (both manufacturer).
THIS is what you guys bring to the table as facts?! Nivdia ok but a quack from a forum?! Come on guys. I thought you had better rebuttals than that. :shadedshu
And you have to kind of read the whole thing from the "nut job". That "nut job" is the one that originally claimed AA was disabled for ATi, and originally claimed it worked in the demo. Posting screenshots to prove it.
The other forum members later went on to disprove the fact that AA was even working. And the "nut job" himself confirmed that it broke the game(even if he didn't want to admit it at first).
S.T.A.L.K.E.R. was originally TWIMTBP, but IIRC, the game runs better with Ati hardware... maybe it ran better at launch with nV but Ati drivers were improved from there. (confirmation needed)
But afaik, now they display Ati Radeon logo at startup... in CS and COP.
now we can typically pick up a 100-200$ card that will grant us all the performance we need. I like it better now.
the argument was interestign to watch, as a former ati fanboy o have to admit I jumped to conclusions, but gettign older I didn't want to post without evidence. I'm glad I didn't and I'm glad the truth came to light.