Tuesday, September 29th 2009
![ATI Radeon Graphics](https://tpucdn.com/images/news/ati-v1739475473466.png)
Batman: Arkham Asylum Enables AA Only on NVIDIA Hardware on PCs
Anti-Aliasing has been one of the most basic image-quality enhancements available in today's games. PC graphics hardware manufacturers regard it as more of an industry standard, and game developers echo with them, by integrating anti-aliasing (AA) features in the game, as part of its engine. This allows the game to selectively implement AA in parts of the 3D scene, so even as the overall image quality of the scene is improved, so is performance, by making sure that not every object in the scene is given AA. It seems that in one of the most well marketed games of the year, Batman: Arkham Asylum, doesn't like to work with ATI Radeon graphics cards when it comes to its in-game AA implementation.
Developed under NVIDIA's The Way it's Meant to be Played program, and featuring NVIDIA's PhysX technology, the game's launcher disables in-game AA when it detects AMD's ATI Radeon graphics hardware. AMD's Ian McNaughton in his recent blog thread said that they had confirmed this by an experiment where they ran ATI Radeon hardware under changed device IDs. Says McNaughton: "Additionally, the in-game AA option was removed when ATI cards are detected. We were able to confirm this by changing the ids of ATI graphics cards in the Batman demo. By tricking the application, we were able to get in-game AA option where our performance was significantly enhanced." He further adds that the option is not available for the retail game as there is a secure-rom.
With no in-game AA available to ATI Radeon users, although the features do technically work on ATI Radeon hardware, the only way AA can be used is by forcing it in Catalyst Control Center. This causes the driver to use AA on every 3D object in the scene, reducing performance, compared to if the game's in-game AA engine is used. "To fairly benchmark this application, please turn off all AA to assess the performance of the respective graphics cards. Also, we should point out that even at 2560×1600 with 4x AA and 8x AF we are still in the highly playable territory," McNaughton adds. Choose with your wallets.
Developed under NVIDIA's The Way it's Meant to be Played program, and featuring NVIDIA's PhysX technology, the game's launcher disables in-game AA when it detects AMD's ATI Radeon graphics hardware. AMD's Ian McNaughton in his recent blog thread said that they had confirmed this by an experiment where they ran ATI Radeon hardware under changed device IDs. Says McNaughton: "Additionally, the in-game AA option was removed when ATI cards are detected. We were able to confirm this by changing the ids of ATI graphics cards in the Batman demo. By tricking the application, we were able to get in-game AA option where our performance was significantly enhanced." He further adds that the option is not available for the retail game as there is a secure-rom.
With no in-game AA available to ATI Radeon users, although the features do technically work on ATI Radeon hardware, the only way AA can be used is by forcing it in Catalyst Control Center. This causes the driver to use AA on every 3D object in the scene, reducing performance, compared to if the game's in-game AA engine is used. "To fairly benchmark this application, please turn off all AA to assess the performance of the respective graphics cards. Also, we should point out that even at 2560×1600 with 4x AA and 8x AF we are still in the highly playable territory," McNaughton adds. Choose with your wallets.
353 Comments on Batman: Arkham Asylum Enables AA Only on NVIDIA Hardware on PCs
if the setting was there *for all GPU's* it would be fair, like every other game title - it may run worse on AMD, and be useless on intel IGP's, but at least then they could release new drivers to fix the performance/compatibility bugs. nvidia would have a headstart on a fair playing field
how many people here would mind, if crysis 2 didnt allow AA on nvidia cards? you had to run it via your nvidia control panel, and you took a 20% FPS hit over ATI using the inbuilt options?
en.wikipedia.org/wiki/Deferred_shading
also, UE3 did not support AA in DX9 until now.
ATI could do it since the get go (remember oblivion, with HDR+AA?) and Nvidia couldnt.
nvidia took the time to make it work (and it does), and cock-blocked ATI from joining the party, when they could do it all along.
ATI followed the DX9/10 specs better than nvidia did, and Nv forced games to disable these options, so that ATI wouldnt have a features or performance advantage.
its an ages old game they play.
ATi had pixel shader 1.4 out when nvidia were stuck on 1.3 (so games stayed on 1.3 longer)
NV released the GF4 MX series without shaders at all (moving BACK two generations in hardware) (which helped NV catch up with PS2.0, bypassing their lack of PS 1.4)
FX series ony did 16 and 32 bit precision for floating point in PS2.0 when the standard called for 24 bit (meaning they were slower, or lower quality)
6 series cards were fine, but had slower SM3.0 performance for the most part (but it was fairly even there)
7 series: nothing in my memory, both sides had a fair go
8 series and up: Nv didnt support many features of DX10 properly, so they got them removed from the specs and pushed into DX10.1
nvidias hardware has often been behind ATI in the features game, and they use their influence (such as TWIMTBP) to get those features removed from games, lest ATI have an advantage. Thats all this is: ATI can do AA on top of all these fancy features, and nvidia cant (which is part of the specs of DX10.1 if you look).
nvidia got B:AA to work on their cards through hard work, and blocked ATI from allowing the same (even tho it works fine, as proven by forcing it in the CCC - try forcing AA onto an unreal engine game with nvidia, and you'll find it doesnt work)
The point is not one of you, nor I, know what happened. And while I find it fun to watch everyone so ravenously chant bloody murder on the basis of nothing more than text, its getting old.
You're forgetting one thing. Why this one game? Why disable AA on JUST this game but none of the others? Were you guys just looking for a reason to burn people at the stake? I'm sure everyone remembers the Far Cry 2 scandal that, oh wait. It wasn't. It was an error in ATI driver code. But that was a witch hunt too, wasn't it?
Edit:
Official response from NVIDIA:
“A representative of AMD recently claimed that NVIDIA interfered with anti-aliasing (AA) support for Batman: Arkham Asylum on AMD cards. They also claimed that NVIDIA’s The Way It’s Meant to be Played Program prevents AMD from working with developers for those games.
Both of these claims are NOT true. Batman is based on the Unreal Engine 3, which does not natively support anti-aliasing. We worked closely with Eidos to add AA and QA the feature on GeForce. Nothing prevented AMD from doing the same thing.
Games in The Way It’s Meant to be Played are not exclusive to NVIDIA. AMD can also contact developers and work with them.
We are proud of the work we do in The Way It’s Meant to be Played. We work hard to deliver kickass, game-changing features in PC games like PhysX, AA, and 3D Vision for games like Batman.If AMD wants to deliver innovation for PC games then we encourage them to roll up their sleeves and do the same.”
www.hardwarecanucks.com/news/games-news/nvidia-bites-refutes-claims-meddling-batman-arkham-asylum-aa-features/
So they say that the game engine does not natively support it. I doubt they'd get something like that wrong in a press release. So the questions, who do I (/we) believe?
a lot of arguments,facts&even proofs ...from both supporters..
i like to read between the lines because the truth is there somewhere...
so if this game was made with nvidia money they have the right to ask the developer to make subtle changes in order to not allow for ati cards to use a specific feature;who have the money and pay is making the rules in any fields
we don't have to forget the fact that nvidia has bought ageia and now they see that is wasn't a good investment as expected, so till they can they'll try to make money from it to compensate this bad investment;neither physx or cuda don't have the expected popularity whatever they make and they'll loose a lot of $ when other similar applications will have a higher usage-market share;a perfect analogy is blu-ray and HDVD;the shift of market to the 1st was deadly for the other with huge loses for the 2nd involved parts
is a known fact that both vga producer invest money in games for obvious reasons:they have to sell cards! ;both of them have optimized games & favorite engines and they can ask the developer to write what they consider necessary in the game soft in order to create an advantage over the competitor
we don't need to be angry or upset for findings like this because the community has always find solutions to fix this problems for the benefit of all users - no matter if ati or nvidia
from my point of view i consider this thread closed...
I guess the witch hunt can end with that info.
Nonsense. If features were disabled for any reason other than technical, there would already be talks of ATI suing NV. It's either a technical limitation, or a bug.
EDIT: In fact, I would go one step further, and say it's ATI's fault for not getting involved with the game's development like NV does. They have every opportunity to go to the developer and help optimize for their platform.
All the TWIMTPB bashing is so BS anyway and always based on previous cases that weren't true to begin with. This must is true, because what happened with Assassin's Creed? LIke I always said: If Nvidia didn't want DX10.1 in "their" game, why was it included in the first place? They work closely with the developer, they test a lot, they can test on Ati hardware... No Ati fanboi whiner has ever answered this simple question. The truth is that Nvidia never interferd, not in this game, not in AC, not ever, but the ball just continues rolling and rolling. The next time that something happens BM:AA will be mentioned, even if it's not true just like happned in the past. :shadedshu I share that opinion and I already said that in more than one post, but it's lost in the labyrinth of posts this thread has become. :)
The one part that the people complaining are not understanding is this.
Quote from Nvidia's answer
"Both of these claims are NOT true. Batman is based on the Unreal Engine 3, which does not natively support anti-aliasing. We worked closely with Eidos to add AA and QA the feature on GeForce. Nothing prevented AMD from doing the same thing."
I read like 5 pages then got bored of reading the same things over and over.
My suggestion: those that don't like what the game developers/nvidia have done - just don't buy the game and at the same time write an e-mail to both the game developers and nvidia telling them the reasons why. When the game devs get a hundred or a thousand e-mails, they'll quickly fix the problem.
Whining about it on here is going to get nothing done.
Cheers have a nice day.
If people want to rage about Nvidia just complain about the pricing. It won't take lies or misconceptions to do so. Its just plain facts.
I got ripped buying a 7950GX2 back in the day. It scaled like crap and drivers had taken forever to make it scale decently. 9800GX2 and GTX 295's were another story though(still overpriced). :)
Strange Indonesian kids are raiding TPU! oh noes! :roll:
*EDIT* btarunr just cleaned it up with an edit. Thank you sir.