Tuesday, September 29th 2009
Batman: Arkham Asylum Enables AA Only on NVIDIA Hardware on PCs
Anti-Aliasing has been one of the most basic image-quality enhancements available in today's games. PC graphics hardware manufacturers regard it as more of an industry standard, and game developers echo with them, by integrating anti-aliasing (AA) features in the game, as part of its engine. This allows the game to selectively implement AA in parts of the 3D scene, so even as the overall image quality of the scene is improved, so is performance, by making sure that not every object in the scene is given AA. It seems that in one of the most well marketed games of the year, Batman: Arkham Asylum, doesn't like to work with ATI Radeon graphics cards when it comes to its in-game AA implementation.
Developed under NVIDIA's The Way it's Meant to be Played program, and featuring NVIDIA's PhysX technology, the game's launcher disables in-game AA when it detects AMD's ATI Radeon graphics hardware. AMD's Ian McNaughton in his recent blog thread said that they had confirmed this by an experiment where they ran ATI Radeon hardware under changed device IDs. Says McNaughton: "Additionally, the in-game AA option was removed when ATI cards are detected. We were able to confirm this by changing the ids of ATI graphics cards in the Batman demo. By tricking the application, we were able to get in-game AA option where our performance was significantly enhanced." He further adds that the option is not available for the retail game as there is a secure-rom.
With no in-game AA available to ATI Radeon users, although the features do technically work on ATI Radeon hardware, the only way AA can be used is by forcing it in Catalyst Control Center. This causes the driver to use AA on every 3D object in the scene, reducing performance, compared to if the game's in-game AA engine is used. "To fairly benchmark this application, please turn off all AA to assess the performance of the respective graphics cards. Also, we should point out that even at 2560×1600 with 4x AA and 8x AF we are still in the highly playable territory," McNaughton adds. Choose with your wallets.
Developed under NVIDIA's The Way it's Meant to be Played program, and featuring NVIDIA's PhysX technology, the game's launcher disables in-game AA when it detects AMD's ATI Radeon graphics hardware. AMD's Ian McNaughton in his recent blog thread said that they had confirmed this by an experiment where they ran ATI Radeon hardware under changed device IDs. Says McNaughton: "Additionally, the in-game AA option was removed when ATI cards are detected. We were able to confirm this by changing the ids of ATI graphics cards in the Batman demo. By tricking the application, we were able to get in-game AA option where our performance was significantly enhanced." He further adds that the option is not available for the retail game as there is a secure-rom.
With no in-game AA available to ATI Radeon users, although the features do technically work on ATI Radeon hardware, the only way AA can be used is by forcing it in Catalyst Control Center. This causes the driver to use AA on every 3D object in the scene, reducing performance, compared to if the game's in-game AA engine is used. "To fairly benchmark this application, please turn off all AA to assess the performance of the respective graphics cards. Also, we should point out that even at 2560×1600 with 4x AA and 8x AF we are still in the highly playable territory," McNaughton adds. Choose with your wallets.
353 Comments on Batman: Arkham Asylum Enables AA Only on NVIDIA Hardware on PCs
My next purchase will definitely be a 5870. U don't really need more power than that.
BTW AA isn't an added feature...it's a standard.
Fuck you Nvidia, and the horse you ride on.
The blogger has proven that they're able to get the game to run with AMD's hardware with a rather evasive measure.
Edios will probably stay silent on this matter. Whats next? Disabling rendering altogether?
"run sh!ttier" implies lower frame rates, and that is only when you force it in CCC not having the selective AA feature which is the very definition of an optimization. Only difference is you see it in the form of a button now so its worse somehow?
I think its stupid they did it but its a difference in IQ and not framerate unless you change the ATi profile to make it swing the other way by forcing AA in CCC.
They can just have an aftermarket patch to swing it the other way. Its no big deal.
I completed the AC on 3 comps using diffrent setups, all DX10.1, NEVER crashed, the patch came and it crashed. Lol.
Did you read it ?
They changed a ID and it WORKED!
There is always something that seems to be bad with ati cards as long as its way its meant to be played/payed.
Xbox runs AA, Why cant a simular architecture run it ? since 2900xt/1950xtx in the middle of those is what a Xbox ati chip is, somewhere in that path.
It doesnt work with my 2900 XT or a 1950XTX... how about That ?.
(Yes i tested :D)
And btw, i have a shitvidia card, which i could have used but nvidia doesnt let me.
Nvidia, the way its meant to fail.
I'm talking about PHysx with ATI as rendering.
I totally liked nvidia products, until.
Rename.
Rename.
Meant to be played issues here n there.
Physx bullshit pushing.
Bashing at AMD for no reason
Bashing again.
Meant to be played starting to become Way its meant to bug you.
Till.
The way its meant to piss you off big time
Physx for me was money well thrown away, i can sell it, but its not worth anything now anyways due to HD5xxx.
"no Batman Arkham Asylum for GeForce for you, bta."
evil wallet.
And AA is a feature that has to be added to a game, it isn't just magically in there, at least not the type of optimized adaptive AA that is present in Batman.
FSAA is just a driver switch that any developer can enabled. However, it always comes with a drop in framerate. The AA used in Batman has been optimized to not only make the game look better, but do it at no performance loss, by optimizing what object get AA applied and which don't. This is definitely not a standard feature in games.
If ATi users want AA, enabled it in CCC, there are plenty of other games that don't have AA and require this also. You will get a FPS hit just like those other games also. In the demo, but how do we know it doesn't cause a problem further along in the game, as I've already pointed out? They haven't tested more than 15 minutes of gameplay and we all assume it works through the entire game.
How many times have we played a game, that worked fine through 3-4 hours of gameplay, then suddenly crashes at the exact same spot no matter what we do? I know I've had it happen several times in the many years I've been playing. In games as recently released as a few months ago. It is actually pretty common in newly released game, as the drivers haven't been fixed yet. The solution is often to disable some visual feature(because the drivers don't like it), or to wait for better drivers.
We don't know that this isn't the case here. Instead, some are jumping to the conclusion that because it has an nVidia stamp on it, that nVidia disabled the feature for ATi. We don't know that. And frankly for a news reporter to even suggest it without any shred of proof completely removes all credibility that new reporter has.
Then there is the Physx issue where the developer did a rather shitty job of that too. Advanced Physx effects only run on NVIDIA GPUs. There is no (in game) option to enable them under the CPU. And some of the effects like cloth and dynamic fog were simply removed in the non Physx version. Apparently it was too much work to replace the effects with at least semi static ones......
Bottom line is that this game, with help from NVIDIA, was intentionally neutered for when it was run on non NVIDIA hardware.
DirectX 10.1 support was removed because:
1. Nvidia cards don't support it.
2. HD 3000 series cards were 20% better than their respective 9000 GT series counterparts with DirectX 10.1 (and AA enabled). The quoted part makes Ubisoft's reasoning pointless. How can making a process take one less step to finish as "costly"? Which ultimately "feeds fuel to the fire" that there really is a different reason.
techreport.com/discussions.x/14707
www.bit-tech.net/news/hardware/2008/05/12/ubisoft-caught-in-assassin-s-creed-marketing-war/1
www.tgdaily.com/content/view/37326/98/
www.fudzilla.com/index.php?option=com_content&task=view&id=7355&Itemid=1
Also Batman is Unreal3 engine which actually doesn't support AA (at least MSAA) natively, so some tweaking needs to be done to get it to support it properly. If NVIDIA paid for these optimizations and getting AA working on this title then they should benefit from this. (TWIMTBP isn't just a stamp, they actually send people out to sit with developers and optimize the game together. No money is paid to the developer to lower performance on competitor hardware!)
ATI used to have a GITG campaign which vanished into thin air, despite the company having said NVIDIA's campaign is nothing more than a marketing gimmick. :rolleyes:
I'm not sure of the AA implementation of UE3 games on the Xbox, but chances are it's SSAA which is exactly what we can get on our ATI graphics cards and if be it SSAA or MSAA, on a console you can tweak performance right down to per cycle level. don't compare a closed system with a PC.
So before we say there's a conspiracy, lets calm ourselves and think about it a little.
EVIDENCE 2: That API however is only supported by ATi cards
API is removed from the game. So the game developers doesn't want increased performance from their games? :confused:
And if the increasd performance comes at the cost of stability, no they probably don't. Especially when they have to handle all the calls from people complaining about the game crashing all the time. I bet if they gave all those people your phone number, you'd probably want DX10.1 removed also.
And since I forgot to add it in the previous post. DX10.1 is "costly" because it requires extra developement time to implement into the game code. It is not costly to render, which is what the quote you posted talks about. However, that is not the "costly" we are talking about when we say it is costly to implement. DX10 has to be implemented either way, DX10.1 only adds to developement costs.
If Nvidia pays for development they could make sure ATi cards can't play it at all. The developer is a company. Its not required to make a game run in any way. If it doesn't play to your liking don't buy it.
There is no "international video game creation bill of rights". A company can make a game play however they want as long as it doesn't cause harm to the person playing it or his/her property. Thats reality. If a game developer doesn't support your hardware to your liking don't buy the game. :) If you are serious it makes this look no longer intentional by the developer. Got a link. trollin? :)
The removal of the DX10.1 support was THROUGH a patch. So how did they get to the initial version in the first place if it was costly? Why did they include it in the first place then?
Stability issues were almost always because of an Nvidia card though (pre-patch). And a post here also talking about AC with an ATi card, said his game was running perfectly before the patch, but then crashes after the patch. Selective stability then?
Use ati Control Panel, Why doesnt it work if ati themself made it work by changing ID ? .
bildr.no/view/497400
The biggest problem is the fact, ATI drivers has always been poor and bad at this point they would be like nvidia or even higher.
ATI GPU with tremendous computing power but they are too lazy to develop drivers able to exploit.
I would say to stop the childish acting fanboyism " i hate nvidia" etc. ...
The stability issues with nVidia cards was due to PhysX mostly. I'm sure there were plenty of stability issue with ATi cards, but they were drastically overshadowed by the PhysX issues. It might have come down to a decision of what features to fix, and which features to just give up on. Sometimes that is what has to be done in the business world.
The patch definitely made the game more stable on both side, but no game is ever going to be perfect. There will always be crashes on certain configurations.
and no it's not about game stability, it's just one fucking greedy developer. look at the news, they just change device ID and viola AA worked flawlessly(and crush nvdia performance) it's just like :
IF device ID=ATI then
{
AA=disable
}
they should mention it in the box that's say "it's for Nvdia card only" so ATI owner card won't get pissed, ypu know