Tuesday, September 29th 2009

Batman: Arkham Asylum Enables AA Only on NVIDIA Hardware on PCs

Anti-Aliasing has been one of the most basic image-quality enhancements available in today's games. PC graphics hardware manufacturers regard it as more of an industry standard, and game developers echo with them, by integrating anti-aliasing (AA) features in the game, as part of its engine. This allows the game to selectively implement AA in parts of the 3D scene, so even as the overall image quality of the scene is improved, so is performance, by making sure that not every object in the scene is given AA. It seems that in one of the most well marketed games of the year, Batman: Arkham Asylum, doesn't like to work with ATI Radeon graphics cards when it comes to its in-game AA implementation.

Developed under NVIDIA's The Way it's Meant to be Played program, and featuring NVIDIA's PhysX technology, the game's launcher disables in-game AA when it detects AMD's ATI Radeon graphics hardware. AMD's Ian McNaughton in his recent blog thread said that they had confirmed this by an experiment where they ran ATI Radeon hardware under changed device IDs. Says McNaughton: "Additionally, the in-game AA option was removed when ATI cards are detected. We were able to confirm this by changing the ids of ATI graphics cards in the Batman demo. By tricking the application, we were able to get in-game AA option where our performance was significantly enhanced." He further adds that the option is not available for the retail game as there is a secure-rom.

With no in-game AA available to ATI Radeon users, although the features do technically work on ATI Radeon hardware, the only way AA can be used is by forcing it in Catalyst Control Center. This causes the driver to use AA on every 3D object in the scene, reducing performance, compared to if the game's in-game AA engine is used. "To fairly benchmark this application, please turn off all AA to assess the performance of the respective graphics cards. Also, we should point out that even at 2560×1600 with 4x AA and 8x AF we are still in the highly playable territory," McNaughton adds. Choose with your wallets.
Add your own comment

353 Comments on Batman: Arkham Asylum Enables AA Only on NVIDIA Hardware on PCs

#51
mR Yellow
entropy13Nvidia pressured Ubisoft to remove DirectX 10.1 support.
Thats why i don't like nVidia and their anti-competitive behavior and shady morals.
My next purchase will definitely be a 5870. U don't really need more power than that.
Posted on Reply
#52
mR Yellow
newtekie1Well if nVidia was the one that paid for the AA optimizations to be put in the game, I see no problem with them limitting them to nVidia hardware also.

For all we know, if nVidia didn't put the money in to implement the AA optimizations, we would never have seen them in the game, so why should ATi benefit from that?

It might not be a case of nVidia or the game developers removing a feature, but instead a case of nVidia paying to have the feature added in the first place.

These could have been performance optimizations that nVidia entirely paid for, the whole purpose of TWIMTBP program, so why should then enable them for ATi?

Or a completely different reasoning:

It could be that having it enabled with ATi cards causes problems in the retail game(remember they only tested this on the demo). For all we know, something with the way ATi cards handles AA causes the game to crash or be extremely buggy with the optimized AA enabled. Maybe a certain part of the game is completely unplayable on ATi cards with the feature enabled, so the developers(nothing to do with nVidia at all) just gave up trying to fix it, and simply disabled the feature on ATi cards as a quick fix to get the game shipped. They now have more time to work on a patch to make it work. It wouldn't be the first time we've seen games have problems with one manufacturer, but not the other, due to certain visual elements conflicting with the current drivers.

Either way, I highly doubt nVidia caused a feature that was already in the game to be disabled.



How do you know this?
This is common knowledge. Been discussed before.

BTW AA isn't an added feature...it's a standard.
Posted on Reply
#53
Disparia
newtekie, I always appreciate your sound logical postings and thank them on occasion. But trying to bridge logic to consumers who will use emotions, pov ethics, etc, is bit futile :D
Posted on Reply
#54
Steevo
I now won't buy the game, another Nvidia card for any PC I build, and will show this as the shit that is, just like the IQ Nvidia tried forcing on users years ago to keep up.

Fuck you Nvidia, and the horse you ride on.
Posted on Reply
#55
tkpenalty
newtekie1Well if nVidia was the one that paid for the AA optimizations to be put in the game, I see no problem with them limitting them to nVidia hardware also.

For all we know, if nVidia didn't put the money in to implement the AA optimizations, we would never have seen them in the game, so why should ATi benefit from that?

It might not be a case of nVidia or the game developers removing a feature, but instead a case of nVidia paying to have the feature added in the first place.

These could have been performance optimizations that nVidia entirely paid for, the whole purpose of TWIMTBP program, so why should then enable them for ATi?

Or a completely different reasoning:

It could be that having it enabled with ATi cards causes problems in the retail game(remember they only tested this on the demo). For all we know, something with the way ATi cards handles AA causes the game to crash or be extremely buggy with the optimized AA enabled. Maybe a certain part of the game is completely unplayable on ATi cards with the feature enabled, so the developers(nothing to do with nVidia at all) just gave up trying to fix it, and simply disabled the feature on ATi cards as a quick fix to get the game shipped. They now have more time to work on a patch to make it work. It wouldn't be the first time we've seen games have problems with one manufacturer, but not the other, due to certain visual elements conflicting with the current drivers.

Either way, I highly doubt nVidia caused a feature that was already in the game to be disabled.
You do realise that this game has been ported from the xbox 360, which mind you runs a R500 based GPU-Ati/AMD's stuff. Now in this case what is preventing AA from working in the first place is secuROM-a questionable use of something that is MEANT to be used for anti-piracy reasons, and not for anti-competetive market practises. Yes i love that phrase anti-competetive market practises. And I love using it agains you since you always argue against such matters.

The blogger has proven that they're able to get the game to run with AMD's hardware with a rather evasive measure.

Edios will probably stay silent on this matter. Whats next? Disabling rendering altogether?
Posted on Reply
#56
DaedalusHelios
mdm-adph"Removing a feature," especially when it comes to making the game look better through AA, is pretty much the same as "making it run shittier," since I care about IQ.

I don't believe for a second that there was something about this game that didn't allow it to run AA just FINE on ATI hardware, especially considering (like one poster pointed out) it's an Xbox port. :shadedshu
Its not an xbox port, someone else has pointed this out already. Development was separate to incorporate GPU Physx.

"run sh!ttier" implies lower frame rates, and that is only when you force it in CCC not having the selective AA feature which is the very definition of an optimization. Only difference is you see it in the form of a button now so its worse somehow?

I think its stupid they did it but its a difference in IQ and not framerate unless you change the ATi profile to make it swing the other way by forcing AA in CCC.

They can just have an aftermarket patch to swing it the other way. Its no big deal.
Posted on Reply
#57
Imsochobo
newtekie1Well if nVidia was the one that paid for the AA optimizations to be put in the game, I see no problem with them limitting them to nVidia hardware also.

For all we know, if nVidia didn't put the money in to implement the AA optimizations, we would never have seen them in the game, so why should ATi benefit from that?

It might not be a case of nVidia or the game developers removing a feature, but instead a case of nVidia paying to have the feature added in the first place.

These could have been performance optimizations that nVidia entirely paid for, the whole purpose of TWIMTBP program, so why should then enable them for ATi?

Or a completely different reasoning:

It could be that having it enabled with ATi cards causes problems in the retail game(remember they only tested this on the demo). For all we know, something with the way ATi cards handles AA causes the game to crash or be extremely buggy with the optimized AA enabled. Maybe a certain part of the game is completely unplayable on ATi cards with the feature enabled, so the developers(nothing to do with nVidia at all) just gave up trying to fix it, and simply disabled the feature on ATi cards as a quick fix to get the game shipped. They now have more time to work on a patch to make it work. It wouldn't be the first time we've seen games have problems with one manufacturer, but not the other, due to certain visual elements conflicting with the current drivers.

Either way, I highly doubt nVidia caused a feature that was already in the game to be disabled.



How do you know this?
Nvidia fanboy at the best ?

I completed the AC on 3 comps using diffrent setups, all DX10.1, NEVER crashed, the patch came and it crashed. Lol.

Did you read it ?
They changed a ID and it WORKED!
There is always something that seems to be bad with ati cards as long as its way its meant to be played/payed.

Xbox runs AA, Why cant a simular architecture run it ? since 2900xt/1950xtx in the middle of those is what a Xbox ati chip is, somewhere in that path.
It doesnt work with my 2900 XT or a 1950XTX... how about That ?.
(Yes i tested :D)

And btw, i have a shitvidia card, which i could have used but nvidia doesnt let me.
Nvidia, the way its meant to fail.
I'm talking about PHysx with ATI as rendering.
I totally liked nvidia products, until.

Rename.
Rename.
Meant to be played issues here n there.
Physx bullshit pushing.
Bashing at AMD for no reason
Bashing again.
Meant to be played starting to become Way its meant to bug you.
Till.
The way its meant to piss you off big time

Physx for me was money well thrown away, i can sell it, but its not worth anything now anyways due to HD5xxx.
Posted on Reply
#58
AphexDreamer
Well nothing more needs to be said by me to elaborate this BS brought on by Nshitia again. I just want them to pull this shit off with the PS3's RSX so I can benfit from it at least one time :laugh:.
Posted on Reply
#59
btarunr
Editor & Senior Moderator
ATI GPUs can handle that game's in-game AA, this comes from AMD. The game just disables the feature when it sees an AMD GPU. This is total blasphemy. I'm not going to / can't tell you what you should choose with your wallets, but I'll tell you what my wallet says.

"no Batman Arkham Asylum for GeForce for you, bta."

evil wallet.
Posted on Reply
#60
newtekie1
Semi-Retired Folder
mR YellowThis is common knowledge. Been discussed before.

BTW AA isn't an added feature...it's a standard.
It's been discussed, but no one has ever shown any proof that nVidia was really behind Ubisoft removing it. Not a single shred. Plenty of claims, but claims don't equate proof.

And AA is a feature that has to be added to a game, it isn't just magically in there, at least not the type of optimized adaptive AA that is present in Batman.

FSAA is just a driver switch that any developer can enabled. However, it always comes with a drop in framerate. The AA used in Batman has been optimized to not only make the game look better, but do it at no performance loss, by optimizing what object get AA applied and which don't. This is definitely not a standard feature in games.

If ATi users want AA, enabled it in CCC, there are plenty of other games that don't have AA and require this also. You will get a FPS hit just like those other games also.
btarunrATI GPUs can handle that game's in-game AA, this comes from AMD. The game just disables the feature when it sees an AMD GPU. This is total blasphemy. I'm not going to / can't tell you what you should choose with your wallets, but I'll tell you what my wallet says.

"no Batman Arkham Asylum for GeForce for you, bta."

evil wallet.
In the demo, but how do we know it doesn't cause a problem further along in the game, as I've already pointed out? They haven't tested more than 15 minutes of gameplay and we all assume it works through the entire game.

How many times have we played a game, that worked fine through 3-4 hours of gameplay, then suddenly crashes at the exact same spot no matter what we do? I know I've had it happen several times in the many years I've been playing. In games as recently released as a few months ago. It is actually pretty common in newly released game, as the drivers haven't been fixed yet. The solution is often to disable some visual feature(because the drivers don't like it), or to wait for better drivers.

We don't know that this isn't the case here. Instead, some are jumping to the conclusion that because it has an nVidia stamp on it, that nVidia disabled the feature for ATi. We don't know that. And frankly for a news reporter to even suggest it without any shred of proof completely removes all credibility that new reporter has.
Posted on Reply
#61
Bull Dog
DaedalusHeliosYou help to pay for a games creation and you want something in return? Thats crazy, Nvidia must have used magic. :laugh:

There are games optimized to do well with Nvidia drivers and vice versa. Its nothing new, its why they throw their hat in the ring to help out in development and funding. Its what they get in return. Not like its the best game of the year. It probably sucks. But its sales are supposed to be good so idk.

No reason to act "butthurt" you guys.
Ignoring the artificial limitation of extra Physx Effects to the GPU only for a second. There is a reason to be annoyed. NVIDIA and the game developer colluded to make the game run WORSE on ATI hardware. There is no hardware reason why Batman:AA can't do MSAA on ATI hardware.

Then there is the Physx issue where the developer did a rather shitty job of that too. Advanced Physx effects only run on NVIDIA GPUs. There is no (in game) option to enable them under the CPU. And some of the effects like cloth and dynamic fog were simply removed in the non Physx version. Apparently it was too much work to replace the effects with at least semi static ones......


Bottom line is that this game, with help from NVIDIA, was intentionally neutered for when it was run on non NVIDIA hardware.
Posted on Reply
#62
entropy13
newtekie1How do you know this?
There are many articles out there, hinting towards that. But in some articles Nvidia insists they had no hand to play in the removal of DirectX10.1 support, which is naturally what they'll say (and Ubisoft likewise says that "implementation is costly" - see the TechReport link).

DirectX 10.1 support was removed because:

1. Nvidia cards don't support it.
2. HD 3000 series cards were 20% better than their respective 9000 GT series counterparts with DirectX 10.1 (and AA enabled).
DirectX 10.1 gives the shader units access to all anti-aliasing buffers in a single pass – something that developers have been unable to do with DirectX 10.0. "DX10.0 screwed AA [performance]. . . . 10.1 would solve that [issue]," said one developer reportedly close to Ubisoft.

"Of course it removes the render pass! That's what 10.1 does! Why is no one pointing this out, that's the correct way to implement it and is why we will implement 10.1. The same effects in 10.1 take 1 pass whereas in 10 it takes 2 passes," added another anonymous developer, said to be working on a title that implements DirectX 10.1 support – in addition to DirectX 10.
The quoted part makes Ubisoft's reasoning pointless. How can making a process take one less step to finish as "costly"? Which ultimately "feeds fuel to the fire" that there really is a different reason.


techreport.com/discussions.x/14707
www.bit-tech.net/news/hardware/2008/05/12/ubisoft-caught-in-assassin-s-creed-marketing-war/1
www.tgdaily.com/content/view/37326/98/
www.fudzilla.com/index.php?option=com_content&task=view&id=7355&Itemid=1
Posted on Reply
#63
ShockG
If we would all get over ourselves. Changing device ID's to get software working is not new. How about when FarCry detected that you were using NV3X, hardware the precision was changed to FP16 instead of FP32? you could could disable this by changing vendor and device ID. (Had the FarCry ATI demo which apparently used Truform that would not work with NVIDIA hardware but changing device and vendor ID allowed it to work on NV hardware as well)
Also Batman is Unreal3 engine which actually doesn't support AA (at least MSAA) natively, so some tweaking needs to be done to get it to support it properly. If NVIDIA paid for these optimizations and getting AA working on this title then they should benefit from this. (TWIMTBP isn't just a stamp, they actually send people out to sit with developers and optimize the game together. No money is paid to the developer to lower performance on competitor hardware!)

ATI used to have a GITG campaign which vanished into thin air, despite the company having said NVIDIA's campaign is nothing more than a marketing gimmick. :rolleyes:

I'm not sure of the AA implementation of UE3 games on the Xbox, but chances are it's SSAA which is exactly what we can get on our ATI graphics cards and if be it SSAA or MSAA, on a console you can tweak performance right down to per cycle level. don't compare a closed system with a PC.

So before we say there's a conspiracy, lets calm ourselves and think about it a little.
Posted on Reply
#64
[I.R.A]_FBi
newtekie1Well if nVidia was the one that paid for the AA optimizations to be put in the game, I see no problem with them limitting them to nVidia hardware also.

For all we know, if nVidia didn't put the money in to implement the AA optimizations, we would never have seen them in the game, so why should ATi benefit from that?

It might not be a case of nVidia or the game developers removing a feature, but instead a case of nVidia paying to have the feature added in the first place.

These could have been performance optimizations that nVidia entirely paid for, the whole purpose of TWIMTBP program, so why should then enable them for ATi?

Or a completely different reasoning:

It could be that having it enabled with ATi cards causes problems in the retail game(remember they only tested this on the demo). For all we know, something with the way ATi cards handles AA causes the game to crash or be extremely buggy with the optimized AA enabled. Maybe a certain part of the game is completely unplayable on ATi cards with the feature enabled, so the developers(nothing to do with nVidia at all) just gave up trying to fix it, and simply disabled the feature on ATi cards as a quick fix to get the game shipped. They now have more time to work on a patch to make it work. It wouldn't be the first time we've seen games have problems with one manufacturer, but not the other, due to certain visual elements conflicting with the current drivers.

Either way, I highly doubt nVidia caused a feature that was already in the game to be disabled.



How do you know this?
anyone else hear this shit?
Posted on Reply
#65
newtekie1
Semi-Retired Folder
entropy13There are many articles out there, hinting towards that. But in some articles Nvidia insists they had no hand to play in the removal of DirectX10.1 support, which is naturally what they'll say (and Ubisoft likewise says that "implementation is costly" - see the TechReport link).

DirectX 10.1 support was removed because:

1. Nvidia cards don't support it.
2. HD 3000 series cards were 20% better than their respective 9000 GT series counterparts with DirectX 10.1 (and AA enabled).



The quoted part makes Ubisoft's reasoning pointless. How can making a process take one less step to finish as "costly"? Which ultimately "feeds fuel to the fire" that there really is a different reason.


techreport.com/discussions.x/14707
www.bit-tech.net/news/hardware/2008/05/12/ubisoft-caught-in-assassin-s-creed-marketing-war/1
www.tgdaily.com/content/view/37326/98/
www.fudzilla.com/index.php?option=com_content&task=view&id=7355&Itemid=1
Ah, so a bunch of conspiracies with no evidence at all. Thats what I thought.
Posted on Reply
#67
entropy13
newtekie1Ah, so a bunch of conspiracies with no evidence at all. Thats what I thought.
EVIDENCE 1: A specific API for the game can increase performance
EVIDENCE 2: That API however is only supported by ATi cards

API is removed from the game. So the game developers doesn't want increased performance from their games? :confused:
Posted on Reply
#68
newtekie1
Semi-Retired Folder
entropy13EVIDENCE 1: A specific API for the game can increase performance
EVIDENCE 2: That API however is only supported by ATi cards

API is removed from the game. So the game developers doesn't want increased performance from their games? :confused:
Thats grasping at straws, at best.

And if the increasd performance comes at the cost of stability, no they probably don't. Especially when they have to handle all the calls from people complaining about the game crashing all the time. I bet if they gave all those people your phone number, you'd probably want DX10.1 removed also.

And since I forgot to add it in the previous post. DX10.1 is "costly" because it requires extra developement time to implement into the game code. It is not costly to render, which is what the quote you posted talks about. However, that is not the "costly" we are talking about when we say it is costly to implement. DX10 has to be implemented either way, DX10.1 only adds to developement costs.
Posted on Reply
#69
DaedalusHelios
Bull DogBottom line is that this game, with help from NVIDIA, was intentionally neutered for when it was run on non NVIDIA hardware.
Bottom line is if you don't support ATi equally its an "evil" game and you feel wronged?

If Nvidia pays for development they could make sure ATi cards can't play it at all. The developer is a company. Its not required to make a game run in any way. If it doesn't play to your liking don't buy it.

There is no "international video game creation bill of rights". A company can make a game play however they want as long as it doesn't cause harm to the person playing it or his/her property. Thats reality. If a game developer doesn't support your hardware to your liking don't buy the game. :)
ImsochoboGame just got patched. :P
If you are serious it makes this look no longer intentional by the developer. Got a link. trollin? :)
Posted on Reply
#70
entropy13
newtekie1Thats grasping at straws, at best.

And if the increasd performance comes at the cost of stability, no they probably don't. Especially when they have to handle all the calls from people complaining about the game crashing all the time. I bet if they gave all those people your phone number, you'd probably want DX10.1 removed also.

And since I forgot to add it in the previous post. DX10.1 is "costly" because it requires extra developement time to implement into the game code. It is not costly to render, which is what the quote you posted talks about. However, that is not the "costly" we are talking about when we say it is costly to implement. DX10 has to be implemented either way, DX10.1 only adds to developement costs.
I wasn't grasping at straws actually, since I made a question, not a statement.

The removal of the DX10.1 support was THROUGH a patch. So how did they get to the initial version in the first place if it was costly? Why did they include it in the first place then?

Stability issues were almost always because of an Nvidia card though (pre-patch). And a post here also talking about AC with an ATi card, said his game was running perfectly before the patch, but then crashes after the patch. Selective stability then?
Posted on Reply
#71
DaedalusHelios
newtekie1Thats grasping at straws, at best.

And if the increasd performance comes at the cost of stability, no they probably don't. Especially when they have to handle all the calls from people complaining about the game crashing all the time. I bet if they gave all those people your phone number, you'd probably want DX10.1 removed also.

And since I forgot to add it in the previous post. DX10.1 is "costly" because it requires extra developement time to implement into the game code. It is not costly to render, which is what the quote you posted talks about. However, that is not the "costly" we are talking about when we say it is costly to implement. DX10 has to be implemented either way, DX10.1 only adds to developement costs.
Thats true. But I did hear that DX11(not DX10.1) makes it cost less because they made the tools easier to use for the developers somehow with the creation of DX11.
Posted on Reply
#72
Imsochobo
The setting Now says :
Use ati Control Panel, Why doesnt it work if ati themself made it work by changing ID ? .


bildr.no/view/497400
Posted on Reply
#73
Animalpak
Well tell to ATI to invest more in the development and refinement of the drivers.

The biggest problem is the fact, ATI drivers has always been poor and bad at this point they would be like nvidia or even higher.

ATI GPU with tremendous computing power but they are too lazy to develop drivers able to exploit.

I would say to stop the childish acting fanboyism " i hate nvidia" etc. ...
Posted on Reply
#74
newtekie1
Semi-Retired Folder
entropy13I wasn't grasping at straws actually, since I made a question, not a statement.

The removal of the DX10.1 support was THROUGH a patch. So how did they get to the initial version in the first place if it was costly? Why did they include it in the first place then?

Stability issues were almost always because of an Nvidia card though (pre-patch). And a post here also talking about AC with an ATi card said his game crashes after the patch. Selective stability then?
I was explaing why implementing DX10.1 is costly in the first place, as you seem to believe that it comes free. In the Ubisoft case, they didn't say they removed it because it was costly, their reason for removing it was because it made the game unstable. In that case, it had nothing to do with being costly to implement(though it might have been costly to fix the implementation...).

The stability issues with nVidia cards was due to PhysX mostly. I'm sure there were plenty of stability issue with ATi cards, but they were drastically overshadowed by the PhysX issues. It might have come down to a decision of what features to fix, and which features to just give up on. Sometimes that is what has to be done in the business world.

The patch definitely made the game more stable on both side, but no game is ever going to be perfect. There will always be crashes on certain configurations.
Posted on Reply
#75
Unregistered
newtekie1Well if nVidia was the one that paid for the AA optimizations to be put in the game, I see no problem with them limitting them to nVidia hardware also.

For all we know, if nVidia didn't put the money in to implement the AA optimizations, we would never have seen them in the game, so why should ATi benefit from that?

It might not be a case of nVidia or the game developers removing a feature, but instead a case of nVidia paying to have the feature added in the first place.

These could have been performance optimizations that nVidia entirely paid for, the whole purpose of TWIMTBP program, so why should then enable them for ATi?

Or a completely different reasoning:

It could be that having it enabled with ATi cards causes problems in the retail game(remember they only tested this on the demo). For all we know, something with the way ATi cards handles AA causes the game to crash or be extremely buggy with the optimized AA enabled. Maybe a certain part of the game is completely unplayable on ATi cards with the feature enabled, so the developers(nothing to do with nVidia at all) just gave up trying to fix it, and simply disabled the feature on ATi cards as a quick fix to get the game shipped. They now have more time to work on a patch to make it work. It wouldn't be the first time we've seen games have problems with one manufacturer, but not the other, due to certain visual elements conflicting with the current drivers.

Either way, I highly doubt nVidia caused a feature that was already in the game to be disabled.



How do you know this?
wtf, are you talking about, we are paying our hard earned money for their game you know, but what they do? instead working it more playable to other hardware, they chose to crippled the game so they can take some money.

and no it's not about game stability, it's just one fucking greedy developer. look at the news, they just change device ID and viola AA worked flawlessly(and crush nvdia performance) it's just like :

IF device ID=ATI then
{
AA=disable
}


they should mention it in the box that's say "it's for Nvdia card only" so ATI owner card won't get pissed, ypu know
Posted on Edit | Reply
Add your own comment
May 3rd, 2024 09:15 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts