Tuesday, September 29th 2009

Batman: Arkham Asylum Enables AA Only on NVIDIA Hardware on PCs

Anti-Aliasing has been one of the most basic image-quality enhancements available in today's games. PC graphics hardware manufacturers regard it as more of an industry standard, and game developers echo with them, by integrating anti-aliasing (AA) features in the game, as part of its engine. This allows the game to selectively implement AA in parts of the 3D scene, so even as the overall image quality of the scene is improved, so is performance, by making sure that not every object in the scene is given AA. It seems that in one of the most well marketed games of the year, Batman: Arkham Asylum, doesn't like to work with ATI Radeon graphics cards when it comes to its in-game AA implementation.

Developed under NVIDIA's The Way it's Meant to be Played program, and featuring NVIDIA's PhysX technology, the game's launcher disables in-game AA when it detects AMD's ATI Radeon graphics hardware. AMD's Ian McNaughton in his recent blog thread said that they had confirmed this by an experiment where they ran ATI Radeon hardware under changed device IDs. Says McNaughton: "Additionally, the in-game AA option was removed when ATI cards are detected. We were able to confirm this by changing the ids of ATI graphics cards in the Batman demo. By tricking the application, we were able to get in-game AA option where our performance was significantly enhanced." He further adds that the option is not available for the retail game as there is a secure-rom.

With no in-game AA available to ATI Radeon users, although the features do technically work on ATI Radeon hardware, the only way AA can be used is by forcing it in Catalyst Control Center. This causes the driver to use AA on every 3D object in the scene, reducing performance, compared to if the game's in-game AA engine is used. "To fairly benchmark this application, please turn off all AA to assess the performance of the respective graphics cards. Also, we should point out that even at 2560×1600 with 4x AA and 8x AF we are still in the highly playable territory," McNaughton adds. Choose with your wallets.
Add your own comment

353 Comments on Batman: Arkham Asylum Enables AA Only on NVIDIA Hardware on PCs

#251
newtekie1
Semi-Retired Folder
pr0n InspectorIt doesn't "work just fine" btw.
Interesting. So not only does it not actually work, but it also breaks something in the game.

Sounds like one of the reasons I said in the beginning of this thread...
El FiendoBoth of these claims are NOT true. Batman is based on the Unreal Engine 3, which does not natively support anti-aliasing. We worked closely with Eidos to add AA and QA the feature on GeForce. Nothing prevented AMD from doing the same thing.
Hey, another reason I said in the beginning.

Seems like the simplest solution is most likely to be correct...

You know, a good reporter would put up a retraction correcting his misinformation...of course real reports do research to make sure their story is straight before reporting it and then wrongfully bashing who they believe to be at fault...
DaedalusHeliosBut Nvidia should still lower their prices since the 5870 had such a strong launch.

If people want to rage about Nvidia just complain about the pricing. It won't take lies or misconceptions to do so. Its just plain facts.

I got ripped buying a 7950GX2 back in the day. It scaled like crap and drivers had taken forever to make it scale decently. 9800GX2 and GTX 295's were another story though(still overpriced). :)
Definitely, but I'm sure they will, it just takes time. We are only a week out from the launch of the HD5870, so I expect a price cut announcement on at least the GTX285 and GTX295 very soon.(The others still fit well in the Performance per dollar graph, thanks to the fact that they have had competition from ATi already).
Posted on Reply
#252
Unregistered
up's sorry guy's. i don't mean to bother you all, i'm just angry to the developer.

i will not do that again, i'm really sorry


NB : if you came to indonesia just call me, i will be your guide. and i will show you how beautiful is indonesia. and btw i'm at 20 now
#253
Steevo
This thread needs to die.


Those that feel offended by NV and the developers antics know what not to buy, and those who don't can support division of gamers.
Posted on Reply
#254
Bjorn_Of_Iceland
newtekie1Bullshit, the tests with hacked drivers were showing PhysX running just fine on ATi hardware.

You are seriously over estimating the power required to run PhysX, any current ATi hardware would have been able to completely kill in PhysX performance. Remember, the original hardware the PhysX API ran on was 128MB PCI cards...
Its not running on ati hardware. Its using the "software mode" which utilizes the CPU for the physics processing. Much like ageia before. In which you may be able to realize physics effects on screen with a performance hit as opposed to having the card itself.
Posted on Reply
#255
Benetanegia
Bjorn_Of_IcelandIts not running on ati hardware. Its using the "software mode" which utilizes the CPU for the physics processing. Much like ageia before. In which you may be able to realize physics effects on screen with a performance hit as opposed to having the card itself.
He is talking about the hack that they were preparing in ngohq.com which allowed PhysX to be accelerated on Ati hardware.

www.tomshardware.com/news/nvidia-physx-ati,5764.html

Nvidia even gave him a lot of support.

www.tomshardware.com/news/nvidia-ati-physx,5841.html

Quote: "In the end, if Badit could get PhysX to run on Radeon cards, the PhysX reach would be extended dramatically and Nvidia would not be exposed to any fishy business claims - since a third party developer is leading the effort."

In the end AMD didn't allow that to happen, and lied about which the reasons were behind that decision, because they had a deal with Intel's Havok which only runs on the CPU. Since Intel didn't want GPU acceleration at all, PhysX could not happen, at least fully supported PhysX couldn't happen.

EDIT: And yeah, I know they are slowly porting Havok to run on GPUs too, but that is more than a year after that happened, because PhysX has some support after all, despite their efforts to block it and because by the time they finish porting it Intel will have their Larrabee out. The thing about GPU Havok is so fishy that the demo of Havok running in AMD's HD5xxx were using AMD's propietary Stream API, but the final product is going to be OpenCL...
Posted on Reply
#256
eidairaman1
The Exiled Airman
Well you see where Physx is, just like it was when Ageia appeared on the scene in 2005.
Posted on Reply
#257
TheMailMan78
Big Member
Ok here is my take on this whole thing. ATI "fooled" the game into running AA natively by telling the game it was in fact an Nvidia card. Once they did this it ran better AA than with a real Nvidia card. So basically the feature was not added to the Geforce game profile but removed from the games ATI profile. Yes the Unreal 3 engine does not in fact support AA but ATIs catalyst has supported AA in the Unreal engine since I believe 9.2. To me this is proof TWIMTBP program is paying developers to hamstring ATI.

NOW if AA was offered no matter what GPU you had but in fact ran better on Nvidia than I would accept fair play with TWIMTBP program. However Nvidia cheated ATI users out of something their card is VERY capable of doing natively. After all we are talking about AA. Not Physx.

Nvidia just had a Tonya Harding moment.
Posted on Reply
#258
Benetanegia
TheMailMan78Ok here is my take on this whole thing. ATI "fooled" the game into running AA natively by telling the game it was in fact an Nvidia card. Once they did this it ran better AA than with a real Nvidia card. So basically the feature was not added to the Geforce game profile but removed from the games ATI profile. Yes the Unreal 3 engine does not in fact support AA but ATIs catalyst has supported AA in the Unreal engine since I believe 9.2. To me this is proof TWIMTBP program is paying developers to hamstring ATI.

NOW if AA was offered no matter what GPU you had but in fact ran better on Nvidia than I would accept fair play with TWIMTBP program. However Nvidia cheated ATI users out of something their card is VERY capable of doing natively. After all we are talking about AA. Not Physx.

Nvidia just had a Tonya Harding moment.
1.bp.blogspot.com/_Wn9gB8wTetM/Se4ohOUE_EI/AAAAAAAAAjY/N1RimJJ5xoY/s400/nancy_kerrigan_biography_2.jpg
Did you read the latest info that has been given in the last posts? Not only the AA is not better in Ati cards, but they are not doing AA at all, and they break the game. :shadedshu
Posted on Reply
#259
DaedalusHelios
TheMailMan78 did not get the memo. :o

Seriously read the other posts and edit if necessary. ;)
Posted on Reply
#260
TheMailMan78
Big Member
BenetanegiaDid you read the latest info that has been given in the last posts? Not only the AA is not better in Ati cards, but they are not doing AA at all, and they break the game. :shadedshu
DaedalusHeliosTheMailMan78 did not get the memo. :o

Seriously read the other posts and edit if necessary. ;)
There is 11 pages! Give me some links damn it!
Posted on Reply
#261
El Fiendo
Start on page 7, around my first post. I'm still not sure which way this is going as both sides have good evidence against each other. I tend to lean towards NVIDIA though because breaking only one game doesn't make sense.
Posted on Reply
#262
newtekie1
Semi-Retired Folder
TheMailMan78There is 11 pages! Give me some links damn it!
I'm too lazy to look it up, so here is a summary:
  • The claim was made that nVidia paid to have AA disabled for ATi hardware.
  • The claim was made that AA works.
  • The claim was made that there was no reason to disable the feature for ATi hardware, other than nVidia paying to have it disabled.
  • Some arguing.
  • The claim was made that AA was a feature that nVidia funded the addition of.
  • The claim was also made that, perhaps the feature was disabled on ATi hardware due to it breaking the game.
  • Some arguing.
  • The claim was made that AA is a standard feature in the Unreal 3.5 Engine.
  • The claim was made that ATi proved it doesn't break the game, because if it works in the demo, it will work in the entire game.
  • Some arguing.
  • It was revealed that AA is not a standard feature in the Unreal 3.5 Engine, and nVidia did infact fund the addition of it to the game.Source
  • It was revealed that changing the device ID to allow AA to be enabled in-game, actually breaks the game on ATi hardware.Source
  • It was revealed that, even with the setting enabled, ATi hardware didn't actually do AA because the feature was not designed for ATi hardware.Source
I think that about covers it.

The discussion should be pretty much over with that. There is no wrong doing on nVidia's part. They paid for the developement and inclusion of AA in Batman, it is only fair that only their hardware gets the benefit. ATi was more than open to do the same, but they didn't, it is their loss, and more imporantly the loss of their customers. And unlike the original reports by ATi, the feature doesn't actually work on ATi hardware. The setting can be enabled in the demo, and full game, but it doesn't actually do anything and it breaks the full version of the game.

Perhaps if the two of them would work a little bit more together, we could see extras like this added to all games that work on both. Though we don't want them working so closely together we get another price fixing situation...:laugh:
Posted on Reply
#263
Benetanegia
newtekie1Perhaps if the two of them would work a little bit more together, we could see extras like this added to all games that work on both. Though we don't want them working so closely together we get another price fixing situation...:laugh:
And the feature probably almost works, I mean it requires just some light recoding. What it does need is a lot of testing and QA on Ati hardware, with someone with extensive knowledge of the Ati architecture (AKA AMD engineer) helping a bit and that's pretty much all. It's not a feature of the UE, it's not a feature present in DX, not in this exact form at least. So it's not something you can take as grated that it will properly work under all conditions. A game developer can't release a game with a feature that has not been properly tested.
Posted on Reply
#264
BelligerentBill
I'm amazed this discussion is still going on. It only shows how ignorant fanboys can be when they care nothing about facts as long as they have found a reason to rant. Human weakness at it's finest.

ATI hacked a demo.

The developer did not cripple ATI because Nvidia paid them to do it. Seriously people... this isn't the US Government. Somebody needs to get facts and settle this BS because I've seen nothing but hearsay from ATI.

At the end of the day, I have an Nvidia card :roll:
Posted on Reply
#265
Valdez
Nvidia wants to estabilish a new tradition: gpu makers have to pay for (basic or non-basic) features, if they want it in-game.
It will be fun to see a game with nvidia (tm) AA, nvidia (tm) physx, ati (tm) tessellation, s3 (tm) AF, ati (tm) hdr, etc...

Pathetic.

Anyway, bioshock and mass effect had aa through control panel (both manufacturer).
Posted on Reply
#266
TheMailMan78
Big Member
newtekie1[*]It was revealed that AA is not a standard feature in the Unreal 3.5 Engine, and nVidia did infact fund the addition of it to the game.Source
[*]It was revealed that changing the device ID to allow AA to be enabled in-game, actually breaks the game on ATi hardware.Source
[*]It was revealed that, even with the setting enabled, ATi hardware didn't actually do AA because the feature was not designed for ATi hardware.Source
Ok the first link is Nvidia and the rest are some nut job on a forum that has nothing to do with ATI or Nvidia. :laugh:

THIS is what you guys bring to the table as facts?! Nivdia ok but a quack from a forum?! Come on guys. I thought you had better rebuttals than that. :shadedshu
Posted on Reply
#267
DaedalusHelios
TheMailMan78Ok the first link is Nvidia and the rest are some nut job on a forum that has nothing to do with ATI or Nvidia. :laugh:

THIS is what you guys bring to the table as facts?! Nivdia ok but a quack from a forum?! Come on guys. I thought you had better rebuttals than that. :shadedshu
Its because the argument is not even acknowledged by the tech media. A guy forced it to work and it makes it broken in-game. Try it yourself, you just change a device ID. Unless you think its a conspiracy too. :laugh:
Posted on Reply
#268
TheMailMan78
Big Member
DaedalusHeliosIts because the argument is not even acknowledged by the tech media. A guy forced it to work and it makes it broken in-game. Try it yourself, you just change a device ID. Unless you think its a conspiracy too. :laugh:
I think all of you work for Nvidia and made my dog sterile.
Posted on Reply
#269
El Fiendo
No, I don't work for NVIDIA but I did make your dog sterile.
Posted on Reply
#270
TheMailMan78
Big Member
El FiendoNo, I don't work for NVIDIA but I did make your dog sterile.
Give him a reach around next time. He likes that.
DaedalusHeliosIts because the argument is not even acknowledged by the tech media. A guy forced it to work and it makes it broken in-game. Try it yourself, you just change a device ID. Unless you think its a conspiracy too. :laugh:
I'm not downloading the demo again. Making baseless claims against shit I have no idea about is way easier.
Posted on Reply
#271
newtekie1
Semi-Retired Folder
TheMailMan78Ok the first link is Nvidia and the rest are some nut job on a forum that has nothing to do with ATI or Nvidia. :laugh:

THIS is what you guys bring to the table as facts?! Nivdia ok but a quack from a forum?! Come on guys. I thought you had better rebuttals than that. :shadedshu
Well, nVidia coming right out and saying what they did, is kind of all the proof needed. They are the ones that did it, they know best. It puts all the other other baseless acusations to rest.

And you have to kind of read the whole thing from the "nut job". That "nut job" is the one that originally claimed AA was disabled for ATi, and originally claimed it worked in the demo. Posting screenshots to prove it.

The other forum members later went on to disprove the fact that AA was even working. And the "nut job" himself confirmed that it broke the game(even if he didn't want to admit it at first).
Posted on Reply
#272
TheMailMan78
Big Member
newtekie1Well, nVidia coming right out and saying what they did, is kind of all the proof needed. They are the ones that did it, they know best. It puts all the other other baseless acusations to rest.

And you have to kind of read the whole thing from the "nut job". That "nut job" is the one that originally claimed AA was disabled for ATi, and originally claimed it worked in the demo. Posting screenshots to prove it.

The other forum members later went on to disprove the fact that AA was even working. And the "nut job" himself confirmed that it broke the game(even if he didn't want to admit it at first).
The game yes. Not the demo. ATI never said anything about the game due to secure rom. Anyway the accusation was from a ATI blog not that forum.
Posted on Reply
#273
Meizuman
Slight offtopic:

S.T.A.L.K.E.R. was originally TWIMTBP, but IIRC, the game runs better with Ati hardware... maybe it ran better at launch with nV but Ati drivers were improved from there. (confirmation needed)

But afaik, now they display Ati Radeon logo at startup... in CS and COP.
Posted on Reply
#274
yogurt_21
newtekie1Though we don't want them working so closely together we get another price fixing situation...:laugh:
again, yeah I think not. lol the 600$ standard for highend single card single core and 300$ standard for decent midrange was quite annoying.

now we can typically pick up a 100-200$ card that will grant us all the performance we need. I like it better now.

the argument was interestign to watch, as a former ati fanboy o have to admit I jumped to conclusions, but gettign older I didn't want to post without evidence. I'm glad I didn't and I'm glad the truth came to light.
Posted on Reply
#275
newtekie1
Semi-Retired Folder
TheMailMan78The game yes. Not the demo. ATI never said anything about the game due to secure rom. Anyway the accusation was from a ATI blog not that forum.
True, but the claim and accusation was made by people in this thread that because it worked in the demo it would work in the game. The original person that discovered all of this in the demo, went on to test it in the game, and it didn't work. ATi jumped the gun and started the bashing too quickly, they should have let the original person finish testing it before they starting crying foul. They were basically crying because nVidia paid for the cookie, and didn't share it.
Posted on Reply
Add your own comment
Nov 22nd, 2024 17:24 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts