Tuesday, September 29th 2009

Batman: Arkham Asylum Enables AA Only on NVIDIA Hardware on PCs

Anti-Aliasing has been one of the most basic image-quality enhancements available in today's games. PC graphics hardware manufacturers regard it as more of an industry standard, and game developers echo with them, by integrating anti-aliasing (AA) features in the game, as part of its engine. This allows the game to selectively implement AA in parts of the 3D scene, so even as the overall image quality of the scene is improved, so is performance, by making sure that not every object in the scene is given AA. It seems that in one of the most well marketed games of the year, Batman: Arkham Asylum, doesn't like to work with ATI Radeon graphics cards when it comes to its in-game AA implementation.

Developed under NVIDIA's The Way it's Meant to be Played program, and featuring NVIDIA's PhysX technology, the game's launcher disables in-game AA when it detects AMD's ATI Radeon graphics hardware. AMD's Ian McNaughton in his recent blog thread said that they had confirmed this by an experiment where they ran ATI Radeon hardware under changed device IDs. Says McNaughton: "Additionally, the in-game AA option was removed when ATI cards are detected. We were able to confirm this by changing the ids of ATI graphics cards in the Batman demo. By tricking the application, we were able to get in-game AA option where our performance was significantly enhanced." He further adds that the option is not available for the retail game as there is a secure-rom.

With no in-game AA available to ATI Radeon users, although the features do technically work on ATI Radeon hardware, the only way AA can be used is by forcing it in Catalyst Control Center. This causes the driver to use AA on every 3D object in the scene, reducing performance, compared to if the game's in-game AA engine is used. "To fairly benchmark this application, please turn off all AA to assess the performance of the respective graphics cards. Also, we should point out that even at 2560×1600 with 4x AA and 8x AF we are still in the highly playable territory," McNaughton adds. Choose with your wallets.
Add your own comment

353 Comments on Batman: Arkham Asylum Enables AA Only on NVIDIA Hardware on PCs

#302
TheMailMan78
Big Member
BenetanegiaNO. This is an especial AA that was paid by Nvidia for their cards and was later tested in their cards. If AMD had done the same you would have that feature. What it's never going to happen is that Nvidia pays so that a feature is added and tested to run in AMD hardware. :shadedshu
So this is a magic AA that has no performance hit for Nvidia cards? Its all makes sense now. You don't think that its basically a shortcut to the Geforce drivers to force AA to it do you? Naaaaa :laugh:
Posted on Reply
#303
Benetanegia
TheMailMan78So this is a magic AA that has no performance hit for Nvidia cards? Its all makes sense now. You don't think that its basically a shortcut to the Geforce drivers to force AA to it do you? Naaaaa :laugh:
Indeed it's sort of a magic AA. It's adaptative AA. Some parts are anti-aliased (the ones that need to be) and some are not, saving a lot of resources and improving performance. It has a hit but not as high as FSAA has. I'll repeat, Unreal Engine 3 does not support MSAA.
Posted on Reply
#304
El Fiendo
No, its an AA that is applied through software so it doesn't have the natural 20% hit of doing it by hardware. If NVIDIA hadn't have had it put in, everyone (NVIDIA and ATI both) would be forced to do hardware forced AA through the control panels of their respective driver set, resulting in overall performance loss. Because NVIDIA had it implemented into the software, something that wouldn't happen out of the box with UE3 or UE3.5 and was extra, they cut out the performance loss by removing the need for it. As such, they have the right to implement it on their hardware alone.
Posted on Reply
#305
TheMailMan78
Big Member
BenetanegiaIndeed it's sort of a magic AA. It's adaptative AA. Some parts are anti-aliased (the ones that need to be) and some are not, saving a lot of resources and improving performance. It has a hit but not as high as FSAA has. I'll repeat, Unreal Engine 3 does not support MSAA.
Ok they didn't add AA to this game. Its not like they are baking cookies and forgot to add sugar. Adding AA to an engine like this would take a lot more than TWIMTBP program would be willing to finance. All they did was take a "shortcut" instead of you manually doing via the geforce drivers. True the Unreal 3.0 engine doesn't support AA but but ATI and Nvidia have supported AA in the Unreal engine for almost a year now. All Nvidia did was get a small head start in the game profiles.
El FiendoNo, its an AA that is applied through software so it doesn't have the natural 20% hit of doing it by hardware. If NVIDIA hadn't have had it put in, everyone (NVIDIA and ATI both) would be forced to do hardware forced AA through the control panels of their respective driver set, resulting in overall performance loss. Because NVIDIA had it implemented into the software, something that wouldn't happen out of the box and was extra, they cut out the performance loss by removing the need for it.
How much do you want to bet within two driver releases for ATI the AA performance will be the same as Nvidias "magic" AA?
Posted on Reply
#306
Benetanegia
TheMailMan78Ok they didn't add AA to this game. Its not like they are baking cookies and forgot to add sugar. Adding AA to an engine like this would take a lot more than TWIMTBP program would be willing to finance. All they did was take a "shortcut" instead of you manually doing via the geforce drivers. True the Unreal 3.0 engine doesn't support AA but but ATI and Nvidia have supported AA in the Unreal engine for almost a year now. All Nvidia did was get a small head start in the game profiles.
EEhhhhh??? None Nvidia nor Ati have supported AA in UE3. They might have made optimizations so that FSAA works faster when enabled in the control panel. FSAA works always, because it works over the final frame, MSAA has to be implemented in the game because it works at fragment (pixel) level, the AA we are talking about goes a little bit farther by selecting with parts need AA and which don't.

The thing about the shortcut is nt only BS, but imposible to make. No matter how FSAA is used it always has a performance hit. UE3 can't do MSAA, it can do FSAA though, at cost of a lot of performance.
How much do you want to bet within two driver releases for ATI the AA performance will be the same as Nvidias "magic" AA?
Of course they would, but it's more probable that they work with the develper and get the feature working. Something they should have done since the beginning if they wanted in-game AA. But truth is that if they trully wanted in-game AA, they would have done it before, in any of the various other UE3 games. They don't want to spend their money on such a feature, they want others to spend that money and have it for free. That's what they are crying for. No more no less.
Posted on Reply
#307
El Fiendo
TheMailMan78How much do you want to bet within two driver releases for ATI the AA performance will be the same as Nvidias "magic" AA?
I've maintained pretty much from the start that this is probably the case. I agree, the NVIDIA AA wasn't/isn't a miracle, it was just made to run correctly instead of breaking the game as software AA does to ATI cards in this game. I figured that ATI simply didn't have things ready on launch day, one developer for ATI decided to bitch and moan, and then the ATI Army championed their plight across the interwebs. Its all the same really, it happened before with Far Cry 2 and the OMG NVIDIA is totally screwing ATI over with image quality. Which then promptly turned out to be a Catalyst bug and was patched within, I believe, 24 hours.
Posted on Reply
#308
newtekie1
Semi-Retired Folder
TheMailMan78How much do you want to bet within two driver releases for ATI the AA performance will be the same as Nvidias "magic" AA?
Yes, but by then, most people will have beat the game, and put it on the shelf...

And I doubt ATi will get MSAA working with Batman: AA, they didn't care to do it before the game was launched, why would they do it after? They have bigger things to worry about. So you will always be forsed to use FSAA, which we have seen has a major performance hit compared to MSAA.

On top of that, even if they get MSAA going, it will still take more of a performance hit because the engine is not optimized like it is for nVidia's hardware.
Posted on Reply
#309
ArmoredCavalry
Benetanegiaimg.techpowerup.org/091002/batset.jpg



It is propietary. It even has Nvidia name all over it!
This has been said like 80000 times already, but Unreal Engine 3 doesn't have AA, every UE3 game to date has not had AA. You had to enable it in the control panel. This AA implementation was put in BM:AA because Nvidia asked them to do and they paid for it and helped making Quality Assurance for the feature to ensure it didn't break the game. AMD didn't even contact with the developer to say Hello and when used it breaks the game, plus it does not anti-alias the game.
Yeah... uh if you read a few posts above yours, you will notice that Mirror's Edge used the unreal engine 3 and DID have AA, and did work on ATI Cards.... Also, financing something doesn't make it proprietary...
Posted on Reply
#310
El Fiendo
Most likely because they took the time to code it in. If you read further up, you'll see UE3 or UE3.5 does not support it straight out of the box. It likely has provision to support it, but you don't have access to it straight away without bugs and errors. They had to pay people to code it in and have it run without crashing the game. There's even proof that when you have software run AA with ATI on this game, the game is unstable.

'Financing something doesn't make it proprietary'.

Uh, yea it does. If I spend money to buy something, I own it, or the rights to it. If the bank gives me home financing, guess what. They own my house and I buy it back from them. What do you think it means?
Posted on Reply
#311
ArmoredCavalry
El FiendoMost likely because they took the time to code it in. If you read further up, you'll see UE3 or UE3.5 does not support it straight out of the box. It likely has provision to support it, but you don't have access to it straight away without bugs and errors. They had to pay people to code it in and have it run without crashing the game. There's even proof that when you have software run AA with ATI on this game, the game is unstable.

'Financing something doesn't make it proprietary'.

Uh, yea it does. If I spend money to buy something, I own it, or the rights to it. If the bank gives me home financing, guess what. They own my house and I buy it back from them. What do you think it means?
I know that it didn't support it straight out of the box................. :rolleyes:

I was pointing out to the guy above that there has been AA on a UE3 game before this, and it and it has run on ATI gpu's:
Benetanegiaevery UE3 game to date has not had AA.
"proprietary - one that possesses, owns, or holds exclusive right to something" I am using proprietary with the meaning "holds exclusive rights". Nvidia doesn't have exclusive rights to AA on Batman... They paid the devs to add a non-proprietary technology (AA) into the game for use by their gpu's.

Now, does Nvidia have the right to include in the agreement that ATI cards should not be allowed to use the AA? Of course, they funded it. But please don't try to tell me that AA is a proprietary technology from Nvidia....
Posted on Reply
#312
Benetanegia
ArmoredCavalryYeah... uh if you read a few posts above yours, you will notice that Mirror's Edge used the unreal engine 3 and DID have AA, and did work on ATI Cards.... Also, financing something doesn't make it proprietary...
Hmm, I didn't thought about Mirror's Edge. Thought that was a modified version of UE. For instance the complete lighting system was changed. It doesn't change anything anyway. As El Fiendo said, they probably worked on that, that's how it should have been made now too.
Posted on Reply
#313
El Fiendo
No, I misunderstood you. I've been saying that the implementation of AA in this game is proprietary (as I see now we both agree). Also, yes it has been in prior UE3x games, but those were special implementations themselves. Sorry, I jumped the guns and appear to have misunderstood the intentions and meaning behind your posts.
Posted on Reply
#314
Benetanegia
ArmoredCavalryI know that it didn't support it straight out of the box................. :rolleyes:

I was pointing out to the guy above that there has been AA on a UE3 game before this, and it and it has run on ATI gpu's:



"proprietary - one that possesses, owns, or holds exclusive right to something" I am using proprietary with the meaning "holds exclusive rights". Nvidia doesn't have exclusive rights to AA on Batman... They paid the devs to add a non-proprietary technology (AA) into the game for use by their gpu's.

Now, does Nvidia have the right to include in the agreement that ATI cards should not be allowed to use the AA? Of course, they funded it. But please don't try to tell me that AA is a proprietary technology from Nvidia....
Anti-aliasing is a very wide range of techniques to obtain the same result, it's not a thing. For instance FSAA and MSAA are very different things, but there are even more types of anti-aliasing. Probably MSAA and FSAA are copyrighted and belong to someone dating back to the 70's. As a comparison, Intel doesn't have the rights over microprocessors, but it does have the rights over x86. Similarly Nvidia doesn't have the rights over AA, but it does have the rights over the implementation of AA present in Batman.
Posted on Reply
#315
ArmoredCavalry
El FiendoNo, I misunderstood you. I've been saying that the implementation of AA in this game is proprietary (as I see now we both agree). Also, yes it has been in prior UE3x games, but those were special implementations themselves. Sorry, I jumped the guns and appear to have misunderstood the intentions and meaning behind your posts.
yeappp
BenetanegiaSimilarly Nvidia doesn't have the rights over AA, but it does have the rights over the implementation of AA present in Batman.
which is what my whole post was trying to say.......... if by "the implementation" you mean only the one they paid for, not all implementations of AA in general
Posted on Reply
#316
Benetanegia
ArmoredCavalrywhich is what my whole post was trying to say.......... if by "the implementation" you mean only the one they paid for, not all implementations of AA in general
I don't know what you wanted to say, but what you said is.
ArmoredCavalry"proprietary - one that possesses, owns, or holds exclusive right to something" I am using proprietary with the meaning "holds exclusive rights". Nvidia doesn't have exclusive rights to AA on Batman... They paid the devs to add a non-proprietary technology (AA) into the game for use by their gpu's.

Now, does Nvidia have the right to include in the agreement that ATI cards should not be allowed to use the AA? Of course, they funded it. But please don't try to tell me that AA is a proprietary technology from Nvidia....
First one, Nvidia doesn't have exclusive rights over AA in Batman, but they have exclusive rights over THEIR AA.

Second sentence. They didn't paid developers to add non-propietary technology, they paid to add propietary technology.

AA as a whole no, but the AA present in BM is propietary.

What you fail to understand is that no one prohibited the developers adding AA for Ati cards, it's tha lack of interest from AMD what made it happen that way. AA is only in BM because Nvidia said to add it. If AMD had done the same and asked, helped and tested that or any oter AA technique, there would be AA for Ati cards too. They didn't, end of story.

This is not AMD asking for the implementation of AA and having a NO as the answer. This is AMD not collaborating.
Posted on Reply
#317
ArmoredCavalry
BenetanegiaI don't know what you wanted to say, but what you said is.



First one, Nvidia doesn't have exclusive rights over AA in Batman, but they have exclusive rights over THEIR AA.

Second sentence. They didn't paid developers to add non-propietary technology, they paid to add propietary technology.

AA as a whole no, but the AA present in BM is propietary.

What you fail to understand is that no one prohibited the developers adding AA for Ati cards, it's tha lack of interest from AMD what made it happen that way. AA is only in BM because Nvidia said to add it. If AMD had done the same and asked, helped and tested that or any oter AA technique, there would be AA for Ati cards too. They didn't, end of story.

This is not AMD asking for the implementation of AA and having a NO as the answer. This is AMD not collaborating.
Yeah you probably know better what I meant than I did.... (sarcasm) Ok, I'm just gonna stop replying now.... Obviously you will just keep arguing that Nvidia is teh_best_eva even if no one is actually arguing with you... yeah....... Have fun.
Posted on Reply
#318
TheMailMan78
Big Member
BenetanegiaI don't know what you wanted to say, but what you said is.



First one, Nvidia doesn't have exclusive rights over AA in Batman, but they have exclusive rights over THEIR AA.

Second sentence. They didn't paid developers to add non-propietary technology, they paid to add propietary technology.
AA is AA its not fucking proprietary! Do you even know what that word means?
Posted on Reply
#319
PEPE3D
I think everyone have to chill out. Nvidia and ATI have to start thinking about us. We are the consumers. Plain and simple. I could care less about AA, FSA CIA FBI jajajaja. All I want is for games to be play in my pc regardless of what VGA I have. End of story.
Posted on Reply
#320
Benetanegia
TheMailMan78AA is AA its not fucking proprietary! Do you even know what that word means?
My GOD! There are many ways of doing AA, just like there are many ways of doing CPUs, just like there are dozens ways of doing cakes or ommelettes. You can make your own processor, but God forbid you if you make an x86 CPU.

Propietary AA modes exists. At least, CSAA (Coverage Sample Anti-aliasing) from Nvidia or CFAA (Custom Filter AA) from Ati, have to be known to you.

Antia-aliasing in the end is nothing more than interpolating the color of more than oe pixel to form a single pixel.

FSAA or supersampling 4x, renders the complete image 4 times or at double the resolution per axis if you prefer looking at it that way and then interpolates.

MSAA: Calculates 4 different color points based on patterns, so that there is an offset between them, the interpolates them and then calculates the rest, lighting, shadowing, etc. That's why it's performance is much higher.

Edge detect AA: It's like MSAA but it detects edges before doing AA and only does it on the edges.

BM:AA: From what it saw described somewhere. It does edge detect, but some algorithms determine if objects have to be antialiased or not. For instance it's stupid to antialias an object in the distance if it's going to be blurred by depth of field. I guess it does that.

CFAA: I have no idea.

CSAA: No idea.

I think you get the idea. The important thing is that the math behind those AA modes is different. Although they share some algorithms, they are different. And most of them have patents behind them.
Posted on Reply
#323
troyrae360
PEPE3DWho Cares!!!!!!!!!!!!!!
Everyone that has posted in this thread :nutkick:
Posted on Reply
#324
newtekie1
Semi-Retired Folder
TheMailMan78AA is AA its not fucking proprietary! Do you even know what that word means?
No, AA is not AA. The concept of AA is not proprietary, however the different implementations are.

ATi's driver level AA is propriatery to ATi, nVidia's driver level AA is propriatery to nVidia. Both developed the different methods of doing AA. It isn't like ATi developed AA, and then just gave it to nVidia. Both had to figure out the methods on their own.

The same goes for in-game AA. Some game engines support AA natively, the method for that AA is proprietary. The game engines that don't have AA natively, require AA to be added. The game developers add it their own way, and that is proprietary.
Posted on Reply
Add your own comment
Nov 22nd, 2024 21:23 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts