Tuesday, September 29th 2009
Batman: Arkham Asylum Enables AA Only on NVIDIA Hardware on PCs
Anti-Aliasing has been one of the most basic image-quality enhancements available in today's games. PC graphics hardware manufacturers regard it as more of an industry standard, and game developers echo with them, by integrating anti-aliasing (AA) features in the game, as part of its engine. This allows the game to selectively implement AA in parts of the 3D scene, so even as the overall image quality of the scene is improved, so is performance, by making sure that not every object in the scene is given AA. It seems that in one of the most well marketed games of the year, Batman: Arkham Asylum, doesn't like to work with ATI Radeon graphics cards when it comes to its in-game AA implementation.
Developed under NVIDIA's The Way it's Meant to be Played program, and featuring NVIDIA's PhysX technology, the game's launcher disables in-game AA when it detects AMD's ATI Radeon graphics hardware. AMD's Ian McNaughton in his recent blog thread said that they had confirmed this by an experiment where they ran ATI Radeon hardware under changed device IDs. Says McNaughton: "Additionally, the in-game AA option was removed when ATI cards are detected. We were able to confirm this by changing the ids of ATI graphics cards in the Batman demo. By tricking the application, we were able to get in-game AA option where our performance was significantly enhanced." He further adds that the option is not available for the retail game as there is a secure-rom.
With no in-game AA available to ATI Radeon users, although the features do technically work on ATI Radeon hardware, the only way AA can be used is by forcing it in Catalyst Control Center. This causes the driver to use AA on every 3D object in the scene, reducing performance, compared to if the game's in-game AA engine is used. "To fairly benchmark this application, please turn off all AA to assess the performance of the respective graphics cards. Also, we should point out that even at 2560×1600 with 4x AA and 8x AF we are still in the highly playable territory," McNaughton adds. Choose with your wallets.
Developed under NVIDIA's The Way it's Meant to be Played program, and featuring NVIDIA's PhysX technology, the game's launcher disables in-game AA when it detects AMD's ATI Radeon graphics hardware. AMD's Ian McNaughton in his recent blog thread said that they had confirmed this by an experiment where they ran ATI Radeon hardware under changed device IDs. Says McNaughton: "Additionally, the in-game AA option was removed when ATI cards are detected. We were able to confirm this by changing the ids of ATI graphics cards in the Batman demo. By tricking the application, we were able to get in-game AA option where our performance was significantly enhanced." He further adds that the option is not available for the retail game as there is a secure-rom.
With no in-game AA available to ATI Radeon users, although the features do technically work on ATI Radeon hardware, the only way AA can be used is by forcing it in Catalyst Control Center. This causes the driver to use AA on every 3D object in the scene, reducing performance, compared to if the game's in-game AA engine is used. "To fairly benchmark this application, please turn off all AA to assess the performance of the respective graphics cards. Also, we should point out that even at 2560×1600 with 4x AA and 8x AF we are still in the highly playable territory," McNaughton adds. Choose with your wallets.
353 Comments on Batman: Arkham Asylum Enables AA Only on NVIDIA Hardware on PCs
Apparently, PhysX was offerred to AMD but they didn't want it, because it was "nvidia" stuff, but don't quote me on it.
However, at the time when the hacked drivers surface, it either just worked or it didn't. The performance issues came down to rendering the acutal graphics more than PhysX. So we never really got to test if the higher levels of PhysX in modern games like Batman would really be hindered on an ATi card.
Even if development did continue once the hacked drivers were released, I don't think we would ever see PhysX running better on ATi than nVidia, simply because developement for the ATi side was severally delayed. However, I do believe that similar cards from the two would have performed similarly. Not identical but similarly.
Development on ati side? What development? Nvidia would develop physx for ati hw if ati would accept it? LOL You're very naive.
It's a given that Nvidia would ask some major licesing money if AMD/ATI wants physx , it's a given they would try to make it run like shit on AMD's hardware and try to ruin them with renewed contracts about licensing this tehnology , it was never an option for AMD/ATI to use a tehnology bought by Nvidia from Ageia with who knows how many millions of dollars.
What you say is just plain stupid and you insult our intelligence , also AMD/ATI said they were never called by Nvidia about physx.
*I just realized, I'm in the wrong thread. :o
And if your an nVIDIA user and your tired of their "shenanigans", then jump ship so they can stop helping benefiting you. And im sure many others have many reasons not to go buy an ATI card. Its user preference.
Unreal 3 doesn't come with native AA support, so this isn't NVIDIA shafting ATI and removing a feature, this is NVIDIA worked with the game developers and added a feature to the game that ATI never bothered to do. Nothing stopped ATI from working with the game developers to enable AA via software. This isn't underhanded.
If nividia comes out with a monster GPU. would you buy even if they did many more (imagine) unethical things. Or buy ati because they are good not great but honest?
I honestly am on the fence at this point, leaning towards the honest
If an HD3870 could stomp through PhysX with no issue back then, I doubt something like an HD5870 would stuggle today. Or even something from the HD4800 series. Do a little research. There would have been no licencing fee for ATi, the only thing ATi would have had to do was support the developement. The PhysX API, engine, and SDK are provided to anyone that wants to use them, free of charge from nVidia. The hardware developer just has to provide hardware drivers that support it.
Again, nVidia was more than willing to help the developer get PhysX/CUDA running on ATi hardware, no licencing or fees involved at all. They were not going to do it themselve, but they were willing to help the devloper that wanted to do it. The problem was that ATi refused to help in any way.
If i own a technology, then i own the tools, to be the best at any circumstances.
I really can't explain myself better, my English isn't so good. But if you're right, and nvidia is so generous as you describe them, then it is time them to port physx from cuda to opencl, and physx source code will be free for everyone.
There are several things you have to consider. The fact that an outside developer was the one doing the devloping, he was just being assisted by nVidia after his initial breakthrough. They essentially were providing him with any documentation and development tools he needed.
Also, CUDA is designed by its nature to be hardware independent. Once the hardware vender writes the driver to support CUDA, it will work. There really isn't a whole lot nVidia can do to make it perform worse on one over the other, and if they did, it would immediately send up red flags because the difference would be drastic.
I've looked and I can't find once where it says Anti - Aliasing is natively supported in any of Unreal Engine's current iterations. In fact all I find are threads lamenting how UE3xx doesn't support AA at all unless done through hardware. That means NVIDIA paid extra money to get it put in, and it would be stupid of them to allow it to ATI users too. Why? Because ATI isn't paying for it, NVIDIA is. They didn't remove a feature. They added a feature for their own market. ATI didn't follow suite and add AA for their market, not they've been 'foul played'.
I find it odd that this Ian McNaughton guy is putting forward this half truth, and if I'm correct I've actually lost respect for ATI in this case because of it. Again, if anyone can prove otherwise (that UE3.5 supports AA and NVIDIA removed usage of AA for ATI instead of adding AA for their own buyers) then I'll retract my claims.
Until then it looks like NVIDIA actually did the gaming market a favor adding by AA, and is owed an apology by roughly 85% of this thread. I wouldn't bother waiting for an apology if I were them though.
One thing is for sure. nVidia is always involved in this shady tactics.
www.pcgameshardware.de/screenshots/original/2009/09/Batman_Arkham_Asylum_Benchmarks_4.PNG
What doesn't make sense to me is why everyone was so ready to jump down NVIDIA's throat. And seriously, hear me out on this one. There are shit tons of games that are 'TWIMTBP' and have in game AA for ATI. Why would they cock block ATI on this game alone? This reeks more of ATI not supporting the game out of the gate, like most games that get emergency patches from them, than it does anything else.
Even the ATI fanboys should have looked at this one with a grain of salt.