Tuesday, September 29th 2009
Batman: Arkham Asylum Enables AA Only on NVIDIA Hardware on PCs
Anti-Aliasing has been one of the most basic image-quality enhancements available in today's games. PC graphics hardware manufacturers regard it as more of an industry standard, and game developers echo with them, by integrating anti-aliasing (AA) features in the game, as part of its engine. This allows the game to selectively implement AA in parts of the 3D scene, so even as the overall image quality of the scene is improved, so is performance, by making sure that not every object in the scene is given AA. It seems that in one of the most well marketed games of the year, Batman: Arkham Asylum, doesn't like to work with ATI Radeon graphics cards when it comes to its in-game AA implementation.
Developed under NVIDIA's The Way it's Meant to be Played program, and featuring NVIDIA's PhysX technology, the game's launcher disables in-game AA when it detects AMD's ATI Radeon graphics hardware. AMD's Ian McNaughton in his recent blog thread said that they had confirmed this by an experiment where they ran ATI Radeon hardware under changed device IDs. Says McNaughton: "Additionally, the in-game AA option was removed when ATI cards are detected. We were able to confirm this by changing the ids of ATI graphics cards in the Batman demo. By tricking the application, we were able to get in-game AA option where our performance was significantly enhanced." He further adds that the option is not available for the retail game as there is a secure-rom.
With no in-game AA available to ATI Radeon users, although the features do technically work on ATI Radeon hardware, the only way AA can be used is by forcing it in Catalyst Control Center. This causes the driver to use AA on every 3D object in the scene, reducing performance, compared to if the game's in-game AA engine is used. "To fairly benchmark this application, please turn off all AA to assess the performance of the respective graphics cards. Also, we should point out that even at 2560×1600 with 4x AA and 8x AF we are still in the highly playable territory," McNaughton adds. Choose with your wallets.
Developed under NVIDIA's The Way it's Meant to be Played program, and featuring NVIDIA's PhysX technology, the game's launcher disables in-game AA when it detects AMD's ATI Radeon graphics hardware. AMD's Ian McNaughton in his recent blog thread said that they had confirmed this by an experiment where they ran ATI Radeon hardware under changed device IDs. Says McNaughton: "Additionally, the in-game AA option was removed when ATI cards are detected. We were able to confirm this by changing the ids of ATI graphics cards in the Batman demo. By tricking the application, we were able to get in-game AA option where our performance was significantly enhanced." He further adds that the option is not available for the retail game as there is a secure-rom.
With no in-game AA available to ATI Radeon users, although the features do technically work on ATI Radeon hardware, the only way AA can be used is by forcing it in Catalyst Control Center. This causes the driver to use AA on every 3D object in the scene, reducing performance, compared to if the game's in-game AA engine is used. "To fairly benchmark this application, please turn off all AA to assess the performance of the respective graphics cards. Also, we should point out that even at 2560×1600 with 4x AA and 8x AF we are still in the highly playable territory," McNaughton adds. Choose with your wallets.
353 Comments on Batman: Arkham Asylum Enables AA Only on NVIDIA Hardware on PCs
I really could care less, the more I see "TWIMTBP" in games the more I will not buy another nvidia card simple as that, and if it comes to it I wont buy the games if it continues and both nvidia and the game devs can go screw themselves, this is the view of many people including nvidia users aswell!
Its not good practice and if they carry on with these tactics it will hurt them in the end.
Start naming the issues with 'TWIMTBP' that you've seen and we'll research how many turned out to be NVIDIA straight up tampering or just a broken driver that was later fixed. I'm not being an ass I'm curious to know the numbers on this myself.
Edit: If that's the case Newtekie, than it looks like it truely is an NVIDIA added feature. I almost expect to see NVIDIA demanding an apology from Ian McNaughton regarding this.
ATi had a similar program, but they dropped it in favor of doing their own optimizations in drivers. It is just two different aproaches to optimization.
(then high physx required again a 20% over that halved fps) Nvidia wants to spread physx, right? Then why should ati make a driver for cuda, if it already has its own api (ati stream 1.x, 2.x), and there is a common api called opencl (currently at 1.0)? If nvidia wants to spread physx, then he should port it to opencl, to become available to every card.
Nvidia would do this, except... if he wants to manipulate with this physx stuff.
Well, the game runs at 95% on both systems, but they can get it 100% with another 20000 extra man hours. To them, 95% is good enough and they don't need to pay developers wage x 20000 hours. Why? Because at the end of the day, its still playable and both camps can run the game. Nobody is losing out.
But wait! Here's NVIDIA saying 'Hey, we'll give you money, you put TWIMTBP at the front, and spend that time to optimize it for our hardware to 100%. Leave ATI at 95%, we don't care.' Right there, you have the idea of TWIMTBP. Its not meant to take 100% and make it 112.5%. Its not meant to take 100% ATI down to 50%.
I think that aside from small optimization tweaks, you'll find people's complaints about TWIMTBP eventually turned out to be something else as the cause.
The graphical performance hit caused by PhysX is very large. Actually, ATi wasn't even tasked with making the driver, an outside developer was willing to do it, they just needs some support from ATi. And at the time ATi Stream was, and still is, pretty much unused, I don't believe it even existed when CUDA and PhysX was developed. And nVidia just release an OpenCL compliant driver, because OpenCL has only been around for a short while also.
And at this point, it is kind of pointless to port PhysX over, as it is pretty much dead thanks to DX11's implementation of physics.
They don't owe you anything either, nor they owe me anything, they make a product the better they can or the better they want and if it's good enough for you, you buy it. If you don't like it you don't buy it and they loose. That's how it works.
we are all adults here not children in a school yard
anyways both cards have there plus's and minus's i will probably get both ,
why because i can;)
And because you are so blinded, you can't even believe that it might entirely be possible that nVidia is actually doing what they say they are doing and paying for the game to be optimized on their hardware. You find that way too far fetched, and instead believe that they are simply paying to have performance retarded on ATi hardware...that makes sense...I guess...:laugh:
You know the simplest solution is usually the correct solution. Which seems simpler to you? The program is being used like nVidia says, and optimizing the game to run on their hardware OR there is a huge conpiracy where nVidia uses the program as a front to hinder performance on ATi hardware to screw ATi over?
But now that you have degraded to simply flaming instead of making intelligent points to backup your argument, I'll ignore you now, as you have lost. No, not really.
:shadedshu
oh noes I have lost on the interwebs :eek: haha get over it mate after all it is a discussion not a fight/game ;)
how is it a conspiracy ? ATI is nvidia's only real competitor ( I'm sorry I dont see intel/matrox/SIS as competitors) so why is it so far fetched that so many ATI users who actually own the hardware and have seen the numbers on various "optimised" games are consistently reporting poor performance on similar even lower performing nv cards ?
Optimised my arse, its not xbox 360 Vs PS3 its PC and it should play the same regardless on similar performing cards whether they are ATI or nv
It never used to be that way, a game was made and it played the same on both nv and ati as long as they were on par in terms of performance overall and there were slight differences in image quality between the 2 but that is all.
"I typically save most gaming news for the semi-regular Binge, but I think that this story deserves its own slot. As part of a software update to its popular Assassin's Creed game, Ubisoft seems to have removed DirectX 10.1 functionality from the game entirely. This is interesting for a few reasons; first and foremost, it just doesn't make sense to remove "main attraction" features from a game - especially if the removal of these features results in reduced game performance on systems using a graphics card supporting DX 10.1. Secondly - and most importantly - because this title is part of Nvidia's "The Way it's Meant to be Played" program, the moves smells very much like collusion - seeing as no current Nvidia graphics cards support DX 10.1. This was a terrible decision, and one can only wonder if karma will rear its ugly head...as it should."
DX 10.1 offered a 20% increase in preformance when AA was being used, But then they scraped it, Go figure?
I assuming that that this stunt is along the same lines.
Everyone has had many marketing gimmicks. And everyone still does.
Maybe it has something to do with their $500+ cards? :) ... edit: and huge marketshare (in b4 fanboyz)
Owning a technology means that yes its yours and you can charge for it. Are you saying that it shouldn't be allowed to create a return as an investment just because ATi doesn't own it? Its the business world and thats how it operates, whether you like it or not.
PS. If they get the Eyefinity going without the need for display port primary or powered adapters I could see grabbing a 5870 or two. Damn bezel sizes.