Tuesday, September 29th 2009

Batman: Arkham Asylum Enables AA Only on NVIDIA Hardware on PCs

Anti-Aliasing has been one of the most basic image-quality enhancements available in today's games. PC graphics hardware manufacturers regard it as more of an industry standard, and game developers echo with them, by integrating anti-aliasing (AA) features in the game, as part of its engine. This allows the game to selectively implement AA in parts of the 3D scene, so even as the overall image quality of the scene is improved, so is performance, by making sure that not every object in the scene is given AA. It seems that in one of the most well marketed games of the year, Batman: Arkham Asylum, doesn't like to work with ATI Radeon graphics cards when it comes to its in-game AA implementation.

Developed under NVIDIA's The Way it's Meant to be Played program, and featuring NVIDIA's PhysX technology, the game's launcher disables in-game AA when it detects AMD's ATI Radeon graphics hardware. AMD's Ian McNaughton in his recent blog thread said that they had confirmed this by an experiment where they ran ATI Radeon hardware under changed device IDs. Says McNaughton: "Additionally, the in-game AA option was removed when ATI cards are detected. We were able to confirm this by changing the ids of ATI graphics cards in the Batman demo. By tricking the application, we were able to get in-game AA option where our performance was significantly enhanced." He further adds that the option is not available for the retail game as there is a secure-rom.

With no in-game AA available to ATI Radeon users, although the features do technically work on ATI Radeon hardware, the only way AA can be used is by forcing it in Catalyst Control Center. This causes the driver to use AA on every 3D object in the scene, reducing performance, compared to if the game's in-game AA engine is used. "To fairly benchmark this application, please turn off all AA to assess the performance of the respective graphics cards. Also, we should point out that even at 2560×1600 with 4x AA and 8x AF we are still in the highly playable territory," McNaughton adds. Choose with your wallets.
Add your own comment

353 Comments on Batman: Arkham Asylum Enables AA Only on NVIDIA Hardware on PCs

#151
Valdez
newtekie1"Running just fine" means it has good performance. Simply running with shitty performance would be far from "fine" would you say?
So you can imagine a game with hw physx running better on ati hw than it's competitor from nvidia?
Posted on Reply
#152
qubit
Overclocked quantum bit
+1
newtekie1Bullshit, the tests with hacked drivers were showing PhysX running just fine on ATi hardware.

You are seriously over estimating the power required to run PhysX, any current ATi hardware would have been able to completely kill in PhysX performance. Remember, the original hardware the PhysX API ran on was 128MB PCI cards...
Yeah, I also remember the reports of PhysX running quite well on AMD with a driver wrapper. This whole restriction is due to politics/business and nothing else.

Apparently, PhysX was offerred to AMD but they didn't want it, because it was "nvidia" stuff, but don't quote me on it.
Posted on Reply
#153
newtekie1
Semi-Retired Folder
SteevoAt least you admit that the rest is true. Knowing you have a problem is the first step.
No, I've just already address the rest previously, and don't feel like repeating myself.
ValdezSo you can imagine a game with hw physx running better on ati hw than it's competitor from nvidia?
With PhysX there really isn't a "better", it really just kind of works or it doesn't. Obviously, there are some slight variations, largely dependent on the amount of physic being calculated using PhysX, which is why nVidia has started to increase the requirements for the video cards that run PhysX. And why Batman actually has different PhysX levels that require different levels of performance from the PhysX card.

However, at the time when the hacked drivers surface, it either just worked or it didn't. The performance issues came down to rendering the acutal graphics more than PhysX. So we never really got to test if the higher levels of PhysX in modern games like Batman would really be hindered on an ATi card.

Even if development did continue once the hacked drivers were released, I don't think we would ever see PhysX running better on ATi than nVidia, simply because developement for the ATi side was severally delayed. However, I do believe that similar cards from the two would have performed similarly. Not identical but similarly.
Posted on Reply
#154
CrAsHnBuRnXp
I like how only the Ati ppl are bitching about this.
Posted on Reply
#155
Valdez
newtekie1With PhysX there really isn't a "better", it really just kind of works or it doesn't. Obviously, there are some slight variations, largely dependent on the amount of physic being calculated using PhysX, which is why nVidia has started to increase the requirements for the video cards that run PhysX. And why Batman actually has different PhysX levels that require different levels of performance from the PhysX card.

However, at the time when the hacked drivers surface, it either just worked or it didn't. The performance issues came down to rendering the acutal graphics more than PhysX. So we never really got to test if the higher levels of PhysX in modern games like Batman would really be hindered on an ATi card.

Even if development did continue once the hacked drivers were released, I don't think we would ever see PhysX running better on ATi than nVidia, simply because developement for the ATi side was severally delayed. However, I do believe that similar cards from the two would have performed similarly. Not identical but similarly.
Ah yes, and a bit later when physx becomes an industry standard, because ati has accepted nvidia's "generous" offer, the user looks at the graps and every game that uses hw physx would show more fps on nvidia hw. It would work on ati hw too, but slower.

Development on ati side? What development? Nvidia would develop physx for ati hw if ati would accept it? LOL You're very naive.
Posted on Reply
#156
leonard_222003
newtekie1nVidia disabling PhysX when an ATi card is present was a dick move. I'll be the first to say that. However, I can understand their frustration, and the reasons behind it. You are too quick to forget that nVidia actually wanted to get PhysX running natively on ATi hardware(no nVidia hardware required). And it was ATi that blocked the effort in any way possible. The fact is that nVidia was trying to be very helpful in getting PhysX/CUDA running on ATi hardware. Was the reasoning because it would benefit nVidia in the fight against Intel, and not just out of the goodness of their hearts? Probably, but who cares? The point was that nVidia was trying to get PhysX/CUDA working on ATi hardware. And after ATi blocked them at every turn, even going as far as not providing review samples to the review site that was responsible for the original hacked drivers...
HAHAHA , are you crazy man , do you take us for some very stupid people ?
It's a given that Nvidia would ask some major licesing money if AMD/ATI wants physx , it's a given they would try to make it run like shit on AMD's hardware and try to ruin them with renewed contracts about licensing this tehnology , it was never an option for AMD/ATI to use a tehnology bought by Nvidia from Ageia with who knows how many millions of dollars.
What you say is just plain stupid and you insult our intelligence , also AMD/ATI said they were never called by Nvidia about physx.
Posted on Reply
#157
AphexDreamer
CrAsHnBuRnXpI like how only the Ati ppl are bitching about this.
Thats a generalization, I personally could care less. This is nothing new.
Posted on Reply
#158
theubersmurf
CrAsHnBuRnXpI like how only the Ati ppl are bitching about this.
It makes sense that it would work that way. :P However, I've been an invida user for a while now, and I'm "bitching" about it too.:nutkick: I'm pretty sick of their shenanigans, and I'm jumping ship...Ironically that makes me a potential ATI user, so based on that, you can lump me in. :D
Posted on Reply
#159
erocker
*
Yes, and the sole reason I stopped using Nvidia was due to their business practices. Stupid me, I went and bought a GTX 260 anyways. Great card, physX was neat. Got bored, bought another ATi card. Plus, when I buy ATi, the money stays closer to home.
CrAsHnBuRnXpI like how only the Ati ppl are bitching about this.
What are Ati people? Meh, I could just go buy a Nvidia card but have too many reasons to not want one.

*I just realized, I'm in the wrong thread. :o
Posted on Reply
#160
CrAsHnBuRnXp
AphexDreamerThats a generalization, I personally could care less. This is nothing new.
I could care less too.
theubersmurfIt makes sense that it would work that way. :P However, I've been an invida user for a while now, and I'm "bitching" about it too.:nutkick: I'm pretty sick of their shenanigans, and I'm jumping ship...Ironically that makes me a potential ATI user, so based on that, you can lump me in. :D
I could care less what they do. I mainly use their cards because they are usually the best by the time i need a new card and they use a black PCB. I hate red PCB thus refuse to buy ATI cards. Thats really the only reason I never used ATI. Retarded I know but i like things to match. But now that XFX is making ATI cards and make them in black PCB, ATI is a possibility for me now. :)

And if your an nVIDIA user and your tired of their "shenanigans", then jump ship so they can stop helping benefiting you.
erockerYes, and the sole reason I stopped using Nvidia was due to their business practices. Stupid me, I went and bought a GTX 260 anyways. Great card, physX was neat. Got bored, bought another ATi card. Plus, when I buy ATi, the money stays closer to home.

What are Ati people? Meh, I could just go buy a Nvidia card but have too many reasons to not want one.
And im sure many others have many reasons not to go buy an ATI card. Its user preference.
Posted on Reply
#161
El Fiendo
I'm sorry but I can't afford the time to read 160+ replies, so if its been mentioned I'm sorry.

Unreal 3 doesn't come with native AA support, so this isn't NVIDIA shafting ATI and removing a feature, this is NVIDIA worked with the game developers and added a feature to the game that ATI never bothered to do. Nothing stopped ATI from working with the game developers to enable AA via software. This isn't underhanded.
Posted on Reply
#162
BigBruser13
What would you do?

If nividia comes out with a monster GPU. would you buy even if they did many more (imagine) unethical things. Or buy ati because they are good not great but honest?

I honestly am on the fence at this point, leaning towards the honest
Posted on Reply
#163
CrAsHnBuRnXp
BigBruser13If nividia comes out with a monster gpu 1 p=-
Wouldnt care. :laugh:
Posted on Reply
#164
Valdez
El FiendoI'm sorry but I can't afford the time to read 160+ replies, so if its been mentioned I'm sorry.

Unreal 3 doesn't come with native AA support, so this isn't NVIDIA shafting ATI and removing a feature, this is NVIDIA worked with the game developers and added a feature to the game that ATI never bothered to do. Nothing stopped ATI from working with the game developers to enable AA via software. This isn't underhanded.
I don't think ati has any word in a game development which is under TWIMTBP program. But i could be wrong.
Posted on Reply
#165
Mistral
El FiendoI'm sorry but I can't afford the time to read 160+ replies, so if its been mentioned I'm sorry.

Unreal 3 doesn't come with native AA support, so this isn't NVIDIA shafting ATI and removing a feature, this is NVIDIA worked with the game developers and added a feature to the game that ATI never bothered to do. Nothing stopped ATI from working with the game developers to enable AA via software. This isn't underhanded.
Just for the record, and since you didn't read the 160+ replies (which is completely understandable) - Batman:AA runs on Unreal 3.5
Posted on Reply
#166
newtekie1
Semi-Retired Folder
ValdezAh yes, and a bit later when physx becomes an industry standard, because ati has accepted nvidia's "generous" offer, the user looks at the graps and every game that uses hw physx would show more fps on nvidia hw. It would work on ati hw too, but slower.

Development on ati side? What development? Nvidia would develop physx for ati hw if ati would accept it? LOL You're very naive.
There is really nothing to show that PhysX would run any worse on ATi hardware, really. While it might run worse in the future, I doubt that the hardware in the future will actually struggle.

If an HD3870 could stomp through PhysX with no issue back then, I doubt something like an HD5870 would stuggle today. Or even something from the HD4800 series.
leonard_222003HAHAHA , are you crazy man , do you take us for some very stupid people ?
It's a given that Nvidia would ask some major licesing money if AMD/ATI wants physx , it's a given they would try to make it run like shit on AMD's hardware and try to ruin them with renewed contracts about licensing this tehnology , it was never an option for AMD/ATI to use a tehnology bought by Nvidia from Ageia with who knows how many millions of dollars.
What you say is just plain stupid and you insult our intelligence , also AMD/ATI said they were never called by Nvidia about physx.
Do a little research. There would have been no licencing fee for ATi, the only thing ATi would have had to do was support the developement. The PhysX API, engine, and SDK are provided to anyone that wants to use them, free of charge from nVidia. The hardware developer just has to provide hardware drivers that support it.

Again, nVidia was more than willing to help the developer get PhysX/CUDA running on ATi hardware, no licencing or fees involved at all. They were not going to do it themselve, but they were willing to help the devloper that wanted to do it. The problem was that ATi refused to help in any way.
Posted on Reply
#167
El Fiendo
MistralJust for the record, and since you didn't read the 160+ replies (which is completely understandable) - Batman:AA runs on Unreal 3.5
Right, and through all my searching I can't find where it says AA is natively supported in 3 or 3.5. If its added in by the developers and its co-developed by NVIDIA then there is no problem. Does anyone have the spec sheet that says UE3.5 has native AA in the engine?
Posted on Reply
#168
Valdez
newtekie1There is really nothing to show that PhysX would run any worse on ATi hardware, really. While it might run worse in the future, I doubt that the hardware in the future will actually struggle.

If an HD3870 could stomp through PhysX with no issue back then, I doubt something like an HD5870 would stuggle today. Or even something from the HD4800 series.
What i'm talking about is not a hardware issue. If a hw physx title would run slower on ati hw is not because it is actually weaker hw. It is because the source code is in nv hand, and wouldn't let ati win. It's fully logical.
If i own a technology, then i own the tools, to be the best at any circumstances.

I really can't explain myself better, my English isn't so good. But if you're right, and nvidia is so generous as you describe them, then it is time them to port physx from cuda to opencl, and physx source code will be free for everyone.
Posted on Reply
#169
Unregistered
CrAsHnBuRnXpI like how only the Ati ppl are bitching about this.
I like how all the nvidia users are defending nvidia to the death :rolleyes: :laugh:
Posted on Edit | Reply
#170
newtekie1
Semi-Retired Folder
ValdezWhat i'm talking about is not a hardware issue. If a hw physx title would run slower on ati hw is not because it is actually weaker hw. It is because the source code is in nv hand, and wouldn't let ati win. It's fully logical.
If i own a technology, then i own the tools, to be the best at any circumstances.

I really can't explain myself better, my English isn't so good. But if you're right, and nvidia is so generous as you describe them, then it is time them to port physx from cuda to opencl, and physx source code will be free for everyone.
I get what you are saying, but what I'm saying, is there is really no way for nVidia to do this. PhysX takes so little GPU power to run, that it wouldn't be feasable.

There are several things you have to consider. The fact that an outside developer was the one doing the devloping, he was just being assisted by nVidia after his initial breakthrough. They essentially were providing him with any documentation and development tools he needed.

Also, CUDA is designed by its nature to be hardware independent. Once the hardware vender writes the driver to support CUDA, it will work. There really isn't a whole lot nVidia can do to make it perform worse on one over the other, and if they did, it would immediately send up red flags because the difference would be drastic.
Posted on Reply
#171
El Fiendo
InTeL-iNsIdEI like how all the nvidia users are defending nvidia to the death :rolleyes: :laugh:
Just like ATI users defend ATI against baseless claims.

I've looked and I can't find once where it says Anti - Aliasing is natively supported in any of Unreal Engine's current iterations. In fact all I find are threads lamenting how UE3xx doesn't support AA at all unless done through hardware. That means NVIDIA paid extra money to get it put in, and it would be stupid of them to allow it to ATI users too. Why? Because ATI isn't paying for it, NVIDIA is. They didn't remove a feature. They added a feature for their own market. ATI didn't follow suite and add AA for their market, not they've been 'foul played'.

I find it odd that this Ian McNaughton guy is putting forward this half truth, and if I'm correct I've actually lost respect for ATI in this case because of it. Again, if anyone can prove otherwise (that UE3.5 supports AA and NVIDIA removed usage of AA for ATI instead of adding AA for their own buyers) then I'll retract my claims.

Until then it looks like NVIDIA actually did the gaming market a favor adding by AA, and is owed an apology by roughly 85% of this thread. I wouldn't bother waiting for an apology if I were them though.
Posted on Reply
#172
mR Yellow
Just wasted 1 hour reading this thread. :rolleyes:
One thing is for sure. nVidia is always involved in this shady tactics.
Posted on Reply
#174
Benetanegia
El FiendoJust like ATI users defend ATI against baseless claims.

I've looked and I can't find once where it says Anti - Aliasing is natively supported in any of Unreal Engine's current iterations. In fact all I find are threads lamenting how UE3xx doesn't support AA at all unless done through hardware. That means NVIDIA paid extra money to get it put in, and it would be stupid of them to allow it to ATI users too. Why? Because ATI isn't paying for it, NVIDIA is. They didn't remove a feature. They added a feature for their own market. ATI didn't follow suite and add AA for their market, not they've been 'foul played'.

I find it odd that this Ian McNaughton guy is putting forward this half truth, and if I'm correct I've actually lost respect for ATI in this case because of it. Again, if anyone can prove otherwise (that UE3.5 supports AA and NVIDIA removed usage of AA for ATI instead of adding AA for their own buyers) then I'll retract my claims.

Until then it looks like NVIDIA actually did the gaming market a favor adding by AA, and is owed an apology by roughly 85% of this thread. I wouldn't bother waiting for an apology if I were them though.
I agree, but it's even worse IMO. From what I read they have discovered all this after the game has launched!!! That means they had no contact with the developer at all! I mean if you are a GPU maker, don't you contact developers and try to optimize before launch or at least start working on the optimization of the full game before it launches? Don't you ask for a copy? IMO if they cared so little about that game that they didn't even contact them, AMD deserves every bit of unoptimized code they get. Especially if it comes from a feature that has never been there and was developed for Nvidia at their request, paid by their money. The fact that the optimization works on Ati cards as well, changes nothing IMO. If I was the developer I would have done the same.
Posted on Reply
#175
El Fiendo
Well, and just looking at the history of ATI driver releases. Almost every game that comes out gets a patch awhile after the fact, and continuously so. I'd say ATI has a more reactionary approach when it comes to supporting games, rather than a proactive one.

What doesn't make sense to me is why everyone was so ready to jump down NVIDIA's throat. And seriously, hear me out on this one. There are shit tons of games that are 'TWIMTBP' and have in game AA for ATI. Why would they cock block ATI on this game alone? This reeks more of ATI not supporting the game out of the gate, like most games that get emergency patches from them, than it does anything else.

Even the ATI fanboys should have looked at this one with a grain of salt.
Posted on Reply
Add your own comment
May 3rd, 2024 03:22 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts