Tuesday, September 29th 2009

Batman: Arkham Asylum Enables AA Only on NVIDIA Hardware on PCs

Anti-Aliasing has been one of the most basic image-quality enhancements available in today's games. PC graphics hardware manufacturers regard it as more of an industry standard, and game developers echo with them, by integrating anti-aliasing (AA) features in the game, as part of its engine. This allows the game to selectively implement AA in parts of the 3D scene, so even as the overall image quality of the scene is improved, so is performance, by making sure that not every object in the scene is given AA. It seems that in one of the most well marketed games of the year, Batman: Arkham Asylum, doesn't like to work with ATI Radeon graphics cards when it comes to its in-game AA implementation.

Developed under NVIDIA's The Way it's Meant to be Played program, and featuring NVIDIA's PhysX technology, the game's launcher disables in-game AA when it detects AMD's ATI Radeon graphics hardware. AMD's Ian McNaughton in his recent blog thread said that they had confirmed this by an experiment where they ran ATI Radeon hardware under changed device IDs. Says McNaughton: "Additionally, the in-game AA option was removed when ATI cards are detected. We were able to confirm this by changing the ids of ATI graphics cards in the Batman demo. By tricking the application, we were able to get in-game AA option where our performance was significantly enhanced." He further adds that the option is not available for the retail game as there is a secure-rom.

With no in-game AA available to ATI Radeon users, although the features do technically work on ATI Radeon hardware, the only way AA can be used is by forcing it in Catalyst Control Center. This causes the driver to use AA on every 3D object in the scene, reducing performance, compared to if the game's in-game AA engine is used. "To fairly benchmark this application, please turn off all AA to assess the performance of the respective graphics cards. Also, we should point out that even at 2560×1600 with 4x AA and 8x AF we are still in the highly playable territory," McNaughton adds. Choose with your wallets.
Add your own comment

353 Comments on Batman: Arkham Asylum Enables AA Only on NVIDIA Hardware on PCs

#101
Sihastru
The Unreal Engine 3.5 can only use AA modes if it's under Vista/DX10+. Because of the frickin' consoles, the game is mostly a DX9 game. In DX9 compatibility mode, UE cannot use AA because of the shadowing algorithm that is incompatible with current AA modes.

An important point for ATI DX10.1 cards was an interesting way of producing soft shadows (something nVidia doesn't have in it's DX10 implementation). Could this be the problem?
Posted on Reply
#102
wiak
AnimalpakWell tell to ATI to invest more in the development and refinement of the drivers.

The biggest problem is the fact, ATI drivers has always been poor and bad at this point they would be like nvidia or even higher.

ATI GPU with tremendous computing power but they are too lazy to develop drivers able to exploit.

I would say to stop the childish acting fanboyism " i hate nvidia" etc. ...
might want to re evaluate that one dude
search google for "NVIDIA Vista Driver FAIL" and you might find a billion pages
www.google.com/search?hl=en&q=nvidia+vista++fail

given the fact that ATI's Catalyst drivers has been released every singel month since January 2004, they even release hotfixes and everything, btw did you know that ATi has had both DX10.1 and DX11 hardware long before nvidia?

but every one do have to agree that Intel's Crapstics drivers suck, dont they? compared to ATI and NVIDIA :p
get a ATI card and try again with this bad driver bitching
Posted on Reply
#103
newtekie1
Semi-Retired Folder
tkpenaltyVery conservative there only being so mindful for the corporations when the wealth will never get to you. In the end the consumer loses.
Not really. If it leads to a more playable game for the consumer, then I hardly call that a loss. If a large number of consumers were having stability issues making the game completely unplayable for them, and their issues were fixed with little affect on the other consumers playability of the game, I don't consider that an overall loss for the consumer.
Imsochobo¨
AC didnt crash for me or any of my friends. DX10.1
There was no problem, review sites didnt have issues either.

The fact that you support paying of game devs for other cards to be bad is just unbearable.

Way its meant to be played is perfectly fine if the FACT that the game ran as it should, and not with intentional crippled performance like its proven...
ATi does support game devs, and give videocards to them so they can check if it works, and support them with documentation and alike, nvidia's strategy is bigger, but they also bribes as it seems like with the result in some TWIMTBP games.
It doesn't matter that your small group of friends didn't have an issue. I didn't have an issue with the game either, but we know for a fact that there was major issues with nVidia hardware at least, and 2 of my machines were running nVidia hardware at the time, one of which was my main machine that I played the majority of the game on. When you get a sample size in the millions with no issues, then your argument will be sound, but until then, you have no clue if there were wide spread issues with DX10.1, the only people who know that are the ones working at Ubisoft.

I support nVidia putting money into better developement of games for their cards. Which is exactly what is happening here. Again, there is just as much evidence that nVidia paid entirely to have the feature added to the game for their cards as there is to say that the feature was already there, and nVidia just paid to have it removed for ATi cards. Either senerio is just as plausable given what we know so far. The only difference is one make nVidia out to be the bad guy, and one doesn't, so you really just have to pick if you want to give nVidia a bad name or not. Personally, I prefer to give everyone the benefit of the doubt and go with the senerio that makes them look best.
Posted on Reply
#104
Imsochobo
wiakmight want to re evaluate that one dude
search google for "NVIDIA Vista Driver FAIL" and you might find a billion pages
www.google.com/search?hl=en&q=nvidia+vista++fail

given the fact that ATI's Catalyst drivers has been released every singel month since January 2004, they even release hotfixes and everything, btw did you know that ATi has had both DX10.1 and DX11 hardware long before nvidia?
Totally back the drivers up! not a issue except 9.2 which could not be upgraded.
Only issue since 2007 for me.
I agree X850XT PE, Niiiiiiiiiiiiightmare.
Posted on Reply
#105
Mistral
SihastruAgain, does it work on an Intel GPU? A test made in-house by...
This isn't even worth joking about at the moment...

In any case, I have both an ATI and an nVidia rig, and I'll be picking up the game once the price drops a bit and I actually have time to play it. Who knows, by then a patch might actually "fix" the AA issue.
Posted on Reply
#106
btarunr
Editor & Senior Moderator
Musselsdespite your trying to use rational logic, you umm, failed.
en.wikipedia.org/wiki/Batman_Arkham_Asylum
img.techpowerup.org/090929/.jpg

note the "unreal engine" ? you see, this game was made from an existing game engine, well known to work on ATI and NVIDIA hardware with antialiasing.
Yes, Unreal Engine 3's AA is proven to work stable on AMD GPUs. Thanks for cementing my argument.
Posted on Reply
#107
newtekie1
Semi-Retired Folder
wiakmight want to re evaluate that one dude
search google for "NVIDIA Vista Driver FAIL" and you might find a billion pages
www.google.com/search?hl=en&q=nvidia+vista++fail

given the fact that ATI's Catalyst drivers has been released every singel month since January 2004, they even release hotfixes and everything, btw did you know that ATi has had both DX10.1 and DX11 hardware long before nvidia?

but every one do have to agree that Intel's Crapstics drivers suck, dont they? compared to ATI and NVIDIA :p
get a ATI card and try again with this bad driver bitching
www.googlefight.com/index.php?lang=en_GB&word1=ATi+Crash&word2=nVidia+crash
www.googlefight.com/index.php?lang=en_GB&word1=ATi+driver+problem&word2=nVidia+driver+problem
www.googlefight.com/index.php?lang=en_GB&word1=ATi+driver+vista+fail&word2=nVidia+driver+vista+fail

It all depends on what you search for. Both sides have driver issue, neither is perfect. NVidia has an issue early on with vista, which is likely why there are so many hits when you search for it.

However, currently, both sides put out very good drivers on a consistant basis. So really, the whole X has better drivers than Y argument should really stop, because in the present it is hard to pick which is better, and if you look in the past both have had some pretty rocky times.
Posted on Reply
#108
Imsochobo
newtekie1I support nVidia putting money into better developement of games for their cards..
Notice- Not full quote!

Well, i wonder how the future will be if both companies put mouths full of money to cripple other cards performance:
Start up game:
Play, get tired of it.
Want to play a new one.
Shutdown, change videocard.
Power on, start game.
Play.
....
..

It should be about what card thats best made, gives best performance per buck, or just is the best card on the damn planet, like the MARS, cause someone just likes the big e-peeen!
Imagine, being on a lan with some friends, you wanna play a game, and you get a disadvantage cause you have nvidia, you wine wine wine, then you guys start playing another game, and they get a disadvantage and you advantage.
It should be supported on all cards, general support is the best way, i dont care if its 10% faster on a nvidia card, i care if its 20% slower and with lack of features that really do work without any quirks on ati cards but is intentionally disabled cause someone payed it to be like it.
Posted on Reply
#109
jaredpace
With batman, they left an option in the settings menu for "optimized nv AA" This is just MSAA (an efficient method of smoothing jaggies) that both ati and nvidia can do. The issue was that if it detected an ati card, that option was not available. The result was the older standard method of smoothing jaggies (regular AA) that is much less efficient, became the method by which ati cards had to render AA in batman. That meant that Batman with AA enabled gave ati cards much worse framerates than Nvidia cards because the nv cards were using fast MSAA and the ati cards were using old ass slow regular AA. Same thing going on with NFS Shift.

For physx, Eidos and rocksteady took certain special effects of the game and packaged them to be able to render via Cuda on the nv gpus. These effects only work on geforce 8800 and higher (along with the MSAA). However, these effects (fog, smoke, sparks, particles shimmer, cloth, mattresses, destructible tiles, flying papers, aluminum cans, garbage, spiderwebs, destructible elements, haze, etc) can also all be rendered using an Ati card. They were just "removed" when cuda + nv is not detected since it is part of the cuda package. If you check rage3d, beyond3d or youtube you can see people with ATI cards + Corei7's running batman using MSAA and all the "physx" effects (because they edited the physx code, and tricked the game to thinking it had an nvidia card).

Nvidia would love to control the usage of these effects because it makes the game more emmersive and appealing to users of their own hardware, while decreasing the "coolness" of the game on ati hardware. The sad part is that if you know what you're doing, a few lines of code will have your ati card running perfect MSAA and your corei7 running all those fancy special effects in about 5 minutes, and probably at better framerates than nvidia (if you have a 5800 series). The really sad part is that, the more they do this, and get away with it, the farther apart technological competition becomes. The ati cards are already very powerful on the hardware level compared to G80/Gt200. With their talented engineers and hardware design team, it's bad that ati isn't as efficient with developer relations, driver model programming, and aggressive business practices.
Posted on Reply
#110
Imsochobo
IF ati isnt aggressive, I dont know.

Pushing out hardware at very low prices, atm the demand is controlled by prices, the cards is already overdemanded, i cant find them in stock at all, they was, and they were gone straight away.
The fact that ati pushing out this is aggressive movement against nvidia, they are aggressive, but the way it should be.
Just like Intel and amd used to do.
Just like ati and nvidia used to do. before GF8xxx came and everything started to fall rapidly!

ATi is aggressive in pointing out flaws, and prices, and bringing out new products fast and with big improvements, this is also how nvidia did things in the past, and they rocked! now they're pushing software like they're Microsoft.
Posted on Reply
#111
newtekie1
Semi-Retired Folder
btarunrSo much for misinformed arguments. AMD tested the in-game AA, and it worked. So regardless of this AA implementation being a standard between NVIDIA and AMD, it works, and was yet disabled for ATI hardware.
btarunrThat really isn't a problem. Whether the feature 'works' or not on the given hardware is all that matters, and it does. Stability issues is cannot be used as an excuse to completely remove the feature. If stability issues did exist, they should have left the feature available to everyone and worked on them. Besides, the game does not advertise that its AA features don't work on ATI hardware (or that it requires NVIDIA hardware for AA, just like it properly advertises PhysX).
Musselsdespite your trying to use rational logic, you umm, failed.
en.wikipedia.org/wiki/Batman_Arkham_Asylum
img.techpowerup.org/090929/.jpg

note the "unreal engine" ? you see, this game was made from an existing game engine, well known to work on ATI and NVIDIA hardware with antialiasing.

you can also rename the games .exe to UE3.exe from what i hear, and then use catalyst control centers AA (even before the patch) and everything works well.

This is purely a dirty trick from nvidia, since NV only add AA to some things in game, while ATI now has to waste power antialiasing EVERYTHING (taking a performance hit) and inconveniencing end users.




Indeed. and a default feature of the engine used.



indeed. i went RE5 over this, due to this lame issue.



unfortunately, the lack of AA will never make it into enough news to hinder sales that much.
btarunrYes, Unreal Engine 3's AA is proven to work stable on AMD GPUs. Thanks for cementing my argument.
Why is it so difficult to understand that this isn't just normal AA? Of course AA works in the game, it can be forced via the control panel, and could always be forced via the control panel. In fact, isn't that what we all do when AA isn't an option in a game? And yes, there are still games released without AA as an option.

But why it is hard to understand that this isn't traditional AA? BTA, you out of everyone should make sure you understand the concept, you are reporting it. It lowers your credibility to report such crap, and make such statements.

This is optimised AA! Done in a way to limit performance loss to next to nothing. This is not a standard feature. This is not a feature that exists in the Unreal Engine by default.

Yes, AA works, but not the AA that is used in the game! That is the difference BTA. The AA forced in CCC is obviously different, as it comes with a performance hit, unlike the in game AA setting. So while the effect might be very similar, it is two different features.

And you can not confirm that changing the device ID to trick it to work in game, really does function properly, as it was not tested in the actual game. Again, it is not likely, that a part of the game causes an issue with the feature on ATi cards, and the developers simply disabled the feature as a quick fix to get the game shipped? I mean the game was already delayed a month on the PC, so we know the developers were under a time crunch to get it shipped...so maybe in the end they did start to implement quick fixes. Is that so far fetched?
wahdangunthen they don't deserve our money (using reasoning no.1)

do you have a solid proof that the game will crashes halfway through(using reasoning 2).


so i say ATI owner card must boycott this game and rate it so low in every on-line store. so they will suffer 40% loss from us ATI owner.
Why not, you got a game. The game isn't any less playable.

Does anyone have any solid proof that it won't crash halfway through? Even ATi said they only tested the demo. I do know that I've encounted games, even recently, that would suffer unexplained crashes or have unexpected and unwanted issues caused by visual features enabled in the game, or driver issues. How many times have we seen "update your drivers" as a responce when someone is having game crashing issue? Just as an example: Prototype crashes on my GTX285 if I have AA enabled in the game menu, but works fine on my HD4890 or if I force AA using nVidia control panel. And Prototype just came out a few months ago!
Posted on Reply
#112
Imsochobo
newtekie1Why is it so difficult to understand that this isn't just normal AA?
This is optimised AA! Done in a way to limit performance loss to next to nothing. This is not a standard feature. This is not a feature that exists in the Unreal Engine by default.
They proved it by changing ID of the videocard, that it bumped the performance.
Its proven that Physx runs fine without a GPU.

Btw, prototype is a quick port, work just as good as GTA4. which is terrible. no matter make, blame the devs there, no features blocked though.
Posted on Reply
#113
Mussels
Freshwater Moderator
newtekie1Why is it so difficult to understand that this isn't just normal AA? Of course AA works in the game, it can be forced via the control panel, and could always be forced via the control panel. In fact, isn't that what we all do when AA isn't an option in a game? And yes, there are still games released without AA as an option.

But why it is hard to understand that this isn't traditional AA? BTA, you out of everyone should make sure you understand the concept, you are reporting it. It lowers your credibility to report such crap, and make such statements.

This is optimised AA! Done in a way to limit performance loss to next to nothing. This is not a standard feature. This is not a feature that exists in the Unreal Engine by default.

Yes, AA works, but not the AA that is used in the game! That is the difference BTA. The AA forced in CCC is obviously different, as it comes with a performance hit, unlike the in game AA setting. So while the effect might be very similar, it is two different features.

And you can not confirm that changing the device ID to trick it to work in game, really does function properly, as it was not tested in the actual game.
our point is simple: the game turns the AA settings off instead of the other options a normal developer would do.

A: leave the option in game, and ATI has worse performance (but can be tweaked via drivers)
B: Work with ATI prior to game release, giving them the same advantages as nvidia
C: disable the setting, sweep it under the rug, make the userbase who want things to "just work" run it on nviida cards

Most games go with A: the good ones go with B: - this game went with C:


where you're going wrong newtekie is that you're thinking the in game AA Is some special super duper thing they cooked up. its not. games have used their own in game AA for as long as in game AA options have been, well, in game options. they can say "oh, only AA stuff close to the camera" or "ignore things on this level, it hurt performance badly with all the action"

the other assumption you appear to be making is that "ATI get AA, nvidia get faster AA" - not hte case. ATI didnt get shit, until this was made a big issue, and the game got patched. NO AA AT ALL.


You're taking the "nvidia can do what they want" approach, but i bet if games came out and said "no AA for nvidia, only ATI" you'd be saying a different story.

remember that to even force the AA in the game via CCC, it took hardware ID hacking, public shaming .exe renaming, and finally a game patch - and thats with an un-neccesary performance hit!
Posted on Reply
#114
newtekie1
Semi-Retired Folder
ImsochoboThey proved it by changing ID of the videocard, that it bumped the performance.
Its proven that Physx runs fine without a GPU.

Btw, prototype is a quick port, work just as good as GTA4. which is terrible.
They proved it worked in the demo. Which is all of 15 Minutes of the actual game. There is a lot more to the game then just what was in the demo, and any part of the game could have been giving then problems.

And what do you think Batman is? What do you think the extra month was for? Porting it to the PC and adding PhysX...
Musselsour point is simple: the game turns the AA settings off instead of the other options a normal developer would do.

A: leave the option in game, and ATI has worse performance (but can be tweaked via drivers)
B: Work with ATI prior to game release, giving them the same advantages as nvidia
C: disable the setting, sweep it under the rug, make the userbase who want things to "just work" run it on nviida cards

Most games go with A: the good ones go with B: - this game went with C:


where you're going wrong newtekie is that you're thinking the in game AA Is some special super duper thing they cooked up. its not. games have used their own in game AA for as long as in game AA options have been, well, in game options. they can say "oh, only AA stuff close to the camera" or "ignore things on this level, it hurt performance badly with all the action"

the other assumption you appear to be making is that "ATI get AA, nvidia get faster AA" - not hte case. ATI didnt get shit, until this was made a big issue, and the game got patched. NO AA AT ALL.


You're taking the "nvidia can do what they want" approach, but i bet if games came out and said "no AA for nvidia, only ATI" you'd be saying a different story.

remember that to even force the AA in the game via CCC, it took hardware ID hacking, public shaming .exe renaming, and finally a game patch - and thats with an un-neccesary performance hit!
A: It would have to be a different setting.
B: When ATi starts paying for developer time, then this become viable, until then nVidia will get more dev time than ATi.
C: Seems like a good option for a time crunched game.

And where you and everyone else seems to be going wrong, is that you don't understand that the in game AA used in Batman isn't normal AA. It is optimized to give next to no performance loss. When have you seen that? That is "super-duper" IMO.
Posted on Reply
#115
Imsochobo
newtekie1They proved it worked in the demo. Which is all of 15 Minutes of the actual game. There is a lot more to the game then just what was in the demo, and any part of the game could have been giving then problems.

And what do you think Batman is? What do you think the extra month was for? Porting it to the PC and adding PhysX...
YES!

The engine is already a PC engine, no port needed except smash textures models maps and add physx for the most part.
Posted on Reply
#116
Mussels
Freshwater Moderator
newtekie1And what do you think Batman is? What do you think the extra month was for? Porting it to the PC and adding PhysX...
you've been asking for evidence of everyone elses unsubstantiated claims, where is yours for this? how do you know this month wasnt spent making the game "better" for their sponsor?

its hypocritical to say everyone else needs direct evidence (it needs to work in the full game - the demo, with the same engine, does not count) - yet you can make up claims like that.
ImsochoboYES!

The engine is already a PC engine, no port needed except smash textures models maps and add physx for the most part.
thats what i was aware of too. with compatible engines, they'd only have bug fixes to do (and finding ways to make physx look like its doing something, since the game was designed without it)
Posted on Reply
#117
AphexDreamer
Come on guys, this shouldn't even be an argument...

There is no justifying what Nvidia did and we ATI users should be used to this kind of treatment by now. If Nvidia wants to use cheap methods to trick the consumers into thinking their card is better then fine, I say let them. ATI is doing just fine regardless and all the wiser people out there will always know and always be a little more educated to know the truth.
Posted on Reply
#118
mdm-adph
ImsochoboYES!

The engine is already a PC engine, no port needed except smash textures models maps and add physx for the most part.
Musselsyou've been asking for evidence of everyone elses unsubstantiated claims, where is yours for this? how do you know this month wasnt spent making the game "better" for their sponsor?

its hypocritical to say everyone else needs direct evidence (it needs to work in the full game - the demo, with the same engine, does not count) - yet you can make up claims like that.

thats what i was aware of too. with compatible engines, they'd only have bug fixes to do (and finding ways to make physx look like its doing something, since the game was designed without it)
AphexDreamerCome on guys, this shouldn't even be an argument...

There is no justifying what Nvidia did and we ATI users should be used to this kind of treatment by now. If Nvidia wants to use cheap methods to trick the consumers into thinking their card is better then fine, I say let them. ATI is doing just fine regardless and all the wiser people out there will always know and always be a little more educated to know the truth.
Nope... you're all crazy! Don't you see what Nvidia was doing here?!? They were being magnanimous -- uh... they were looking out for the poor ATI player!

By doing this, they were, uh, improving the game experience for ATI users! How nice of them!!

Wait, that... totally doesn't make any sense at all. :wtf:
Posted on Reply
#119
cauby
Boo-hoo to Batman,DC Comics and Nvidia...

I'm playing Spiderman,enough of this troubles!
Posted on Reply
#120
the_wolf88
Comic book games are so... c**p :D I dont miss Batman at all :)

"With no in-game AA available to ATI Radeon users, although the features do technically work on ATI Radeon hardware, the only way AA can be used is by forcing it in Catalyst Control Center."
Problem solved :toast:
No it's not !!

you should read the full line..

With no in-game AA available to ATI Radeon users, although the features do technically work on ATI Radeon hardware, the only way AA can be used is by forcing it in Catalyst Control Center. This causes the driver to use AA on every 3D object in the scene, reducing performance, compared to if the game's in-game AA engine is used.

Performance drops a lot !!!

Damn you Nvidia :mad:
Posted on Reply
#121
AphexDreamer
mdm-adphNope... you're all crazy! Don't you see what Nvidia was doing here?!? They were being magnanimous -- uh... they were looking out for the poor ATI player!

By doing this, they were, uh, improving the game experience for ATI users! How nice of them!!

Wait, that... totally doesn't make any sense at all. :wtf:
How am I crazy, I think were on the same side here lol.
Posted on Reply
#122
newtekie1
Semi-Retired Folder
ImsochoboYES!

The engine is already a PC engine, no port needed except smash textures models maps and add physx for the most part.
I guess you are right here, the extra time was to add features.
Musselsyou've been asking for evidence of everyone elses unsubstantiated claims, where is yours for this? how do you know this month wasnt spent making the game "better" for their sponsor?
www.actiontrip.com/rei/comments_news.phtml?id=080609_9

There you go. Article explaining the game was delayed to add PhysX.
Musselsits hypocritical to say everyone else needs direct evidence (it needs to work in the full game - the demo, with the same engine, does not count) - yet you can make up claims like that.




thats what i was aware of too. with compatible engines, they'd only have bug fixes to do (and finding ways to make physx look like its doing something, since the game was designed without it)
Its not really hypocritical, as when I'm ask for it, I provide it.

And yes, it is the same engine, I get that, we all do. However, that doesn't mean it will work in every single part of the game. That is my point. Testing it in the demo is one step, but testing it in the real game, playing completely through is another.
Posted on Reply
#123
PEPE3D
Batman AA and ATI

I try to play this game. It plays good with nice graphics but it crashes a lot. I am angry at the fact that it maybe do to the fact that I have two ATI cards in CF 4870x2 2gb GDDR5. Great cards. I can pretty much play any game maxed out at 1900x1200. But this game is different. I have to play it at low resolution or the game won't play at all. I am very annoyed. If I knew this, I would've not spent the money on this game. Also, I think is time for developers to start thinking about us, We are the one that buy the games regardless of what brand of GPU we have in our pc. We should start to look at the reviews of the game before they come out and if by any chance the developer suggests that it will play better with NVIDIA or ATI, "we" the customers should boycott the game. We have the power, they need us, they need our money. Therefore, there should be no preference of what GPU you have. It should play great regardless. These are bad business practices and someone should do something about it. GAMES are not cheap!
Posted on Reply
#124
newtekie1
Semi-Retired Folder
If you plan to boycott every game that plays better on one over the other, don't expect to be playing any games. They all favor one over the other, that is just how it is. However, they all also run far beyond acceptably on both, regardless of who they favor.

You are never going to have a game that is playable on one, and completely unplayable on the other. You might have to lower a setting or two on one or the other, but really all games tend to be very similar on both when using comparable cards.
Posted on Reply
#125
Bjorn_Of_Iceland
Meh.. I feels BatMan Arkham devs are just lazy/out of budget and would fear that the engine would have massive code revamps / time consumed when they optimize the game for both ati and nvidia.

Its the same thing we feel sometimes when we make web apps in work.. as long as it run on IE, its good for deployment. "To hell with the firefox, chrome, etc.. go live is tommorow. Just hide the link and lets do a workaround until we find a longterm solution/fix (that would never see the sunlight)"
Posted on Reply
Add your own comment
May 3rd, 2024 09:34 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts