# Batman: Arkham Asylum Enables AA Only on NVIDIA Hardware on PCs



## btarunr (Sep 29, 2009)

Anti-Aliasing has been one of the most basic image-quality enhancements available in today's games. PC graphics hardware manufacturers regard it as more of an industry standard, and game developers echo with them, by integrating anti-aliasing (AA) features in the game, as part of its engine. This allows the game to selectively implement AA in parts of the 3D scene, so even as the overall image quality of the scene is improved, so is performance, by making sure that not every object in the scene is given AA. It seems that in one of the most well marketed games of the year, Batman: Arkham Asylum, doesn't like to work with ATI Radeon graphics cards when it comes to its in-game AA implementation.

Developed under NVIDIA's The Way it's Meant to be Played program, and featuring NVIDIA's PhysX technology, the game's launcher disables in-game AA when it detects AMD's ATI Radeon graphics hardware. AMD's Ian McNaughton in his recent blog thread said that they had confirmed this by an experiment where they ran ATI Radeon hardware under changed device IDs. Says McNaughton: "Additionally, the in-game AA option was removed when ATI cards are detected. We were able to confirm this by changing the ids of ATI graphics cards in the Batman demo. By tricking the application, we were able to get in-game AA option where our performance was significantly enhanced." He further adds that the option is not available for the retail game as there is a secure-rom.



With no in-game AA available to ATI Radeon users, although the features do technically work on ATI Radeon hardware, the only way AA can be used is by forcing it in Catalyst Control Center. This causes the driver to use AA on every 3D object in the scene, reducing performance, compared to if the game's in-game AA engine is used. "To fairly benchmark this application, please turn off all AA to assess the performance of the respective graphics cards. Also, we should point out that even at 2560×1600 with 4x AA and 8x AF we are still in the highly playable territory," McNaughton adds. Choose with your wallets.

*View at TechPowerUp Main Site*


----------



## Atnevon (Sep 29, 2009)

See, this is why I hate Nvidia so much. First, it was the overbloated prices on their cards, and now, its only their in-deep with the developers relationships keeping them afloat.

Yes you have to trick the game if you are ATi, but hey, 5870, you are sure looking sexy now compared to The Nvidia...........oh, thats right, you're still behind.


----------



## theorw (Sep 29, 2009)

WOW low blow from nvidia...
They ve fallen so low.Like the 58xx series would have a real problem handling the driver level AA via CCC thats a bit more performance hungry.nvidia should work on its 300 chipsets instead of giving nasty hits on amdshadedshu:shadedshu


----------



## HalfAHertz (Sep 29, 2009)

Maybe it's just a bug that will be resolved in the next patch


----------



## entropy13 (Sep 29, 2009)

HalfAHertz said:
			
		

> Maybe it's just a bug that will be resolved in the next patch



A bug that only appears with ATi cards? Right...:shadedshu


----------



## INSTG8R (Sep 29, 2009)

Dirty trick...Makes me wonder if its this same sorta tactic that is making ATI cards run like junk in NFS Shift..


----------



## Imsochobo (Sep 29, 2009)

Nvfail again.

Seems like they do that alot lately


----------



## REVHEAD (Sep 29, 2009)

This is very Gay, Oh well I wasnt going to get that pos Port of a game anyway.


----------



## csendesmark (Sep 29, 2009)

Comic book games are so... c**p  I dont miss Batman at all 

"With no in-game AA available to ATI Radeon users, although the features do technically work on ATI Radeon hardware, the only way AA can be used is by forcing it in Catalyst Control Center."
Problem solved


----------



## DaedalusHelios (Sep 29, 2009)

You help to pay for a games creation and you want something in return? Thats crazy, Nvidia must have used magic. 

There are games optimized to do well with Nvidia drivers and vice versa. Its nothing new, its why they throw their hat in the ring to help out in development and funding. Its what they get in return. Not like its the best game of the year. It probably sucks. But its sales are supposed to be good so idk.

No reason to act "butthurt" you guys.


----------



## Paintface (Sep 29, 2009)

A reminder why i use ATI cards, not just for the great price/performance ratio + drivers.

But mostly cause of the business ethics, this and the  need for speed : shift / assasins creed debacle show what kind of company nvidia is, instead of pooring all the profit they make into actually better card and drivers, they litteraly bribe gaming companies.

Im all for optimasations, but make it fair for all, disabling AA on purpose ? removing perfectly working DX10.1 ? ignoring ATIs request for bug fixing ? 

Not only do i avoid buying any nvidia hardware, but i havent bought and never will buy any of those games, ATI and the computer news media have to take an example of TPU and be more vocal about these things before they get out of hand.


----------



## WarEagleAU (Sep 29, 2009)

I Am kind of disappointed but really can we blame Nvidia for this? Doesn't it ultimately lie with the game creators, coders, developers, programmers? I don't know but this really makes me mad at Nvidia right now. Of all the TWIMTBP games out, this has never happened before, which makes me suspect.


----------



## kenkickr (Sep 29, 2009)

The game is so dark and it seems most gamers are going to be using the Detective Mode so I don't see alot of "Ohs and Ahs" during game play and positive reason to even enable AA and AF besides I think the game would perform better enabling AA in the game instead of in CCC.  From the get go Nvidia has been all over this game and when they decided to not even update the Ageia driver and setup their drivers to disable Physx when a "non-nvidia" card is detected....I decided to say FUCK BATMAN and NVIDIA!!


----------



## DaedalusHelios (Sep 29, 2009)

ATi could just make a driver profile to do the same thing with better drivers. Assuming they care and are capable. Or a patch for the game to fix it etc. Wouldn't be hard if a guy figured out how to get around it when the game has been out less than a month. 

Its no big deal, just don't buy the game in protest if you own an ATi card.

PS. Wouldn't it be funny if the game was bundled with ATi cards and then they had to fix it.


----------



## HalfAHertz (Sep 29, 2009)

I played the demo, the fighting was amasingly addictive. Plus the game ran like a dream on my ATi powered lappy, so as soon as it drops in price and I see it around, I'd totaly pick it up.


----------



## 3870x2 (Sep 29, 2009)

REVHEAD said:


> This is very Gay, Oh well I wasnt going to get that pos Port of a game anyway.



Remember that this was not a port, the release was the same, and it was created for the computer just as much as it was created for the consoles.  The game is top-notch on the PC, with very good reviews, matching those of the PC reviews.


----------



## mikek75 (Sep 29, 2009)

I'm a long time ATI user, but all this malarky with TWIMTBP games finally got to me and I bought a GTX260 for a great price. It arrived yesterday and lasted precisely 2 hours before the DVI output stopped displaying!

Funnily enough it came with a bundled free download of Batman AA but the card didn't even last long enough to get round to DLing it, LOL. 

I will say though, SHIFT ran very smoothly, albeit with occasional lurches, FPS was stable but the frame seemed to jump. Weird.


----------



## DaedalusHelios (Sep 29, 2009)

mikek75 said:


> I'm a long time ATI user, but all this malarky with TWIMTBP games finally got to me and I bought a GTX260 for a great price. It arrived yesterday and lasted precisely 2 hours before the DVI output stopped displaying!
> 
> Funnily enough it came with a bundled free download of Batman AA but the card didn't even last long enough to get round to DLing it, LOL.
> 
> I will say though, SHIFT ran very smoothly, albeit with occasional lurches, FPS was stable but the frame seemed to jump. Weird.



RMA it. Probably the fault of the board maker and not the GPU itself.


----------



## Bl4ck (Sep 29, 2009)

it's funny b/c the xbox360 has MS Xenos chip inside (made by ATI) to start with  :F


----------



## mdm-adph (Sep 29, 2009)

DaedalusHelios said:


> You help to pay for a games creation and you want something in return? Thats crazy, Nvidia must have used magic.
> 
> There are games optimized to do well with Nvidia drivers and vice versa. Its nothing new, its why they throw their hat in the ring to help out in development and funding. Its what they get in return. Not like its the best game of the year. It probably sucks. But its sales are supposed to be good so idk.
> 
> No reason to act "butthurt" you guys.



DH, there's optimizing games to work better, and there's purposefully making a game work shittier on a competitor's product.

I was seriously considering buying an Nvidia card soon (just for the GPU processing!), but fuck that.  I had heard that the whole "TWIMTBP" program was ending (heard that on hear somewhere), and I thought that Nvidia were mending their ways.

However, forget that -- this is _exactly_ the kind of shit that I've been talking about for years now.


----------



## mikek75 (Sep 29, 2009)

Oh yes, its already been picked up. It wasn't THAT cheap, lol. I mentioned it on the NFS Shift forum and two other people said that they'd had that happen with GTX260's. Assuming they were telling the truth its a bit concerning.


----------



## G@dn!q (Sep 29, 2009)

what nvidia is doin' to ati sonunds like what intel did to amd and they were fined more than 1 billion for that! i'm just curious what will happen if ati finds proof for the dirty games of nvidia! will they be fined the same way like intel? i think that's the only way nvidia to stop playin' dirty and focus on other things!


----------



## Semi-Lobster (Sep 29, 2009)

I wonder how much they 'funded' the 'development' of Arkham Asylum to purposely cripple the game's performance on ATI cards? Theres a point when you can say 'this card was optimized for XXXX's video card' and then there is a point when you willfully sabotage the performance of a game when you use a competitor's project since the game runs fine on Xbox 360 (which uses an ATI GPU) or by simply tricking the game into thinking you have the 'right' video card.


----------



## DaedalusHelios (Sep 29, 2009)

mikek75 said:


> Oh yes, its already been picked up. It wasn't THAT cheap, lol. I mentioned it on the NFS Shift forum and two other people said that they'd had that happen with GTX260's. Assuming they were telling the truth its a bit concerning.



They were most likely made by the same board partner.


----------



## soldier242 (Sep 29, 2009)

dos thats fucked ...


----------



## Velvet Wafer (Sep 29, 2009)

lol, another weak attempt from the green team, to stop the red team on its rise to equally.
i want a 50/50 share, with massive battles


----------



## KainXS (Sep 29, 2009)

More than likely Nvidia payed the devs to to that, 

thats fucked up man, I have an nvidia card right now, but thats just bs

bad for the devs too

and changing the cards device ID via flashing is not easy to do for novices and always voids your warranty,

this is really anticompetitive stuff they are doing, but they have done it before I think so. . . . .


But I have an Nvidia card XD


----------



## DaedalusHelios (Sep 29, 2009)

mdm-adph said:


> DH, there's optimizing games to work better, and there's purposefully making a game work shittier on a competitor's product.




It is selective IQ performance enhancing features offered to one companies GPU. *It doesn't detect an ATi GPU and say "reduce performance".* It detects an ATi GPU and _removes a feature_ that the game offers to improve the game's IQ while selectively minimizing the load on the GPU by AA'ing only things it deems necessary.

It does not retard the ATi card. It just makes sure certain *optimizations* only can be used as developed for Nvidia's offerings.

If two computers, one ATi and one Nvidia, both don't have the box checked there is no difference. Thats why your statement "cannot hold water" as they say.


----------



## FordGT90Concept (Sep 29, 2009)

HalfAHertz said:


> Maybe it's just a bug that will be resolved in the next patch


I tend to agree.  NVIDIA is making an ass of themselves but this sounds more like an Eidos mix up than NVIDIA backdoor deal.  If Eidos doesn't fix it then yeah, foul play most likely was involved.


----------



## KainXS (Sep 29, 2009)

Hey for you know HE'S coming right . . . . .


----------



## DaedalusHelios (Sep 29, 2009)

KainXS said:


> Hey for you know HE'S coming right . . . . .



What Zalgo?


----------



## tkpenalty (Sep 29, 2009)

AA disabled because of securom.

That is just absolute abuse of whats supposed to be keeping piracy at bay. Again Nvidia tries to grab more shareholders through fabricated crap. Nice damage minimisation by nvidia, but a nice way to get into court as well. Normally its due to hardware reasons, but this isnt the case. 

I'll just pirate this game.


----------



## Disparia (Sep 29, 2009)

DaedalusHelios said:


> What Zalgo?



OH F**K!



Anyhoo.... no anger. Just not going to buy the game. Done.


----------



## phanbuey (Sep 29, 2009)

Desperate times call for desperate measures.  my next cards will be ati.


----------



## MilkyWay (Sep 29, 2009)

There is nothing we can do about it but i think that they shouldn't have done that the developers are shafting ATi users, ultimately the developers could have said no but money talks i guess.

Nvidia can do it and we might not like it but its nothing illegal just a bit sad.

Its like buying the last cake so that your mate cant have it even though you are full up.


----------



## HossHuge (Sep 29, 2009)

It seems to me that who ever the company was that made the game wouldn't want to piss off 40% of the market.


----------



## MilkyWay (Sep 29, 2009)

HossHuge said:


> It seems to me that who ever the company was that made the game wouldn't want to piss off 40% of the market.



well they guessed that ati users would still buy it and not notice plus they got paid and ait can do nothing about it

nvidia are just being arseholes!

i am no fanboy for graphics cards i use whatever is the best value at the time and last time it happend to be the gtx 260


----------



## DaedalusHelios (Sep 29, 2009)

tkpenalty said:


> *I'll just pirate this game*.




*I don't think we can talk about piracy on techpowerup.*

I get most games bundled free from videocard purchases so I really don't pay either.


----------



## Easo (Sep 29, 2009)

INSTG8R said:


> Dirty trick...Makes me wonder if its this same sorta tactic that is making ATI cards run like junk in NFS Shift..



I wondered about the same thing.


----------



## rpsgc (Sep 29, 2009)

"The Way It's Meant To Be Paid"


Just another day in the office.


----------



## [I.R.A]_FBi (Sep 29, 2009)

batman can kiss my ass anyway


----------



## mechtech (Sep 29, 2009)

This doesnt surprise me coming from Nvidia, I still have a bitter taste in my mouth when they bought out Uli and discontinued driver support for their chipsets.

Ah well, I only have time for a bit of CS source once in a while anyway.  

Someone once said in a forum once that Nvidia's CEO was kinda like Mao, I wonder....hmmmmmmmm.


----------



## tkpenalty (Sep 29, 2009)

DaedalusHelios said:


> *I don't think we can talk about piracy on techpowerup.*
> 
> I get most games bundled free from videocard purchases so I really don't pay either.



Its a protest =_=.... Not a serious one.


----------



## HossHuge (Sep 29, 2009)

[I.R.A]_FBi said:


> batman can kiss my ass anyway



Mine 2, God dam it!!....


----------



## tkpenalty (Sep 29, 2009)

Funny how nvidia's drivers have been getting worse and worse for their lower-mid range products every release too.


----------



## mdm-adph (Sep 29, 2009)

DaedalusHelios said:


> It is selective IQ performance enhancing features offered to one companies GPU. *It doesn't detect an ATi GPU and say "reduce performance".* It detects an ATi GPU and _removes a feature_ that the game offers to improve the game's IQ while selectively minimizing the load on the GPU by AA'ing only things it deems necessary.
> 
> It does not retard the ATi card. It just makes sure certain *optimizations* only can be used as developed for Nvidia's offerings.
> 
> If two computers, one ATi and one Nvidia, both don't have the box checked there is no difference. Thats why your statement "cannot hold water" as they say.



"Removing a feature," especially when it comes to making the game look better through AA, is pretty much the same as "making it run shittier," since I care about IQ.

I don't believe for a second that there was something about this game that didn't allow it to run AA just FINE on ATI hardware, especially considering (like one poster pointed out) it's an Xbox port. :shadedshu


----------



## ShinyG (Sep 29, 2009)

DaedalusHelios said:


> It is selective IQ performance enhancing features offered to one companies GPU. *It doesn't detect an ATi GPU and say "reduce performance".* It detects an ATi GPU and _removes a feature_ that the game offers to improve the game's IQ while selectively minimizing the load on the GPU by AA'ing only things it deems necessary.
> 
> It does not retard the ATi card. It just makes sure certain *optimizations* only can be used as developed for Nvidia's offerings.
> 
> If two computers, one ATi and one Nvidia, both don't have the box checked there is no difference. Thats why your statement "cannot hold water" as they say.



"Removing features" and "retarding" are very similar, I would dare say identical" if you look at it from a neutral point of view. Calling it other names like "optimizations" doesn't make it any different, it's still a basic feature disabled on ATi cards by the developer. The reason for this might be performance related, but ATi's statement seems to point that there is no problem running the game on ATi hardware with AA. It could also be a programming mistake or it might be related to the TWIMTBP logo at the beginning of the game.


----------



## ZoneDymo (Sep 29, 2009)

Say did they not did something similair with Assassins Creed?
AC had DX 10.1 and under it, it ran better on ATI hardware, then in a later "patch", directX 10.1 support was removed under the reason on "being unstable".


----------



## entropy13 (Sep 29, 2009)

ZoneDymo said:


> Say did they not did something similair with Assassins Creed?
> AC had DX 10.1 and under it, it ran better on ATI hardware, then in a later "patch", directX 10.1 support was removed under the reason on "being unstable".



Nvidia pressured Ubisoft to remove DirectX 10.1 support.


----------



## naoan (Sep 29, 2009)

Bah, I have ATi card and even though I don't use AA cause of my weak GPU, I dropped my plan to buy this game cause of this shit. :shadedshu


----------



## newtekie1 (Sep 29, 2009)

Well if nVidia was the one that paid for the AA optimizations to be put in the game, I see no problem with them limitting them to nVidia hardware also.

For all we know, if nVidia didn't put the money in to implement the AA optimizations, we would never have seen them in the game, so why should ATi benefit from that?

It might not be a case of nVidia or the game developers removing a feature, but instead a case of nVidia paying to have the feature added in the first place.

These could have been performance optimizations that nVidia entirely paid for, the whole purpose of TWIMTBP program, so why should then enable them for ATi?

Or a completely different reasoning:

It could be that having it enabled with ATi cards causes problems in the retail game(remember they only tested this on the demo).  For all we know, something with the way ATi cards handles AA causes the game to crash or be extremely buggy with the optimized AA enabled.  Maybe a certain part of the game is completely unplayable on ATi cards with the feature enabled, so the developers(nothing to do with nVidia at all) just gave up trying to fix it, and simply disabled the feature on ATi cards as a quick fix to get the game shipped.  They now have more time to work on a patch to make it work.  It wouldn't be the first time we've seen games have problems with one manufacturer, but not the other, due to certain visual elements conflicting with the current drivers.

Either way, I highly doubt nVidia caused a feature that was already in the game to be disabled.



entropy13 said:


> Nvidia pressured Ubisoft to remove DirectX 10.1 support.



How do you know this?


----------



## mR Yellow (Sep 29, 2009)

entropy13 said:


> Nvidia pressured Ubisoft to remove DirectX 10.1 support.



Thats why i don't like nVidia and their anti-competitive behavior and shady morals.
My next purchase will definitely be a 5870. U don't really need more power than that.


----------



## mR Yellow (Sep 29, 2009)

newtekie1 said:


> Well if nVidia was the one that paid for the AA optimizations to be put in the game, I see no problem with them limitting them to nVidia hardware also.
> 
> For all we know, if nVidia didn't put the money in to implement the AA optimizations, we would never have seen them in the game, so why should ATi benefit from that?
> 
> ...



This is common knowledge. Been discussed before.

BTW AA isn't an added feature...it's a standard.


----------



## Disparia (Sep 29, 2009)

newtekie, I always appreciate your sound logical postings and thank them on occasion. But trying to bridge logic to consumers who will use emotions, pov ethics, etc, is bit futile


----------



## Steevo (Sep 29, 2009)

I now won't buy the game, another Nvidia card for any PC I build, and will show this as the shit that is, just like the IQ Nvidia tried forcing on users years ago to keep up.

Fuck you Nvidia, and the horse you ride on.


----------



## tkpenalty (Sep 29, 2009)

newtekie1 said:


> Well if nVidia was the one that paid for the AA optimizations to be put in the game, I see no problem with them limitting them to nVidia hardware also.
> 
> For all we know, if nVidia didn't put the money in to implement the AA optimizations, we would never have seen them in the game, so why should ATi benefit from that?
> 
> ...



You do realise that this game has been ported from the xbox 360, which mind you runs a R500 based GPU-Ati/AMD's stuff. Now in this case what is preventing AA from working in the first place is secuROM-a questionable use of something that is MEANT to be used for anti-piracy reasons, and not for anti-competetive market practises. Yes i love that phrase anti-competetive market practises. And I love using it agains you since you always argue against such matters.

The blogger has proven that they're able to get the game to run with AMD's hardware with a rather evasive measure. 

Edios will probably stay silent on this matter. Whats next? Disabling rendering altogether?


----------



## DaedalusHelios (Sep 29, 2009)

mdm-adph said:


> "Removing a feature," especially when it comes to making the game look better through AA, is pretty much the same as "making it run shittier," since I care about IQ.
> 
> I don't believe for a second that there was something about this game that didn't allow it to run AA just FINE on ATI hardware, especially considering (like one poster pointed out) it's an Xbox port. :shadedshu



*Its not an xbox port, someone else has pointed this out already. Development was separate to incorporate GPU Physx.*

"run sh!ttier" implies lower frame rates, and that is only when you force it in CCC not having the selective AA feature which is the very definition of an optimization. Only difference is you see it in the form of a button now so its worse somehow?

I think its stupid they did it but its a difference in IQ and not framerate unless you change the ATi profile to make it swing the other way by forcing AA in CCC.

They can just have an aftermarket patch to swing it the other way. Its no big deal.


----------



## Imsochobo (Sep 29, 2009)

newtekie1 said:


> Well if nVidia was the one that paid for the AA optimizations to be put in the game, I see no problem with them limitting them to nVidia hardware also.
> 
> For all we know, if nVidia didn't put the money in to implement the AA optimizations, we would never have seen them in the game, so why should ATi benefit from that?
> 
> ...



Nvidia fanboy at the best ?

I completed the AC on 3 comps using diffrent setups, all DX10.1, NEVER crashed, the patch came and it crashed. Lol.

Did you read it ?
They changed a ID and it WORKED!
There is always something that seems to be bad with ati cards as long as its way its meant to be played/payed.

Xbox runs AA, Why cant a simular architecture run it ? since 2900xt/1950xtx in the middle of those is what a Xbox ati chip is, somewhere in that path.
It doesnt work with my 2900 XT or a 1950XTX... how about That ?.
(Yes i tested )

And btw, i have a shitvidia card, which i could have used but nvidia doesnt let me.
Nvidia, the way its meant to fail.
I'm talking about PHysx with ATI as rendering.
I totally liked nvidia products, until.

Rename.
Rename.
Meant to be played issues here n there.
Physx bullshit pushing.
Bashing at AMD for no reason 
Bashing again.
Meant to be played starting to become Way its meant to bug you.
Till.
The way its meant to piss you off big time

Physx for me was money well thrown away, i can sell it, but its not worth anything now anyways due to HD5xxx.


----------



## AphexDreamer (Sep 29, 2009)

Well nothing more needs to be said by me to elaborate this BS brought on by Nshitia again. I just want them to pull this shit off with the PS3's RSX so I can benfit from it at least one time .


----------



## btarunr (Sep 29, 2009)

ATI GPUs can handle that game's in-game AA, this comes from AMD. The game just disables the feature when it sees an AMD GPU. This is total blasphemy. I'm not going to  / can't tell you what you should choose with your wallets, but I'll tell you what my wallet says.

"no Batman Arkham Asylum for GeForce for you, bta."

evil wallet.


----------



## newtekie1 (Sep 29, 2009)

mR Yellow said:


> This is common knowledge. Been discussed before.
> 
> BTW AA isn't an added feature...it's a standard.



It's been discussed, but no one has ever shown any proof that nVidia was really behind Ubisoft removing it.  Not a single shred.  Plenty of claims, but claims don't equate proof.

And AA is a feature that has to be added to a game, it isn't just magically in there, at least not the type of optimized adaptive AA that is present in Batman.

FSAA is just a driver switch that any developer can enabled.  However, it always comes with a drop in framerate.  The AA used in Batman has been optimized to not only make the game look better, but do it at no performance loss, by optimizing what object get AA applied and which don't.  This is definitely not a standard feature in games.

If ATi users want AA, enabled it in CCC, there are plenty of other games that don't have AA and require this also.  You will get a FPS hit just like those other games also.



btarunr said:


> ATI GPUs can handle that game's in-game AA, this comes from AMD. The game just disables the feature when it sees an AMD GPU. This is total blasphemy. I'm not going to  / can't tell you what you should choose with your wallets, but I'll tell you what my wallet says.
> 
> "no Batman Arkham Asylum for GeForce for you, bta."
> 
> evil wallet.



In the demo, but how do we know it doesn't cause a problem further along in the game, as I've already pointed out?  They haven't tested more than 15 minutes of gameplay and we all assume it works through the entire game.

How many times have we played a game, that worked fine through 3-4 hours of gameplay, then suddenly crashes at the exact same spot no matter what we do?  I know I've had it happen several times in the many years I've been playing.  In games as recently released as a few months ago.  It is actually pretty common in newly released game, as the drivers haven't been fixed yet.  The solution is often to disable some visual feature(because the drivers don't like it), or to wait for better drivers.  

We don't know that this isn't the case here. Instead, some are jumping to the conclusion that because it has an nVidia stamp on it, that nVidia disabled the feature for ATi.  We don't know that.  And frankly for a news reporter to even suggest it without any shred of proof completely removes all credibility that new reporter has.


----------



## Bull Dog (Sep 29, 2009)

DaedalusHelios said:


> You help to pay for a games creation and you want something in return? Thats crazy, Nvidia must have used magic.
> 
> There are games optimized to do well with Nvidia drivers and vice versa. Its nothing new, its why they throw their hat in the ring to help out in development and funding. Its what they get in return. Not like its the best game of the year. It probably sucks. But its sales are supposed to be good so idk.
> 
> No reason to act "butthurt" you guys.



Ignoring the artificial limitation of extra Physx Effects to the GPU only for a second.  There is a reason to be annoyed.  NVIDIA and the game developer colluded to make the game run WORSE on ATI hardware.  There is no hardware reason why Batman:AA can't do MSAA on ATI hardware.

Then there is the Physx issue where the developer did a rather shitty job of that too.  Advanced Physx effects only run on NVIDIA GPUs.  There is no (in game) option to enable them under the CPU.  And some of the effects like cloth and dynamic fog were simply removed in the non Physx version.  Apparently it was too much work to replace the effects with at least semi static ones...... 


Bottom line is that this game, with help from NVIDIA, was intentionally neutered for when it was run on non NVIDIA hardware.


----------



## entropy13 (Sep 29, 2009)

newtekie1 said:


> How do you know this?




There are many articles out there, hinting towards that. But in some articles Nvidia insists they had no hand to play in the removal of DirectX10.1 support, which is naturally what they'll say (and Ubisoft likewise says that "implementation is costly" - see the TechReport link).

DirectX 10.1 support was removed because:

1. Nvidia cards don't support it.
2. HD 3000 series cards were 20% better than their respective 9000 GT series counterparts with DirectX 10.1 (and AA enabled).



> DirectX 10.1 gives the shader units access to all anti-aliasing buffers in a single pass – something that developers have been unable to do with DirectX 10.0. "DX10.0 screwed AA [performance]. . . . 10.1 would solve that [issue]," said one developer reportedly close to Ubisoft.
> 
> "Of course it removes the render pass! That's what 10.1 does! Why is no one pointing this out, that's the correct way to implement it and is why we will implement 10.1. The same effects in 10.1 take 1 pass whereas in 10 it takes 2 passes," added another anonymous developer, said to be working on a title that implements DirectX 10.1 support – in addition to DirectX 10.



The quoted part makes Ubisoft's reasoning pointless. How can making a process take one less step to finish as "costly"? Which ultimately "feeds fuel to the fire" that there really is a different reason.


http://techreport.com/discussions.x/14707
http://www.bit-tech.net/news/hardware/2008/05/12/ubisoft-caught-in-assassin-s-creed-marketing-war/1
http://www.tgdaily.com/content/view/37326/98/
http://www.fudzilla.com/index.php?option=com_content&task=view&id=7355&Itemid=1


----------



## ShockG (Sep 29, 2009)

If we would all get over ourselves. Changing device ID's to get software working is not new. How about when FarCry detected that you were using NV3X, hardware the precision was changed to FP16 instead of FP32? you could could disable this by changing vendor and device ID. (Had the FarCry ATI demo which apparently used Truform that would not work with NVIDIA hardware but changing device and vendor ID allowed it to work on NV hardware as well) 
Also Batman is Unreal3 engine which actually doesn't support AA (at least MSAA) natively, so some tweaking needs to be done to get it to support it properly. If NVIDIA paid for these optimizations and getting AA working on this title then they should benefit from this. (TWIMTBP isn't just a stamp, they actually send people out to sit with developers and optimize the game together. No money is paid to the developer to lower performance on competitor hardware!) 

ATI used to have a GITG campaign which vanished into thin air, despite the company having said NVIDIA's campaign is nothing more than a marketing gimmick. 

I'm not sure of the AA implementation of UE3 games on the Xbox, but chances are it's SSAA which is exactly what we can get on our ATI graphics cards and if be it SSAA or MSAA, on a console you can tweak performance right down to per cycle level. don't compare a closed system with a PC. 

So before we say there's a conspiracy, lets calm ourselves and think about it a little.


----------



## [I.R.A]_FBi (Sep 29, 2009)

newtekie1 said:


> Well if nVidia was the one that paid for the AA optimizations to be put in the game, I see no problem with them limitting them to nVidia hardware also.
> 
> For all we know, if nVidia didn't put the money in to implement the AA optimizations, we would never have seen them in the game, so why should ATi benefit from that?
> 
> ...



anyone else hear this shit?


----------



## newtekie1 (Sep 29, 2009)

entropy13 said:


> There are many articles out there, hinting towards that. But in some articles Nvidia insists they had no hand to play in the removal of DirectX10.1 support, which is naturally what they'll say (and Ubisoft likewise says that "implementation is costly" - see the TechReport link).
> 
> DirectX 10.1 support was removed because:
> 
> ...



Ah, so a bunch of conspiracies with no evidence at all.  Thats what I thought.


----------



## Imsochobo (Sep 29, 2009)

Game just got patched.


----------



## entropy13 (Sep 29, 2009)

newtekie1 said:


> Ah, so a bunch of conspiracies with no evidence at all.  Thats what I thought.



EVIDENCE 1: A specific API for the game can increase performance
EVIDENCE 2: That API however is only supported by ATi cards

API is removed from the game. So the game developers doesn't want increased performance from their games?


----------



## newtekie1 (Sep 29, 2009)

entropy13 said:


> EVIDENCE 1: A specific API for the game can increase performance
> EVIDENCE 2: That API however is only supported by ATi cards
> 
> API is removed from the game. So the game developers doesn't want increased performance from their games?



Thats grasping at straws, at best.

And if the increasd performance comes at the cost of stability, no they probably don't.  Especially when they have to handle all the calls from people complaining about the game crashing all the time.  I bet if they gave all those people your phone number, you'd probably want DX10.1 removed also.

And since I forgot to add it in the previous post.  DX10.1 is "costly" because it requires extra developement time to implement into the game code.  It is not costly to render, which is what the quote you posted talks about.  However, that is not the "costly" we are talking about when we say it is costly to implement.  DX10 has to be implemented either way, DX10.1 only adds to developement costs.


----------



## DaedalusHelios (Sep 29, 2009)

Bull Dog said:


> Bottom line is that this game, with help from NVIDIA, was intentionally neutered for when it was run on non NVIDIA hardware.



Bottom line is if you don't support ATi equally its an "evil" game and you feel wronged?

If Nvidia pays for development they could make sure ATi cards can't play it at all. The developer is a company. Its not required to make a game run in any way. If it doesn't play to your liking don't buy it. 

There is no "international video game creation bill of rights". A company can make a game play however they want as long as it doesn't cause harm to the person playing it or his/her property. Thats reality. If a game developer doesn't support your hardware to your liking don't buy the game. 



Imsochobo said:


> Game just got patched.



If you are serious it makes this look no longer intentional by the developer. Got a link. trollin?


----------



## entropy13 (Sep 29, 2009)

newtekie1 said:


> Thats grasping at straws, at best.
> 
> And if the increasd performance comes at the cost of stability, no they probably don't.  Especially when they have to handle all the calls from people complaining about the game crashing all the time.  I bet if they gave all those people your phone number, you'd probably want DX10.1 removed also.
> 
> And since I forgot to add it in the previous post.  DX10.1 is "costly" because it requires extra developement time to implement into the game code.  It is not costly to render, which is what the quote you posted talks about.  However, that is not the "costly" we are talking about when we say it is costly to implement.  DX10 has to be implemented either way, DX10.1 only adds to developement costs.



I wasn't grasping at straws actually, since I made a question, not a statement.

The removal of the DX10.1 support was THROUGH a patch. So how did they get to the initial version in the first place if it was costly? Why did they include it in the first place then?

Stability issues were almost always because of an Nvidia card though (pre-patch). And a post here also talking about AC with an ATi card, said his game was running perfectly before the patch, but then crashes after the patch. Selective stability then?


----------



## DaedalusHelios (Sep 29, 2009)

newtekie1 said:


> Thats grasping at straws, at best.
> 
> And if the increasd performance comes at the cost of stability, no they probably don't.  Especially when they have to handle all the calls from people complaining about the game crashing all the time.  I bet if they gave all those people your phone number, you'd probably want DX10.1 removed also.
> 
> And since I forgot to add it in the previous post.  DX10.1 is "costly" because it requires extra developement time to implement into the game code.  It is not costly to render, which is what the quote you posted talks about.  However, that is not the "costly" we are talking about when we say it is costly to implement.  DX10 has to be implemented either way, DX10.1 only adds to developement costs.




Thats true. But I did hear that DX11(_not DX10.1_) makes it cost less because they made the tools easier to use for the developers somehow with the creation of DX11.


----------



## Imsochobo (Sep 29, 2009)

The setting Now says :
Use ati Control Panel, Why doesnt it work if ati themself made it work by changing ID ? .


http://bildr.no/view/497400


----------



## Animalpak (Sep 29, 2009)

Well tell to ATI  to invest more in the development and refinement of the drivers.

The biggest problem is the fact, ATI drivers has always been poor and bad at this point they would be like nvidia or even higher. 

ATI GPU with tremendous computing power but they are too lazy to develop drivers able to exploit.

I would say to stop the childish acting fanboyism " i hate nvidia" etc. ...


----------



## newtekie1 (Sep 29, 2009)

entropy13 said:


> I wasn't grasping at straws actually, since I made a question, not a statement.
> 
> The removal of the DX10.1 support was THROUGH a patch. So how did they get to the initial version in the first place if it was costly? Why did they include it in the first place then?
> 
> Stability issues were almost always because of an Nvidia card though (pre-patch). And a post here also talking about AC with an ATi card said his game crashes after the patch. Selective stability then?



I was explaing why implementing DX10.1 is costly in the first place, as you seem to believe that it comes free.  In the Ubisoft case, they didn't say they removed it because it was costly, their reason for removing it was because it made the game unstable.  In that case, it had nothing to do with being costly to implement(though it might have been costly to fix the implementation...).

The stability issues with nVidia cards was due to PhysX mostly.  I'm sure there were plenty of stability issue with ATi cards, but they were drastically overshadowed by the PhysX issues.  It might have come down to a decision of what features to fix, and which features to just give up on.  Sometimes that is what has to be done in the business world.

The patch definitely made the game more stable on both side, but no game is ever going to be perfect.  There will always be crashes on certain configurations.


----------



## wahdangun (Sep 29, 2009)

newtekie1 said:


> Well if nVidia was the one that paid for the AA optimizations to be put in the game, I see no problem with them limitting them to nVidia hardware also.
> 
> For all we know, if nVidia didn't put the money in to implement the AA optimizations, we would never have seen them in the game, so why should ATi benefit from that?
> 
> ...



wtf, are you talking about, we are *paying* our hard earned money for their game you know, but what they do? instead working it more playable to other hardware, they chose to crippled the game so they can take some money. 

and no it's not about game stability, it's just one fucking greedy developer. look at the news, they just change device ID and viola AA worked flawlessly(and crush nvdia performance) it's just like :

IF device ID=ATI then 
{
AA=disable
}


they should mention it in the box that's say "it's for Nvdia card only"  so ATI owner card won't get pissed, ypu know


----------



## tkpenalty (Sep 29, 2009)

newtekie1 said:


> I was explaing why implementing DX10.1 is costly in the first place, as you seem to believe that it comes free.  In the Ubisoft case, they didn't say they removed it because it was costly, their reason for removing it was because it made the game unstable.  In that case, it had nothing to do with being costly to implement(though it might have been costly to fix the implementation...).
> 
> The stability issues with nVidia cards was due to PhysX mostly.  I'm sure there were plenty of stability issue with ATi cards, but they were drastically overshadowed by the PhysX issues.  It might have come down to a decision of what features to fix, and which features to just give up on.  Sometimes that is what has to be done in the business world.
> 
> The patch definitely made the game more stable on both side, but no game is ever going to be perfect.  There will always be crashes on certain configurations.



The API's functionality was removed altogether after one patch, then DX10.1 stopped working after that. You CLEARLY dont understand what the recent APIs have been written for if you say its costly to implement. DX10.1 and DX11 both are to make life easier for coders so its illogical why they'd fall back on DX10. 

Get back on topic will ya? Personal question, are you a conservative person?

Before any fanboy comment is fired off, im using an 8800GT. Not going to go for nvidia for my next GPU however, due to their draconian market practises.



Animalpak said:


> Well tell to ATI  to invest more in the development and refinement of the drivers.
> 
> The biggest problem is the fact, ATI drivers has always been poor and bad at this point they would be like nvidia or even higher.
> 
> ...



Totally irrelevant. In this case, the drivers are not the issue, but the game basically prevents AA in game from working whenever it detects an ATI card. As shown the developers easily averted this by changing the devID, and hence proves thats its nothing to do with the drivers. 
On this note I've been having way more issues with Nvidia's drivers than AMD's drivers. 

Its not fanboyism. Its a logical fallacy to assume that because AA doesnt work in this case AMD's drivers suck.


----------



## entropy13 (Sep 29, 2009)

newtekie1 said:


> I was explaing why implementing DX10.1 is costly in the first place, as you seem to believe that it comes free.  In the Ubisoft case, they didn't say they removed it because it was costly, their reason for removing it was because it made the game unstable.  In that case, it had nothing to do with being costly to implement(though it might have been costly to fix the implementation...).
> 
> The stability issues with nVidia cards was due to PhysX mostly.  I'm sure there were plenty of stability issue with ATi cards, but they were drastically overshadowed by the PhysX issues.  It might have come down to a decision of what features to fix, and which features to just give up on.  Sometimes that is what has to be done in the business world.
> 
> The patch definitely made the game more stable on both side, but no game is ever going to be perfect.  There will always be crashes on certain configurations.




BAAAAHHH My head is hurting. I'll rearrange my thoughts first, formulate a proper argument. It should also be mentioned that ATi was pushing for DirectX 10.1 support to stay...but if the stability issues were universal as you point out, then why do they do so? Hmmm....


----------



## Imsochobo (Sep 29, 2009)

Animalpak said:


> Well tell to ATI  to invest more in the development and refinement of the drivers.
> 
> The biggest problem is the fact, ATI drivers has always been poor and bad at this point they would be like nvidia or even higher.
> 
> ...



My job says diffrent.
IT people want to move away from nvidia due to driver issues.

Quadro is more a standard, therefore the bosses like it, strange company yes.
ATI in all laptops though, not an issue.

Ati drivers always been poor?
I cant say that.

I cant say ati should invest in developing games, then it turns out to be a money game rather than making just a product that supports a API that game devs use and then we're done.

Not pay for making AA being supported in games.
Pay so you wont get crippled performance.
Not Pay so the game works as it can, just because you didnt pay the dev.


----------



## newtekie1 (Sep 29, 2009)

wahdangun said:


> wtf, are you talking about, we are *paying* our hard earned money for their game you know, but what they do? instead working it more playable to other hardware, they chose to crippled the game so they can take some money.
> 
> and no it's not about game stability, it's just one fucking greedy developer. look at the news, they just change device ID and viola AA worked flawlessly(and crush nvdia performance) it's just like :
> 
> ...



Yes, and you got the game you paid for.  However, you didn't get the optimizations that *nVidia* paid for.(Using reasoning 1 here).

Or you got a game that crashes halfway through(using reasoning 2).  Would you prefer to pay for a game that won't let you finish it?

And if you read the article, it didn't crush nVidia's performance, in performed a lot worse with AA enabled, while nVidia cards see no performance loss.



tkpenalty said:


> The API's functionality was removed altogether after one patch, then DX10.1 stopped working after that.
> 
> Get back on topic will ya? Personal question, are you a conservative person?



Yes, and that might have been an dicision made to help fix stability in a very unstable game.

Personal answer, generally I don't like to make baseless statements, and tend to argue and play devil's advocate whenever someone else does.


----------



## cauby (Sep 29, 2009)

I really don't see any reason why to be pissed off about this if you can just enable AA on CCC.Sure,there might have a hit in performance,but if it works then it's fine for me....


----------



## tkpenalty (Sep 29, 2009)

newtekie1 said:


> Yes, and you got the game you paid for.  However, you didn't get the optimizations that *nVidia* paid for.(Using reasoning 1 here).
> 
> Or you got a game that crashes halfway through(using reasoning 2).  Would you prefer to pay for a game that won't let you finish it?
> 
> ...



Very conservative there only being so mindful for the corporations when the wealth will never get to you. In the end the consumer loses.


----------



## Imsochobo (Sep 29, 2009)

newtekie1 said:


> Yes, and you got the game you paid for.  However, you didn't get the optimizations that *nVidia* paid for.(Using reasoning 1 here).
> 
> Or you got a game that crashes halfway through(using reasoning 2).  Would you prefer to pay for a game that won't let you finish it?
> 
> ...


¨
AC didnt crash for me or any of my friends. DX10.1
There was no problem, review sites didnt have issues either.

The fact that you support paying of game devs for other cards to be bad is just unbearable.

Way its meant to be played is perfectly fine if the FACT that the game ran as it should, and not with intentional crippled performance like its proven...
ATi does support game devs, and give videocards to them so they can check if it works, and support them with documentation and alike, nvidia's strategy is bigger, but they also bribes as it seems like with the result in some TWIMTBP games.


----------



## entropy13 (Sep 29, 2009)

cauby said:


> I really don't see any reason why to be pissed off about this if you can just enable AA on CCC.Sure,there might have a hit in performance,but if it works then it's fine for me....



Well the majority who may be playing the game would be badly hit in forcing AA (like me in CoH for example, forced 4x AA is unplayable - less than 20fps most of the time during the built-in benchmark-,  compared to 4x AA in-game - around 40 fps most of the time in the benchmark).

And before I be branded an ATi fanboy, I had an Nvidia GeForce 6600 for 4 years.


----------



## tkpenalty (Sep 29, 2009)

Imsochobo said:


> ¨
> AC didnt crash for me or any of my friends. DX10.1
> There was no problem, review sites didnt have issues either.
> 
> ...


----------



## DaedalusHelios (Sep 29, 2009)

tkpenalty said:


> Get back on topic will ya? Personal question, are you a conservative person?




Leave personal questions to PM's. Also the terms conservative and liberal are different for Aussies because their politics are different there.

Maybe you were getting different information because the sites were blocked by the ACMA.  j/k

http://www.time.com/time/business/article/0,8599,1888011,00.html 




Imsochobo said:


> ¨nvidia's strategy is bigger, but they also bribes as it seems like with the result in some TWIMTBP games.




 And I bribe online retailers to send me products in the checkout process. BRIBES! BRIBES! I TELL YOU!!!


----------



## btarunr (Sep 29, 2009)

wahdangun said:


> wtf, are you talking about, we are *paying* our hard earned money for their game you know, but what they do? instead working it more playable to other hardware, they chose to crippled the game so they can take some money.
> 
> and no it's not about game stability, it's just one fucking greedy developer. look at the news, they just change device ID and viola AA worked flawlessly(and crush nvdia performance) it's just like :
> 
> ...



That's a convincing argument. If you're an ATI owner, Batman: AA is lesser worth its price for you, because not only will you not be able to use the PhysX effects (because ATI GPUs can't accelerate it), but also that its in-game AA won't work (even though the feature works perfectly fine on ATI hardware). Since there's no mention on the box that you need NVIDIA GPUs to be able to use the in-game AA, it's also misleading.


----------



## Imsochobo (Sep 29, 2009)

entropy13 said:


> And before I be branded an ATi fanboy, I had an Nvidia GeForce 6600 for 4 years.



I have/have had:
Riva TNT2.
Geforce 256, Geforce 2MX, Geforce 2 TI, Geforce 3 TI 560(gainward) ,Geforce4 TI 4600, Geforce 6800GT, Geforce 7800GT SLI, Geforce GTX260

Ati Radeon 9800 Pro, X850XTPE, 1950XTX, 2900XT, 3870X2, 4870CF

So same applies to me, so you guys see where most of my money have gone... now im not buying more of nvidia, unless they become a underdog without money so they cant do meant to fail program.


----------



## AsRock (Sep 29, 2009)

So i may as well get it for the PS3 then ? lol.  I remember some what the same thing with Oblivion when it was released were ATI cards could do HDR and AA were as Nv card could not at the time so they stopped the ATI cards from being able to do it.

Well thats how i remember reading it on there forums when it was released.


Sad soooo sad maybe people should be sueing nvidia more often. As this sounds totally unfair and the makers of the game need a foot in there ass too.


----------



## Sihastru (Sep 29, 2009)

So much misplaced anger, rage, mouth foaming rage... I don't see any of these fanboys attacking any other company in this way.

AA is not an industry standard, it just seems that way. The truth is that AA implementations are very different from company to company. If AA were a standard then ALL cards (ATI, nVidia, Intel, VIA, Matrox) will see the same comparable drop in performance when AA was enabled.

The truth is, they are using completely different ways to implement it, with so many optimizations it's amazing it still produces the same comparable effect. The same can be said about AF and a bunch of other "quality enhancing settings".

I need to know what happens when you play this game on an Intel GPU. Why didn't anyone raised this problem? This is not just a two horse race you know.

Is it really nVidia's fault because you bought an ATI card? Is it ATI's fault because you bought and nVidia card? I don't know, but I do know it is both these companies' fault because you didn't bought a VIA or Matrox card. And if you don't own an ATI or nVidia card, you should take a number for the "screw Intel queue".

When you bought a video card you made a choice. Nobody forced you to do it. Accept it and live with it. Don't be angry if some things might not work as fluently as you think they should.

You guys are so narrow minded. It's like being angry at Ford because one of their engines does not accept the pistons from a VW. I mean, it's a piston, it's a standard...

Another reason because you are narrow minded is because you're ignoring the console market. These days consoles are the priority for the developers. The PC game is just a port. And if a company pays for that port, it wants to see that money being pored into resolving compatibility issues with it's own products, not the competition's.


----------



## laszlo (Sep 29, 2009)

HalfAHertz said:


> Maybe it's just a bug that will be resolved in the next patch



 good one

is a green bug but the red insecticide will kill it ...


----------



## btarunr (Sep 29, 2009)

Sihastru said:


> So much misplaced anger, rage, mouth foaming rage... I don't see any of these fanboys attacking any other company in this way.
> 
> AA is not an industry standard, it just seems that way. The truth is that AA implementations are very different from company to company. If AA were a standard then ALL cards (ATI, nVidia, Intel, VIA, Matrox) will see the same comparable drop in performance when AA was enabled.



So much for misinformed arguments. AMD tested the in-game AA, and it worked. So regardless of this AA implementation being a standard between NVIDIA and AMD, it works, and was yet disabled for ATI hardware.


----------



## Ahhzz (Sep 29, 2009)

I just wonder how many of the people here, as well as in the industry, who are complaining about the 'business practices' of NVIDIA, are the same ones who think M$'s (for example) operating practices and Intel's are just fine...


----------



## Imsochobo (Sep 29, 2009)

HEhe, i dont like MS, therefore, only thing running MS are gaming platforms, i got no other choice.
I buy AMD for the intel reason.
I use Linux for everything else, suits me JUUUUUUUUUUST fine 
And ati got a open source support there aswell.


----------



## [I.R.A]_FBi (Sep 29, 2009)

Ahhzz said:


> I just wonder how many of the people here, as well as in the industry, who are complaining about the 'business practices' of NVIDIA, are the same ones who think M$'s (for example) operating practices and Intel's are just fine...



smoke and mirrors


----------



## btarunr (Sep 29, 2009)

newtekie1 said:


> In the demo, but how do we know it doesn't cause a problem further along in the game, as I've already pointed out?  They haven't tested more than 15 minutes of gameplay and we all assume it works through the entire game.
> 
> How many times have we played a game, that worked fine through 3-4 hours of gameplay, then suddenly crashes at the exact same spot no matter what we do?  I know I've had it happen several times in the many years I've been playing.  In games as recently released as a few months ago.  It is actually pretty common in newly released game, as the drivers haven't been fixed yet.  The solution is often to disable some visual feature(because the drivers don't like it), or to wait for better drivers.
> 
> We don't know that this isn't the case here. Instead, some are jumping to the conclusion that because it has an nVidia stamp on it, that nVidia disabled the feature for ATi.  We don't know that.  And frankly for a news reporter to even suggest it without any shred of proof completely removes all credibility that new reporter has.



That really isn't a problem. Whether the feature 'works' or not on the given hardware is all that matters, and it does. Stability issues is cannot be used as an excuse to completely remove the feature. If stability issues did exist, they should have left the feature available to everyone and worked on them. Besides, the game does not advertise that its AA features don't work on ATI hardware (or that it requires NVIDIA hardware for AA, just like it properly advertises PhysX).


----------



## wahdangun (Sep 29, 2009)

newtekie1 said:


> Yes, and you got the game you paid for.  However, you didn't get the optimizations that *nVidia* paid for.(Using reasoning 1 here).
> 
> Or you got a game that crashes halfway through(using reasoning 2).  Would you prefer to pay for a game that won't let you finish it?
> 
> ...





then they don't deserve our money (using reasoning no.1) 

do you have *a solid proof* that the game will crashes halfway through(using reasoning 2).


so i say ATI owner card must boycott this game and rate it so low in every on-line store. so they will suffer 40% loss from us ATI owner.


----------



## Ahhzz (Sep 29, 2009)

[I.R.A]_FBi said:


> smoke and mirrors


----------



## Sihastru (Sep 29, 2009)

btarunr said:


> So much for misinformed arguments. AMD tested the in-game AA, and it worked. So regardless of this AA implementation being a standard between NVIDIA and AMD, it works, and was yet disabled for ATI hardware.



Again, does it work on an Intel GPU? A test made in-house by the developer company QA department is much better conducted, then a quick test run. Testing one or two or three cards does not qualify for a PASS. We can make many assumptions, some can be true, most can be false.

nVidia could disable the entire game on ATI cards. Why would this be a problem? It's practically their game. It's like Badaboom. It uses CUDA. A game is an application that is geared towards entertainment. PhysX should be reason enough to explain an incompatibility. Apple does it all the time. Takes a random small thing and makes a big deal out of it.

EDIT: btarunr, my post is not a comment to the OP. It's a result of me wasting my time reading all the angry posts after that.


----------



## Mussels (Sep 29, 2009)

newtekie1 said:


> Well if nVidia was the one that paid for the AA optimizations to be put in the game, I see no problem with them limitting them to nVidia hardware also.
> 
> For all we know, if nVidia didn't put the money in to implement the AA optimizations, we would never have seen them in the game, so why should ATi benefit from that?
> 
> ...



despite your trying to use rational logic, you umm, failed.
http://en.wikipedia.org/wiki/Batman_Arkham_Asylum






note the "unreal engine" ? you see, this game was made from an existing game engine, well known to work on ATI and NVIDIA hardware with antialiasing.

you can also rename the games .exe to UE3.exe from what i hear, and then use catalyst control centers AA (even before the patch) and everything works well.

This is purely a dirty trick from nvidia, since NV only add AA to some things in game, while ATI now has to waste power antialiasing EVERYTHING (taking a performance hit) and inconveniencing end users.




mR Yellow said:


> This is common knowledge. Been discussed before.
> 
> BTW AA isn't an added feature...it's a standard.



Indeed. and a default feature of the engine used.



btarunr said:


> ATI GPUs can handle that game's in-game AA, this comes from AMD. The game just disables the feature when it sees an AMD GPU. This is total blasphemy. I'm not going to  / can't tell you what you should choose with your wallets, but I'll tell you what my wallet says.
> 
> "no Batman Arkham Asylum for GeForce for you, bta."
> 
> evil wallet.



indeed. i went RE5 over this, due to this lame issue.



wahdangun said:


> so i say ATI owner card must boycott this game and rate it so low in every on-line store. so they will suffer 40% loss from us ATI owner.



unfortunately, the lack of AA will never make it into enough news to hinder sales that much.


----------



## cauby (Sep 29, 2009)

humm..I guess Btaman is not working properly even on Nvidia cards...

http://forums.techpowerup.com/showthread.php?t=104750


----------



## Sihastru (Sep 29, 2009)

The Unreal Engine 3.5 can only use AA modes if it's under Vista/DX10+. Because of the frickin' consoles, the game is mostly a DX9 game. In DX9 compatibility mode, UE cannot use AA because of the shadowing algorithm that is incompatible with current AA modes.

An important point for ATI DX10.1 cards was an interesting way of producing soft shadows (something nVidia doesn't have in it's DX10 implementation). Could this be the problem?


----------



## wiak (Sep 29, 2009)

Animalpak said:


> Well tell to ATI  to invest more in the development and refinement of the drivers.
> 
> The biggest problem is the fact, ATI drivers has always been poor and bad at this point they would be like nvidia or even higher.
> 
> ...


might want to re evaluate that one dude
search google for "NVIDIA Vista Driver FAIL" and you might find a billion pages
http://www.google.com/search?hl=en&q=nvidia+vista++fail

given the fact that ATI's Catalyst drivers has been released every singel month since January 2004, they even release hotfixes and everything, btw did you know that ATi has had both DX10.1 and DX11 hardware long before nvidia?

but every one do have to agree that Intel's Crapstics drivers suck, dont they? compared to ATI and NVIDIA 
get a ATI card and try again with this bad driver bitching


----------



## newtekie1 (Sep 29, 2009)

tkpenalty said:


> Very conservative there only being so mindful for the corporations when the wealth will never get to you. In the end the consumer loses.



Not really.  If it leads to a more playable game for the consumer, then I hardly call that a loss.  If a large number of consumers were having stability issues making the game completely unplayable for them, and their issues were fixed with little affect on the other consumers playability of the game, I don't consider that an overall loss for the consumer.



Imsochobo said:


> ¨
> AC didnt crash for me or any of my friends. DX10.1
> There was no problem, review sites didnt have issues either.
> 
> ...



It doesn't matter that your small group of friends didn't have an issue.  I didn't have an issue with the game either, but we know for a fact that there was major issues with nVidia hardware at least, and 2 of my machines were running nVidia hardware at the time, one of which was my main machine that I played the majority of the game on.  When you get a sample size in the millions with no issues, then your argument will be sound, but until then, you have no clue if there were wide spread issues with DX10.1, the only people who know that are the ones working at Ubisoft.

I support nVidia putting money into better developement of games for their cards.  Which is exactly what is happening here.  Again, there is just as much evidence that nVidia paid entirely to have the feature added to the game for their cards as there is to say that the feature was already there, and nVidia just paid to have it removed for ATi cards.  Either senerio is just as plausable given what we know so far.  The only difference is one make nVidia out to be the bad guy, and one doesn't, so you really just have to pick if you want to give nVidia a bad name or not.  Personally, I prefer to give everyone the benefit of the doubt and go with the senerio that makes them look best.


----------



## Imsochobo (Sep 29, 2009)

wiak said:


> might want to re evaluate that one dude
> search google for "NVIDIA Vista Driver FAIL" and you might find a billion pages
> http://www.google.com/search?hl=en&q=nvidia+vista++fail
> 
> given the fact that ATI's Catalyst drivers has been released every singel month since January 2004, they even release hotfixes and everything, btw did you know that ATi has had both DX10.1 and DX11 hardware long before nvidia?



Totally back the drivers up! not a issue except 9.2 which could not be upgraded.
Only issue since 2007 for me.
I agree X850XT PE, Niiiiiiiiiiiiightmare.


----------



## Mistral (Sep 29, 2009)

Sihastru said:


> Again, does it work on an *Intel GPU*? A test made in-house by...



This isn't even worth joking about at the moment...

In any case, I have both an ATI and an nVidia rig, and I'll be picking up the game once the price drops a bit and I actually have time to play it. Who knows, by then a patch might actually "fix" the AA issue.


----------



## btarunr (Sep 29, 2009)

Mussels said:


> despite your trying to use rational logic, you umm, failed.
> http://en.wikipedia.org/wiki/Batman_Arkham_Asylum
> http://img.techpowerup.org/090929/.jpg
> 
> note the "unreal engine" ? you see, this game was made from an existing game engine, well known to work on ATI and NVIDIA hardware with antialiasing.



Yes, Unreal Engine 3's AA is proven to work stable on AMD GPUs. Thanks for cementing my argument.


----------



## newtekie1 (Sep 29, 2009)

wiak said:


> might want to re evaluate that one dude
> search google for "NVIDIA Vista Driver FAIL" and you might find a billion pages
> http://www.google.com/search?hl=en&q=nvidia+vista++fail
> 
> ...



http://www.googlefight.com/index.php?lang=en_GB&word1=ATi+Crash&word2=nVidia+crash
http://www.googlefight.com/index.php?lang=en_GB&word1=ATi+driver+problem&word2=nVidia+driver+problem
http://www.googlefight.com/index.ph...ver+vista+fail&word2=nVidia+driver+vista+fail

It all depends on what you search for.  Both sides have driver issue, neither is perfect.  NVidia has an issue early on with vista, which is likely why there are so many hits when you search for it.

However, currently, both sides put out very good drivers on a consistant basis.  So really, the whole X has better drivers than Y argument should really stop, because in the present it is hard to pick which is better, and if you look in the past both have had some pretty rocky times.


----------



## Imsochobo (Sep 29, 2009)

newtekie1 said:


> I support nVidia putting money into better developement of games for their cards..


Notice- Not full quote!

Well, i wonder how the future will be if both companies put mouths full of money to cripple other cards performance:
Start up game:
Play, get tired of it.
Want to play a new one.
Shutdown, change videocard.
Power on, start game.
Play.
....
..

It should be about what card thats best made, gives best performance per buck, or just is the best card on the damn planet, like the MARS, cause someone just likes the big e-peeen!
Imagine, being on a lan with some friends, you wanna play a game, and you get a disadvantage cause you have nvidia, you wine wine wine, then you guys start playing another game, and they get a disadvantage and you advantage.
It should be supported on all cards, general support is the best way, i dont care if its 10% faster on a nvidia card, i care if its 20% slower and with lack of features that really do work without any quirks on ati cards but is intentionally disabled cause someone payed it to be like it.


----------



## jaredpace (Sep 29, 2009)

With batman, they left an option in the settings menu for "optimized nv AA"  This is just MSAA (an efficient method of smoothing jaggies) that both ati and nvidia can do.  The issue was that if it detected an ati card, that option was not available.  The result was the older standard method of smoothing jaggies (regular AA) that is much less efficient, became the method by which ati cards had to render AA in batman.  That meant that Batman with AA enabled gave ati cards much worse framerates than Nvidia cards because the nv cards were using fast MSAA and the ati cards were using old ass slow regular AA.  Same thing going on with NFS Shift. 

For physx, Eidos and rocksteady took certain special effects of the game and packaged them to be able to render via Cuda on the nv gpus.  These effects only work on geforce 8800 and higher (along with the MSAA).  However, these effects (fog, smoke, sparks, particles shimmer, cloth, mattresses, destructible tiles, flying papers, aluminum cans, garbage, spiderwebs, destructible elements, haze, etc) can also all be rendered using an Ati card.  They were just "removed" when cuda + nv is not detected since it is part of the cuda package.  If you check rage3d, beyond3d or youtube you can see people with ATI cards + Corei7's running batman using MSAA and all the "physx" effects (because they edited the physx code, and tricked the game to thinking it had an nvidia card).

Nvidia would love to control the usage of these effects because it makes the game more emmersive and appealing to users of their own hardware, while decreasing the "coolness" of the game on ati hardware.  The sad part is that if you know what you're doing, a few lines of code will have your ati card running perfect MSAA and your corei7 running all those fancy special effects in about 5 minutes, and probably at better framerates than nvidia (if you have a 5800 series).  The really sad part is that, the more they do this, and get away with it, the farther apart technological competition becomes.  The ati cards are already very powerful on the hardware level compared to G80/Gt200.  With their talented engineers and hardware design team, it's bad that ati isn't as efficient with developer relations, driver model programming, and aggressive business practices.


----------



## Imsochobo (Sep 29, 2009)

IF ati isnt aggressive, I dont know.

Pushing out hardware at very low prices, atm the demand is controlled by prices, the cards is already overdemanded, i cant find them in stock at all, they was, and they were gone straight away.
The fact that ati pushing out this is aggressive movement against nvidia, they are aggressive, but the way it should be.
Just like Intel and amd used to do.
Just like ati and nvidia used to do. before GF8xxx came and everything started to fall rapidly!

ATi is aggressive in pointing out flaws, and prices, and bringing out new products fast and with big improvements, this is also how nvidia did things in the past, and they rocked! now they're pushing software like they're Microsoft.


----------



## newtekie1 (Sep 29, 2009)

btarunr said:


> So much for misinformed arguments. AMD tested the in-game AA, and it worked. So regardless of this AA implementation being a standard between NVIDIA and AMD, it works, and was yet disabled for ATI hardware.





btarunr said:


> That really isn't a problem. Whether the feature 'works' or not on the given hardware is all that matters, and it does. Stability issues is cannot be used as an excuse to completely remove the feature. If stability issues did exist, they should have left the feature available to everyone and worked on them. Besides, the game does not advertise that its AA features don't work on ATI hardware (or that it requires NVIDIA hardware for AA, just like it properly advertises PhysX).





Mussels said:


> despite your trying to use rational logic, you umm, failed.
> http://en.wikipedia.org/wiki/Batman_Arkham_Asylum
> http://img.techpowerup.org/090929/.jpg
> 
> ...





btarunr said:


> Yes, Unreal Engine 3's AA is proven to work stable on AMD GPUs. Thanks for cementing my argument.



Why is it so difficult to understand that this isn't just normal AA?  Of course AA works in the game, it can be forced via the control panel, and could always be forced via the control panel.  In fact, isn't that what we all do when AA isn't an option in a game?  And yes, there are still games released without AA as an option.

But why it is hard to understand that this isn't traditional AA?  BTA, you out of everyone should make sure you understand the concept, you are reporting it.  It lowers your credibility to report such crap, and make such statements.

This is optimised AA!  Done in a way to limit performance loss to next to nothing.  This is not a standard feature.  This is not a feature that exists in the Unreal Engine by default.

Yes, AA works, but not the AA that is used in the game!  That is the difference BTA.  The AA forced in CCC is obviously different, as it comes with a performance hit, unlike the in game AA setting.  So while the effect might be very similar, it is two different features.

And you can not confirm that changing the device ID to trick it to work in game, really does function properly, as it was not tested in the actual game.  Again, it is not likely, that a part of the game causes an issue with the feature on ATi cards, and the developers simply disabled the feature as a quick fix to get the game shipped?  I mean the game was already delayed a month on the PC, so we know the developers were under a time crunch to get it shipped...so maybe in the end they did start to implement quick fixes.  Is that so far fetched?



wahdangun said:


> then they don't deserve our money (using reasoning no.1)
> 
> do you have *a solid proof* that the game will crashes halfway through(using reasoning 2).
> 
> ...



Why not, you got a game.  The game isn't any less playable.

Does anyone have any solid proof that it won't crash halfway through?  Even ATi said they only tested the demo.  I do know that I've encounted games, even recently, that would suffer unexplained crashes or have unexpected and unwanted issues caused by visual features enabled in the game, or driver issues.  How many times have we seen "update your drivers" as a responce when someone is having game crashing issue?  Just as an example:  Prototype crashes on my GTX285 if I have AA enabled in the game menu, but works fine on my HD4890 or if I force AA using nVidia control panel.  And Prototype just came out a few months ago!


----------



## Imsochobo (Sep 29, 2009)

newtekie1 said:


> Why is it so difficult to understand that this isn't just normal AA?
> This is optimised AA!  Done in a way to limit performance loss to next to nothing.  This is not a standard feature.  This is not a feature that exists in the Unreal Engine by default.



They proved it by changing ID of the videocard, that it bumped the performance.
Its proven that Physx runs fine without a GPU.

Btw, prototype is a quick port, work just as good as GTA4. which is terrible. no matter make, blame the devs there, no features blocked though.


----------



## Mussels (Sep 29, 2009)

newtekie1 said:


> Why is it so difficult to understand that this isn't just normal AA?  Of course AA works in the game, it can be forced via the control panel, and could always be forced via the control panel.  In fact, isn't that what we all do when AA isn't an option in a game?  And yes, there are still games released without AA as an option.
> 
> But why it is hard to understand that this isn't traditional AA?  BTA, you out of everyone should make sure you understand the concept, you are reporting it.  It lowers your credibility to report such crap, and make such statements.
> 
> ...



our point is simple: the game turns the AA settings off instead of the other options a normal developer would do.

A: leave the option in game, and ATI has worse performance (but can be tweaked via drivers)
B: Work with ATI prior to game release, giving them the same advantages as nvidia
C: disable the setting, sweep it under the rug, make the userbase who want things to "just work" run it on nviida cards

Most games go with A: the good ones go with B: - this game went with C:


where you're going wrong newtekie is that you're thinking the in game AA Is some special super duper thing they cooked up. its not. games have used their own in game AA for as long as in game AA options have been, well, in game options. they can say "oh, only AA stuff close to the camera" or "ignore things on this level, it hurt performance badly with all the action"

the other assumption you appear to be making is that "ATI get AA, nvidia get faster AA" - not hte case. ATI didnt get shit, until this was made a big issue, and the game got patched. NO AA AT ALL.


You're taking the "nvidia can do what they want" approach, but i bet if games came out and said "no AA for nvidia, only ATI" you'd be saying a different story.
*
remember that to even force the AA in the game via CCC, it took hardware ID hacking, public shaming .exe renaming, and finally a game patch - and thats with an un-neccesary performance hit!*


----------



## newtekie1 (Sep 29, 2009)

Imsochobo said:


> They proved it by changing ID of the videocard, that it bumped the performance.
> Its proven that Physx runs fine without a GPU.
> 
> Btw, prototype is a quick port, work just as good as GTA4. which is terrible.



They proved it worked in the *demo*.  Which is all of 15 Minutes of the actual game.  There is a lot more to the game then just what was in the demo, and any part of the game could have been giving then problems.

And what do you think Batman is?  What do you think the extra month was for?  Porting it to the PC and adding PhysX...



Mussels said:


> our point is simple: the game turns the AA settings off instead of the other options a normal developer would do.
> 
> A: leave the option in game, and ATI has worse performance (but can be tweaked via drivers)
> B: Work with ATI prior to game release, giving them the same advantages as nvidia
> ...



A: It would have to be a different setting.
B: When ATi starts paying for developer time, then this become viable, until then nVidia will get more dev time than ATi.
C: Seems like a good option for a time crunched game.

And where you and everyone else seems to be going wrong, is that you don't understand that the in game AA used in Batman isn't normal AA.  It is optimized to give next to no performance loss.  When have you seen that?  That is "super-duper" IMO.


----------



## Imsochobo (Sep 29, 2009)

newtekie1 said:


> They proved it worked in the *demo*.  Which is all of 15 Minutes of the actual game.  There is a lot more to the game then just what was in the demo, and any part of the game could have been giving then problems.
> 
> And what do you think Batman is?  What do you think the extra month was for?  Porting it to the PC and adding PhysX...



YES!

The engine is already a PC engine, no port needed except smash textures models maps and add physx for the most part.


----------



## Mussels (Sep 29, 2009)

newtekie1 said:


> And what do you think Batman is?  What do you think the extra month was for?  Porting it to the PC and adding PhysX...



you've been asking for evidence of everyone elses unsubstantiated claims, where is yours for this? how do you know this month wasnt spent making the game "better" for their sponsor?

its hypocritical to say everyone else needs direct evidence (it needs to work in the full game - the demo, with the same engine, does not count) - yet you can make up claims like that.




Imsochobo said:


> YES!
> 
> The engine is already a PC engine, no port needed except smash textures models maps and add physx for the most part.



thats what i was aware of too. with compatible engines, they'd only have bug fixes to do (and finding ways to make physx look like its doing something, since the game was designed without it)


----------



## AphexDreamer (Sep 29, 2009)

Come on guys, this shouldn't even be an argument... 

There is no justifying what Nvidia did and we ATI users should be used to this kind of treatment by now. If Nvidia wants to use cheap methods to trick the consumers into thinking their card is better then fine, I say let them. ATI is doing just fine regardless and all the wiser people out there will always know and always be a little more educated to know the truth.


----------



## mdm-adph (Sep 29, 2009)

Imsochobo said:


> YES!
> 
> The engine is already a PC engine, no port needed except smash textures models maps and add physx for the most part.





Mussels said:


> you've been asking for evidence of everyone elses unsubstantiated claims, where is yours for this? how do you know this month wasnt spent making the game "better" for their sponsor?
> 
> its hypocritical to say everyone else needs direct evidence (it needs to work in the full game - the demo, with the same engine, does not count) - yet you can make up claims like that.
> 
> thats what i was aware of too. with compatible engines, they'd only have bug fixes to do (and finding ways to make physx look like its doing something, since the game was designed without it)





AphexDreamer said:


> Come on guys, this shouldn't even be an argument...
> 
> There is no justifying what Nvidia did and we ATI users should be used to this kind of treatment by now. If Nvidia wants to use cheap methods to trick the consumers into thinking their card is better then fine, I say let them. ATI is doing just fine regardless and all the wiser people out there will always know and always be a little more educated to know the truth.




Nope... you're all crazy!  Don't you see what Nvidia was doing here?!?  They were being magnanimous -- uh... they were looking out for the poor ATI player!  

By doing this, they were, uh, *improving* the game experience for ATI users!  How nice of them!!

Wait, that... totally doesn't make any sense at all.


----------



## cauby (Sep 29, 2009)

Boo-hoo to Batman,DC Comics and Nvidia...

I'm playing Spiderman,enough of this troubles!


----------



## the_wolf88 (Sep 29, 2009)

> Comic book games are so... c**p  I dont miss Batman at all
> 
> "With no in-game AA available to ATI Radeon users, although the features do technically work on ATI Radeon hardware, the only way AA can be used is by forcing it in Catalyst Control Center."
> Problem solved



No it's not !!

you should read the full line..

With no in-game AA available to ATI Radeon users, although the features do technically work on ATI Radeon hardware, the only way AA can be used is by forcing it in Catalyst Control Center. This causes the driver to use AA on every 3D object in the scene, reducing performance, compared to if the game's in-game AA engine is used.

Performance drops a lot !!!

Damn you Nvidia


----------



## AphexDreamer (Sep 29, 2009)

mdm-adph said:


> Nope... you're all crazy!  Don't you see what Nvidia was doing here?!?  They were being magnanimous -- uh... they were looking out for the poor ATI player!
> 
> By doing this, they were, uh, *improving* the game experience for ATI users!  How nice of them!!
> 
> Wait, that... totally doesn't make any sense at all.



How am I crazy, I think were on the same side here lol.


----------



## newtekie1 (Sep 29, 2009)

Imsochobo said:


> YES!
> 
> The engine is already a PC engine, no port needed except smash textures models maps and add physx for the most part.



I guess you are right here, the extra time was to add features.



Mussels said:


> you've been asking for evidence of everyone elses unsubstantiated claims, where is yours for this? how do you know this month wasnt spent making the game "better" for their sponsor?



http://www.actiontrip.com/rei/comments_news.phtml?id=080609_9

There you go.  Article explaining the game was delayed to add PhysX.



Mussels said:


> its hypocritical to say everyone else needs direct evidence (it needs to work in the full game - the demo, with the same engine, does not count) - yet you can make up claims like that.
> 
> 
> 
> ...



Its not really hypocritical, as when I'm ask for it, I provide it.

And yes, it is the same engine, I get that, we all do.  However, that doesn't mean it will work in every single part of the game.  That is my point.  Testing it in the demo is one step, but testing it in the real game, playing completely through is another.


----------



## PEPE3D (Sep 29, 2009)

*Batman AA and ATI*

I try to play this game. It plays good with nice graphics but it crashes a lot. I am angry at the fact that it maybe do to the fact that I have two ATI cards in CF 4870x2 2gb GDDR5. Great cards. I can pretty much play any game maxed out at 1900x1200. But this game is different. I have to play it at low resolution or the game won't play at all. I am very annoyed. If I knew this, I would've not spent the money on this game. Also, I think is time for developers to start thinking about us, We are the one that buy the games regardless of what brand of GPU we have in our pc. We should start to look at the reviews of the game before they come out and if by any chance the developer suggests that it will play better with NVIDIA or ATI, "we" the customers should boycott the game. We have the power, they need us, they need our money. Therefore, there should be no preference of what GPU you have. It should play great regardless. These are bad business practices and someone should do something about it. GAMES are not cheap!


----------



## newtekie1 (Sep 29, 2009)

If you plan to boycott every game that plays better on one over the other, don't expect to be playing any games.  They all favor one over the other, that is just how it is.  However, they all also run far beyond acceptably on both, regardless of who they favor.

You are never going to have a game that is playable on one, and completely unplayable on the other.  You might have to lower a setting or two on one or the other, but really all games tend to be very similar on both when using comparable cards.


----------



## Bjorn_Of_Iceland (Sep 29, 2009)

Meh.. I feels BatMan Arkham devs are just lazy/out of budget and would fear that the engine would have massive code revamps / time consumed when they optimize the game for both ati and nvidia. 

Its the same thing we feel sometimes when we make web apps in work.. as long as it run on IE, its good for deployment. "To hell with the firefox, chrome, etc.. go live is tommorow. Just hide the link and lets do a workaround until we find a longterm solution/fix (that would never see the sunlight)"


----------



## Imsochobo (Sep 29, 2009)

newtekie1 said:


> If you plan to boycott every game that plays better on one over the other, don't expect to be playing any games.  They all favor one over the other, that is just how it is.  However, they all also run far beyond acceptably on both, regardless of who they favor.
> 
> You are never going to have a game that is playable on one, and completely unplayable on the other.  You might have to lower a setting or two on one or the other, but really all games tend to be very similar on both when using comparable cards.



I can play almost all games.
I havnt like wanted to play a game that doesnt work on ati yet, phew.
Far beyond acceptably on both, without AA it looks ugly and ruins my experience of a game, reason why i dont like to play anything before a new videocard comes, waiting for 5870 2GB

Another issue here, previous post.
HEll if it works with opera chrome etc.
We have another problem, we only "support" IE due to a security chief from a stubborn country, and we break the law daily cause IE sux.


----------



## leonard_222003 (Sep 29, 2009)

I've always said Nvidia behaves like a child , they argue with Intel , Amd , integrators , even their own customers who say they have a hardware problem , they deny everything.
They act like they are the only ones who sell's    , let's say weed , so in their mind they can pretty much do anything.
If they keep this up eventually we will have 2 groups divided , people who own ATI cards will buy ATI games , people who own Nvidia cards will buy Nvidia games.
Ati didn't make a move in this direction but Nvidia thinks they are big enough to impose Physx/CUDA  like everybody on this planet owns only Nvidia video cards , if statistics prove that 90% of possible customers own Nvidia cards then ATI owners are fuc...ked.
It's sad AMD/ATI aren't this agressive , whinning about a game not supporting ATI properly is not good , what should i do ? what do they expect me to do ? you the big shot with billions of dollars do something , if you can't i'll buy Nvidia even if i don't like what they do.
Nvidia owners are probably happy with this  , they could even come here and defend the green bastards  but like someone here said , if it was the other way  they would be outraged.
Also , saying DX11 is not so important , but CUDA and PHYSX are  , of course , crap after crap , come on with the next generation Nvidia , how much are you gonna milk Gt200  and g92's ?


----------



## ZoneDymo (Sep 29, 2009)

newtekie1 said:


> If you plan to boycott every game that plays better on one over the other, don't expect to be playing any games.  They all favor one over the other, that is just how it is.  However, they all also run far beyond acceptably on both, regardless of who they favor.
> 
> You are never going to have a game that is playable on one, and completely unplayable on the other.  You might have to lower a setting or two on one or the other, but really all games tend to be very similar on both when using comparable cards.




1. Why are you trying so hard to defend Nvidia in this new turn of events?
2. (responding to your message) That is not the issue at all here as I'm sure you understand, making sure something works well on your product OR paying devs to make the competators version worse, is quite something different.


----------



## Imsochobo (Sep 29, 2009)

leonard_222003 said:


> I've always said Nvidia behaves like a child , they argue with Intel , Amd , integrators , even their own customers who say they have a hardware problem , they deny everything.
> They act like they are the only ones who sell's    , let's say weed , so in their mind they can pretty much do anything.
> If they keep this up eventually we will have 2 groups divided , people who own ATI cards will buy ATI games , people who own Nvidia cards will buy Nvidia games.
> Ati didn't make a move in this direction but Nvidia thinks they are big enough to impose Physx/CUDA  like everybody on this planet owns only Nvidia video cards , if statistics prove that 90% of possible customers own Nvidia cards then ATI owners are fuc...ked.
> ...



Who knows, for another 2 years for the GT200, rename for 2 series, and with late architectur e devs from nvidia its 2 years.
ATI is pushing DX11, i can defend that, cause when nvidia get thier fist out of their ass which seems to be stuck.
They get the same features, ati is pushing tech that matrox, ati, intel, nvidia and via can use!
Not ATi only.


----------



## soryuuha (Sep 29, 2009)

Dear beloved company,

PC gaming already lacks of hardcore games, please don't add more stupid political into PC gaming industry.

Thank you.


----------



## newtekie1 (Sep 29, 2009)

Imsochobo said:


> I can play almost all games.
> I havnt like wanted to play a game that doesnt work on ati yet, phew.
> Far beyond acceptably on both, without AA it looks ugly and ruins my experience of a game, reason why i dont like to play anything before a new videocard comes, waiting for 5870 2GB
> 
> ...



If you need AA to play, I pitty you, and you are not a gamer.  But beyond that, you should have learned how to enable it in CCC a long time ago, because there are a lot of games that don't even give the option of AA unless you force it.  So one would assume you did the same with Batman, and enjoyed it.



ZoneDymo said:


> 1. Why are you trying so hard to defend Nvidia in this new turn of events?
> 2. (responding to your message) That is not the issue at all here as I'm sure you understand, making sure something works well on your product OR paying devs to make the competators version worse, is quite something different.



1. Not really defending, just giving just as plausable reason to explain the situation.  I like to give the benefit of the doubt and go with the most likely reason, not conspiracy theories.  Never been one to jump on the conspiracy bandwagon.

2. As I've said, in either of the two senerios I believe more likely to be true, nVidia did not pay to make the competition any worse than they already were.  ATi's position was not made worse in anyway by nVidia, nVidia's position might have been improved, but ATi's position wasn't made worse.


----------



## SteelSix (Sep 29, 2009)

kenkickr said:


> The game is so dark and it seems most gamers are going to be using the Detective Mode so I don't see alot of "Ohs and Ahs" during game play and positive reason to even enable AA and AF besides I think the game would perform better enabling AA in the game instead of in CCC.  From the get go Nvidia has been all over this game and when they decided to not even update the Ageia driver and setup their drivers to disable Physx when a "non-nvidia" card is detected....I decided to say FUCK BATMAN and NVIDIA!!



OMG.. sorry I drooled all over your avatar

Sincerly,
New Cubs fan


----------



## Imsochobo (Sep 29, 2009)

newtekie1 said:


> If you need AA to play, I pitty you, and you are not a gamer.  But beyond that, you should have learned how to enable it in CCC a long time ago, because there are a lot of games that don't even give the option of AA unless you force it.  So one would assume you did the same with Batman, and enjoyed it.



I'm NOT A GAMER ?
National team player in 3 games.
Europe Class A player
Had games broadcasted on LIVE TV in germany, and im not a gamer ?.

Dont say things you dont know.
My progaming history might be over, but i'm still a gamer.

You dont even know me, how can you say that then ?

Get youre facts right.
My statements is backed up by own testings, physx with ati and nvidia cards did work, not anymore.
Ati had AA and more performance no problem.
Physx can run just fine on cpu.

3 things ive seen you defend like its youre newly born child.


----------



## Valdez (Sep 29, 2009)

Firstly, it's not nvidia's fault. It's the devs' fault, because they are corrupt and buyable.
Secondly, it's our fault, the pc gamers, we warez alot.
Thirdly it's nvidia's fault which is a company, from the worst. It's main interest is the profit at any cost.


----------



## theubersmurf (Sep 29, 2009)

*The proprietary crap has to end*

I have a gtx260 currently, and this will be the last card I buy from invidia. And this kind of thing is the reason. I don't want, for there to be only one vga company, like I don't want there to be only one major cpu maker. 

The proprietary crap has to end, Really, being forced to buy from only one company is just ridiculous. My thinking immediately goes to physx/havok and both having become proprietary. Really, had things ended in a way that was good for the consumer, both of those companies would have licensed their APIs to both ATI and invida, and it would run on either brand of GPU. Now, Nvidia is blocking the functionality of Physx if there is an ATI card doing the rendering. (link) The big business attitude that both invida and intel have do a good bit to ruin gaming and computing in general. I'm actually pissed, computers and gaming are time out of mind for me...And this kind of crap is done by companies that charge ridiculous amounts of money given the opportunity. Their needs to be some kind of redressment...some how.


----------



## AddSub (Sep 29, 2009)

Other than the obvious sub-segment of the user base, who cares really and is really surprised by this? Corporations will continue to exert such influence as long as there are such things as corporations. 

The day AMD acquired ATI was in retrospect probably the darkest day in computing history, since on that day the unholy amalgamation that it created out of AMD and ATI fanboys will probably create a black hole that will swallow us all. And any threads on TPU with keywords such as "ATI", "AMD", or "Radeon" will continue to bloat in response rate irregardless of the validity of such threads. This thread was such obvious fanboy catastrophe in the making. Is Charlie Demerjian submitting news on TPU now? I mean 136 responses in less than five hours and all because of some shi**y comic book hero game and some ultimately minor injustice?

Maybe I should be thankful, at least ASUS-related articles have dropped to one per week, from about twenty per day from few years ago.


----------



## Imsochobo (Sep 29, 2009)

Its about its been nvfail related issues all the time since HD5xxx was released that shocks me, before it was bareable, now its not.

AMD hater okey, i dont mind, ATI is still the same company, and AMD have done nothing but good with ATI, Open source, Drivers, market strategies that really makes sense! (might not be reason of bought by amd)
However, its not negative at all!


----------



## SNiiPE_DoGG (Sep 29, 2009)

*Posts Christian Bale Tirade* 



Nvidia is despicable, but hey I've known that since the time I had to RMA my 6th EVGA 680i motherboard because of a fried memory controller!


----------



## zCexVe (Sep 29, 2009)

This is total BS. Either NV agreed to pay all the money that the game coders will get from selling it to ATi VGA owners or funded the game from the begining..
nv: You must have a nvidia GPU to play the games.Thats how its meant to be played


----------



## temp02 (Sep 29, 2009)

Crysis déjà vu...

NVidia hardware isn't bad, the same thing can't be said about their so called _marketing strategies_.


----------



## newtekie1 (Sep 29, 2009)

Imsochobo said:


> I'm NOT A GAMER ?
> National team player in 3 games.
> Europe Class A player
> Had games broadcasted on LIVE TV in germany, and im not a gamer ?.
> ...



If you require AA, you are not a gamer.  Gamers play games because they are fun, they play them for the games themselves, not the visuals.  Period.  No true gamer would make the statement that AA is required to enjoy a game.



Imsochobo said:


> Get youre facts right.
> My statements is backed up by own testings, physx with ati and nvidia cards did work, not anymore.
> Ati had AA and more performance no problem.
> Physx can run just fine on cpu.
> ...



nVidia disabling PhysX when an ATi card is present was a dick move.  I'll be the first to say that.  However, I can understand their frustration, and the reasons behind it.  You are too quick to forget that nVidia actually wanted to get PhysX running natively on ATi hardware(no nVidia hardware required).  And it was ATi that blocked the effort in any way possible.  The fact is that nVidia was trying to be very helpful in getting PhysX/CUDA running on ATi hardware.  Was the reasoning because it would benefit nVidia in the fight against Intel, and not just out of the goodness of their hearts?  Probably, but who cares?  The point was that nVidia was trying to get PhysX/CUDA working on ATi hardware.  And after ATi blocked them at every turn, even going as far as not providing review samples to the review site that was responsible for the original hacked drivers...

Ati had AA, but in the case of Batman, it cost performance.  Unlike the ingame solution which doesn't.  ATi of course still has better performance thanks to the HD5870 having a very healthy horse power lead over nVidia offerings.

And of course PhysX/CUDA can run just fine on the CPU.  Anything that can run the CUDA environment can run PhysX, and nVidia from the beginning has made the CUDA environment free to use and develop for any hardware, including ATis hardware.  However, you have to realize that PhysX itself is ported from running on dedicated hardware, to running in a CUDA environment.  And issues will arrise from this, such as not using more than one CPU core.


----------



## Steevo (Sep 29, 2009)

NV is pushing users away, finally they have realized they don't want to price gouge consumers for hardware, and claim open source hardware and software, to work on only their overpriced hardware. They just don't want you to  buy it. They are tired of your money and wants and needs, listen to the great green god, they will tell you how to live your life, what games to play, in return you will become subserviant to them, assimilating more into the masses, you will defend their every flaw with such fervor that others who are normal will wonder about you. 


This is how it is, now all submit to the big green phallic probe.


That or they feel that us normal gamers are beneath them and will never be smart enough to find out about the shit they pull.


----------



## SNiiPE_DoGG (Sep 29, 2009)

stop telling people they arent gamers because they want AA, the doesnt make any flipping sense as this isnt a dick waving contest about who is the most HARDCORE gamer FFS


----------



## newtekie1 (Sep 29, 2009)

SNiiPE_DoGG said:


> stop telling people they arent gamers because they want AA, the doesnt make any flipping sense as this isnt a dick waving contest about who is the most HARDCORE gamer FFS



No, stop trying to tell people what to say.

No true gamer would require AA.  And no true gamer would say a game has to have AA to be enjoyable.  It is that simple.



Steevo said:


> NV is pushing users away, finally they have realized they don't want to price gouge consumers for hardware, and claim open source hardware and software, to work on only *their overpriced hardware*. They just don't want you to  buy it. They are tired of your money and wants and needs, listen to the great green god, they will tell you how to live your life, what games to play, in return you will become subserviant to them, assimilating more into the masses, you will defend their every flaw with such fervor that others who are normal will wonder about you.
> 
> 
> This is how it is, now all submit to the big green phallic probe.
> ...



Oh please, now you are just making shit up.  Get your facts straight, nVidia has equaled or bettered ATi in price to performance for at least the last two generations in every performance segment the two competed in.


----------



## qubit (Sep 29, 2009)

This sounds like the sort of dirty tricks that nvidia would pull. We need Charlie Demerjian's take on this!


----------



## Valdez (Sep 29, 2009)

newtekie1 said:


> nVidia disabling PhysX when an ATi card is present was a dick move.  I'll be the first to say that.  However, I can understand their frustration, and the reasons behind it.  You are too quick to forget that nVidia actually wanted to get PhysX running natively on ATi hardware(no nVidia hardware required).  And it was ATi that blocked the effort in any way possible.  The fact is that nVidia was trying to be very helpful in getting PhysX/CUDA running on ATi hardware.  Was the reasoning because it would benefit nVidia in the fight against Intel, and not just out of the goodness of their hearts?  Probably, but who cares?  The point was that nVidia was trying to get PhysX/CUDA working on ATi hardware.  And after ATi blocked them at every turn, even going as far as not providing review samples to the review site that was responsible for the original hacked drivers...



Bullshit, you know that, because it was told you several times.
Nvidia has the source code, the physx is not an open standard, a game with hw physx would run a lot worse on ati hw than on nvidia hw, because it is optimized to nvidia hw.
Nvidia wouldn't let a game with nv's own tech to perform better on ati hw.


----------



## newtekie1 (Sep 29, 2009)

Valdez said:


> Bullshit, you know that, because it was told you several times.
> Nvidia has the source code, the physx is not an open standard, a game with hw physx would run a lot worse on ati hw than on nvidia hw, because it is optimized to nvidia hw.



Bullshit, the tests with hacked drivers were showing PhysX running just fine on ATi hardware.

You are seriously over estimating the power required to run PhysX, any current ATi hardware would have been able to completely kill in PhysX performance.  Remember, the original hardware the PhysX API ran on was 128MB PCI cards...


----------



## Valdez (Sep 29, 2009)

newtekie1 said:


> Bullshit, the tests with hacked drivers were showing PhysX running just fine on ATi hardware.



Where did i write it doesn't run?


----------



## newtekie1 (Sep 29, 2009)

Valdez said:


> Where did i write it doesn't run?



"Running just fine" means it has good performance.  Simply running with shitty performance would be far from "fine" would you say?  Or do you not understand what the word fine means?


----------



## Steevo (Sep 29, 2009)

newtekie1 said:


> Oh please, now you are just making shit up.  Get your facts straight, nVidia has equaled or bettered ATi in price to performance for at least the last two generations in every performance segment the two competed in.



At least you admit that the rest is true. Knowing you have a problem is the first step.

BTW, you han't even posted when I was writing that post, and you almost jumped into it, as if some of us could tell what you were going to say and do..........


----------



## Valdez (Sep 29, 2009)

newtekie1 said:


> "Running just fine" means it has good performance.  Simply running with shitty performance would be far from "fine" would you say?



So you can imagine a game with hw physx running better on ati hw than it's competitor from nvidia?


----------



## qubit (Sep 29, 2009)

*+1*



newtekie1 said:


> Bullshit, the tests with hacked drivers were showing PhysX running just fine on ATi hardware.
> 
> You are seriously over estimating the power required to run PhysX, any current ATi hardware would have been able to completely kill in PhysX performance.  Remember, the original hardware the PhysX API ran on was 128MB PCI cards...




Yeah, I also remember the reports of PhysX running quite well on AMD with a driver wrapper. This whole restriction is due to politics/business and nothing else.

Apparently, PhysX was offerred to AMD but they didn't want it, because it was "nvidia" stuff, but don't quote me on it.


----------



## newtekie1 (Sep 29, 2009)

Steevo said:


> At least you admit that the rest is true. Knowing you have a problem is the first step.



No, I've just already address the rest previously, and don't feel like repeating myself.



Valdez said:


> So you can imagine a game with hw physx running better on ati hw than it's competitor from nvidia?



With PhysX there really isn't a "better", it really just kind of works or it doesn't.  Obviously, there are some slight variations, largely dependent on the amount of physic being calculated using PhysX, which is why nVidia has started to increase the requirements for the video cards that run PhysX.  And why Batman actually has different PhysX levels that require different levels of performance from the PhysX card.

However, at the time when the hacked drivers surface, it either just worked or it didn't.  The performance issues came down to rendering the acutal graphics more than PhysX.  So we never really got to test if the higher levels of PhysX in modern games like Batman would really be hindered on an ATi card.

Even if development did continue once the hacked drivers were released, I don't think we would ever see PhysX running better on ATi than nVidia, simply because developement for the ATi side was severally delayed.  However, I do believe that similar cards from the two would have performed similarly.  Not identical but similarly.


----------



## CrAsHnBuRnXp (Sep 29, 2009)

I like how only the Ati ppl are bitching about this.


----------



## Valdez (Sep 29, 2009)

newtekie1 said:


> With PhysX there really isn't a "better", it really just kind of works or it doesn't.  Obviously, there are some slight variations, largely dependent on the amount of physic being calculated using PhysX, which is why nVidia has started to increase the requirements for the video cards that run PhysX.  And why Batman actually has different PhysX levels that require different levels of performance from the PhysX card.
> 
> However, at the time when the hacked drivers surface, it either just worked or it didn't.  The performance issues came down to rendering the acutal graphics more than PhysX.  So we never really got to test if the higher levels of PhysX in modern games like Batman would really be hindered on an ATi card.
> 
> Even if development did continue once the hacked drivers were released, I don't think we would ever see PhysX running better on ATi than nVidia, simply because developement for the ATi side was severally delayed.  However, I do believe that similar cards from the two would have performed similarly.  Not identical but similarly.




Ah yes, and a bit later when physx becomes an industry standard, because ati has accepted nvidia's "generous" offer, the user looks at the graps and every game that uses hw physx would show more fps on nvidia hw. It would work on ati hw too, but slower.

Development on ati side? What development? Nvidia would develop physx for ati hw if ati would accept it? LOL You're very naive.


----------



## leonard_222003 (Sep 29, 2009)

newtekie1 said:


> nVidia disabling PhysX when an ATi card is present was a dick move.  I'll be the first to say that.  However, I can understand their frustration, and the reasons behind it.  You are too quick to forget that nVidia actually wanted to get PhysX running natively on ATi hardware(no nVidia hardware required).  And it was ATi that blocked the effort in any way possible.  The fact is that nVidia was trying to be very helpful in getting PhysX/CUDA running on ATi hardware.  Was the reasoning because it would benefit nVidia in the fight against Intel, and not just out of the goodness of their hearts?  Probably, but who cares?  The point was that nVidia was trying to get PhysX/CUDA working on ATi hardware.  And after ATi blocked them at every turn, even going as far as not providing review samples to the review site that was responsible for the original hacked drivers...


HAHAHA , are you crazy man , do you take us for some very stupid people ?
It's a given that Nvidia would ask some major licesing money if AMD/ATI wants physx , it's a given they would try to make it run like shit on AMD's hardware and try to ruin them with renewed contracts about licensing this tehnology , it was never an option for AMD/ATI to use a tehnology bought by Nvidia from Ageia with who knows how many millions of dollars.
What you say is just plain stupid and you insult our intelligence , also AMD/ATI said they were never called by Nvidia about physx.


----------



## AphexDreamer (Sep 29, 2009)

CrAsHnBuRnXp said:


> I like how only the Ati ppl are bitching about this.



Thats a generalization, I personally could care less. This is nothing new.


----------



## theubersmurf (Sep 29, 2009)

CrAsHnBuRnXp said:


> I like how only the Ati ppl are bitching about this.


It makes sense that it would work that way.  However, I've been an invida user for a while now, and I'm "bitching" about it too. I'm pretty sick of their shenanigans, and I'm jumping ship...Ironically that makes me a potential ATI user, so based on that, you can lump me in.


----------



## erocker (Sep 29, 2009)

Yes, and the sole reason I stopped using Nvidia was due to their business practices. Stupid me, I went and bought a GTX 260 anyways. Great card, physX was neat. Got bored, bought another ATi card. Plus, when I buy ATi, the money stays closer to home.



CrAsHnBuRnXp said:


> I like how only the Ati ppl are bitching about this.



What are Ati people? Meh, I could just go buy a Nvidia card but have too many reasons to not want one.

*I just realized, I'm in the wrong thread.


----------



## CrAsHnBuRnXp (Sep 29, 2009)

AphexDreamer said:


> Thats a generalization, I personally could care less. This is nothing new.


I could care less too. 


theubersmurf said:


> It makes sense that it would work that way.  However, I've been an invida user for a while now, and I'm "bitching" about it too. I'm pretty sick of their shenanigans, and I'm jumping ship...Ironically that makes me a potential ATI user, so based on that, you can lump me in.


I could care less what they do. I mainly use their cards because they are usually the best by the time i need a new card and they use a black PCB. I hate red PCB thus refuse to buy ATI cards. Thats really the only reason I never used ATI. Retarded I know but i like things to match. But now that XFX is making ATI cards and make them in black PCB, ATI is a possibility for me now. 

And if your an nVIDIA user and your tired of their "shenanigans", then jump ship so they can stop helping benefiting you. 


erocker said:


> Yes, and the sole reason I stopped using Nvidia was due to their business practices. Stupid me, I went and bought a GTX 260 anyways. Great card, physX was neat. Got bored, bought another ATi card. Plus, when I buy ATi, the money stays closer to home.
> 
> What are Ati people? Meh, I could just go buy a Nvidia card but have too many reasons to not want one.


And im sure many others have many reasons not to go buy an ATI card. Its user preference.


----------



## El Fiendo (Sep 29, 2009)

I'm sorry but I can't afford the time to read 160+ replies, so if its been mentioned I'm sorry.

Unreal 3 doesn't come with native AA support, so this isn't NVIDIA shafting ATI and removing a feature, this is NVIDIA worked with the game developers and added a feature to the game that ATI never bothered to do. Nothing stopped ATI from working with the game developers to enable AA via software. This isn't underhanded.


----------



## BigBruser13 (Sep 29, 2009)

*What would you do?*

If nividia comes out with a monster GPU. would you buy even if they did many more (imagine) unethical things. Or buy ati because they are good not great but honest?

I honestly am on the fence at this point, leaning towards the honest


----------



## CrAsHnBuRnXp (Sep 29, 2009)

BigBruser13 said:


> If nividia comes out with a monster gpu 1 p=-



Wouldnt care.


----------



## Valdez (Sep 29, 2009)

El Fiendo said:


> I'm sorry but I can't afford the time to read 160+ replies, so if its been mentioned I'm sorry.
> 
> Unreal 3 doesn't come with native AA support, so this isn't NVIDIA shafting ATI and removing a feature, this is NVIDIA worked with the game developers and added a feature to the game that ATI never bothered to do. Nothing stopped ATI from working with the game developers to enable AA via software. This isn't underhanded.



I don't think ati has any word in a game development which is under TWIMTBP program. But i could be wrong.


----------



## Mistral (Sep 29, 2009)

El Fiendo said:


> I'm sorry but I can't afford the time to read 160+ replies, so if its been mentioned I'm sorry.
> 
> Unreal 3 doesn't come with native AA support, so this isn't NVIDIA shafting ATI and removing a feature, this is NVIDIA worked with the game developers and added a feature to the game that ATI never bothered to do. Nothing stopped ATI from working with the game developers to enable AA via software. This isn't underhanded.



Just for the record, and since you didn't read the 160+ replies (which is completely understandable) - Batman:AA runs on Unreal 3.5


----------



## newtekie1 (Sep 29, 2009)

Valdez said:


> Ah yes, and a bit later when physx becomes an industry standard, because ati has accepted nvidia's "generous" offer, the user looks at the graps and every game that uses hw physx would show more fps on nvidia hw. It would work on ati hw too, but slower.
> 
> Development on ati side? What development? Nvidia would develop physx for ati hw if ati would accept it? LOL You're very naive.



There is really nothing to show that PhysX would run any worse on ATi hardware, really.  While it might run worse in the future, I doubt that the hardware in the future will actually struggle.

If an HD3870 could stomp through PhysX with no issue back then, I doubt something like an HD5870 would stuggle today.  Or even something from the HD4800 series.



leonard_222003 said:


> HAHAHA , are you crazy man , do you take us for some very stupid people ?
> It's a given that Nvidia would ask some major licesing money if AMD/ATI wants physx , it's a given they would try to make it run like shit on AMD's hardware and try to ruin them with renewed contracts about licensing this tehnology , it was never an option for AMD/ATI to use a tehnology bought by Nvidia from Ageia with who knows how many millions of dollars.
> What you say is just plain stupid and you insult our intelligence , also AMD/ATI said they were never called by Nvidia about physx.



Do a little research.  There would have been no licencing fee for ATi, the only thing ATi would have had to do was support the developement.  The PhysX API, engine, and SDK are provided to anyone that wants to use them, free of charge from nVidia.  The hardware developer just has to provide hardware drivers that support it.

Again, nVidia was more than willing to help the developer get PhysX/CUDA running on ATi hardware, no licencing or fees involved at all.  They were not going to do it themselve, but they were willing to help the devloper that wanted to do it.  The problem was that ATi refused to help in any way.


----------



## El Fiendo (Sep 29, 2009)

Mistral said:


> Just for the record, and since you didn't read the 160+ replies (which is completely understandable) - Batman:AA runs on Unreal 3.5



Right, and through all my searching I can't find where it says AA is natively supported in 3 or 3.5. If its added in by the developers and its co-developed by NVIDIA then there is no problem. Does anyone have the spec sheet that says UE3.5 has native AA in the engine?


----------



## Valdez (Sep 29, 2009)

newtekie1 said:


> There is really nothing to show that PhysX would run any worse on ATi hardware, really.  While it might run worse in the future, I doubt that the hardware in the future will actually struggle.
> 
> If an HD3870 could stomp through PhysX with no issue back then, I doubt something like an HD5870 would stuggle today.  Or even something from the HD4800 series.



What i'm talking about is not a hardware issue. If a hw physx title would run slower on ati hw is not because it is actually weaker hw. It is because the source code is in nv hand, and wouldn't let ati win. It's fully logical.
If i own a technology, then i own the tools, to be the best at any circumstances.

I really can't explain myself better, my English isn't so good. But if you're right, and nvidia is so generous as you describe them, then it is time them to port physx from cuda to opencl, and physx source code will be free for everyone.


----------



## InTeL-iNsIdE (Sep 29, 2009)

CrAsHnBuRnXp said:


> I like how only the Ati ppl are bitching about this.



I like how all the nvidia users are defending nvidia to the death


----------



## newtekie1 (Sep 29, 2009)

Valdez said:


> What i'm talking about is not a hardware issue. If a hw physx title would run slower on ati hw is not because it is actually weaker hw. It is because the source code is in nv hand, and wouldn't let ati win. It's fully logical.
> If i own a technology, then i own the tools, to be the best at any circumstances.
> 
> I really can't explain myself better, my English isn't so good. But if you're right, and nvidia is so generous as you describe them, then it is time them to port physx from cuda to opencl, and physx source code will be free for everyone.



I get what you are saying, but what I'm saying, is there is really no way for nVidia to do this.  PhysX takes so little GPU power to run, that it wouldn't be feasable.

There are several things you have to consider.  The fact that an outside developer was the one doing the devloping, he was just being assisted by nVidia after his initial breakthrough.  They essentially were providing him with any documentation and development tools he needed.

Also, CUDA is designed by its nature to be hardware independent.  Once the hardware vender writes the driver to support CUDA, it will work.  There really isn't a whole lot nVidia can do to make it perform worse on one over the other, and if they did, it would immediately send up red flags because the difference would be drastic.


----------



## El Fiendo (Sep 29, 2009)

InTeL-iNsIdE said:


> I like how all the nvidia users are defending nvidia to the death




Just like ATI users defend ATI against baseless claims.

I've looked and I can't find once where it says Anti - Aliasing is natively supported in any of Unreal Engine's current iterations. In fact all I find are threads lamenting how UE3xx doesn't support AA at all unless done through hardware. That means NVIDIA paid extra money to get it put in, and it would be stupid of them to allow it to ATI users too. Why? Because ATI isn't paying for it, NVIDIA is. They didn't remove a feature. They added a feature for their own market. ATI didn't follow suite and add AA for their market, *not* they've been 'foul played'.

I find it odd that this Ian McNaughton guy is putting forward this half truth, and if I'm correct I've actually lost respect for ATI in this case because of it. Again, if anyone can prove otherwise (that UE3.5 supports AA and NVIDIA removed usage of AA for ATI _*instead*_ of adding AA for their own buyers) then I'll retract my claims. 

Until then it looks like NVIDIA actually did the gaming market a favor adding by AA, and is owed an apology by roughly 85% of this thread. I wouldn't bother waiting for an apology if I were them though.


----------



## mR Yellow (Sep 29, 2009)

Just wasted 1 hour reading this thread. 
One thing is for sure. nVidia is always involved in this shady tactics.


----------



## Valdez (Sep 29, 2009)

newtekie1 said:


> I get what you are saying, but what I'm saying, is there is really no way for nVidia to do this.  PhysX takes so little GPU power to run, that it wouldn't be feasable.



Actually physx needs a lot of gpu power to run.

http://www.pcgameshardware.de/screenshots/original/2009/09/Batman_Arkham_Asylum_Benchmarks_4.PNG


----------



## Benetanegia (Sep 29, 2009)

El Fiendo said:


> Just like ATI users defend ATI against baseless claims.
> 
> I've looked and I can't find once where it says Anti - Aliasing is natively supported in any of Unreal Engine's current iterations. In fact all I find are threads lamenting how UE3xx doesn't support AA at all unless done through hardware. That means NVIDIA paid extra money to get it put in, and it would be stupid of them to allow it to ATI users too. Why? Because ATI isn't paying for it, NVIDIA is. They didn't remove a feature. They added a feature for their own market. ATI didn't follow suite and add AA for their market, *not* they've been 'foul played'.
> 
> ...



I agree, but it's even worse IMO. From what I read they have discovered all this after the game has launched!!! That means they had no contact with the developer at all! I mean if you are a GPU maker, don't you contact developers and try to optimize before launch or at least start working on the optimization of the full game before it launches? Don't you ask for a copy? IMO if they cared so little about that game that they didn't even contact them, AMD deserves every bit of unoptimized code they get. Especially if it comes from a feature that has never been there and was developed for Nvidia at their request, paid by their money. The fact that the optimization works on Ati cards as well, changes nothing IMO. If I was the developer I would have done the same.


----------



## El Fiendo (Sep 29, 2009)

Well, and just looking at the history of ATI driver releases. Almost every game that comes out gets a patch awhile _after_ the fact, and continuously so. I'd say ATI has a more reactionary approach when it comes to supporting games, rather than a proactive one. 

What doesn't make sense to me is why everyone was so ready to jump down NVIDIA's throat. And seriously, hear me out on this one. There are shit tons of games that are 'TWIMTBP' and have in game AA for ATI. Why would they cock block ATI on this game alone? This reeks more of ATI not supporting the game out of the gate, like most games that get emergency patches from them, than it does anything else. 

Even the ATI fanboys should have looked at this one with a grain of salt.


----------



## xstayxtruex (Sep 29, 2009)

Benetanegia said:


> I agree, but it's even worse IMO. From what I read they have discovered all this after the game has launched!!! That means they had no contact with the developer at all! I mean if you are a GPU maker, don't you contact developers and try to optimize before launch or at least start working on the optimization of the full game before it launches? Don't you ask for a copy? IMO if they cared so little about that game that they didn't even contact them, AMD deserves every bit of unoptimized code they get. Especially if it comes from a feature that has never been there and was developed for Nvidia at their request, paid by their money. The fact that the optimization works on Ati cards as well, changes nothing IMO. If I was the developer I would have done the same.



IMO people should stop caring IMO move on to the next game IMO game is simplistic and easy to beat IMO :spoiler: batman dies :spoiler:


----------



## newtekie1 (Sep 29, 2009)

Valdez said:


> Actually physx need a lot of gpu power to run.
> 
> http://www.pcgameshardware.de/screenshots/original/2009/09/Batman_Arkham_Asylum_Benchmarks_4.PNG



No, it requires a lot of GPU power to render all the extra objects created by PhysX, it requires next to no GPU power to actually run PhysX.  The rendering of the extra objects would all depend on the cards ability to render graphics.



El Fiendo said:


> Well, and just looking at the history of ATI driver releases. Almost every game that comes out gets a patch awhile _after_ the fact, and continuously so. I'd say ATI has a more reactionary approach when it comes to supporting games, rather than a proactive one.
> 
> What doesn't make sense to me is why everyone was so ready to jump down NVIDIA's throat. And seriously, hear me out on this one. There are shit tons of games that are 'TWIMTBP' and have in game AA for ATI. Why would they cock block ATI on this game alone? This reeks more of ATI not supporting the game out of the gate, like most games that get emergency patches from them, than it does anything else.
> 
> Even the ATI fanboys should have looked at this one with a grain of salt.



What is even more interesting, and something I just noticed, is that in the Demo graphics launcher, it is even referred to as "nVidia Multi Sample Anti-Aliasing".  Kind of makes sense to call it that if nVidia was the one that paid to have it added to the game...


----------



## InTeL-iNsIdE (Sep 29, 2009)

El Fiendo said:


> Well, and just looking at the history of ATI driver releases. Almost every game that comes gets a patch awhile after the fact, and continuously so. I'd say ATI has a more reactionary approach when it comes to supporting games, rather than a proactive one.
> 
> What doesn't make sense to me is why everyone was so ready to jump down NVIDIA's throat. And seriously, hear me out on this one. There are shit tons of games that are TWIMTBP and have in game for ATI. Why would they cock block ATI on this game alone? This reeks more of ATI not supporting the game out of the gate, like most games that get emergency patches from them, than it does anything else.
> 
> Even the ATI fanboys should have looked at this one with a grain of salt.



Surely you dont think this is the only game where nvidia have been playing dirty ? since the whole twimtbp there have been numerous claims regarding different games of poor performance on ATI cards that doesnt make sense looking at similar performing nvidia cards, and to be honest this thread is just repetitive now on both sides.

I really could care less, the more I see "TWIMTBP" in games the more I will not buy another nvidia card simple as that, and if it comes to it I wont buy the games if it continues and both nvidia and the game devs can go screw themselves, this is the view of many people including nvidia users aswell!

Its not good practice and if they carry on with these tactics it will hurt them in the end.


----------



## El Fiendo (Sep 29, 2009)

I'm pretty sure most of the instances of people complaining about 'TWIMTBP' was fixed later in an ATI driver fix or a game fix that wasn't related to NVIDIA tampering.

Start naming the issues with 'TWIMTBP' that you've seen and we'll research how many turned out to be NVIDIA straight up tampering or just a broken driver that was later fixed. I'm not being an ass I'm curious to know the numbers on this myself.


Edit: If that's the case Newtekie, than it looks like it truely is an NVIDIA _added_ feature. I almost expect to see NVIDIA demanding an apology from Ian McNaughton regarding this.


----------



## newtekie1 (Sep 29, 2009)

InTeL-iNsIdE said:


> Surely you dont think this is the only game where nvidia have been playing dirty ? since the whole twimtbp there have been numerous claims regarding different games of poor performance on ATI cards that doesnt make sense looking at similar performing nvidia cards, and to be honest this thread is just repetitive now on both sides.
> 
> I really could care less, the more I see "TWIMTBP" in games the more I will not buy another nvidia card simple as that, and if it comes to it I wont buy the games if it continues and both nvidia and the game devs can go screw themselves, this is the view of many people including nvidia users aswell!
> 
> Its not good practice and if they carry on with these tactics it will hurt them in the end.



How is that playing dirty exactly?  TWIMTBP's entire purpose it to optimize games to run better on nVidia hardware.  Why are you surprised when it actually works?  And furthermore, how it is playing dirty?  People like to claim games running worse on ATi hardware than on comparable nVidia hardware is proof that nVidia is somehow hindering ATi's performance, but it is really just proof that TWIMTBP program is doing its job, optimizing performance on nVidia hardware.

ATi had a similar program, but they dropped it in favor of doing their own optimizations in drivers.  It is just two different aproaches to optimization.


----------



## Valdez (Sep 29, 2009)

newtekie1 said:


> No, it requires a lot of GPU power to render all the extra objects created by PhysX, it requires next to no GPU power to actually run PhysX.  The rendering of the extra objects would all depend on the cards ability to render graphics.



Well, there should be a lot of shit, that come with physx enabled, because it halved the fps!
(then high physx required again a 20% over that halved fps)



> Also, CUDA is designed by its nature to be hardware independent. Once the hardware vender writes the driver to support CUDA, it will work. There really isn't a whole lot nVidia can do to make it perform worse on one over the other, and if they did, it would immediately send up red flags because the difference would be drastic.



Nvidia wants to spread physx, right? Then why should ati make a driver for cuda, if it already has its own api (ati stream 1.x, 2.x), and there is a common api called opencl (currently at 1.0)? If nvidia wants to spread physx, then he should port it to opencl, to become available to every card. 
Nvidia would do this, except... if he wants to manipulate with this physx stuff.


----------



## Benetanegia (Sep 29, 2009)

newtekie1 said:


> How is that playing dirty exactly?  TWIMTBP's entire purpose it to optimize games to run better on nVidia hardware.  Why are you surprised when it actually works?  And furthermore, how it is playing dirty?
> 
> ATi had a similar program, but they dropped it in favor of doing their own optimizations in drivers.  It is just two different aproaches to optimization.



Anger fanbois are not able to differentiate between improving performance for one card and making the other run slower. Same people don't know what optimization is, so you know why that happens. Then again all of them expect a 20% improvement with the specific driver releases, and see that as normal when it does happen, but think that Nvidia achieved that optimization and more through various months of work? No, that's imposible.


----------



## Valdez (Sep 29, 2009)

newtekie1 said:


> but it is really just proof that TWIMTBP program is doing its job, optimizing performance on nVidia hardware.



...meantime excluding ati completely from developing.


----------



## troyrae360 (Sep 29, 2009)

I think it dose happen, What about Assissans creed, it come out with DX 10.1 and it worked perfectley, BUT then all of a sudden it dissapered. I was under the understanding that that had somthing to do with nv not being able to support it


----------



## El Fiendo (Sep 29, 2009)

InTeL-iNsIdE said:


> You know what, you are the biggest fanboy on this thread which is quite obvious from your constant ass licking and defending of nvidia.
> 
> Its bullshit because there is no reason other than nvidia paying the devs not to optimise it for nvidia, but to give worse performance on ATI cards.
> 
> But hey, you defend them all you like, I could care less and like I said the more it goes on then the more I will not buy an nvidia product, and a lot of other people feel the exact same way.



Uhh there's alot of reasons for Devs to not optimize it to 100% for either. One key one being time costs money. Another being the more you screw with code optimizing it, the more chance you have to break it and create more issues for yourself. The more time spent on something the more money it costs. What's the best way to cut costs? Here's a scenario.

Well, the game runs at 95% on both systems, but they can get it 100% with another 20000 extra man hours. To them, 95% is good enough and they don't need to pay developers wage x 20000 hours. Why? Because at the end of the day, its still playable and both camps can run the game. Nobody is losing out.

But wait! Here's NVIDIA saying 'Hey, we'll give you money, you put TWIMTBP at the front, and spend that time to optimize it for our hardware to 100%. Leave ATI at 95%, we don't care.' Right there, you have the idea of TWIMTBP. Its not meant to take 100% and make it 112.5%. Its not meant to take 100% ATI down to 50%. 

I think that aside from small optimization tweaks, you'll find people's complaints about TWIMTBP eventually turned out to be something else as the cause.


----------



## newtekie1 (Sep 29, 2009)

Valdez said:


> Well, there should be a lot of shit, that come with physx enabled, because it halved the fps!
> (then high physx required again a 20% over that halved fps)



That has been the case since PhysX was first release in the market, long before nVidia even entered the picture, when PhysX was still owned by Ageia.

The graphical performance hit caused by PhysX is very large.





Valdez said:


> Nvidia wants to spread physx, right? Then why should ati make a driver for cuda, if it already has its own api (ati stream 1.x, 2.x), and there is a common api called opencl (currently at 1.0)? If nvidia wants to spread physx, then he should port it to opencl, to become available to every card.
> Nvidia would do this, except... if he wants to manipulate with this physx stuff.



Actually, ATi wasn't even tasked with making the driver, an outside developer was willing to do it, they just needs some support from ATi.  And at the time ATi Stream was, and still is, pretty much unused, I don't believe it even existed when CUDA and PhysX was developed.  And nVidia just release an OpenCL compliant driver, because OpenCL has only been around for a short while also.

And at this point, it is kind of pointless to port PhysX over, as it is pretty much dead thanks to DX11's implementation of physics.


----------



## Benetanegia (Sep 29, 2009)

Valdez said:


> ...meantime excluding ati completely from developing.



Game developers owe nothing to Ati. If Ati doesn't help them, why should they help Ati at all? And it goes far beyond that, because not only they do not help them optimize, but they don't even care enough to be around when the launch is close, and show some interest. So as I said they owe them nothing.

They don't owe you anything either, nor they owe me anything, they make a product the better they can or the better they want and if it's good enough for you, you buy it. If you don't like it you don't buy it and they loose. That's how it works.


----------



## dr emulator (madmax) (Sep 29, 2009)

i think what is needed here is a little 
 people there really is no need for anyone here to get heated it's just a computer game yes nvidia has done something to it , so what, are you going to die from it ?is it going to make your house burn down and eat your children? .mussels said this thread was getting over the top, please be carefull or this thread will face being closed, 
we are all adults here not children in a school yard 
anyways both cards have there plus's and minus's  i will probably get both ,
why because i can


----------



## newtekie1 (Sep 29, 2009)

InTeL-iNsIdE said:


> You know what, you are the biggest fanboy on this thread which is quite obvious from your constant ass licking and defending of nvidia.
> 
> Its bullshit because there is no reason other than nvidia paying the devs not to optimise it for nvidia, but to give worse performance on ATI cards.
> 
> But hey, you defend them all you like, I could care less and like I said the more it goes on then the more I will not buy an nvidia product, and a lot of other people feel the exact same way.



Says the guy with the big ATi symble under his name....

And because you are so blinded, you can't even believe that it might entirely be possible that nVidia is actually doing what they say they are doing and paying for the game to be optimized on their hardware.  You find that way too far fetched, and instead believe that they are simply paying to have performance retarded on ATi hardware...that makes sense...I guess...

You know the simplest solution is usually the correct solution.  Which seems simpler to you?  The program is being used like nVidia says, and optimizing the game to run on their hardware OR there is a huge conpiracy where nVidia uses the program as a front to hinder performance on ATi hardware to screw ATi over?

But now that you have degraded to simply flaming instead of making intelligent points to backup your argument, I'll ignore you now, as you have lost.


Valdez said:


> ...meantime excluding ati completely from developing.



No, not really.


----------



## pr0n Inspector (Sep 29, 2009)

Google: deferred shading
:shadedshu


----------



## InTeL-iNsIdE (Sep 29, 2009)

newtekie1 said:


> Says the guy with the big ATi symble under his name....
> 
> And because you are so blinded, you can't even believe that it might entirely be possible that nVidia is actually doing what they say they are doing and paying for the game to be optimized on their hardware.  You find that way too far fetched, and instead believe that they are simply paying to have performance retarded on ATi hardware...that makes sense...I guess...
> 
> ...



Says the guy who has been fighting this like his life counted on it ? 

oh noes I have lost on the interwebs  haha get over it mate after all it is a discussion not a fight/game 

how is it a conspiracy ? ATI is nvidia's only real competitor ( I'm sorry I dont see intel/matrox/SIS as competitors) so why is it so far fetched that so many ATI users who actually own the hardware and have seen the numbers on various "optimised" games are consistently reporting poor performance on similar even lower performing nv cards ? 

Optimised my arse, its not xbox 360 Vs PS3 its PC and it should play the same regardless on similar performing cards whether they are ATI or nv 

It never used to be that way, a game was made and it played the same on both nv and ati as long as they were on par in terms of performance overall and there were slight differences in image quality between the 2 but that is all.


----------



## eidairaman1 (Sep 29, 2009)

What NV is doing here is reducing the overall sales of the game by pulling this stunt, which in turn will affect their cash flow due to the game not being bought by users of ATI hardware as well.  Remember anything with a logo or slogan such as "TWIMTBP" gets bought some of that money goes to them as well due to just the logo.


----------



## troyrae360 (Sep 29, 2009)

once again NV has pulled out the dirty tactics again

"I typically save most gaming news for the semi-regular Binge, but I think that this story deserves its own slot. As part of a software update to its popular Assassin's Creed game, Ubisoft seems to have removed DirectX 10.1 functionality from the game entirely. This is interesting for a few reasons; first and foremost, it just doesn't make sense to remove "main attraction" features from a game - especially if the removal of these features results in reduced game performance on systems using a graphics card supporting DX 10.1. Secondly - and most importantly - because this title is part of Nvidia's "The Way it's Meant to be Played" program, the moves smells very much like collusion - seeing as no current Nvidia graphics cards support DX 10.1.  This was a terrible decision, and one can only wonder if karma will rear its ugly head...as it should."

DX 10.1 offered a 20% increase in preformance when AA was being used, But then they scraped it, Go figure?
I assuming that that this stunt is along the same lines.


----------



## El Fiendo (Sep 29, 2009)

InTeL-iNsIdE said:


> It never used to be that way, a game was made and it played the same on both nv and ati as long as they were on par in terms of performance overall and there were slight differences in image quality between the 2 but that is all.



You seem unwilling to acknowledge my posts directly refuting yours. No matter, this statement above is entirely wrong. Before TWIMTBP there was Digital Vibrance. ATI itself had a TWIMTBP variant though I can't remember what it was called. And what of Sideport and its expected massive boost over NVIDIA? Eyefinity? Cuda and PhysX...? Havok?

Everyone has had many marketing gimmicks. And everyone still does.


----------



## DRDNA (Sep 29, 2009)

G@dn!q said:


> what nvidia is doin' to ati sonunds like what intel did to amd and they were fined more than 1 billion for that! i'm just curious what will happen if ati finds proof for the dirty games of nvidia! will they be fined the same way like intel? i think that's the only way nvidia to stop playin' dirty and focus on other things!



I believe there will be a lawsuit if this is not made right.


----------



## El Fiendo (Sep 29, 2009)

DRDNA said:


> I believe there will be a lawsuit if this is not made right.



They'd lose.


----------



## ArmoredCavalry (Sep 29, 2009)

Wow, I think this thread has more replies than the 5870 release... Honestly, this doesn't surprise me... Although you really have to wonder how Nvidia is making so much money that they can not only afford to develop competing cards, but also pay (or bribe as somone said earlier) developers to optimize games solely for their gpu's.

Maybe it has something to do with their $500+ cards?  ... edit: and huge marketshare (in b4 fanboyz)


----------



## btarunr (Sep 29, 2009)

Alright, the off-topic discussions stop here.


----------



## WarEagleAU (Sep 29, 2009)

Phew, thanks BTA I mean I love reading drama as much as anyone but this is getting ridiculous.


----------



## DaedalusHelios (Sep 29, 2009)

leonard_222003 said:


> My god you eighter are naive or you insult me even further , if let's say Physx was free , for how much time ? 1 month ? 1 year ? or until it became a standard and Nvidia would force them to pay if they want to use it ? something like that ?
> I'm starting to think i'm wasting my time here , i have better things to do , good luck with all this and keep it nice  .



Same goes for anything proprietary. You think all companies throw in funds evenly to develop all technology?

Owning a technology means that yes its yours and you can charge for it. Are you saying that it shouldn't be allowed to create a return as an investment just because ATi doesn't own it? Its the business world and thats how it operates, whether you like it or not.

PS. If they get the Eyefinity going without the need for display port primary or powered adapters I could see grabbing a 5870 or two. Damn bezel sizes.


----------



## BelligerentBill (Sep 29, 2009)

Take this competitive marketing Batman AA rumor with a grain of salt.  Any jab by the competitor who's stuck in a corner should be looked at with extreme skepticism.  This may simply be another marketing ploy (reverse psychology).  As an Nvidia user I know for a fact there are many games where Nvidia's own default forced AA modes are not compatible and the only option Nvidia provides is to 'enhance application settings' which even those do not always work properly.  Depending on how a developer implements AA it may or may not work properly on specific hardware and specific drivers.  When a demo and a finished product are released nearly simultaneously it's pretty safe to assume that a code branch took place before the product was finalized.  This could easily lead to discrepancies such as is taking place here.  Optimizations may have occurred after the demo split that changed how certain features might be implemented, financially motivated or otherwise.

That being said, I'd wait to hear more information before nay saying Nvidia.  Let it be known that I was an ATI owner for YEARS before becoming fed up with certain quality issues never being resolved and finally switched to Nvidia about two years ago.  As it stands now Nvidia has far more to offer the consumer in terms of overall features and upgraded paths than ATI does.  Who stands to benefit the most from a sucker punch?

Just my opinion but I trust no company and am always skeptical when I see something like this...


----------



## troyrae360 (Sep 29, 2009)

Needs AA enabled


----------



## DaedalusHelios (Sep 29, 2009)

troyrae360 said:


> http://img.techpowerup.org/090929/BatmanAndRobin.jpg



You must have something to say other than a batman picture from before you were born.


----------



## Marineborn (Sep 29, 2009)

i dont feel like reading all 8 pages, and i can be accused of being a ati fanboy, but honestly whens the last time you seen a game with a opening screne Saying ATI!!!!! THE WAY ITS MEANT TO BE PLAYED!! OR!! PHYSX GET IN THE GAME......*blinks*......and if this is true and not a bug, im gonna be very sad for nvidia. seeing there already living on old technology, the 300 series...really....

well see what comes about this, all i know is when i tried to play batman on my system..has a ati card..it would blue screen it....NO game blue screens my computer...i dont even understand it myself, ive reinstalled and everything. but whatever.


----------



## troyrae360 (Sep 29, 2009)

DaedalusHelios said:


> You must have something to say other than a batman picture from before you were born.



Ive added somthing


----------



## El Fiendo (Sep 29, 2009)

Marineborn said:


> i dont feel like reading all 8 pages, and i can be accused of being a ati fanboy, but honestly whens the last time you seen a game with a opening screne Saying ATI!!!!! THE WAY ITS MEANT TO BE PLAYED!! OR!! PHYSX GET IN THE GAME......*blinks*......and if this is true and not a bug, im gonna be very sad for nvidia. seeing there already living on old technology, the 300 series...really....
> 
> well see what comes about this, all i know is when i tried to play batman on my system..has a ati card..it would blue screen it....NO game blue screens my computer...i dont even understand it myself, ive reinstalled and everything. but whatever.








http://en.wikipedia.org/wiki/Get_in_the_game


----------



## DaedalusHelios (Sep 29, 2009)

Marineborn said:


> i dont feel like reading all 8 pages, and i can be accused of being a ati fanboy, but honestly whens the last time you seen a game with a opening screne Saying ATI!!!!! THE WAY ITS MEANT TO BE PLAYED!! OR!! PHYSX GET IN THE GAME......*blinks*......and if this is true and not a bug, im gonna be very sad for nvidia. seeing there already living on old technology, the 300 series...really....
> 
> well see what comes about this, all i know is when i tried to play batman on my system..has a ati card..it would blue screen it....NO game blue screens my computer...i dont even understand it myself, ive reinstalled and everything. but whatever.



If all else fails try a legit copy.


----------



## pr0n Inspector (Sep 29, 2009)

Sad fact of life: Intelligent and un-sensational posts are bad for attracting 13-year-olds. They should be ignored and buried.


----------



## erocker (Sep 29, 2009)

> ...the only way AA can be used is by forcing it in Catalyst Control Center. This causes the driver to use AA on every 3D object in the scene, reducing performance, compared to if the game's in-game AA engine is used.



Wait.. So I get to use superior AA in the game? Sounds good to me.


----------



## DaedalusHelios (Sep 30, 2009)

troyrae360 said:


> Ive added somthing



Until Christian Bale signed on to the batman franchise the live action series/movies were so fruity/kiddie.

I mean no offense to any Batman enthusiasts here BTW.


----------



## cavemanthreeonesix (Sep 30, 2009)

I'm so glad i didnt give in and buy this game now


----------



## troyrae360 (Sep 30, 2009)

I might hire the game tonight, load it up and see what happens.
Anyone know what might happen if i try to run this program?


----------



## cauby (Sep 30, 2009)

Wow,almost 200 posts!(Ok,I contributed a little...)

So,the conclusions i can draw after reading those 9 pages is that either Nvidia is a big bad company who doesn't like to play fair or ATI is full of incopetent people.There's another thing which I think is more reasonable:if Nvidia put their money helping in the development of better features for the game(in-game AA and Phsyx),then why would it share it with ATI?Is there a rule anywhere saying that it should?


----------



## Marineborn (Sep 30, 2009)

DaedalusHelios said:


> If all else fails try a legit copy.
> 
> i was talking about the demo, yes its unfinished but still it shouldnt blue screen a system so i just bought it for 360 and its been playing just fine....


----------



## Kitkat (Sep 30, 2009)

this really blows hate this stuff they just saying what has been going on for years lol


----------



## kid41212003 (Sep 30, 2009)

Bad moves, ruin their own reputations, there is no need to go this far :/.


----------



## EastCoasthandle (Sep 30, 2009)

This is a great opportunity to put up a poll to see if people will continue to buy their products after reading about news like this.


----------



## kid41212003 (Sep 30, 2009)

I wouldn't stop buying their products because of this, I buy who ever offer better performance for my games.


----------



## pr0n Inspector (Sep 30, 2009)

This is hilarious. Almost everyone is ignoring the fact that (DX9)UE3 did not have proper AA until now. And this game happens to be a TWIMTBP one. Isn't it obvious that nVidia payed or worked with Rocksteady to make this happen?


----------



## WarEagleAU (Sep 30, 2009)

And we began to rock....Steady!....Steady rockin all night long ::Sings:: Sorry, that brought a flash back!


----------



## Robert-The-Rambler (Sep 30, 2009)

*To Put An End To This Argument*

Just get a 30" monitor with 2560 * 1600 resolution and sit back several feet and FSAA just won't matter much at all.

Anyhow I think doing just about anything proprietary on an open platform is a bad way of doing business. What ever happened to Glide and 3Dfx? Let Nvidia and their Cuda simply fall off a cliff since the future will bring a better way of doing things anyhow. I've supported ATI for a long time now and have not bought a Nvidia video card since my 8800 GTX. I remember the first AMD Athlon and what it stood for. It meant that some other people existed out there that could do the same thing as the current king of the hill and do it at a better value and do it with creative thinking and a better product. I think ATI/AMD is doing that again right now especially on the graphics front and the way games like Crysis and the likes seem to want to sleep only with Nvidia hardware makes me angry. I'll probably still buy Arkham Asylum but there is no way in hell I'm gonna buy an Nvidia GPU to see FSAA done the "right" way.


----------



## pr0n Inspector (Sep 30, 2009)

Robert-The-Rambler said:


> Just get a 30" monitor with 2560 * 1600 resolution and sit back several feet and FSAA just won't matter much at all.
> 
> Anyhow I think doing just about anything proprietary on an open platform is a bad way of doing business. What ever happened to Glide and 3Dfx? Let Nvidia and their Cuda simply fall off a cliff since the future will bring a better way of doing things anyhow. I've supported ATI for a long time now and have not bought a Nvidia video card since my 8800 GTX. I remember the first AMD Athlon and what it stood for. It meant that some other people existed out there that could do the same thing as the current king of the hill and do it at a better value and do it with creative thinking and a better product. I think ATI/AMD is doing that again right now especially on the graphics front and the way games like Crysis and the likes seem to want to sleep only with Nvidia hardware makes me angry. I'll probably still buy Arkham Asylum but there is no way in hell I'm gonna buy an Nvidia GPU to see FSAA done the "right" way.



And nVidia still churns out the best drivers for *nix systems. How evil.


----------



## tkpenalty (Sep 30, 2009)

Robert-The-Rambler said:


> Just get a 30" monitor with 2560 * 1600 resolution and sit back several feet and FSAA just won't matter much at all.
> 
> Anyhow I think doing just about anything proprietary on an open platform is a bad way of doing business. What ever happened to Glide and 3Dfx? Let Nvidia and their Cuda simply fall off a cliff since the future will bring a better way of doing things anyhow. I've supported ATI for a long time now and have not bought a Nvidia video card since my 8800 GTX. I remember the first AMD Athlon and what it stood for. It meant that some other people existed out there that could do the same thing as the current king of the hill and do it at a better value and do it with creative thinking and a better product. I think ATI/AMD is doing that again right now especially on the graphics front and the way games like Crysis and the likes seem to want to sleep only with Nvidia hardware makes me angry. I'll probably still buy Arkham Asylum but there is no way in hell I'm gonna buy an Nvidia GPU to see FSAA done the "right" way.



Actually not the right way for AA, the corner cutting way .


----------



## Mussels (Sep 30, 2009)

newtekie1 said:


> If you need AA to play, I pitty you, and you are not a gamer.  But beyond that, you should have learned how to enable it in CCC a long time ago, because there are a lot of games that don't even give the option of AA unless you force it.  So one would assume you did the same with Batman, and enjoyed it.



unlike nvidia, ATI do not have profiles. if you enable AA in there, you're taking a large performance hit (due to using an un-optimised AA, at least until driver updates emerge) AND You have to go into the CCC and turn it on and off everytime you change between batman and another game



newtekie1 said:


> No, stop trying to tell people what to say.
> 
> No true gamer would require AA.  And no true gamer would say a game has to have AA to be enjoyable.  It is that simple.



stop trying to tell gamers how they play their games?

i hate gaming without AA, its the whole reason i have overkill graphics cards.




El Fiendo said:


> Right, and through all my searching I can't find where it says AA is natively supported in 3 or 3.5. If its added in by the developers and its co-developed by NVIDIA then there is no problem. Does anyone have the spec sheet that says UE3.5 has native AA in the engine?



the engine supports it, its just that on nvidia card DX10 rendering is normally required to use it.
ATI could always run it, with a few odd glitches when HDR was used at the same time.

http://www.pcbuyersguide.co.za/showthread.php?t=6757






all thats happened here, is that nvidia took tweaking to get it to work and ATI didnt, but instead of just letting ATI run they "hid" the option to make their sponsors look better.


----------



## BlackOmega (Sep 30, 2009)

cauby said:


> Wow,almost 200 posts!(Ok,I contributed a little...)
> 
> So,the conclusions i can draw after reading those 9 pages is that either Nvidia is a big bad company who doesn't like to play fair or ATI is full of incopetent people.There's another thing which I think is more reasonable:if Nvidia put their money helping in the development of better features for the game(in-game AA and Phsyx),then why would it share it with ATI?Is there a rule anywhere saying that it should?



 I think the more accurate assumption would be that the game had AA already implemented and Nvidia paid the game devs to disable it with ATi hardware. 

 I definitely wont be buying this game due to this bullshit. I was planning on upgrading my folding rigs (2 of them) to 260/216's but thats not going to happen either. I hope Stanford gets the ATi clients right and us ATi guys will finally be able to see some decent PPD associated with our hardware.


----------



## Mussels (Sep 30, 2009)

the problem isnt that nvidia had more work put it and wont share, its that they did a dirty trick and hid the setting.

if the setting was there *for all GPU's* it would be fair, like every other game title - it may run worse on AMD, and be useless on intel IGP's, but at least then they could release new drivers to fix the performance/compatibility bugs. nvidia would have a headstart on a fair playing field



how many people here would mind, if crysis 2 didnt allow AA on nvidia cards? you had to run it via your nvidia control panel, and you took a 20% FPS hit over ATI using the inbuilt options?


----------



## pr0n Inspector (Sep 30, 2009)

Mussels said:


> the engine supports it, its just that on nvidia card DX10 rendering is normally required to use it.
> ATI could always run it, with a few odd glitches when HDR was used at the same time.
> 
> http://www.pcbuyersguide.co.za/showthread.php?t=6757
> ...



Not HDR. I'm tired of mentioning this again and again.
http://en.wikipedia.org/wiki/Deferred_shading

also, UE3 *did not* support AA in DX9 until now.


----------



## Mussels (Sep 30, 2009)

pr0n Inspector said:


> Not HDR. I'm tired of mentioning this again and again.
> http://en.wikipedia.org/wiki/Deferred_shading



sorry pron man. HDR was mentioned in the link i found.


----------



## entropy13 (Sep 30, 2009)

Sorry too pron man, we're talking about UE *3.5* here and not UE 3.


----------



## pr0n Inspector (Sep 30, 2009)

entropy13 said:


> Sorry too pron man, we're talking about UE *3.5* here and not UE 3.



And guess what? When Epic announced UE 3.5 there's not a single word about AA@DX9. I wonder what does that mean?


----------



## Mussels (Sep 30, 2009)

pr0n Inspector said:


> And guess what? When Epic announced UE 3.5 there's not a single word about AA@DX9. I wonder what does that mean?



it means what i said earlier.

ATI could do it since the get go (remember oblivion, with HDR+AA?) and Nvidia couldnt.

nvidia took the time to make it work (and it does), and cock-blocked ATI from joining the party, when they could do it all along.


----------



## Hayder_Master (Sep 30, 2009)

this year come with many cool games and i don't have time to play it all so I was thinking which game i will leave play it this year and thanks for nvidia they make the easy chose for me


----------



## wiak (Sep 30, 2009)

i miss the time they (ATI/NVIDIA) were cheating in 3dmark, atleast it wasnt a game, so it didnt realy impact us that much


----------



## pr0n Inspector (Sep 30, 2009)

Mussels said:


> it means what i said earlier.
> 
> ATI could do it since the get go (remember oblivion, with HDR+AA?) and Nvidia couldnt.
> 
> nvidia took the time to make it work (and it does), and cock-blocked ATI from joining the party, when they could do it all along.



So you are saying this is the same old HDR+AA issue?


----------



## Mussels (Sep 30, 2009)

pr0n Inspector said:


> So you are saying this is the same old HDR+AA issue?



it pretty much is, yes.

ATI followed the DX9/10 specs better than nvidia did, and Nv forced games to disable these options, so that ATI wouldnt have a features or performance advantage.

its an ages old game they play.

ATi had pixel shader 1.4 out when nvidia were stuck on 1.3 (so games stayed on 1.3 longer)
NV released the GF4 MX series without shaders at all (moving BACK two generations in hardware) (which helped NV catch up with PS2.0, bypassing their lack of PS 1.4)
FX series ony did 16 and 32 bit precision for floating point in PS2.0 when the standard called for 24 bit (meaning they were slower, or lower quality)
6 series cards were fine, but had slower SM3.0 performance for the most part (but it was fairly even there)
7 series: nothing in my memory, both sides had a fair go
8 series and up: Nv didnt support many features of DX10 properly, so they got them removed from the specs and pushed into DX10.1


nvidias hardware has often been behind ATI in the features game, and they use their influence (such as TWIMTBP) to get those features removed from games, lest ATI have an advantage. Thats all this is: ATI can do AA on top of all these fancy features, and nvidia cant (which is part of the specs of DX10.1 if you look).

nvidia got B:AA to work on their cards through hard work, and blocked ATI from allowing the same (even tho it works fine, as proven by forcing it in the CCC - try forcing AA onto an unreal engine game with nvidia, and you'll find it doesnt work)


----------



## El Fiendo (Sep 30, 2009)

Hold on a second there Mussels, there's nothing to prove that NVIDIA intentionally hid it. In fact, what seems more likely is the game developer hid it on their own doing. As you mentioned it runs but with bugs. Could be just as likely that it caused system instability without the tweak so it was disabled by the game.

The point is not one of you, nor I, know what happened. And while I find it fun to watch everyone so ravenously chant bloody murder on the basis of nothing more than text, its getting old. 

You're forgetting one thing. Why this one game? Why disable AA on JUST this game but none of the others? Were you guys just looking for a reason to burn people at the stake? I'm sure everyone remembers the Far Cry 2 scandal that, oh wait. It wasn't. It was an error in ATI driver code. But that was a witch hunt too, wasn't it?

Edit:
Official response from NVIDIA:


“A representative of AMD recently claimed that NVIDIA interfered with anti-aliasing (AA) support for Batman: Arkham Asylum on AMD cards. They also claimed that NVIDIA’s The Way It’s Meant to be Played Program prevents AMD from working with developers for those games.

Both of these claims are NOT true. Batman is based on the Unreal Engine 3, which does not natively support anti-aliasing. We worked closely with Eidos to add AA and QA the feature on GeForce. Nothing prevented AMD from doing the same thing.

Games in The Way It’s Meant to be Played are not exclusive to NVIDIA. AMD can also contact developers and work with them.

We are proud of the work we do in The Way It’s Meant to be Played. We work hard to deliver kickass, game-changing features in PC games like PhysX, AA, and 3D Vision for games like Batman.If AMD wants to deliver innovation for PC games then we encourage them to roll up their sleeves and do the same.”

http://www.hardwarecanucks.com/news...ms-meddling-batman-arkham-asylum-aa-features/

So they say that the game engine does not natively support it. I doubt they'd get something like that wrong in a press release. So the questions, who do I (/we) believe?


----------



## pr0n Inspector (Sep 30, 2009)

It doesn't "work just fine" btw.


----------



## laszlo (Sep 30, 2009)

i read from page 12 where the thread was yesterday....

a lot of arguments,facts&even proofs ...from both supporters..

i like to read between the lines because the truth is there somewhere...

so if this game was made with nvidia money they have the right to ask the developer to make subtle changes in order to not allow for ati cards to use a specific feature;who have the money and pay is making the rules in any fields

we don't have to forget the fact that nvidia has bought ageia and now they see that is wasn't a good investment as expected, so till they can they'll try to make money from it to compensate this bad investment;neither physx or cuda don't have the expected popularity whatever they make and they'll loose a lot of $ when other similar applications will have a higher usage-market share;a perfect analogy is blu-ray and HDVD;the shift of market to the 1st was deadly for the other with huge loses for the 2nd involved parts

is a known fact that both vga producer invest money in games for obvious reasons:they have to sell cards! ;both of them have optimized games & favorite engines and they can ask the developer to write what they consider necessary in the game soft in order to create an advantage over the competitor 

we don't need to be angry or upset for findings like this because the community has always find solutions to fix this problems for the benefit of all users - no matter if ati or nvidia

from my point of view i consider this thread closed...


----------



## DaedalusHelios (Sep 30, 2009)

laszlo said:


> i read from page 12 where the thread was yesterday....
> 
> a lot of arguments,facts&even proofs ...from both supporters..
> 
> ...




The people moderating this thread: moderators btarunr, Mussels, and Erocker made it clear they do not see it your way though.




pr0n Inspector said:


> It doesn't "work just fine" btw.



Its all just hysteria. To bad people jump the gun without proof. Good read BTW.


I guess the witch hunt can end with that info.


----------



## Wile E (Sep 30, 2009)

So what did I miss, more people claiming NV used TWIMTBP to do evil things against ATI?

Nonsense. If features were disabled for any reason other than technical, there would already be talks of ATI suing NV. It's either a technical limitation, or a bug.

EDIT: In fact, I would go one step further, and say it's ATI's fault for not getting involved with the game's development like NV does. They have every opportunity to go to the developer and help optimize for their platform.


----------



## Benetanegia (Sep 30, 2009)

pr0n Inspector said:


> It doesn't "work just fine" btw.



That pretty much says it all. You can enable it using tricks, but it doesn't work. It's not anti-aliasing the game at all. And that's because it was an specific AA *ADDED* for Nvidia cards.

All the TWIMTPB bashing is so BS anyway and always based on previous cases that weren't true to begin with. This must is true, because what happened with Assassin's Creed? LIke I always said: If Nvidia didn't want DX10.1 in "their" game, why was it included in the first place? They work closely with the developer, they test a lot, they can test on Ati hardware... No Ati fanboi whiner has ever answered this simple question. The truth is that Nvidia never interferd, not in this game, not in AC, not ever, but the ball just continues rolling and rolling. The next time that something happens BM:AA will be mentioned, even if it's not true just like happned in the past. :shadedshu



Wile E said:


> EDIT: In fact, I would go one step further, and say it's ATI's fault for not getting involved with the game's development like NV does. They have every opportunity to go to the developer and help optimize for their platform.



I share that opinion and I already said that in more than one post, but it's lost in the labyrinth of posts this thread has become. 

The one part that the people complaining are not understanding is this.

Quote from Nvidia's answer

"Both of these claims are NOT true. Batman is based on the Unreal Engine 3, which does not natively support anti-aliasing. We worked closely with Eidos to *add* AA and *QA* the feature on GeForce. Nothing prevented AMD from doing the same thing."


----------



## zOaib (Sep 30, 2009)

HalfAHertz said:


> I played the demo, the fighting was amasingly addictive. Plus the game ran like a dream on my ATi powered lappy, so as soon as it drops in price and I see it around, I'd totaly pick it up.



i got the game free with my GTX 275 i bought a month ago , which is now being replaced with the hd 5870 today ............... and i will play thte game with my ATI card at max res. for my LCD ........ the game is quite addictive the fighting style is very intriguing too , since it was free with the gtx ill be playing it on an ATI =)


----------



## gumpty (Sep 30, 2009)

Meh,

I read like 5 pages then got bored of reading the same things over and over.

My suggestion: those that don't like what the game developers/nvidia have done - just don't buy the game and at the same time write an e-mail to both the game developers and nvidia telling them the reasons why. When the game devs get a hundred or a thousand e-mails, they'll quickly fix the problem.

*Whining about it on here is going to get nothing done.*

Cheers have a nice day.


----------



## $ReaPeR$ (Sep 30, 2009)

this is NOT healthy competition, this is disgusting. cant they just focus on being more productive?:shadedshu


----------



## DaedalusHelios (Sep 30, 2009)

$ReaPeR$ said:


> this is NOT healthy competition, this is disgusting. cant they just focus on being more productive?:shadedshu




Read the posts above you to understand blaming Nvidia was BS.


----------



## $ReaPeR$ (Sep 30, 2009)

i just did  thanks!


----------



## DaedalusHelios (Sep 30, 2009)

$ReaPeR$ said:


> i just did  thanks!



*But Nvidia should still lower their prices since the 5870 had such a strong launch.*

If people want to rage about Nvidia just complain about the pricing. It won't take lies or misconceptions to do so. Its just plain facts.

I got ripped buying a 7950GX2 back in the day. It scaled like crap and drivers had taken forever to make it scale decently. 9800GX2 and GTX 295's were another story though(still overpriced).


----------



## wahdangun (Sep 30, 2009)

yep the conclusion is *if you are ati owner then you never ever buy this game, they don't deserve our money*


----------



## DaedalusHelios (Sep 30, 2009)

Cruising on caps lock is cool. 

Strange Indonesian kids are raiding TPU! oh noes! 

*EDIT* btarunr just cleaned it up with an edit. Thank you sir.


----------



## gumpty (Sep 30, 2009)

Wow that is some serious statements, made all the more serious by the judicious use of large lettering.


----------



## newtekie1 (Sep 30, 2009)

pr0n Inspector said:


> It doesn't "work just fine" btw.



Interesting.  So not only does it not actually work, but it also breaks something in the game.

Sounds like one of the reasons I said in the beginning of this thread...



El Fiendo said:


> Both of these claims are NOT true. Batman is based on the Unreal Engine 3, which does not natively support anti-aliasing. We worked closely with Eidos to add AA and QA the feature on GeForce. Nothing prevented AMD from doing the same thing.



Hey, another reason I said in the beginning.

Seems like the simplest solution is most likely to be correct...

You know, a good reporter would put up a retraction correcting his misinformation...of course real reports do research to make sure their story is straight before reporting it and then wrongfully bashing who they believe to be at fault...



DaedalusHelios said:


> *But Nvidia should still lower their prices since the 5870 had such a strong launch.*
> 
> If people want to rage about Nvidia just complain about the pricing. It won't take lies or misconceptions to do so. Its just plain facts.
> 
> I got ripped buying a 7950GX2 back in the day. It scaled like crap and drivers had taken forever to make it scale decently. 9800GX2 and GTX 295's were another story though(still overpriced).



Definitely, but I'm sure they will, it just takes time.  We are only a week out from the launch of the HD5870, so I expect a price cut announcement on at least the GTX285 and GTX295 very soon.(The others still fit well in the Performance per dollar graph, thanks to the fact that they have had competition from ATi already).


----------



## wahdangun (Sep 30, 2009)

up's sorry guy's. i don't mean to bother you all, i'm just angry to the developer.

i will not do that again, i'm really sorry


NB : if you came to indonesia just call me, i will be your guide. and i will show you how beautiful is indonesia. and btw i'm at 20 now


----------



## Steevo (Sep 30, 2009)

This thread needs to die.


Those that feel offended by NV and the developers antics know what not to buy, and those who don't can support division of gamers.


----------



## Bjorn_Of_Iceland (Sep 30, 2009)

newtekie1 said:


> Bullshit, the tests with hacked drivers were showing PhysX running just fine on ATi hardware.
> 
> You are seriously over estimating the power required to run PhysX, any current ATi hardware would have been able to completely kill in PhysX performance.  Remember, the original hardware the PhysX API ran on was 128MB PCI cards...



Its not running on ati hardware. Its using the "software mode" which utilizes the CPU for the physics processing. Much like ageia before. In which you may be able to realize physics effects on screen with a performance hit as opposed to having the card itself.


----------



## Benetanegia (Sep 30, 2009)

Bjorn_Of_Iceland said:


> Its not running on ati hardware. Its using the "software mode" which utilizes the CPU for the physics processing. Much like ageia before. In which you may be able to realize physics effects on screen with a performance hit as opposed to having the card itself.



He is talking about the hack that they were preparing in ngohq.com which allowed PhysX to be accelerated on Ati hardware.

http://www.tomshardware.com/news/nvidia-physx-ati,5764.html

Nvidia even gave him a lot of support.

http://www.tomshardware.com/news/nvidia-ati-physx,5841.html

Quote: "In the end, if Badit could get PhysX to run on Radeon cards, the PhysX reach would be extended dramatically and *Nvidia would not be exposed to any fishy business claims - since a third party developer is leading the effort.*"

In the end AMD didn't allow that to happen, and lied about which the reasons were behind that decision, because they had a deal with Intel's Havok which only runs on the CPU. Since Intel didn't want GPU acceleration at all, PhysX could not happen, at least fully supported PhysX couldn't happen.

EDIT: And yeah, I know they are slowly porting Havok to run on GPUs too, but that is more than a year after that happened, because PhysX has some support after all, despite their efforts to block it and because by the time they finish porting it Intel will have their Larrabee out. The thing about GPU Havok is so fishy that the demo of Havok running in AMD's HD5xxx were using AMD's propietary Stream API, but the final product is going to be OpenCL...


----------



## eidairaman1 (Sep 30, 2009)

Well you see where Physx is, just like it was when Ageia appeared on the scene in 2005.


----------



## TheMailMan78 (Sep 30, 2009)

Ok here is my take on this whole thing. ATI "fooled" the game into running AA natively by telling the game it was in fact an Nvidia card. Once they did this it ran better AA than with a real Nvidia card. So basically the feature was not added to the Geforce game profile but removed from the games ATI profile. Yes the Unreal 3 engine does not in fact support AA but ATIs catalyst has supported AA in the Unreal engine since I believe 9.2. To me this is proof TWIMTBP program is paying developers to hamstring ATI.

NOW if AA was offered no matter what GPU you had but in fact ran better on Nvidia than I would accept fair play with TWIMTBP program. However Nvidia cheated ATI users out of something their card is VERY capable of doing natively. After all we are talking about AA. Not Physx.

Nvidia just had a Tonya Harding moment.


----------



## Benetanegia (Sep 30, 2009)

TheMailMan78 said:


> Ok here is my take on this whole thing. ATI "fooled" the game into running AA natively by telling the game it was in fact an Nvidia card. Once they did this it ran better AA than with a real Nvidia card. So basically the feature was not added to the Geforce game profile but removed from the games ATI profile. Yes the Unreal 3 engine does not in fact support AA but ATIs catalyst has supported AA in the Unreal engine since I believe 9.2. To me this is proof TWIMTBP program is paying developers to hamstring ATI.
> 
> NOW if AA was offered no matter what GPU you had but in fact ran better on Nvidia than I would accept fair play with TWIMTBP program. However Nvidia cheated ATI users out of something their card is VERY capable of doing natively. After all we are talking about AA. Not Physx.
> 
> ...



Did you read the latest info that has been given in the last posts? Not only the AA is not better in Ati cards, but *they are not doing AA at all*, and they break the game. :shadedshu


----------



## DaedalusHelios (Sep 30, 2009)

TheMailMan78 did not get the memo. 

Seriously read the other posts and edit if necessary.


----------



## TheMailMan78 (Sep 30, 2009)

Benetanegia said:


> Did you read the latest info that has been given in the last posts? Not only the AA is not better in Ati cards, but *they are not doing AA at all*, and they break the game. :shadedshu





DaedalusHelios said:


> TheMailMan78 did not get the memo.
> 
> Seriously read the other posts and edit if necessary.



There is 11 pages! Give me some links damn it!


----------



## El Fiendo (Sep 30, 2009)

Start on page 7, around my first post. I'm still not sure which way this is going as both sides have good evidence against each other. I tend to lean towards NVIDIA though because breaking only one game doesn't make sense.


----------



## newtekie1 (Sep 30, 2009)

TheMailMan78 said:


> There is 11 pages! Give me some links damn it!



I'm too lazy to look it up, so here is a summary:


The claim was made that nVidia paid to have AA disabled for ATi hardware.
The claim was made that AA works.
The claim was made that there was no reason to disable the feature for ATi hardware, other than nVidia paying to have it disabled.
Some arguing.
The claim was made that AA was a feature that nVidia funded the addition of.
The claim was also made that, perhaps the feature was disabled on ATi hardware due to it breaking the game.
Some arguing.
The claim was made that AA is a standard feature in the Unreal 3.5 Engine.
The claim was made that ATi proved it doesn't break the game, because if it works in the demo, it will work in the entire game.
Some arguing.
It was revealed that AA is not a standard feature in the Unreal 3.5 Engine, and nVidia did infact fund the addition of it to the game.Source
It was revealed that changing the device ID to allow AA to be enabled in-game, actually breaks the game on ATi hardware.Source
It was revealed that, even with the setting enabled, ATi hardware didn't actually do AA because the feature was not designed for ATi hardware.Source

I think that about covers it.  

The discussion should be pretty much over with that.  There is no wrong doing on nVidia's part.  They paid for the developement and inclusion of AA in Batman, it is only fair that only their hardware gets the benefit.  ATi was more than open to do the same, but they didn't, it is their loss, and more imporantly the loss of their customers.  And unlike the original reports by ATi, the feature doesn't actually work on ATi hardware.  The setting can be enabled in the demo, and full game, but it doesn't actually do anything and it breaks the full version of the game.

Perhaps if the two of them would work a little bit more together, we could see extras like this added to all games that work on both.  Though we don't want them working so closely together we get another price fixing situation...


----------



## Benetanegia (Sep 30, 2009)

newtekie1 said:


> Perhaps if the two of them would work a little bit more together, we could see extras like this added to all games that work on both.  Though we don't want them working so closely together we get another price fixing situation...



And the feature probably almost works, I mean it requires just some light recoding. What it does need is a lot of testing and QA on Ati hardware, with someone with extensive knowledge of the Ati architecture (AKA AMD engineer) helping a bit and that's pretty much all. It's not a feature of the UE, it's not a feature present in DX, not in this exact form at least. So it's not something you can take as grated that it will properly work under all conditions. A game developer can't release a game with a feature that has not been properly tested.


----------



## BelligerentBill (Sep 30, 2009)

I'm amazed this discussion is still going on.  It only shows how ignorant fanboys can be when they care nothing about facts as long as they have found a reason to rant.  Human weakness at it's finest.

ATI hacked a *demo*.

The developer did not cripple ATI because Nvidia paid them to do it.  Seriously people... this isn't the US Government.  Somebody needs to get facts and settle this BS because I've seen nothing but hearsay from ATI.

At the end of the day, I have an Nvidia card


----------



## Valdez (Sep 30, 2009)

Nvidia wants to estabilish a new tradition: gpu makers have to pay for (basic or non-basic) features, if they want it in-game.
It will be fun to see a game with nvidia (tm) AA, nvidia (tm) physx, ati (tm) tessellation, s3 (tm) AF, ati (tm) hdr, etc...

Pathetic.

Anyway, bioshock and mass effect had aa through control panel (both manufacturer).


----------



## TheMailMan78 (Sep 30, 2009)

newtekie1 said:


> [*]It was revealed that AA is not a standard feature in the Unreal 3.5 Engine, and nVidia did infact fund the addition of it to the game.Source
> [*]It was revealed that changing the device ID to allow AA to be enabled in-game, actually breaks the game on ATi hardware.Source
> [*]It was revealed that, even with the setting enabled, ATi hardware didn't actually do AA because the feature was not designed for ATi hardware.Source



Ok the first link is Nvidia and the rest are some nut job on a forum that has nothing to do with ATI or Nvidia. 

THIS is what you guys bring to the table as facts?! Nivdia ok but a quack from a forum?! Come on guys. I thought you had better rebuttals than that. :shadedshu


----------



## DaedalusHelios (Sep 30, 2009)

TheMailMan78 said:


> Ok the first link is Nvidia and the rest are some nut job on a forum that has nothing to do with ATI or Nvidia.
> 
> THIS is what you guys bring to the table as facts?! Nivdia ok but a quack from a forum?! Come on guys. I thought you had better rebuttals than that. :shadedshu



Its because the argument is not even acknowledged by the tech media. A guy forced it to work and it makes it broken in-game. Try it yourself, you just change a device ID. Unless you think its a conspiracy too.


----------



## TheMailMan78 (Sep 30, 2009)

DaedalusHelios said:


> Its because the argument is not even acknowledged by the tech media. A guy forced it to work and it makes it broken in-game. Try it yourself, you just change a device ID. Unless you think its a conspiracy too.



I think all of you work for Nvidia and made my dog sterile.


----------



## El Fiendo (Sep 30, 2009)

No, I don't work for NVIDIA but I did make your dog sterile.


----------



## TheMailMan78 (Sep 30, 2009)

El Fiendo said:


> No, I don't work for NVIDIA but I did make your dog sterile.



Give him a reach around next time. He likes that.



DaedalusHelios said:


> Its because the argument is not even acknowledged by the tech media. A guy forced it to work and it makes it broken in-game. Try it yourself, you just change a device ID. Unless you think its a conspiracy too.



I'm not downloading the demo again. Making baseless claims against shit I have no idea about is way easier.


----------



## newtekie1 (Sep 30, 2009)

TheMailMan78 said:


> Ok the first link is Nvidia and the rest are some nut job on a forum that has nothing to do with ATI or Nvidia.
> 
> THIS is what you guys bring to the table as facts?! Nivdia ok but a quack from a forum?! Come on guys. I thought you had better rebuttals than that. :shadedshu



Well, nVidia coming right out and saying what they did, is kind of all the proof needed.  They are the ones that did it, they know best.  It puts all the other other baseless acusations to rest.

And you have to kind of read the whole thing from the "nut job".  That "nut job" is the one that originally claimed AA was disabled for ATi, and originally claimed it worked in the demo.  Posting screenshots to prove it.

The other forum members later went on to disprove the fact that AA was even working.  And the "nut job" himself confirmed that it broke the game(even if he didn't want to admit it at first).


----------



## TheMailMan78 (Sep 30, 2009)

newtekie1 said:


> Well, nVidia coming right out and saying what they did, is kind of all the proof needed.  They are the ones that did it, they know best.  It puts all the other other baseless acusations to rest.
> 
> And you have to kind of read the whole thing from the "nut job".  That "nut job" is the one that originally claimed AA was disabled for ATi, and originally claimed it worked in the demo.  Posting screenshots to prove it.
> 
> The other forum members later went on to disprove the fact that AA was even working.  And the "nut job" himself confirmed that it broke the game(even if he didn't want to admit it at first).



The game yes. Not the demo. ATI never said anything about the game due to secure rom. Anyway the accusation was from a ATI blog not that forum.


----------



## Meizuman (Oct 1, 2009)

Slight offtopic:

S.T.A.L.K.E.R. was originally TWIMTBP, but IIRC, the game runs better with Ati hardware... maybe it ran better at launch with nV but Ati drivers were improved from there. (confirmation needed)

But afaik, now they display Ati Radeon logo at startup... in CS and COP.


----------



## yogurt_21 (Oct 1, 2009)

newtekie1 said:


> Though we don't want them working so closely together we get another price fixing situation...



again, yeah I think not. lol the 600$ standard for highend single card single core and 300$ standard for decent midrange was quite annoying.

now we can typically pick up a 100-200$ card that will grant us all the performance we need. I like it better now. 

the argument was interestign to watch, as a former ati fanboy o have to admit I jumped to conclusions, but gettign older I didn't want to post without evidence. I'm glad I didn't and I'm glad the truth came to light.


----------



## newtekie1 (Oct 1, 2009)

TheMailMan78 said:


> The game yes. Not the demo. ATI never said anything about the game due to secure rom. Anyway the accusation was from a ATI blog not that forum.



True, but the claim and accusation was made by people in this thread that because it worked in the demo it would work in the game.  The original person that discovered all of this in the demo, went on to test it in the game, and it didn't work.  ATi jumped the gun and started the bashing too quickly, they should have let the original person finish testing it before they starting crying foul.  They were basically crying because nVidia paid for the cookie, and didn't share it.


----------



## wolf (Oct 1, 2009)

I see both sides here I think.

it really sucks ATi doesn't get good native support, but really Nvidia made the whole thing happen with their own pocket money, and since I generally go with Nvidia hardware, suits me.

I bet this plays riiiiight into ATi fans believing in Nvidias apparently 'evil' practices, man I get a belly laugh out of that every time.


----------



## Benetanegia (Oct 1, 2009)

wolf said:


> I bet this plays riiiiight into ATi fans believing in Nvidias apparently 'evil' practices, man I get a belly laugh out of that every time.



What it really bugs me is that many of those same people think that Intel did nothing wrong to AMD. Makes absolutely no sense. _j/k it does and we know why._


----------



## the_wolf88 (Oct 1, 2009)

Everybody knows that if ATI did not return with 4xxx series no body were able to buy a high end card from Nvidia !!

280GTX>> 650$ to 450$
260GTX>> 450$ to 300$

Nvidia Sucks !


----------



## Benetanegia (Oct 1, 2009)

the_wolf88 said:


> Everybody knows that if ATI did not return with 4xxx series no body were able to buy a high end card from Nvidia !!
> 
> 280GTX>> 650$ to 450$
> 260GTX>> 450$ to 300$
> ...



If you got the 9700 Pro at launch, that's what you paid for it, in comparison to Nvidia  prices at the time. If you ever got a X1900XTX new that's what it costed to you. If you ever bought a X850 XT PE that's what you pàid. When one of them have the top card that's what happens and always has, except when AMD decided to lower the prices dramatically out of desesperation*. I'm not saying that's not good, I'm saying that if AMD was on top they would do the same. They've done that in the past.

* Fact is that revenues in the discrete graphics market is half of what it used to be since the AMD pricing strategy chaged. They are not making the same money (neither Nvidia), but that's what it takes if they want to gain market share. Market share keeps investors happy. In the link look at market value.


----------



## wolf (Oct 2, 2009)

the_wolf88 said:


> Everybody knows that if ATI did not return with 4xxx series no body were able to buy a high end card from Nvidia !!
> 
> 280GTX>> 650$ to 450$
> 260GTX>> 450$ to 300$
> ...



everyone knows if the GTX series was not released ATi would have charged more for their cards, especially the X2.

Im it's something both companies do.

Oh no that's right, Nvidia is a big greedy evil corporation and ATi is nothing of the sort, my bad!


----------



## ArmoredCavalry (Oct 2, 2009)

Now the question is, did the developer leave out AA on purpose, so that Nvidia would pay them to have it developed....

hmmmmmm 

Seriously though, its 2009, what is up with these games trying to skimp on AA all the time...


----------



## newtekie1 (Oct 2, 2009)

I like how you can tell who the fanboys are because they thank the blatent trolling posts as long as they bash nVidia...

Kind of lose all credibility when you do that.



ArmoredCavalry said:


> Now the question is, did the developer leave out AA on purpose, so that Nvidia would pay them to have it developed....
> 
> hmmmmmm
> 
> Seriously though, its 2009, what is up with these games trying to skimp on AA all the time...



No, the game is based on the, aging, Unreal Tournament Engine.  This engine does not have native AA, it is likely that AA would never have been implemented in-game if nVidia did not fund the developement.

Well as long as the games continue to be based on older game engines, they will lack features that we are all used to, such as AA.  The older engines are usually more supported, more mature, and easier to develope for.


----------



## ArmoredCavalry (Oct 2, 2009)

newtekie1 said:


> I like how you can tell who the fanboys are because they thank the blatent trolling posts...



If you are referring to me, you will notice I began my second sentence with "Seriously though" inferring that my first sentence was not meant to be serious (it was actually making fun of the fanboyism of the thread).

Obviously the humor went over your head though.  Edit: nvm............. I used my eyes to read 



newtekie1 said:


> No, the game is based on the, aging, Unreal Tournament Engine.  This engine does not have native AA, it is likely that AA would never have been implemented in-game if nVidia did not fund the developement.
> 
> Well as long as the games continue to be based on older game engines, they will lack features that we are all used to, such as AA.  The older engines are usually more supported, more mature, and easier to develope for.



Ah, that would make more sense, however I could have sworn that there was a game that used the unreal engine, that also had AA. Mirror's Edge I believe... Hey didn't that have Physx too?  (disclaimer: definitely not inferring a conspiracy)


----------



## newtekie1 (Oct 2, 2009)

ArmoredCavalry said:


> Ah, that would make more sense, however I could have sworn that there was a game that used the unreal engine, that also had AA. Mirror's Edge I believe... Hey didn't that have Physx too?  (disclaimer: definitely not inferring a conspiracy)



You are correct, Mirror's Edge used the Unreal Engine, and I do believe it had AA in-game.  However, the developers added the AA themselves.  The developers of Batman would have had to do the same, since they were different developers, they wouldn't share.

What *is* interesting is that Mirror's Edge was another TWIMTBP title.  I wonder if nVidia just gave the developers of Batman Arkham Asylum the AA code since they more than likely at least aided in the developement for Mirror's Edge, and probably had access to the coding required to add it to the Unreal Engine...

I'm really tempted to re-install mirror's edge on my HD4890 machine and see if it allows AA with ATi hardware.


----------



## ArmoredCavalry (Oct 2, 2009)

newtekie1 said:


> I'm really tempted to re-install mirror's edge on my HD4890 machine and see if it allows AA with ATi hardware.



It does, since day 1 (with my 4870).

http://armoredcavalry.deviantart.com/art/Edge-of-a-Mirror-112597392

I still have the last level to complete, I'm really bad about completing games... and tv shows... and sometimes homework... on occasion sentences...


----------



## TheMailMan78 (Oct 2, 2009)

newtekie1 said:


> I like how you can tell who the fanboys are because they thank the blatent trolling posts as long as they bash nVidia...
> 
> Kind of lose all credibility when you do that.
> 
> ...



Honestly I hope you don't feel I'm a fanboy. For some reason Ive been called that recently. :shadedshu


----------



## Sihastru (Oct 2, 2009)

wolf said:


> everyone knows if the GTX series was not released ATi would have charged more for their cards, especially the X2.
> 
> Im it's something both companies do.
> 
> Oh no that's right, Nvidia is a big greedy evil corporation and ATi is nothing of the sort, my bad!



It was actually the GTX295 that brought the 4870X2's price down and not the other way around. The 4870X2 had a ridiculous price. The GTX295 price came so much down that these days it costs as much as a 5870, while outperforming it and while costing a lot more to manufacture.

If you really need to, watch the 5870X2 and see what price it will have at launch.


----------



## DaedalusHelios (Oct 2, 2009)

I am glad to see this thread turn friendly. We have enough hate floating around as it is.


----------



## dvdlim (Oct 2, 2009)

rpsgc said:


> "The Way It's Meant To Be Paid"
> 
> 
> Just another day in the office.



well done!


----------



## newtekie1 (Oct 2, 2009)

TheMailMan78 said:


> Honestly I hope you don't feel I'm a fanboy. For some reason Ive been called that recently. :shadedshu



No, not at all.


----------



## yogurt_21 (Oct 2, 2009)

highend cards have always been expensive it's in the very nature of the word lol.

the midrange is where the value truly lays. and that's where you run into problems if one side is dominatign the other. 9600xt at launch was 300$, a pure example of ati taking advantage of their postiion in the market at the time. take at look at the 4770, it was 129$ at launch. obviously we benefit from competition between the two.

thus as long as both have good midrange competitors, I'll eb happy. if nvidia trumps ati at the midrange or vice versa, I won't be.

edit: you know this is funny based on the fact that nvidia paid for the develment the headline might as well read "Batman: Arkham Asylum Enables PhysX Only on NVIDIA Hardware on PCs"  lol


----------



## DaedalusHelios (Oct 2, 2009)

yogurt_21 said:


> highend cards have always been expensive it's in the very nature of the word lol.
> 
> the midrange is where the value truly lays. and that's where you run into problems if one side is dominatign the other. 9600xt at launch was 300$, a pure example of ati taking advantage of their postiion in the market at the time. take at look at the 4770, it was 129$ at launch. obviously we benefit from competition between the two.
> 
> ...



Good correlation indeed. Now ATi is off promoting another physics venture with another company. I wonder if ATi will offer it for free to Nvidia the way Nvidia offered Physx to run on ATi hardware for free and got turned down.


----------



## ArmoredCavalry (Oct 2, 2009)

DaedalusHelios said:


> Good correlation indeed. Now ATi is off promoting another physics venture with another company. I wonder if ATi will offer it for free to Nvidia the way Nvidia offered Physx to run on ATi hardware for free and got turned down.



Don't mean to go in a negative direction here.... However, I feel the need to point out that AA is not proprietary tech, PhysX is (although this has probably already been stated I'm sure).


----------



## El Fiendo (Oct 2, 2009)

ArmoredCavalry said:


> Don't mean to go in a negative direction here.... However, I feel the need to point out that AA is not proprietary tech, PhysX is (although this has probably already been stated I'm sure).



Right but in this case it is proprietary because extra was paid by NVIDIA to get it put into a game engine that doesn't natively support it. In this case it works on NVIDIA GPUs only because the game code has been optimized and allow the workings of software AA on NVIDIA GPUs.


----------



## Benetanegia (Oct 2, 2009)

ArmoredCavalry said:


> Don't mean to go in a negative direction here.... However, I feel the need to point out that AA is not proprietary tech, PhysX is (although this has probably already been stated I'm sure).











It *is* propietary. It even has Nvidia name all over it!
This has been said like 80000 times already, but Unreal Engine 3 doesn't have AA, every UE3 game to date has not had AA. You had to enable it in the control panel. This AA implementation was put in BM:AA because Nvidia asked them to do and they paid for it and helped making Quality Assurance for the feature to ensure it didn't break the game. AMD didn't even contact with the developer to say Hello and when used it breaks the game, plus it does not anti-alias the game.


----------



## TheMailMan78 (Oct 2, 2009)

Benetanegia said:


> http://img.techpowerup.org/091002/batset.jpg
> 
> 
> 
> ...



What the hell are you going on about man? AA isnt an Nvidia property.


----------



## El Fiendo (Oct 2, 2009)

El Fiendo said:


> Right but in this case it is proprietary because extra was paid by NVIDIA to get it put into a game engine that doesn't natively support it. In this case it works on NVIDIA GPUs only because the game code has been optimized and allow the workings of software AA on NVIDIA GPUs.



^^^^


----------



## Benetanegia (Oct 2, 2009)

TheMailMan78 said:


> What the hell are you going on about man? AA isnt an Nvidia property.



*THAT* AA is. That is not normal AA, it's not normal MSAA, it's an adaptative AA mode designed specifically for that game and tested in Nvidia hardware, again because Nvidia asked them to. In Ati cards you can still use the normal AA, the one that will have a performance hit *AS ALWAYS* has done.


----------



## TheMailMan78 (Oct 2, 2009)

El Fiendo said:


> ^^^^





Benetanegia said:


> *THAT* AA is. That is not normal AA, it's not normal MSAA, it's an adaptative AA mode designed specifically for that game and tested in Nvidia hardware, again because Nvidia asked them to.  You can still use the normal AA, the one that will have a performance hit *AS ALWAYS* has done in Ati cards.



So this is a super special AA that only Nvidia can do. I see


----------



## Benetanegia (Oct 2, 2009)

TheMailMan78 said:


> So this is a super special AA that only Nvidia can do. I see



NO. This is an especial AA that was paid by Nvidia for their cards and was later tested in their cards. If AMD had done the same you would have that feature. What it's never going to happen is that Nvidia pays so that a feature *that is not part of the engine or the game*, is added and tested to run in AMD hardware. :shadedshu


----------



## Deleted member 24505 (Oct 2, 2009)

Simple,just dont buy it if you have an ati card.


----------



## TheMailMan78 (Oct 2, 2009)

Benetanegia said:


> NO. This is an especial AA that was paid by Nvidia for their cards and was later tested in their cards. If AMD had done the same you would have that feature. What it's never going to happen is that Nvidia pays so that a feature is added and tested to run in AMD hardware. :shadedshu



So this is a magic AA that has no performance hit for Nvidia cards? Its all makes sense now. You don't think that its basically a shortcut to the Geforce drivers to force AA to it do you? Naaaaa


----------



## Benetanegia (Oct 2, 2009)

TheMailMan78 said:


> So this is a magic AA that has no performance hit for Nvidia cards? Its all makes sense now. You don't think that its basically a shortcut to the Geforce drivers to force AA to it do you? Naaaaa



Indeed it's sort of a magic AA. It's adaptative AA. Some parts are anti-aliased (the ones that need to be) and some are not, saving a lot of resources and improving performance. It has a hit but not as high as FSAA has. I'll repeat, Unreal Engine 3 does not support MSAA.


----------



## El Fiendo (Oct 2, 2009)

No, its an AA that is applied through software so it doesn't have the natural 20% hit of doing it by hardware. If NVIDIA hadn't have had it put in, everyone (NVIDIA and ATI both) would be forced to do hardware forced AA through the control panels of their respective driver set, resulting in overall performance loss. Because NVIDIA had it implemented into the software, something that wouldn't happen out of the box with UE3 or UE3.5 and was extra, they cut out the performance loss by removing the need for it. As such, they have the right to implement it on their hardware alone.


----------



## TheMailMan78 (Oct 2, 2009)

Benetanegia said:


> Indeed it's sort of a magic AA. It's adaptative AA. Some parts are anti-aliased (the ones that need to be) and some are not, saving a lot of resources and improving performance. It has a hit but not as high as FSAA has. I'll repeat, Unreal Engine 3 does not support MSAA.


 Ok they didn't add AA to this game. Its not like they are baking cookies and forgot to add sugar. Adding AA to an engine like this would take a lot more than TWIMTBP program would be willing to finance. All they did was take a "shortcut" instead of you manually doing via the geforce drivers. True the Unreal 3.0 engine doesn't support AA but but ATI and Nvidia have supported AA in the Unreal engine for almost a year now. All Nvidia did was get a small head start in the game profiles.



El Fiendo said:


> No, its an AA that is applied through software so it doesn't have the natural 20% hit of doing it by hardware. If NVIDIA hadn't have had it put in, everyone (NVIDIA and ATI both) would be forced to do hardware forced AA through the control panels of their respective driver set, resulting in overall performance loss. Because NVIDIA had it implemented into the software, something that wouldn't happen out of the box and was extra, they cut out the performance loss by removing the need for it.


How much do you want to bet within two driver releases for ATI the AA performance will be the same as Nvidias "magic" AA?


----------



## Benetanegia (Oct 2, 2009)

TheMailMan78 said:


> Ok they didn't add AA to this game. Its not like they are baking cookies and forgot to add sugar. Adding AA to an engine like this would take a lot more than TWIMTBP program would be willing to finance. All they did was take a "shortcut" instead of you manually doing via the geforce drivers. True the Unreal 3.0 engine doesn't support AA but but ATI and Nvidia have supported AA in the Unreal engine for almost a year now. All Nvidia did was get a small head start in the game profiles.



EEhhhhh??? None Nvidia nor Ati have supported AA in UE3. They might have made optimizations so that FSAA works faster when enabled in the control panel. FSAA works always, because it works over the final frame, MSAA has to be implemented in the game because it works at fragment (pixel) level, the AA we are talking about goes a little bit farther by selecting with parts need AA and which don't.

The thing about the shortcut is nt only BS, but imposible to make. No matter how FSAA is used it always has a performance hit. UE3 can't do MSAA, it can do FSAA though, at cost of a lot of performance.



> How much do you want to bet within two driver releases for ATI the AA performance will be the same as Nvidias "magic" AA?



Of course they would, but it's more probable that they work with the develper and get the feature working. Something they should have done since the beginning if they wanted in-game AA. But truth is that if they trully wanted in-game AA, they would have done it before, in any of the various other UE3 games. They don't want to spend their money on such a feature, they want others to spend that money and have it for free. That's what they are crying for. No more no less.


----------



## El Fiendo (Oct 2, 2009)

TheMailMan78 said:


> How much do you want to bet within two driver releases for ATI the AA performance will be the same as Nvidias "magic" AA?



I've maintained pretty much from the start that this is probably the case. I agree, the NVIDIA AA wasn't/isn't a miracle, it was just made to run correctly instead of breaking the game as software AA does to ATI cards in this game. I figured that ATI simply didn't have things ready on launch day, one developer for ATI decided to bitch and moan, and then the ATI Army championed their plight across the interwebs. Its all the same really, it happened before with Far Cry 2 and the OMG NVIDIA is totally screwing ATI over with image quality. Which then promptly turned out to be a Catalyst bug and was patched within, I believe, 24 hours.


----------



## newtekie1 (Oct 2, 2009)

TheMailMan78 said:


> How much do you want to bet within two driver releases for ATI the AA performance will be the same as Nvidias "magic" AA?



Yes, but by then, most people will have beat the game, and put it on the shelf...

And I doubt ATi will get MSAA working with Batman: AA, they didn't care to do it before the game was launched, why would they do it after?  They have bigger things to worry about.  So you will always be forsed to use FSAA, which we have seen has a major performance hit compared to MSAA.

On top of that, even if they get MSAA going, it will still take more of a performance hit because the engine is not optimized like it is for nVidia's hardware.


----------



## ArmoredCavalry (Oct 2, 2009)

Benetanegia said:


> http://img.techpowerup.org/091002/batset.jpg
> 
> 
> 
> ...



Yeah... uh if you read a few posts above yours, you will notice that Mirror's Edge used the unreal engine 3 and *DID* have AA, and did work on ATI Cards.... Also, financing something doesn't make it proprietary...


----------



## El Fiendo (Oct 2, 2009)

Most likely because they took the time to code it in. If you read further up, you'll see UE3 or UE3.5 does not support it straight out of the box. It likely has provision to support it, but you don't have access to it straight away without bugs and errors. They had to pay people to code it in and have it run without crashing the game. There's even proof that when you have software run AA with ATI on this game, the game is unstable.

'Financing something doesn't make it proprietary'. 

Uh, yea it does. If I spend money to buy something, I own it, or the rights to it. If the bank gives me home financing, guess what. They own my house and I buy it back from them. What do you think it means?


----------



## ArmoredCavalry (Oct 2, 2009)

El Fiendo said:


> Most likely because they took the time to code it in. If you read further up, you'll see UE3 or UE3.5 does not support it straight out of the box. It likely has provision to support it, but you don't have access to it straight away without bugs and errors. They had to pay people to code it in and have it run without crashing the game. There's even proof that when you have software run AA with ATI on this game, the game is unstable.
> 
> 'Financing something doesn't make it proprietary'.
> 
> Uh, yea it does. If I spend money to buy something, I own it, or the rights to it. If the bank gives me home financing, guess what. They own my house and I buy it back from them. What do you think it means?



I know that it didn't support it straight out of the box................. 

I was pointing out to the guy above that there has been AA on a UE3 game before this, and it and it *has* run on ATI gpu's:



Benetanegia said:


> every UE3 game to date has not had AA.



"proprietary - one that possesses, owns, or holds exclusive right to something" I am using proprietary with the meaning "holds exclusive rights". Nvidia doesn't have exclusive rights to AA on Batman... They paid the devs to add a non-proprietary technology (AA) into the game for use by their gpu's.

Now, does Nvidia have the right to include in the agreement that ATI cards should not be allowed to use the AA? *Of course,* they funded it. But please don't try to tell me that AA is a proprietary technology from Nvidia....


----------



## Benetanegia (Oct 2, 2009)

ArmoredCavalry said:


> Yeah... uh if you read a few posts above yours, you will notice that Mirror's Edge used the unreal engine 3 and *DID* have AA, and did work on ATI Cards.... Also, financing something doesn't make it proprietary...



Hmm, I didn't thought about Mirror's Edge. Thought that was a modified version of UE. For instance the complete lighting system was changed. It doesn't change anything anyway. As El Fiendo said, they probably worked on that, that's how it should have been made now too.


----------



## El Fiendo (Oct 2, 2009)

No, I misunderstood you. I've been saying that the implementation of AA in this game is proprietary (as I see now we both agree). Also, yes it has been in prior UE3x games, but those were special implementations themselves. Sorry, I jumped the guns and appear to have misunderstood the intentions and meaning behind your posts.


----------



## Benetanegia (Oct 2, 2009)

ArmoredCavalry said:


> I know that it didn't support it straight out of the box.................
> 
> I was pointing out to the guy above that there has been AA on a UE3 game before this, and it and it *has* run on ATI gpu's:
> 
> ...



Anti-aliasing is a very wide range of techniques to obtain the same result, it's not a thing. For instance FSAA and MSAA are very different things, but there are even more types of anti-aliasing. Probably MSAA and FSAA are copyrighted and belong to someone dating back to the 70's. As a comparison, Intel doesn't have the rights over microprocessors, but it does have the rights over x86. Similarly Nvidia doesn't have the rights over AA, but it does have the rights over the implementation of AA present in Batman.


----------



## ArmoredCavalry (Oct 2, 2009)

El Fiendo said:


> No, I misunderstood you. I've been saying that the implementation of AA in this game is proprietary (as I see now we both agree). Also, yes it has been in prior UE3x games, but those were special implementations themselves. Sorry, I jumped the guns and appear to have misunderstood the intentions and meaning behind your posts.



yeappp



Benetanegia said:


> Similarly Nvidia doesn't have the rights over AA, but it does have the rights over the implementation of AA present in Batman.



which is what my whole post was trying to say.......... if by "the implementation" you mean only the one they paid for, not all implementations of AA in general


----------



## Benetanegia (Oct 2, 2009)

ArmoredCavalry said:


> which is what my whole post was trying to say.......... if by "the implementation" you mean only the one they paid for, not all implementations of AA in general



I don't know what you wanted to say, but what you said is.



ArmoredCavalry said:


> "proprietary - one that possesses, owns, or holds exclusive right to something" I am using proprietary with the meaning "holds exclusive rights". *Nvidia doesn't have exclusive rights to AA on Batman... They paid the devs to add a non-proprietary technology (AA) into the game for use by their gpu's.*
> 
> Now, does Nvidia have the right to include in the agreement that ATI cards should not be allowed to use the AA? *Of course,* they funded it. *But please don't try to tell me that AA is a proprietary technology from Nvidia*....



First one, Nvidia doesn't have exclusive rights over AA in Batman, but they have exclusive rights over THEIR AA.

Second sentence. They didn't paid developers to add non-propietary technology, they paid to add propietary technology.

AA as a whole no, but the AA present in BM is propietary.

What you fail to understand is that no one prohibited the developers adding AA for Ati cards, it's tha lack of interest from AMD what made it happen that way. AA is only in BM because Nvidia said to add it. If AMD had done the same and asked, helped and tested that or any oter AA technique, there would be AA for Ati cards too. They didn't, end of story.

This is not AMD asking for the implementation of AA and having a NO as the answer. This is AMD not collaborating.


----------



## ArmoredCavalry (Oct 2, 2009)

Benetanegia said:


> I don't know what you wanted to say, but what you said is.
> 
> 
> 
> ...



Yeah you probably know better what I meant than I did.... (sarcasm) Ok, I'm just gonna stop replying now.... Obviously you will just keep arguing that Nvidia is teh_best_eva even if no one is actually arguing with you... yeah....... Have fun.


----------



## TheMailMan78 (Oct 2, 2009)

Benetanegia said:


> I don't know what you wanted to say, but what you said is.
> 
> 
> 
> ...



AA is AA its not fucking proprietary! Do you even know what that word means?


----------



## PEPE3D (Oct 2, 2009)

I think everyone have to chill out. Nvidia and ATI have to start thinking about us. We are the consumers. Plain and simple. I could care less about AA, FSA CIA FBI jajajaja. All I want is for games to be play in my pc regardless of what VGA I have. End of story.


----------



## Benetanegia (Oct 2, 2009)

TheMailMan78 said:


> AA is AA its not fucking proprietary! Do you even know what that word means?



My GOD! There are many ways of doing AA, just like there are many ways of doing CPUs, just like there are dozens ways of doing cakes or ommelettes. You can make your own processor, but God forbid you if you make an x86 CPU.

Propietary AA modes exists. At least, CSAA (Coverage Sample Anti-aliasing) from Nvidia or CFAA (Custom Filter AA) from Ati, have to be known to you.

Antia-aliasing in the end is nothing more than interpolating the color of more than oe pixel to form a single pixel.

FSAA or supersampling 4x, renders the complete image 4 times or at double the resolution per axis if you prefer looking at it that way and then interpolates.

MSAA: Calculates 4 different color points based on patterns, so that there is an offset between them, the interpolates them and then calculates the rest, lighting, shadowing, etc. That's why it's performance is much higher.

Edge detect AA: It's like MSAA but it detects edges before doing AA and only does it on the edges.

BM:AA: From what it saw described somewhere. It does edge detect, but some algorithms determine if objects have to be antialiased or not. For instance it's stupid to antialias an object in the distance if it's going to be blurred by depth of field. I guess it does that.

CFAA: I have no idea.

CSAA: No idea.

I think you get the idea. The important thing is that the math behind those AA modes is different. Although they share some algorithms, they are different. And most of them have patents behind them.


----------



## PEPE3D (Oct 2, 2009)

Who Cares!!!!!!!!!!!!!!


----------



## Benetanegia (Oct 2, 2009)

PEPE3D said:


> Who Cares!!!!!!!!!!!!!!



*Tech*PowerUp!


----------



## troyrae360 (Oct 3, 2009)

PEPE3D said:


> Who Cares!!!!!!!!!!!!!!



Everyone that has posted in this thread


----------



## newtekie1 (Oct 3, 2009)

TheMailMan78 said:


> AA is AA its not fucking proprietary! Do you even know what that word means?



No, AA is not AA.  The concept of AA is not proprietary, however the different implementations are.

ATi's driver level AA is propriatery to ATi, nVidia's driver level AA is propriatery to nVidia.  Both developed the different methods of doing AA.  It isn't like ATi developed AA, and then just gave it to nVidia.  Both had to figure out the methods on their own.

The same goes for in-game AA.  Some game engines support AA natively, the method for that AA is proprietary.  The game engines that don't have AA natively, require AA to be added.  The game developers add it their own way, and that is proprietary.


----------



## troyrae360 (Oct 3, 2009)

you might not be able to run AA on batman with ATI card but can you do this with a NV card?  http://www.youtube.com/watch?v=ujf6P6iGcfc


----------



## ArmoredCavalry (Oct 3, 2009)

newtekie1 said:


> The concept of AA is not proprietary, however the different implementations are.



Yeah but do a lot of games develop their own 'algorithms' for AA? Or is there an industry standard of sorts? I wouldn't think that companies pay to develop something over and over that is so widely used.

And if they used a commercial solution, wouldn't they have somewhere in the credits "developed using X brand AA" like physics/audio is basically... ?


----------



## erocker (Oct 3, 2009)

You can run AA on an ATi card without a problem using CCC as far as I know.


----------



## ArmoredCavalry (Oct 3, 2009)

erocker said:


> You can run AA on an ATi card without a problem using CCC as far as I know.



from article: "the only way AA can be used is by forcing it in Catalyst Control Center. This causes the driver to use AA on every 3D object in the scene, reducing performance, compared to if the game's in-game AA engine is used."

Yeah, I'd imagine most high-end cards will run it with CCC AA with no issues... Since Unreal Engine 3 is getting on in years.

Of course I don't have the game (doesn't interest me enough) so I couldn't tell ya for sure.


----------



## Benetanegia (Oct 3, 2009)

ArmoredCavalry said:


> Yeah but do a lot of games develop their own 'algorithms' for AA? Or is there an industry standard of sorts? I wouldn't think that companies pay to develop something over and over that is so widely used.
> 
> And if they used a commercial solution, wouldn't they have somewhere in the credits "developed using X brand AA" like physics/audio is basically... ?



The tools required to do MSAA are inside DX, but they are that, tools that developers can use in their engines. Usually developers implement it in their rendering pipeline, in the way that better fit their engines or desired effect or expected performance. But when you are creating your game AA it's not a checkbox inside DX, developers have to implement it. 

The other option is supersampling that doesn't require being implemented in the engine. The same frame is rendered 4 times and blended into one (more or less). The quality is better than MSAA, but the performance hit is huge.

Epic didn't implement AA into UE3 for some reason (PS3 can't do HDR+AA)(AA is dificult to implement in defferred engines or whatever reason). The developer behind Batman was not going to implement it, but I supose Nvidia convinced them. That's what TWIMTBP is for. The situation is not usual, most engines have AA implemented.


----------



## pr0n Inspector (Oct 3, 2009)

Once again, games that use a technique called deferred shading need some workarounds to have AA in DX9 mode, or at least that's what some dude said.

Also, I already posted a link to THE thread where its OP admitted forcing AA in CCC does not work correctly.


----------



## dr emulator (madmax) (Oct 4, 2009)

troyrae360 said:


> you might not be able to run AA on batman with ATI card but can you do this with a NV card?  http://www.youtube.com/watch?v=ujf6P6iGcfc



wo that freaked me out you tube stopped and then it said erocker moderator, for a sec i was like wtf i know he's a mod here but at youtube as well


----------



## DaedalusHelios (Oct 5, 2009)

Now that I have played the game I think its a boring beat'em up like devil may cry. No reason to fuss because its not a great game IMO.


----------



## mR Yellow (Oct 5, 2009)

TBH, i've done the PhysX hack and it didn't add much to the game play. All i saw was smoke, spiderwebs and paper effects. Nothing to go wow about.

To date PhysX has jusy been nothing but a sales gimmick. PORTAL and HL2 was way better in terms of game play.


----------



## newtekie1 (Oct 5, 2009)

mR Yellow said:


> TBH, i've done the PhysX hack and it didn't add much to the game play. All i saw was smoke, spiderwebs and paper effects. Nothing to go wow about.
> 
> To date PhysX has jusy been nothing but a sales gimmick. PORTAL and HL2 was way better in terms of game play.



For the most part, you are correct.  PhysX doesn't add much beyond a little eye candy to any game so far.

PhysX had/has a lot of potential.  However, it hasn't even come close to showing the true pontential in games simply because it is proprietary, and not supported on all hardware.  So developers have to create a normal game, then just add a few PhysX elements to the game later.  Nothing related to gameplay is PhysX related, because it would ruin the game for people without PhysX.

Now if a developer based the game, and gameplay elements on PhysX right from the beginning of developement, we would see some pretty amazing stuff.  A lot more realistic environments, fully destructable environments.  Imagine CounterStrike, but instead of having to enter a building only through the door, or a window, you can also just blow a hole in the wall and walk in, and not just at certain pre-defined spots in the way, but anywhere in the wall you wanted.

Saddly, we will never see this because it doesn't run natively on ATi hardware.  It is clear that nVidia knew this was required to see PhysX really show it's potential, and this is why they wanted to get it up and running on ATi hardware.  I'm sure at the time, ATi definitely didn't want this, since they were in-bed with Intel and Havok.


----------



## mR Yellow (Oct 5, 2009)

newtekie1 said:


> For the most part, you are correct.  PhysX doesn't add much beyond a little eye candy to any game so far.
> 
> PhysX had/has a lot of potential.  However, it hasn't even come close to showing the true pontential in games simply because it is proprietary, and not supported on all hardware.  So developers have to create a normal game, then just add a few PhysX elements to the game later.  Nothing related to gameplay is PhysX related, because it would ruin the game for people without PhysX.
> 
> ...



Good point. Maybe nVidia should release a title to demonstrate this. Wasn't there a game called Cell Factor that was supposed to do this?


----------



## TheMailMan78 (Oct 5, 2009)

newtekie1 said:


> For the most part, you are correct.  PhysX doesn't add much beyond a little eye candy to any game so far.
> 
> PhysX had/has a lot of potential.  However, it hasn't even come close to showing the true pontential in games simply because it is proprietary, and not supported on all hardware.  So developers have to create a normal game, then just add a few PhysX elements to the game later.  Nothing related to gameplay is PhysX related, because it would ruin the game for people without PhysX.
> 
> ...



Havok already does this without the over head Physx brings. Look at the "frostbite" engine.


----------



## DaedalusHelios (Oct 5, 2009)

TheMailMan78 said:


> Havok already does this without the over head Physx brings. Look at the "frostbite" engine.



PhysX is much more complicated and can be used for a better gaming experience where good physics can shine. I don't think anybody is dumb enough to think PhysX is not as good as Havok. The problem is that PhysX requires Nvidia hardware. Thats not accessible like Havok is, which can run on just about anything. They need to develop it to run on ATi hardware and realize that widespread adoption is better than just keeping it to themselves. And once it would become the physics stadard they could charge low cost licensing like game engines etc. Nvidia is playing a good hand but not using it right. Probably because of the arrogant CEO they have.


----------



## BelligerentBill (Oct 5, 2009)

mR Yellow said:


> TBH, i've done the PhysX hack and it didn't add much to the game play. All i saw was smoke, spiderwebs and paper effects. Nothing to go wow about.
> 
> To date PhysX has jusy been nothing but a sales gimmick. PORTAL and HL2 was way better in terms of game play.



Atmosphere of a game is a pretty big deal.  Batman is a damn fine game and the additional atmosphere really is a nice bonus IMO.  Ever since I demoted my 8800 GTS 512 to a dedicated Physx PPU I simply see no reason to go without PhysX... in fact it's much like a drug... it's there and I must have it.  No the feature isn't critical to any game... but I would liken it's entertainment value to watching a movie in Blu-Ray HD as opposed to a standard DVD.


----------



## Wile E (Oct 5, 2009)

TheMailMan78 said:


> Havok already does this without the over head Physx brings. Look at the "frostbite" engine.



Havok's current capabilities are a fraction of what Physx is capable of. Now, come back and discuss this when Havok actually releases their gpu accelerated physics implementation. Until then, Physx has the most potential. It's just that it's currently untapped by developers.


----------



## TheMailMan78 (Oct 5, 2009)

Wile E said:


> Havok's current capabilities are a fraction of what Physx is capable of. Now, come back and discuss this when Havok actually releases their gpu accelerated physics implementation. Until then, Physx has the most potential. It's just that it's currently untapped by developers.



I have yet to see Physx do ANYTHING Havok can't. Again I say research the Frostbite engine.


----------



## Wile E (Oct 5, 2009)

TheMailMan78 said:


> I have yet to see Physx do ANYTHING Havok can't. Again I say research the Frostbite engine.



I did. And I'm telling you, just because you haven't seen it, doesn't mean it's not capable. Physx is capable of much, MUCH more than all other current physics implementations. OpenCL and gpu accelerated Havok may change that, but as it stands, Physx has superior capabilities. No developers have chosen to tap into it's full capabilities yet, as they don't want to alienate non-nVidia users. Doesn't make it any less capable.


----------



## TheMailMan78 (Oct 5, 2009)

Wile E said:


> I did. And I'm telling you, just because you haven't seen it, doesn't mean it's not capable. Physx is capable of much, MUCH more than all other current physics implementations. OpenCL and gpu accelerated Havok may change that, but as it stands, Physx has superior capabilities. No developers have chosen to tap into it's full capabilities yet, as they don't want to alienate non-nVidia users. Doesn't make it any less capable.



Proof man proof. Show me something Physx can do that havok can't.


----------



## Wile E (Oct 5, 2009)

TheMailMan78 said:


> Proof man proof. Show me something Physx can do that havok can't.



Why don't you use google buddy? Where is your proof that Havok is capable of everything that Physx is capable of?


----------



## TheMailMan78 (Oct 5, 2009)

Wile E said:


> Why don't you use google buddy? Where is your proof that Havok is capable of everything that Physx is capable of?



Your the one making the accusations Havok isn't on par with Physx. All I said is they were equal and you said Physx was better. I gave you proof with the frostbite and you offer none.

Again where is the beef man?


----------



## Wile E (Oct 5, 2009)

TheMailMan78 said:


> Your the one making the accusations Havok isn't on par with Physx. All I said is they were equal and you said Physx was better. I have you proof with the frostbite and you offer none.
> 
> Again where is the beef man?



No, you're the one making accusations that Physx isn't more capable. This thread isn't about Havok. The burden of proof lies on you.

Besides, you have to be a developer to understand the raw data that's out there. I don't have the ability to translate. All the info you need is in the Physx and Havok SDK's. Download them, and have a go at it.

Not to mention, we haven't even touched on how much faster gpus are at crunching physics numbers vs cpus. It's just common sense that Physx is capable of more. Even if it can only do the same types of Physx, it still can do more of them.


----------



## TheMailMan78 (Oct 5, 2009)

Wile E said:


> No, you're the one making accusations that Physx isn't more capable. This thread isn't about Havok. The burden of proof lies on you.
> 
> Besides, you have to be a developer to understand the raw data that's out there. I don't have the ability to translate. All the info you need is in the Physx and Havok SDK's. Download them, and have a go at it.
> 
> Not to mention, we haven't even touched on how much faster gpus are at crunching physics numbers vs cpus. It's just common sense that Physx is capable of more. Even if it can only do the same types of Physx, it still can do more of them.



You're assuming its more capable because its dedicated. In theory you're right. However NOTHING in the industry shows that it is. As a matter of fact everything points to the opposite. IF it were that much better how come Intel went with Havok? Why are most engines using Havok? Just because an SDK is more crowded doesn't make it better.

You say Intel went with Havok because Nvidia owns them but I say its because Physx is inferior. I also believe it will soon be dead too. Say what you will but my proof is in practice. Yours is in theory.


----------



## Wile E (Oct 5, 2009)

TheMailMan78 said:


> You're assuming its more capable because its dedicated. In theory you're right. However NOTHING in the industry shows that it is. As a matter of fact everything points to the opposite. IF it were that much better how come Intel went with Havok? Why are most engines using Havok? Just because an SDK is more crowded doesn't make it better.
> 
> You say Intel went with Havok because Nvidia owns them but I say its because Physx is inferior. I also believe it will soon be dead too. Say what you will but my proof is in practice. Yours is in theory.



I didn't say anything about Intel and Havok. But anyway, Intel went Havok because Physx was already bought out, and they needed something to push with Larabee. Has nothing to do with technical capabilities.

And more engines use Physx than you think. Physx also has a cpu based api, just like Havok.

Again, the adoption rate is low because devs don't like to alienate customers. This is nv's fault for sure, for not making gpu Physx run on an open standard, but adoption rates do not in any way prove capabilities. Not to mention, how much longer has Havok been around? That's a pretty piss poor argument, tbh.

And Physx is not necessarily dead either. With the release of OpenCL, all nVidia has to do is port it from CUDA to OpenCL, and it will be alive and well. Whether they do that or not, is a different story. They seem to have pride issues on opening up their API's for maximum exposure.

At any rate, nothing you have mentioned points to Physx having inferior capabilities. You still haven't proven anything either.


----------



## TheMailMan78 (Oct 5, 2009)

Wile E said:


> I didn't say anything about Intel and Havok. But anyway, Intel went Havok because Physx was already bought out, and they needed something to push with Larabee. Has nothing to do with technical capabilities.
> 
> And more engines use Physx than you think. Physx also has a cpu based api, just like Havok.
> 
> ...



You're correct. You didn't say anything about Intel. My mistake. I'm so used to that argument I got ya confused 

Anyway I don't feel  Physx is inferior for its capabilities. I feel its inferior due to the way its executed. (Nividia only hardware). What I do believe is its no better than Havok and even when its GPU accelerated I have yet to see something Havok cannot do and has been proven to do. Does it have more potential in theory? Hell yeah but I haven't seen a damn thing yet to justify a dedicated GPU other than some slick marketing by Nvidia.

As for adoption rates just look at Havok vs Physx SINCE physx was first released. I think you'll be surprised.


----------



## Wile E (Oct 5, 2009)

TheMailMan78 said:


> You're correct. You didn't say anything about Intel. My mistake. I'm so used to that argument I got ya confused
> 
> Anyway I don't feel  Physx is inferior for its capabilities. I feel its inferior due to the way its executed. (Nividia only hardware). What I do believe is its no better than Havok and even when its GPU accelerated I have yet to see something Havok cannot do and has been proven to do. Does it have more potential in theory? Hell yeah but I haven't seen a damn thing yet to justify a dedicated GPU other than some slick marketing by Nvidia.
> 
> As for adoption rates just look at Havok vs Physx SINCE physx was first released. I think you'll be surprised.



I'm not surprised at all. I already admitted Nv is holding gpu Physx back, and by extension, cpu Physx. But directly comparing it to Havok is still pointless, because Havok has been around so much longer that it has had more time to penetrate the market and bring up it's brand recognition. 

None of that changes the fact that it's capable of more than any cpu based physics.


----------



## mR Yellow (Oct 6, 2009)

BelligerentBill said:


> Atmosphere of a game is a pretty big deal.  Batman is a damn fine game and the additional atmosphere really is a nice bonus IMO.  Ever since I demoted my 8800 GTS 512 to a dedicated Physx PPU I simply see no reason to go without PhysX... in fact it's much like a drug... it's there and I must have it.  No the feature isn't critical to any game... but I would liken it's entertainment value to watching a movie in Blu-Ray HD as opposed to a standard DVD.



Valid point, but the difference isn't that huge as SD and HD.


----------



## Chad Boga (Oct 11, 2009)

As Jumping Jack pointed out to me on another forum, how ironic that despite making their card like like the Batmobile, ATI gets no AA loving from Batman : Arkham Asylum.


----------



## Benetanegia (Oct 12, 2009)

TheMailMan78 said:


> You're correct. You didn't say anything about Intel. My mistake. I'm so used to that argument I got ya confused
> 
> Anyway I don't feel  Physx is inferior for its capabilities. I feel its inferior due to the way its executed. (Nividia only hardware). What I do believe is its no better than Havok and even when its GPU accelerated *I have yet to see something Havok cannot do and has been proven to do*. Does it have more potential in theory? Hell yeah but I haven't seen a damn thing yet to justify a dedicated GPU other than some slick marketing by Nvidia.
> 
> As for adoption rates just look at Havok vs Physx SINCE physx was first released. I think you'll be surprised.



That is not an argument at all. I have yet to see anything that DX10 does or even DX10.1 and DX11 does that DX9 doesn't. Just like you, I'm talking about games, because if we were talking about the tech, there are plenty of demos that demostrate the superiority of PhysX and DX10/11. 

Are you really going to tell me that this:

PhysX demo on UT3 - http://www.youtube.com/watch?v=nF7Iq9pzKRk&feature=PlayList&p=55C1A52A917B2DDF&index=7

and this:

RedFaction: Guerrilla - http://www.youtube.com/watch?v=U93lGcMC4mc

are the same thing?? And remember we are talking about technology.

It's been said already, the most prominent advantage of PhysX is in the power of the GPU and how it works. There's been plenty of games with destructable environments, but they are completely unrealistic. PhysX has the potential to make them realistic. That UT3 demo is still far from maxing out a single SP cluster in current generation of cards, they have around ten. As you can see here and here PhysX can simulate as many as 0.5 million particles and tens of thousands of rigid objects. More than enough to make an entire building made of actual bricks and columns.

Now wait, Havok, Bullet and many others are going to be able to do the same when running on the GPU? Most probably, but they are not here, and they will not be here until late 2010 or even 2011, while PhysX has been offering that potential since 2005. So IMO ANY ENTHUSIAST should be thankful that PhysX exists and should push for it instead of bashing it, because without PhysX we would never have advanced physics. Back in the 90's the GPU offered *nothing* on top of what a CPU could do for rendering games. Only one company (3DFx*) really pushed for accelerated graphics and they did so with propietary tech *in a single game: Quake*. Accelerated Quake didn't offer too much, compared to the CPU version but it showed the potential. History tells they were damn right about their vision. And yes their propietary tech had to die and it died, but you can't kill the tech before it takes off. Propietary tech MUST exist ALWAYS when talking about revolutionary new tech, because an open tech with no interested company behind it, will never take off, companies are lazy to push for something that will net no benefit and game developers are even worse, because they are very time/cost constrained.

Finally, PhysX adoption right now is much greater than Havok ever was, you should take a look at how many games use PhysX nowadays and how many are under develpment, you'll be really surprised. Remember: PhysX runs on the CPU too.

*I'm talking about pushing for it, not having hardware. There were other companies with accelerated cards, but none of them really helped moving the tech fordward.


----------



## RevengE (Nov 10, 2009)

I just got done playing the demo.. it looks good without AA seeing i'm a ATI user. Nvidia can eat my ***


----------

