# Criticism of Nvidia's TWIMTBP Program - HardOCP's Just Cause 2 Review



## Mr McC (May 5, 2010)

I am not a fan of Nvidia’s marketing practices as I firmly believe they damage the PC gaming industry, creating division through the use of proprietary technology where open standards are available that can produce exactly the same effects. Many people criticise ATI for the scant attention it pays to developer relations, citing the tangible advances that Nvidia offers gamers through its TWIMTBP releases and the use of CUDA and PhysX. However, I do not want to see ATI respond in kind as Nvidia seems intent on creating a situation wherein the consumer will eventually be forced to ask whether a given game is an “ATI title” or an “Nvidia title” wherein performance is essentially crippled on the competitor’s cards. I did not buy, nor will I buy Batman Arkham Asylum as Nvidia paid the developer to turn off in-game AA on ATI cards. I find that reprehensible: it is one thing to optimise a title for your hardware; it is another thing to pay the developer to ensure that the performance of a given title is reduced when you use the competitor’s hardware. Again, whilst many people ask why ATI didn’t pay the developer to ensure that this function was enabled for its hardware, I firmly believe that development should be left to the developers and that certain aspects of a game, such as AA, should be available irrespective of the brand of graphics card that the consumer decides to purchase. By all accounts Arkham Asylum is an excellent game; however, my principles will not allow me to support such practices with my money – to each, his own.

I just finished reading the review of Just Cause 2 over on HardOCP. It was very refreshing to find a reviewer who, despite the obvious pressure placed on tech sites, was willing to openly criticise Nvidia’s TWIMTBP program and marketing practices:

http://www.hardocp.com/article/2010/05/04/just_cause_2_gameplay_performance_image_quality

_The Way It’s Meant to be Played?
We have no doubt that the Bokeh filter and the GPU Water Simulation options could have been executed successfully on AMD’s Radeon HD 5000 series of GPUs. That the developers chose NVIDIA’s CUDA technology over Microsoft DirectCompute or even OpenCL is probably due to the fact that NVIDIA’s developer relations team worked with Avalanche Studios developers, and of course they like to promote their own products. (We would surely love to see the contract between the two, but that will never happen.) It is certainly their right to do so, just as it is Avalanche’s right to choose whatever API they want to use. We would certainly not presume to tell any independent game developer how to design their own game, but we would suggest that a more open alternative (such as OpenCL or DirectCompute) would have been preferred by us for those gamers without CUDA compatible hardware.
This is an old argument, and is basically analogous to the adoption of PhysX as opposed to a more broadly compatible physics library. NVIDIA wants to increase its side of the GPU business by giving its customers a "tangible" advantage in as many games as possible, while gamers without NVIDIA hardware would prefer that game developers had not forget about them.
As it stands for Just Cause 2, gamers without NVIDIA hardware are missing a couple of really nice graphics features, but those features are not critical to the enjoyment of the game. Just Cause 2 still looks just fine and is just as fun without them. But if you want the very best eye candy experience possible, NVIDIA's video cards, especially the GeForce GTX 480 and GTX 470, will give it to you.
When NVIDIA tells us that it will "Do no harm!" when it comes to gaming, that is really a bold faced lie, and we knew it when it was told to us. It will do no harm to PC gaming when it fits its agenda. NVIDIA is going to continue to glom onto its proprietary technologies so that it gains a marketing edge, which it very much does though its TWIMTBP program. And we have to assume that marketing edge is worth all the bad press it does generate. To say NVIDIA does not harm to PC gaming is a delusional at best. You AMD users just got shafted on these cool effects that could have been easily developed for all PC gamers instead of just those that purchase from one company. _

This is another title that I refuse to buy. I doubt that this will cause much concern to Avalanche Studios, but if sufficient numbers avoid a developer’s titles because it allows Nvidia access to areas of development that should employ open standards, the developers may begin to take notice. Hopefully, TWIMTBP will become a thing of the past as it creates division and artificially introduced differences that are neither necessary nor desirable from the point of view of the consumer.


----------



## Binge (May 5, 2010)

ShiBDiB said:


> reported for flame/troll-tastic remarks



He's stated his opinion and cited an article that supports a similar view.  How is that trolling?  Let the man speak his mind, but if you disagree simply explain yourself.  The issue here is that NV puts money and man-power into finding ways for games to run _better_ on their GPUs.  If it's bad marketing let it be known, but if it's just your average trick of the trade then this thread will die in a matter of hours.


----------



## ShiBDiB (May 5, 2010)

it's an nvidia bashing thread... how is that not trolling and why is it in the games section


----------



## sneekypeet (May 5, 2010)

I guess when the OS was loaded you were ok with bill gates and his "the way a PC is to be run?"

Same practices, and lets not forget Intels latest fiasco with OEM builders and what they did to AMD.

You are simply forgetting the golden rule....The one with the gold makes the rules!

I understand the frustration, but Nvidia is just following the footsteps of much larger entities that have set precedence before they got here

@ shib, the OP is directed at both TWIMTBP and Just Cause 2 Also he IS allowed to bash and take sides to an article, its usually what follows those posts is where the trouble derives.


----------



## newtekie1 (May 5, 2010)

Mr McC said:


> I did not buy, nor will I buy Batman Arkham Asylum as Nvidia paid the developer to turn off in-game AA on ATI cards. I find that reprehensible: it is one thing to optimise a title for your hardware; it is another thing to pay the developer to ensure that the performance of a given title is reduced when you use the competitor’s hardware.



I believe you have a very incorrect view on this subject.  Your claim is entirely untrue, and has been talked about many times on this forum and others.  Batman Arkham Asylum is based on the Unreal Engine.  By default the Unreal Engine does not have AA capabilities.  What this means is that the developer has to add AA if it wants to have it built into the game.  This is a feature that would likely not have existed in the game at all if nVidia had not had it added.  In this case, nVidia paid to have AA added into the game for their hardware, they also paid for the testing to ensure it worked on their hardware.  ATi on the other hand did not, so I fail to see why you think they should benefit from it.  NVidia most certainly did _*NOT*_ pay to have AA disabled on ATi hardware, they paid to have it enabled on their hardware.  And in fact, when it is forced enabled on ATi hardware, the game doesn't work.  Why?  Because it wasn't tested or optimized for ATi hardware.  Why wasn't it tested or optimized for ATi hardware, because nVidia was paying to have it developed, tested, and optimized on their hardware not their competitor's.

Your assumption that AA is something that exists in all graphics engines and game titles by default is wrong.




Mr McC said:


> This is another title that I refuse to buy. I doubt that this will cause much concern to Avalanche Studios, but if sufficient numbers avoid a developer’s titles because it allows Nvidia access to areas of development that should employ open standards, the developers may begin to take notice. Hopefully, TWIMTBP will become a thing of the past as it creates division and artificially introduced differences that are neither necessary nor desirable from the point of view of the consumer.



Again, if nVidia pays for the developement of something, it has every right to not allow it to be used on the competitors hardware, and that is likely the case here.  I see nothing wrong with that.  Especially if it is something that likely wouldn't have been included in the game at all if not for nVidia paying to have it added(such is the case with AA in Batman).  I'm not sure if this is the case with the effects in Just Cause 2 that they are talking about, I don't really see how either is necessary in the game, so I am actually very doubtful that either would be in the game at all if nVidia had not paid for the developement.   I have no problem with Just Cuase 2 on my HD4890.  It looks great and runs perfectly.

And oddly enough, you make a big stink about how Just Cause 2 should have used more open standards, and yet they used Havok for the physics engine instead of PhysX.  I would think if nVidia was really that deap and paying them so well, they would have used nVidia's PhysX instead of Havok.


----------



## HeroPrinny (May 5, 2010)

newtekie1 said:


> Again, if nVidia pays for the developement of something, it has every right to not allow it to be used on the competitors hardware, and that is likely the case here.  I see nothing wrong with that.  Especially if it is something that likely wouldn't have been included in the game at all if not for nVidia paying to have it added(such is the case with AA in Batman).  I'm not sure if this is the case with the effects in Just Cause 2 that they are talking about, I don't really see how either is necessary in the game, so I am actually very doubtful that either would be in the game at all if nVidia had not paid for the developement.   I have no problem with Just Cuase 2 on my HD4890.  It looks great and runs perfectly.


Only issue with doing things like that, it that it  doesn't booster fair  play, why should those with an ATI card be hindered and those who have an nVidia aren't, pc gaming isn't  like console gaming. The Batman thing I have no issue with due to unreal having pretty much no AA in the first place and nVidia added it, now if ATi ever decides to add AA it would be really nice.


----------



## sneekypeet (May 5, 2010)

Ever think that when you buy that cheaper ATI card (price wise) development doesnt come into play? It isnt like Nvidia users arent getting stabbed in the backside in pricing to offset said R&D.


----------



## HeroPrinny (May 5, 2010)

sneekypeet said:


> Ever think that when you buy that cheaper ATI card (price wise) development doesnt come into play? It isnt like Nvidia users arent getting stabbed in the backside in pricing to offset said R&D.


That's all pricing wise though and you have a choice to get an ATI or an Nvidia video card, you  don't have a choice to have Just Cause 2 look the same as it does on Nvidia if you have an ATi. Then there's the whole metro 2033 issue, and it seems liek it always has phsyx on, as adding a 8600gt and doing the phsyx mod boosted my fps a lot to the point of it was playable fully cranked on my 5870.


----------



## Wile E (May 5, 2010)

I have to agree with Newtekie. Optimizing a title for nVidia is not the same as crippling ATI. The "crippling ATI" argument has no grounds to stand on. Don't you think ATI would have sued nV for anti-trust by now if that was actually what was happening?

All in all, it's up to the devs to use CUDA over OpenCL and DirectCompute. And I'll give you a huge reason why they choose to do so: The developer tools are much easier to use. Not to mention the hands on support nV gives.

Nothing is stopping ATI from offering the same level of dev help.


----------



## FordGT90Concept (May 5, 2010)

ShiBDiB said:


> it's an nvidia bashing thread... how is that not trolling and why is it in the games section


NVIDIA doesn't deserve it?   They've been pretty shady since, oh, about 2004 or 2005 making deals with Microsoft (prevented X### cards from having Pixel Shader 3.0 support and breaking DX10 into two parts: 10.0 and 10.1 because NVIDIA couldn't meet the full requirements) to monopolistic behavior after acquiring Ageia PhysX to numerous repackaging of damn similar products to milk the dying cash cow (could be filed under false advertising).

This is just another example of NVIDIA pushing their weight around.




Wile E said:


> Don't you think ATI would have sued nV for anti-trust by now if that was actually what was happening?


There's only so much AMD can do though because they aren't exactly in the best shape to finance a massive lawsuit.

Edit: I'm sure there was a recent lawsuit but I can't find any information on it.  The FTC suit against ATI/NVIDIA and FTC suit against Intel are taking all the Google hits.


----------



## newtekie1 (May 5, 2010)

HeroPrinny said:


> Only issue with doing things like that, it that it  doesn't booster fair  play, why should those with an ATI card be hindered and those who have an nVidia aren't, pc gaming isn't  like console gaming. The Batman thing I have no issue with due to unreal having pretty much no AA in the first place and nVidia added it, now if ATi ever decides to add AA it would be really nice.



How are those with ATi cards hindered?  They aren't.

If the features were something that would have never existed if nVidia didn't pay to have them implemented, then those with ATi cards are not missing out on anything they would have had if nVidia never stepped in.

I'll give an easy example:

Lets say the developers had planned to include features A, B, C, and D.  Then nVidia comes along and says "we'll pay for you to develope and include feature E, but only for our hardware".  Then ATi cards are not being hindered becuase they can't use feature E, they still get the original A, B, C, and D they would have gotten.

Now, if those features were going to be part of the game for everyone, and nVidia paid to have them disabled on ATi hardware, then yes I agree with you.  However, I have yet to see a single piece of evidence pointing to that being the case.  Everything shows that nVidia paid to have the features added, and TWIMTBP is a program designed to get developers to optimize their games for nVidia hardware.  This means making them run better on nVidia hardware than they would have with no optimization, which is how they run on ATi hardware.



FordGT90Concept said:


> prevented X### cards from having Pixel Shader 3.0 support



Wait.  How exactly is nVidia to blame for ATi not including PS 3.0 support in their hardware exactly?


----------



## Benetanegia (May 5, 2010)

HeroPrinny said:


> Only issue with doing things like that, it that it  doesn't booster fair  play, why should those with an ATI card be hindered and those who have an nVidia aren't, pc gaming isn't  like console gaming.



Because of what most people forget: if Nvidia didn't include those features they would have never been implemented. *Nobody* would enjoy those extra features if Nvidia didn'¡t push developers. What's best?

a- No one gets any features.
b- Some get the features for those cards where the features have been tested and developed for.

Because there really isn't any other option. 

It's an extension of what most people know is happening in the gaming world, but choose to forget on these Nvidia bashing threads: *developers don't really want to develop for PC* and adding PC centric features is the last thing they would do without external help. Without any help all that we would get is a straight console port. Yeah I know what you are thinking, you think we already do, but you are wrong. We call them ports, but they could be much much worse and it's only thanks to Nvidia and AMD that ports are something close to a decent PC game.


----------



## Bundy (May 5, 2010)

FordGT90Concept said:


> This is just another example of NVIDIA pushing their weight around.



Or pragmatism, depends on how you want to consider the situation.


----------



## cadaveca (May 5, 2010)

newtekie1 said:


> Your assumption that AA is something that exists in all graphics engines and game titles by default is wrong.



Batman doesn't matter. It was a crap game of endless repeating tasks in a varied environment.

They matched the Batman flavour and ambience, but in actual gameplay, I could not have been more bored.

That said, AA in Unreal3 engines works in many other titles. The simple fact it doesn't work at all on ATi cards says something far greater than nVidia paying for it to NOT work on ATI cards...

Which, whether you want to argue that they paid for the development, so it's thier right to restrict it's use, or that the gpu architectures are different, so require different code, doesn't matter. *The damning fact is that had nVidia NOT helped develop the AA code, the developers would have been forced to either write code for both, or none at all.*

I mean, I more than welcome nVidia to get developers to work with Phys-X. But AA is a different story...they just simply shouldn't have messed in that part of the game. The act of helping develop the code forced a situation where the developer would naturally expect ATi to provide the same help, rather than relying on thier own in-house resources. The complaint has nothing to do with the code itself, but the actual effect on the companies' mentality when dealing with ATi that is the issue.

nVidia jsut shouldn't have developed code for something as basic as AA in unreal engine. It's really not that hard to do, or they'd be the only Unreal game with AA...and they certainly are not.

Really, nVidia's owner claims to not be a hardware company, but a software company that sells hardware. There are mixed interests here...nVidia is using thier clout in one market to effect another, and that, my friend, is illegal. M$ paid lots of money for the same basic business practices.

DX10 was delayed due to nV not supporting the API properly. DX10.1 was barely a whisper, again, thanks to a lack of nV support.

Heck, tesselation, nearly a decade old, has finally made into DirectX...only because of nV's support.

nVidia truly is ruining the industry, and in more ways than one. They are helping the software development-side stagnate...these developer's getting help for things like AA really should be more than capable of doing such things themselves, but due ot a lack of basic skills, they had to go elsewhere.

It's really sad that a developer can't even implement AA, because they are incapable of hiring the proper staff. It's a sad excuse for nVidia-only code...the entire house's management should be fired, as far as I am concerned, for delivery of a pathetic product.

I mean, really...99.9% of the AI does exactly the same thing. Sure, it's pretty...but my god, I cannot believe the hype this game got. But then again, I know how all that works, and why it was so critcally acclaimed. And I am NOT fooled.



newtekie1 said:


> Wait.  How exactly is nVidia to blame for ATi not including PS 3.0 support in their hardware exactly?



LoL. Ati doesn't support PS3.0? You sure about that?


----------



## DannibusX (May 5, 2010)

nVidia would probably make a boat load of money if the released Ageia styled PPU cards with driver support for ATI GPUs.

I'd like to have official support to use an nVidia card as a PPU, but that'll never happen.


----------



## HeroPrinny (May 5, 2010)

i'll point out metro again, it's a perfect example. My testing really did take the cake too a 8600gt running just phsyx boosted my fps from unplayable to playable, that shouldn't happen, why does metro force phsyx?


----------



## FordGT90Concept (May 5, 2010)

newtekie1 said:


> Wait.  How exactly is nVidia to blame for ATi not including PS 3.0 support in their hardware exactly?


I wish I could remember where I read that...


----------



## the54thvoid (May 5, 2010)

I can see this thread getting lockdown!!

Any marketing strategy that aims to proactively diminish another companies standing is to be expected.  There (unfortunately) is nothing extraordinary about that.  
However, when a percentage of service users that fall into the opposing camp have their experience diminshed by said practice, then that is very wrong.  To create deals with software developers that aim to 'handicap' the opposition and co-erce the consumer to purchase a certain brand is, under different conditions, illegal practice.  At best, it is unfair.
In a capitalist economy, it is fair to undercut, aggressively market and generally hype your product and detract from the opposition.  This is all fair.
On the other hand, to use financial incentive to persuade developors to create an uneven playing field is underhanded (regardless of who does it).  

To defend what NV does and justify it by the preceding arguements is to wholesomely miss the OP's point.  Just because 'A' did it and then 'B' did it, does not make 'B' right.  If ATI started the same practice, it would still be just as bad.

The same could be said of coding (Metro 2033) but this can be overcome by intelligent and pragmatic software engineering (driver support).  However, the act of diminishing standards for the opposition by hardware/architectural manipulation of closed standards is just plain wrong.

Remember, just because it's not technically illegal doesn't make it right.  Hamstringing the opposition to co-erce consumers to buy another product is never a very savoury thing to do.

Defending this practice is a very bad road to take.


----------



## Mr McC (May 5, 2010)

ShiBDiB said:


> it's an nvidia bashing thread... how is that not trolling and why is it in the games section



I am most certainly bashing Nvidia; however, I have attempted to justify and explain my stance. I did not simply state what I felt in order to annoy those who hold an opposing point of view. If you disagree wth me, please tell me why, in other words, attack the substance of my argument, rather than issuing accusations.


----------



## Benetanegia (May 5, 2010)

cadaveca said:


> That said, AA in Unreal3 engines works in many other titles. The simple fact it doesn't work at all on ATicards says something far greater than nVidia paying for it to NOT work on ATI cards...
> 
> The damning fact is that had nVidia NOT helped develop the AA code, the developers would have been forced to either write code for both, or none at all.



No game using UE3 has built in AA, except Batman and a couple more I think, who chose to implement their own AA. Batman did implement it because Nvidia made the work, plain and simple. Fact of the matter is 90% of games using UE3 have no AA and Batman would have been just another one. I fail to see how implementing something is bad. And I find it funny that people would rather have it not implemented at all. Pretending that AA for none is better than AA for half the people is stupid.


----------



## cadaveca (May 5, 2010)

The choice of how it was implemented is the issue. Every other UE3 game that supports AA supports it on both side of the graphics pile.

I'm not saying that them using it was bad...but they have openly stated that they have restricted it to nV cards due to custom code developed by nV.

that's fine...but...

In other words, they couldn't do it. So they suck as programmers. THAT is how it's bad.

I only mention that fact that it wouldn't have been there at all to highlight how incapable they are. Just a simple misunderstanding between us.

I'm not knocking nVidia for helping with that either...other than that they shouldn't be messing with stuff outsdie thier closed APIs. Phys-X, CUDA, whatever...sure, go right ahead. AA code...nope.


----------



## KainXS (May 5, 2010)

I say its kinda like getting mad at someone who studied for a test because they passed(they studied) and your friend(he didn't) failed, Nvidia studied, AMD didn't, and people with AMD cards are mad, but alot of the blame lies with AMD.


----------



## Mr McC (May 5, 2010)

sneekypeet said:


> I guess when the OS was loaded you were ok with bill gates and his "the way a PC is to be run?"
> 
> Same practices, and lets not forget Intels latest fiasco with OEM builders and what they did to AMD.
> 
> ...



My only argument is that two wrongs don't make a right, but everything you have stated is not subject to debate. How long would Microsoft last if Linux supported DirectX? 

My intention is not to start a flame war, so if you feel that this post will bring that about, go ahead and lock it up with my blessing.

Cheers


----------



## sneekypeet (May 5, 2010)

My point was that while I understand the frustration, its nothing new. As long as things continue to stay civil, I have no intentions to lock it up.


----------



## BababooeyHTJ (May 5, 2010)

sneekypeet said:


> Ever think that when you buy that cheaper ATI card (price wise) development doesnt come into play? It isnt like Nvidia users arent getting stabbed in the backside in pricing to offset said R&D.



 What was that about never selling a graphics card for more than $599?


----------



## sneekypeet (May 5, 2010)

a nearsighted guestimate?

Im not here to defend any stance. I am here to say that if you pay out the butt up front you can has all the goodies. If you choose not to buy it, I dont see why the complaint at all. In the end you made that choice.


----------



## Mr McC (May 5, 2010)

newtekie1 said:


> Your assumption that AA is something that exists in all graphics engines and game titles by default is wrong.



You have misunderstood me. I did not state that AA is something that exists in all graphics engines, but rather something that should exist in all graphics engines. Where we differ is that I place onus on the developer to ensure that this feature is included, irrespective of the engine they decide to use.


----------



## ucanmandaa (May 5, 2010)

I think Gears of War has built in AA but only in DX10 mode (a UE3 game)


----------



## cadaveca (May 5, 2010)

Mr McC said:


> You have misunderstood me. I did not state that AA is something that exists in all graphics engines, but rather something that should exist in all graphics engines. Where we differ is that I place onus on the developer to ensure that this feature is included, irrespective of the engine they decide to use.



Yes, this is my thing as well. If they cannot do it, they shouldn't be developing games in the first palce. Basic stuff, AA. Next, they won't be able to get AF to work...

I mean really...you can expect very little from those developers, for sure. If it was any other character than Batman, the game would have been passed over by everyone that hyped it as being so good.


----------



## Benetanegia (May 5, 2010)

cadaveca said:


> The choice of how it was implemented is the issue. Every other UE3 game that supports AA supports it on both side of the graphics pile.
> 
> I'm not saying that them using it was bad...but they have openly stated that they have restricted it to nV cards due to custom code developed by nV.
> 
> ...



It's not making it what prevented them from implementing it on AMD hardware as well. It was *testing*. Developers did ask AMD to help them implement and test it and AMD refused. In fact, according to the developers, AMD didn't help on the development at all. After the launch AMD did say they would help add it, but afaik it's not been implemented yet and I doubt it will ever be.

Testing and QA costs a lot and independent studios just can't pay for testing. Even the big studios leave features out because testing them costs a lot.


----------



## Mr McC (May 5, 2010)

Wile E said:


> I have to agree with Newtekie. Optimizing a title for nVidia is not the same as crippling ATI. The "crippling ATI" argument has no grounds to stand on. Don't you think ATI would have sued nV for anti-trust by now if that was actually what was happening?



Possibly, but I would need to see the contract between Nvidia and the developer in question and it is certainly walking a very thin line. Moreover, I don't want features such as AA to be dependent upon a bidding competition between the two companies: there are certain things that I expect as given.


----------



## Wile E (May 5, 2010)

cadaveca said:


> The choice of how it was implemented is the issue. Every other UE3 game that supports AA supports it on both side of the graphics pile.
> 
> I'm not saying that them using it was bad...but they have openly stated that they have restricted it to nV cards due to custom code developed by nV.
> 
> ...


That's not the fault of nV or TWIMTBP tho. That's purely the fault of the dev house.


----------



## newtekie1 (May 5, 2010)

cadaveca said:


> That said, AA in Unreal3 engines works in many other titles. The simple fact it doesn't work at all on ATi cards says something far greater than nVidia paying for it to NOT work on ATI cards...



There are also many Unreal Engine 3 games that have no AA at all.  Adding AA to the engine is probably pretty easy, but not something that is as easy as flipping a switch in a control panel and expecting it to to work with all hardware.  The issue with not working with ATi likely comes down to a simple driver issue.  There are plenty of things that work perfectly fine on some things, and not on others.  We see games that function properly with one driver, and crash constantly with another.

To expect something like AA to just work on all hardware without it being tested or even developed for that hardware and it's drivers in any way is a little inane.



cadaveca said:


> Which, whether you want to argue that they paid for the development, so it's thier right to restrict it's use, or that the gpu architectures are different, so require different code, doesn't matter. *The damning fact is that had nVidia NOT helped develop the AA code, the developers would have been forced to either write code for both, or none at all.*



You are right, and since they was a console port with no AA in it to begin with, it likely would have been the second option, none at all.  And in that situation I'm with Benetanegia.  I'd rather see it available for some, then not at all.



cadaveca said:


> I mean, I more than welcome nVidia to get developers to work with Phys-X. But AA is a different story...they just simply shouldn't have messed in that part of the game. The act of helping develop the code forced a situation where the developer would naturally expect ATi to provide the same help, rather than relying on thier own in-house resources. The complaint has nothing to do with the code itself, but the actual effect on the companies' mentality when dealing with ATi that is the issue.



I don't think that is reasonable.  If nVidia wants to add features to the game, they should.  I don't believe developers or the industry is expecting it.  It isn't like the developers are sitting there demanding money from either company to include basic features of the game.  NVidia saw an oppertunity to improve gameplay for their customers, and they did it. 



cadaveca said:


> nVidia jsut shouldn't have developed code for something as basic as AA in unreal engine. It's really not that hard to do, or they'd be the only Unreal game with AA...and they certainly are not.



Developing AA is one thing.  Developing AA that has next to no performance hit is another, and what was done with Batman.  Most AA implementation in Unreal Engine games take a rather large toll on performance, mainly because they aren't actually using the engine itself to enable AA, but instead simply enabling AA through the graphics driver.



cadaveca said:


> DX10 was delayed due to nV not supporting the API properly. DX10.1 was barely a whisper, again, thanks to a lack of nV support.



DX10.1 was mainly performance enhancements done to DX10 after it was released.  You make it sound like DX10 was cut back be cause of nVidia, that simply isn't true.  DX10 was what DX10 was supposed to be, but Microsoft saw a way to improve on it, similar to DX9 and DX9b/c.

In terms of DX10.1, nVidia didn't need it.  Their hardware was already faster then ATi hardware even with ATi hardware supporting DX10.1.



cadaveca said:


> nVidia truly is ruining the industry, and in more ways than one. They are helping the software development-side stagnate...these developer's getting help for things like AA really should be more than capable of doing such things themselves, but due ot a lack of basic skills, they had to go elsewhere.
> 
> It's really sad that a developer can't even implement AA, because they are incapable of hiring the proper staff. It's a sad excuse for nVidia-only code...the entire house's management should be fired, as far as I am concerned, for delivery of a pathetic product.



It isn't just a matter of hiring the proper people, and having the ability to do it.  There is also the issues of time and money.  Two things that are ever shrinking in the gaming industry, especially during the economic times we've seen recently.  Publishers are pushing developers to make games quicker, and developers are facing smaller budgets.  That is where nVidia comes in to develop for them, and provide money for developers.



cadaveca said:


> I mean, really...99.9% of the AI does exactly the same thing. Sure, it's pretty...but my god, I cannot believe the hype this game got. But then again, I know how all that works, and why it was so critcally acclaimed. And I am NOT fooled.



Your opinons on the gameplay really have no business in this thread.



cadaveca said:


> LoL. Ati doesn't support PS3.0? You sure about that?



Look at what I responded to. Yes, I'm sure the X### series did not support PS3.0.

Sometimes I think infractions should be handed out for inability to read and use basic comprehension skills in a discussion.


----------



## BababooeyHTJ (May 5, 2010)

sneekypeet said:


> a nearsighted guestimate?
> 
> Im not here to defend any stance. I am here to say that if you pay out the butt up front you can has all the goodies. If you choose not to buy it, I dont see why the complaint at all. In the end you made that choice.



My point is that ATI is no better than Nvidia when it comes to ethics. The price of Cypress didn't exactly go down when they didn't have any competition. Hemlock is another story all together. 

Also at least Nvidia supports developers. I hear that working with ATI isn't exactly the easiest thing in the world to do. Do you ever wonder why OCCT gpu memtest only works with Nvidia? Nvidia gave the author the hardware for testing unlike ATI. I've heard other stories like this about ATI. 

I'm not a big fan of Physix and other closed standards since to play games like Mirror's Edge the way that they are intended I need to use Nvidia hardware but the point that you made about hardware could also be applied to the software in question.


----------



## cadaveca (May 5, 2010)

Benetanegia said:


> Testing and QA costs a lot and independent studios just can't pay for testing. Even the big studios leave features out because testing them costs a lot.



See here:



Wile E said:


> That's not the fault of nV or TWIMTBP tho. That's purely the fault of the dev house.



Yes, exactly. Like I siad, I'm not knocking nV on this. I'm trying to place blame where it belongs.

You guys are just too used to me bashing nV here. LoL.


----------



## Benetanegia (May 5, 2010)

Mr McC said:


> Possibly, but I would need to see the contract between Nvidia and the developer in question and it is certainly walking a very thin line. Moreover, I don't want features such as AA to be dependent upon a bidding competition between the two companies: there are certain things that I expect as given.



If the lazyness of developers implementing AA is your real issue, then this is not the thread you should have created. And it should have been created long long long ago and there should be no mention to Nvidia at all, because Nvidia has no blame on the issue. There are dozens if not hundreds of games that lack AA, most of the ones that use UE3, so you should blame the state of the PC gaming or something instead of blaimng one of the very very few companies that are really pushing for better PC gaming.


----------



## cadaveca (May 5, 2010)

newtekie1 said:


> Look at what I responded to. Yes, I'm sure the X### series did not support PS3.0.
> 
> Sometimes I think infractions should be handed out for inability to read and use basic comprehension skills in a discussion.



Uh, you owe me an apology.



> Get your box geared up for the future of games with blazing fast shader performance and watch your characters sweat with Shader Model 3.0 Shaders create the surface properties of 3D objects. *The Radeon X1900 has 48 shader processors that work in parallel with Shader Model 3.0 to render complex object surfaces—such as glistening sweat on a character’s skin—with 128-bit floating point precision.*



http://ati.amd.com/products/radeonx1900/index.html



> This is it. The new Radeon™ X1800 Series hands you the visual and performance possibilities you only dreamed of from a PC graphics processor. It has been designed with a radically new ultra-threaded 3D architecture and Shader Model 3.0,



http://ati.amd.com/products/radeonx1800/index.html

Every card since the X800 has supported PS3.0. Lack of support on legacy products in unimportant.


----------



## sneekypeet (May 5, 2010)

dave he has ony 3 ### meaning 850/800 or similar cards. At this time the 6800/7series already had it.

 none of my X850 XTPE's has SM3....didnt run the putry dragon test in 3DM06 till I got a 7600


----------



## cadaveca (May 5, 2010)

Sure, but I could do without the attitude, SP. You know what I'm getting at.


----------



## Benetanegia (May 5, 2010)

cadaveca said:


> Uh, you owe me an apology.
> 
> 
> 
> http://ati.amd.com/products/radeonx1900/index.html



He said X### NOT X####. That is X800, X600... Nvidia had PS3.0 on their hardware GF6800, ATi was like 6 months late with PS3.0 hardware. Games like Oblivion lacked proper PS3.0 for that reason, for example.


----------



## Mr McC (May 5, 2010)

Benetanegia said:


> If the lazyness of developers implementing AA is your real issue, then this is not the thread you should have created. And it should have been created long long long ago and there should be no mention to Nvidia at all, because Nvidia has no blame on the issue. There are dozens if not hundreds of games that lack AA, most of the ones that use UE3, so you should blame the state of the PC gaming or something instead of blaimng one of the very very few companies that are really pushing for better PC gaming.



The developer is not blameless in this or similar cases and if they simply need more money from Nvidia to include AA or beautifully textured water, then they need to improve their act; it is Nvidia's willingness to occupy the position of sponsor in these particular contexts that I also find censurable.


----------



## sneekypeet (May 6, 2010)

cadaveca said:


> Sure, but I could do without the attitude, SP. You know what I'm getting at.



sorry! I wasnt giving attitude, I was correcting your post, that is all, just trying to clear up the miscommunications.


----------



## cadaveca (May 6, 2010)

Not your attitude, SP, LOL...his.


----------



## Wile E (May 6, 2010)

Mr McC said:


> The developer is not blameless in this or similar cases and if they simply need more money to include AA from Nvidia in orde rto be able to include AA or beautifully textured water, then they need to improve their act;* it is Nvidia's willingness to occupy the position of sponsor in these particular contexts that I also find censurable.*



That, in no way, is a negative trait. ATI should step up and offer testing and tweaking help on their hardware as well.

As far as I am concerned, ATI's lack of dev help is the problem here. The dev teams have less time and money than ever (relatively speaking) to push titles out the door. They need all the help they can get.


----------



## sneekypeet (May 6, 2010)

oh well hell, then Im not sorry


----------



## Mr McC (May 6, 2010)

Wile E said:


> That, in no way, is a negative trait. ATI should step up and offer testing and tweaking help on their hardware as well.
> 
> As far as I am concerned, ATI's lack of dev help is the problem here. The dev teams have less time and money than ever (relatively speaking) to push titles out the door. They need all the help they can get.



We agree, to an extent, but disagree in those areas where a company should be expected to pay to have features, that I view as standard, enabled for their hardware. That in no way allows ATI to evade its reponsibility to cooperate.


----------



## cadaveca (May 6, 2010)

Benetanegia said:


> He said X### NOT X####. That is X800, X600... Nvidia had PS3.0 on their hardware GF6800, ATi was like 6 months late with PS3.0 hardware. Games like Oblivion lacked proper PS3.0 for that reason, for example.



Yes, the issue with his statement is in how he chose to knock Gt90's statement.

GT90 is in essence right. PS3.0 should have been reserved for DX10, but nV pushed it forward, and then didn't need to support DX10, as everything was already in DX 9.0c. They screwed DX10 by doing that, and THAT is was GT90 remember's reading, and how nV split DX10 into two seperate parts.

The X1900 bridged that gap in the APIs(and why I brought it up), but we could have had a full-fledged R600 instead, had nV waited for DX10 for PS3.0. R600 was actually a very old design, tweaked. nV delayed it's release by getting PS 3.0 into DX9.0c, so ATi had PS3.0 hardware in development already...nV was just first to market with it.


----------



## DaedalusHelios (May 6, 2010)

Mr McC said:


> We agree, to an extent, but disagree in those areas where a company should be expected to pay to have features, that I view as standard, enabled for their hardware. That in no way allows ATI to evade its reponsibility to cooperate.



It is a business and not a moral institution to build a game. A developers budget and overall payout are the only thing that matters to them. PC gaming development doesn't meet budget as easily as console games because aside from MMOs, it is PC gaming that isn't as profitable as console gaming. So when a developer needs extra money to meet their budget will they ask ATi who offers them nothing? Nope, they ask Nvidia who ponies up the cash and gets a reward in response with better support. It isn't that difficult to understand. ATi is just cheap with the devs and therefore they will make their customers deal with decreased support.


----------



## Benetanegia (May 6, 2010)

cadaveca said:


> GT90 is in essence right. PS3.0 should have been reserved for DX10, but nV pushed it forward, and then didn't need to support DX10, as everything was already in DX 9.0c. They screwed DX10 by doing that, and THAT is was GT90 remember's reading, and how nV split DX10 into two seperate parts.



Excuse me?  DX10 was not even a project when DX9.0c was released...

You claim to be a non-biased person, but you always come up with this pure inventions where Nvidia is the omnipresent bad guy.


----------



## cadaveca (May 6, 2010)

Benetanegia said:


> Excuse me?  DX10 was not even a project when DX9.0c was released...
> 
> You claim to be a non-biased person, but you always come up with this pure inventions where Nvidia is the omnipresent bad guy.




Uh, actually, it was. Back then it was called WGF, and was part of "Longhorn". Circa 2001. DX 9.0c released in 2004, 3 years after work on DX10 began.

http://en.wikipedia.org/wiki/Development_of_Windows_Vista

http://en.wikipedia.org/wiki/DirectX

And I make no claims to not being biased...many times I have claimed to be ATI's #1 fanboy. Uh, are you OK, man? I mean look at my sig...why would i have that there, if bias wasn't part of everything I write(and everyone else too, for posterity).


----------



## newtekie1 (May 6, 2010)

Mr McC said:


> You have misunderstood me. I did not state that AA is something that exists in all graphics engines, but rather something that should exist in all graphics engines. Where we differ is that I place onus on the developer to ensure that this feature is included, irrespective of the engine they decide to use.





cadaveca said:


> Yes, this is my thing as well. If they cannot do it, they shouldn't be developing games in the first palce. Basic stuff, AA. Next, they won't be able to get AF to work...
> 
> I mean really...you can expect very little from those developers, for sure. If it was any other character than Batman, the game would have been passed over by everyone that hyped it as being so good.



Game developers use pre-made engines for a reason. It is cheaper and less time consuming then creating their own. That certainly does not make them not suited to develope games.  If we are saying that every game developer that uses a pre-made game engine shouldn't be making games, we wouldn't see 90% of the games we have, including some extremely good ones.

 It takes huge resources to develope an engine, and it takes huge resources to implement such things as in game AA.  Which is exactly why most game developers use another larger studio's pre-built engine.  We are talking about some pretty big developers too:  EA, Midway, Sony, Square Enix.  Some pretty good games also: X-men Origins, Unreal Tournament 3, Rainbox Six Vegas and Vegas 2, Stranglehold, Mirror's Edge, and the list goes on.

Hell, even Unreal Tournament 3, the premier game on the Unreal Engine 3 lacks in game AA.



cadaveca said:


> Uh, you owe me an apology.
> 
> 
> 
> ...



No, just because you have an inability to read and comprehend, that does not mean I owe you and appology.



sneekypeet said:


> dave he has ony 3 ### meaning 850/800 or similar cards. At this time the 6800/7series already had it.
> 
> none of my X850 XTPE's has SM3....didnt run the putry dragon test in 3DM06 till I got a 7600





Benetanegia said:


> He said X### NOT X####. That is X800, X600... Nvidia had PS3.0 on their hardware GF6800, ATi was like 6 months late with PS3.0 hardware. Games like Oblivion lacked proper PS3.0 for that reason, for example.



See, they don't have a problem with reading and comprehension.


----------



## HeroPrinny (May 6, 2010)

chill people  chill, this could be a very interesting topic but this. back and forth slagging ruins it.


----------



## cadaveca (May 6, 2010)

newtekie1 said:


> Hell, even Unreal Tournament 3, the premier game on the Unreal Engine 3 lacks in game AA.



Actually, only in DX9 does it not support AA, due to using deferred rendering. It's the use of deffered rendering that makes it THE choice for games with Phys-X. There's no reason for AA to be left out of DX10 UE3 games.



> Tim Sweeney: Yes, we'll ship Unreal Tournament 3 with full DirectX 10 support. Support for multisampling is the most visible benefit.


----------



## Benetanegia (May 6, 2010)

cadaveca said:


> Uh, actually, it was. Back then it was called WGF, and was part of "Longhorn". Circa 2001. DX 9.0c released in 2004, 3 years after work on DX10 began.
> 
> http://en.wikipedia.org/wiki/Development_of_Windows_Vista
> 
> http://en.wikipedia.org/wiki/DirectX



Vista is not *that* Longhorn, in fact Vista was more of a rushed Win7 than anything else. The original Longhorn was delayed and much changed over the time, eventually adopting many things from the next OS M$ was working on, now called Win7. Things are not black and white, so M$ released 9.0c, because Longhorn was no longer what it was thought to be: a OS for 2003 release. They included things that were supossed to be part of Longhorn? Obviously they could not wait until Vista (aka rushed Win7) was released, so they implemented things that were much needed. They crippled DX10 because that? Nope and certainly Nvidia had nothing to do with that, which is what my post was about anyway.


----------



## EastCoasthandle (May 6, 2010)

You know what's interesting about that review is that even with those IQ features disabled the 480 still loses.  I read some of the posts over there and one person said that the benchmark test clearly put the 480 ahead.  I wonder if the drivers were optimized for the benchmark?


----------



## cadaveca (May 6, 2010)

Benetanegia said:


> Vista is not *that* Longhorn, in fact Vista was more of a rushed Win7 than anything else. The original Longhorn was delayed and much changed over the time, eventually adopting many things from the next OS M$ was working on, now called Win7. Things are not black and white, so M$ released 9.0c, because Longhorn was no longer what it was thought to be: a OS for 2003 release. They included things that were supossed to be part of Longhorn? Obviously they could not wait until Vista (aka rushed Win7) was released, so they implemented things that were much needed. They crippled DX10 because that? Nope and certainly Nvidia had nothing to do with that, which is what my post was about anyway.



We can start another thread on this all on it's own, there's so much to this subject. So I'll leave the O/T chat for now. Your impressions of what longhorn, WGF, and DX10 are is a bit skewed, but if you can put up with me, I'd more than love to talk about it in another thread.


----------



## bpgt64 (May 6, 2010)

The developer is only responsible to only deliver a product upon which the vast majority of their target audience will be willing to purchase, and thus sustain that company until next release.  If you take issue with the product, vote with your wallet.  Just like the Graphic's Card company has to measure feature(s) vs cost.  In the end, the goal is simply to make profit.

We, are the minority that spends heavily(in comparison to the average), we demand more and drive the industry.  If Nvidia wants to insure there product has a greater feature set, and yes paying a developer to test code is a valid way to do so.  Then is not Nvidia's fault by any-means to do so.  Where things get dicey, is when a company can pay to put a competitor at a disadvantage intentionally.  That, if provable(which conclusively has not been or ever will be proven in a forum such as this), goes against Anti-Trust(mainly anti-competition) laws.  Like the ones Apple is about to face, despite being a minority provider.

Now, your thinking well paying a developer to test there code on your hardware by definition puts an apposing GPU manufacturer at a disadvantage.  No it doesn't, Ati had a choice, and it made its decision.   Again, vote with your wallet if your unhappy with your horse.


----------



## Benetanegia (May 6, 2010)

cadaveca said:


> There's no reason for AA to be left out of DX10 UE3 games.



I think we can all agree with that, although I'm not sure they did implement it in the end. And there's no reason to leave it out of DX9 either. It can be done after all and Batman is the proof. Fact of the matter is that all and every UE3 game released to date is a DX9 game, as so is 99% of the games released (copy/paste the code and create a DX10 executable doesn't make it a DX10 game). It's the choice that developers are making and only them are to blame. Nvidia's intervention (or AMD's in the case of the few games where they helped) is always a good thing for us, consumers, gamers.


----------



## cadaveca (May 6, 2010)

Sure, but my point is that in DX10 UE3, AA is natively supported(on DX10 hardware).

Q: So why did nV need to code it?

A: For DX9 cards.

Q: So why does that effect ATi cards in DX10?

A: Because the developer purposely changed the AA implementation throughout the engine, breaking AA on ATi's cards, using nV's code.

So, nV broke AA in Batman, and seemingly, purposefully. This is the real basis for the whole argument in the first place.


----------



## Benetanegia (May 6, 2010)

cadaveca said:


> Q: So why does that effect ATi cards in DX10?
> 
> A: Because the developer purposely changed the AA implementation throughout the engine, breaking AA on ATi's cards, using nV's code.



Once again no. DX10 supports AA with deferred shading, but UE3 does not. Plain and simple. They said they would implement it on DX10, but I highly doubt they did in the end. And all that is irrelevant, since every game is using the DX9 code anyway (is there even a DX10 code path for UE3?). My point is it doesn't matter if Epic implemented AA on DX10 UE3 or not (if it exists at all), because developers are buying and using the DX9 one, for multi-platform purposes.


----------



## cadaveca (May 6, 2010)

Yes, there is a DX10 codepath, and I know for a fact that UE3 supports AA.

Tim Sweeny who I quoted above, is the engine's writer, by the way. I know for a fact that since early 2008 AA has worked in DX10...you can even enable it in Vista with the UT3 demo.

And they do not "just buy the DX9 version"...one of the major selling points of UE3 is that you need not worry as the engine will scale features automatically to the platform it's on. Nv's code changes this detection method to use nV's code for AA at all times, rather than defaulting to the normal path for AA.

I mean sure, nV's code is brilliant. It does come at very little performance penalty compared to running the native implementation, but that doesn't chagne that fact that the code customized the engine, and broke AA.

I can even explain why the native method comes at such a performance hit, but that's irrelevant.


----------



## Benetanegia (May 6, 2010)

cadaveca said:


> Yes, there is a DX10 codepath, and I know for a fact that UE3 supports AA.
> 
> Tim Sweeny who I quoted above, is the engine's writer, by the way. I know for a fact that since early 2008 AA has worked in DX10...you can even enable it in Vista with the UT3 demo.
> 
> ...



I would like you to show proofs of that, because I was never able to use AA on UT3 (nor Mass Effect, nor Mirror's Edge, Borderlands, Brothers in Arms....), except via Control Panel, something that you can do with Batman too. I'd really apreciate it, because I see many people claim all those things everywhere and I have yet to see a single proof of anything.


----------



## FordGT90Concept (May 6, 2010)

cadaveca said:


> http://ati.amd.com/products/radeonx1800/index.html
> 
> Every card since the X800 has supported PS3.0. Lack of support on legacy products in unimportant.


I started the X### thing here.

9### = 9700 and 9800 family of cards (DirectX 9.0, Pixel Shader 2.0)
X### = X800 and X850 family of cards (DirectX 9.0b, Pixel Shader 2.0b)
X1### = X1800, X1900, and X1950 family of cards (DirectX 9.0c, Pixel Shader 3.0)
HD 2### = HD 2900 family of cards (DirectX 10.0, Pixel Shader 4.0)
HD 3### = HD 3800 family of cards (DirectX 10.1, Pixel Shader 4.1)
HD 4### = HD 4800 family of cards (DirectX 10.1, Pixel Shader 4.1)
HD 5### = HD 5800 family of cards (DirectX 11.0, Pixel Shader 5.0)

The X### Pixel Shader 2.0b cards debuted after GeForce 6 series which supported Pixel Shader 3.0.  X1### didn't debut until late 2005, after the introduction of GeForce 7 series.  ATI was an entire generation behind NVIDIA at that time.


----------



## cadaveca (May 6, 2010)

Benetanegia said:


> I would like you to show proofs of that, because I was never able to use AA on UT3 (nor Mass Effect, nor Mirror's Edge, Borderlands, Brothers in Arms....), except via Control Panel, something that you can do with Batman too. I'd really apreciate it, because I see many people claim all those things everywhere and I have yet to see a single proof of anything.



Well, the DX10 AA uses SuperSampling AA. As we all know, not all drivers support this properly, so that may be a major source of people not getting working right. It's also why it comes at such a huge performance penalty.


This is why you can't get it working in those games...they've simply turned off the switches in the GUI to enable it, but beleive me, it's natively supported by the engine. Current gen cards definately have the power to use SSAA, but mid-range and lower cards might not be so capable.

What it boils down to is that deffered rendering is hard to balance properly for most programmers. So they have to make choices as to where they spend their dollars, as many have mentioned, and AA is low on this list, due to the aforementioned performance problems.

However, deffered rendering is the way of the future for all engines, so programmers better get used to it, and get good at it too...and when things happen like nV stepping in, this prevents the need for them to do so. THAT's why I have issue with nV and them helping with AA...it's cool that nV users get a benefit, but it should not break the experience for others.


----------



## Benetanegia (May 6, 2010)

cadaveca said:


> Well, the DX10 AA uses SuperSampling AA. As we all know, not all drivers support this properly, so that may be a major source of people not getting working right. It's also why it comes at such a huge performance penalty.
> 
> 
> This is why you can't get it working in those games...they've simply turned off the switches in the GUI to enable it, but beleive me, it's natively supported by the engine. Current gen cards definately have the power to use SSAA, but mid-range and lower cards might not be so capable.
> ...



Yeah SSAA has always been an option for UE3 (always through CP) and it also is available in BM:AA. It's MSAA what it lacks and what all this thing is about. While not exactly MSAA, Nvidia's implementation is similar and that's why it doesn't have as large a penalty as SSAA. And that was always the thing, that you had to enable SSAA through CP for Ati cards, but Nvidia had this MSAA+edge detect thing. SSAA has always been available and yes it's something anything DX10 can do because it's not really part of the fragment stage of the renderer, but that's not what has ever been discussed. I think you are a little bit lost here. So much written and you didn't even understood the beginning of the story. SSAA is there for all cards, "MSAA" is not.


----------



## cadaveca (May 6, 2010)

Yes, but the engine has the ability to enable SSAA with the console...there's no need to use the CP to enable it, other than developer's choice.(works with the UDK).

I get what you are saying, but that's not the point. SSAA is broken too, in Batman. It should not be.


----------



## ShiBDiB (May 6, 2010)

mods shoulda listening to me in the beginning... it didnt turn into a nvidia bashing thread, but an E-Peen war on who knows more about AA


----------



## sneekypeet (May 6, 2010)

so? its still civil, quit poking or you will earn yourself points.

Either post on the topic or move along, thank you.


----------



## BababooeyHTJ (May 6, 2010)

cadaveca said:


> Well, the DX10 AA uses SuperSampling AA. As we all know, not all drivers support this properly, so that may be a major source of people not getting working right. It's also why it comes at such a huge performance penalty.



The quote that you posted by Tim Sweeny mentions Multi-sampling, btw.


----------



## cadaveca (May 6, 2010)

BababooeyHTJ said:


> The quote that you posted by Tim Sweeny mentions Multi-sampling, btw.



Yes. And what is the difference between MSAA and SSAA?


----------



## Benetanegia (May 6, 2010)

cadaveca said:


> Yes, but the engine has the ability to enable SSAA with the console...there's no need to use the CP to enable it, other than developer's choice.(works with the UDK).
> 
> I get what you are saying, but that's not the point. SSAA is broken too, in Batman. It should not be.



It's the first time I hear that SSAA is broken for BMAA. It certainly isn't a widespread issue (0 google resluts on this end). 

Maybe:



> As we all know, not all drivers support this properly, so that may be a major source of people not getting working right. It's also why it comes at such a huge performance penalty.



Many games have issues with SSAA, so I'm not doubting that BM:AA has some. I've had some issues with many games myself.

In any case the point comes home that Nvidia has nothing to do with the lack of proper AA in BMAA (in fact, if SSAA is broken the implementation of their AA is a blessing). And neither does with any of the many other things they've been accused for. But I guess it's easier to blame someone, even if it's not the correct one...



cadaveca said:


> Yes. And what is the difference between MSAA and SSAA?



Are you serious?


----------



## ctrain (May 6, 2010)

newtekie1 said:


> Your assumption that AA is something that exists in all graphics engines and game titles by default is wrong.



of course it exists by default in all graphics engines, the question / problem at hand is rather if any of the rendering techniques they use breaks hardware aa... 

...like the case is with UE3, it's a DX9 limitation


----------



## cadaveca (May 6, 2010)

Benetanegia said:


> Are you serious?




No, but if you understand the differences, you know that Tim was refering to SSAA only. DX10.1 is required for MSAA in deferred rendering.

Well, at least it was, and that's what makes nV's code so great. 

Why that screws with SSAA, I don't know the precise details of, but i do have a couple of good possibilities. I need an nV card to find out the exact cause.

If it is using DX10.1-type commands...wow...there really is something to this...because according to nV, thier card's don't support 10.1. So how the heck did they get that working?


----------



## enaher (May 6, 2010)

Well the use of proprietary technology sucks big time, but Nvidia's TWIMTP, is not to blame, Nvidia offers extra's and that's fine, I can't see nothing wrong with it, actually Id like Ati to work with Devs, and improve Stream and Opencl applications and promote open standards, I'm sure Ati hardware has some untapped potential.


----------



## Benetanegia (May 6, 2010)

cadaveca said:


> DX10.1 is required for MSAA in deferred rendering.



 It was always posible via a separate Z buffer (even on DX9), although was more tricky and maybe not as desirable. *DX10.1 only made it faster and introduced a clearer interface.*

edit: Yeah it wouldn't be exactly MSAA, but rather an edge detect algorithm + SSAA on the edges. So... it wouldn't be a Coke, it would be Pepsi, but essentially the same. It's probably exactly what it's being done in BM:AA.


----------



## Nick89 (May 6, 2010)

Mr McC said:


> I am not a fan of Nvidia’s marketing practices as I firmly believe they damage the PC gaming industry, creating division through the use of proprietary technology where open standards are available that can produce exactly the same effects. Many people criticise ATI for the scant attention it pays to developer relations, citing the tangible advances that Nvidia offers gamers through its TWIMTBP releases and the use of CUDA and PhysX. However, I do not want to see ATI respond in kind as Nvidia seems intent on creating a situation wherein the consumer will eventually be forced to ask whether a given game is an “ATI title” or an “Nvidia title” wherein performance is essentially crippled on the competitor’s cards. I did not buy, nor will I buy Batman Arkham Asylum as Nvidia paid the developer to turn off in-game AA on ATI cards. I find that reprehensible: it is one thing to optimise a title for your hardware; it is another thing to pay the developer to ensure that the performance of a given title is reduced when you use the competitor’s hardware. Again, whilst many people ask why ATI didn’t pay the developer to ensure that this function was enabled for its hardware, I firmly believe that development should be left to the developers and that certain aspects of a game, such as AA, should be available irrespective of the brand of graphics card that the consumer decides to purchase. By all accounts Arkham Asylum is an excellent game; however, my principles will not allow me to support such practices with my money – to each, his own.
> 
> I just finished reading the review of Just Cause 2 over on HardOCP. It was very refreshing to find a reviewer who, despite the obvious pressure placed on tech sites, was willing to openly criticise Nvidia’s TWIMTBP program and marketing practices:
> 
> ...



 I completely agree.


----------



## newtekie1 (May 6, 2010)

cadaveca said:


> Q: So why does that effect ATi cards in DX10?
> 
> A: Because the developer purposely changed the AA implementation throughout the engine, breaking AA on ATi's cards, using nV's code.
> 
> So, nV broke AA in Batman, and seemingly, purposefully. This is the real basis for the whole argument in the first place.



A: Because the game doesn't run in DX10, it doesn't effect ATi cards in DX10, because that situation doesn't exist.

So, no nVidia did not break AA in Batman.


----------



## ctrain (May 6, 2010)

Benetanegia said:


> It was always posible via a separate Z buffer (even on DX9), although was more tricky and maybe not as desirable. *DX10.1 only made it faster and introduced a clearer interface.*
> 
> edit: Yeah it wouldn't be exactly MSAA, but rather an edge detect algorithm + SSAA on the edges. So... it wouldn't be a Coke, it would be Pepsi, but essentially the same. It's probably exactly what it's being done in BM:AA.



you cannot resolve the render target yourself in dx9, data required for msaa is lost.


----------



## Benetanegia (May 6, 2010)

ctrain said:


> you cannot resolve the render target yourself in dx9, data required for msaa is lost.



How the hell is going to be lost something that you specifically stored??


----------



## cadaveca (May 6, 2010)

You can only have so many interupts in the pipeline. You just can't access the data at that point. That's what DX10.1 brings, is that needed interrupt. It's like HDR and AA at the same time was...you choose one or the other.



newtekie1 said:


> A: Because the game doesn't run in DX10, it doesn't effect ATi cards in DX10, because that situation doesn't exist.
> 
> So, no nVidia did not break AA in Batman.












And why doesn't it run in DX10, then? Cuase you know the default behavior of the engine is to recognize the hardware and decide itself unless told differently, right? I mean, you select DX10 in the options...


----------



## Benetanegia (May 6, 2010)

What the.. are you both saying... You can store and load anything as many times as you want. Like I said you store Z separately, in a different render target, for you to access it as many times as you want. You are going to access Z data plenty of times in a deferred renderer anyway. Like I said is not MSAA, not the hardware accelerated MSAA, present in hardware, but rether an edge detect algorithm that will do supersampling on the detected edges. That's what MSAA does anyway, so it's MSAA just not being MSAA, if you know what I mean. And yeah it's slower, but not nearly as much as some make it to be. 5-10% slower maybe.


----------



## EastCoasthandle (May 6, 2010)

But who's still buying the game?  I recall many with 260s complaining about the high frame rate hit that those 2 IQ features put on their PCs.  So I get the impression that this game isn't fully playable with these features unless you have a 470/480 anyway.  

What I'm getting at is are those features a game breaker for you?


----------



## entropy13 (May 6, 2010)

EastCoasthandle said:


> But who's still buying the game?  I recall many with 260s complaining about the high frame rate hit that those 2 IQ features put on their PCs.  So I get the impression that this game isn't fully playable with these features unless you have a 470/480 anyway.
> 
> What I'm getting at is are those features a game breaker for you?



Well the review does say that even with the lack of those Water Simulation and Bokeh Filter options for the 5870, the 5870 would still be the best card for that game. Unless of course if you're an Nvidia fanboi then those two options would be the Second Coming...


----------



## cadaveca (May 6, 2010)

I dunno, the water difference is pretty drastic. I don;t understand the choices, but I do know what's more pleasing to the eye, and 5870 ain't gonna get it.

I'll still buy the game, but I won't play it on ATI hardware. Realyl in the end, is how much fun it is, and damn it...I like it.

No deal breaker. I bought (and finished) Batman too. My kids ask me to play it(Batman) at least once a week.


----------



## entropy13 (May 6, 2010)

Well go on ahead if you're going to buy the more expensive card because they make the water look better. It is your money after all, which is the primary purpose of the Water Simulation and Bokeh Filter anyway, being a "NVIDIA EXCLUSIVE - TWIMTBP"


----------



## cadaveca (May 6, 2010)

I'm the guy that has all the consoles too, just for platform specifics.

It's guys like me that they are really catering too...the ones that have the cash to spend to get what you want. Screw voting with my wallet...at least somebody is gonna get fed with that job I help support. They can deal with the bad press just fine.

I mean...it sucks, it really does...but it employs people and feeds thier kids. If they want to run around and be incompetent, that's just fine by me. Free society is all about that.


----------



## ctrain (May 6, 2010)

Benetanegia said:


> How the hell is going to be lost something that you specifically stored??



this isn't how it works, there is no official way to do this stuff in dx9. reading values from the depth buffer in dx9 is vendor specific and you might have to use a special format for it. it's all a hack under the hood. you cannot do a custom resolve on a multisampled depth buffer in dx9.




Benetanegia said:


> What the.. are you both saying... You can store and load anything as many times as you want. Like I said you store Z separately, in a different render target, for you to access it as many times as you want. You are going to access Z data plenty of times in a deferred renderer anyway. Like I said is not MSAA, not the hardware accelerated MSAA, present in hardware, but rether an edge detect algorithm that will do supersampling on the detected edges. That's what MSAA does anyway, so it's MSAA just not being MSAA, if you know what I mean. And yeah it's slower, but not nearly as much as some make it to be. 5-10% slower maybe.



i don't think you understand what SSAA is, because the method you describe doesn't make much sense.


----------



## Dazzeerr (May 6, 2010)

Benetanegia said:


> He said X### NOT X####. That is X800, X600... Nvidia had PS3.0 on their hardware GF6800, ATi was like 6 months late with PS3.0 hardware. Games like Oblivion lacked proper PS3.0 for that reason, for example.



Does this mean ATI paid Microsoft to release their DX11 cards first?

Give us a break.


----------



## Mr McC (May 6, 2010)

EastCoasthandle said:


> But who's still buying the game?  I recall many with 260s complaining about the high frame rate hit that those 2 IQ features put on their PCs.  So I get the impression that this game isn't fully playable with these features unless you have a 470/480 anyway.
> 
> What I'm getting at is are those features a game breaker for you?



I believe that there are sufficient other options available to allow me to continue to play games without supporting such actions. Too often, when consumers complain about shady marketing practices within the PC gaming industry (abusive DRM, TWIMTBP, etc), their peers in forums urge them to purchase the title anyway, because it's a great game or because we have already consented to being "shafted", as the reviewer in HardOCP puts it, by our decision to use the Windows platform. Certainly, as individuals, our ability to shape or control the market is limited; however, collectively, if the developers see that their practices are costing them sales and their "special relationship" with Nvidia is actually detrimental to their interests, they may be less willing to employ proprietary technology where open standards are available. Developers may be able to measure the money they save by allowing Nvidia to develop certain aspects of their games, but they can not measure the lost sales resulting from consumers' perception that a title has been crippled for their hardware; moreover, if we remain silent, they are unlikely to even consider this possibility. I can accept that the refusal to buy is a futile gesture, when considered at the level of the individual, and I have no doubt that I will miss many great titles as a result of my principles, but the choice of whether or not to purchase is one of the few options for input that remains to us, perhaps the most important. However, it is equally important to let a company know why I refuse to purchase, in order to enable it to either address my concerns or simply dismiss me whilst admitting awareness of my issues.

By refusing to buy, I punish the developer for assigning the development of features, which I believe are developer's responsibility, to Nvidia; I punish Nvidia by showing them that they have wasted money and resources that will reap no end benefits whilst associating their name with questionable marketing practices. To what extent I am punishing myself remains to be seen, but I can live quite happily without ever playing Just Cause 2.


----------



## erixx (May 6, 2010)

I buy games that 'I have to play', no others. As there really are only a few 'I have to play' games, I couldn't care less about the backgrounds, sorry. 

I played the Just Cause 2 demo, it's 'all ages' and mindless shooting range style is not for me, but just as a theoretical simulation, let's suppose you like it. Are you not going to play it because some corporate junk whatever? Well, ok.....  but that's like not visiting a state museum just because 'your' party was not elected for government


----------



## Mr McC (May 6, 2010)

erixx said:


> I buy games that 'I have to play', no others. As there really are only a few 'I have to play' games, I couldn't care less about the backgrounds, sorry.
> 
> I played the Just Cause 2 demo, it's 'all ages' and mindless shooting range style is not for me, but just as a theoretical simulation, let's suppose you like it. Are you not going to play it because some corporate junk whatever? Well, ok.....  but that's like not visiting a state museum just because 'your' party was not elected for government



As the years go by, there are fewer and fewer games that I "have to play". There is no need to apologise for your stance, the fact that you do not feel that the issues I have highlighted are worthy of consideration does not make your point of view any less valid and, needless to say, I understand your perspective.

Where and when we draw the line is a personal decision and the "corporate junk" that is but a minor annoyance for one user may prove intolerable for another. I am not going to play it for the reasons outlined above and because I feel that there are enough viable alternatives to ensure that I don't have time to dwell on what I might be missing.


----------



## Benetanegia (May 6, 2010)

ctrain said:


> this isn't how it works, there is no official way to do this stuff in dx9. reading values from the depth buffer in dx9 is vendor specific and you might have to use a special format for it. it's all a hack under the hood. you cannot do a custom resolve on a multisampled depth buffer in dx9.
> 
> 
> 
> ...



You can read te deph buffer and store it like a texture. In a deferred engine you are going to be reading that buffer many times, for lighting, shadowing, effects... Look I don't know exactly how it works, because I am not a coder, but I do read a lot on Gamedev.net and Beyond3D forums and there's plenty dozens of threads about that and they say this can be done in this way. Batman is the clear example that it can be effectively done, so fight as much as you want, *it can be done and probably in this way*.


----------



## Mr McC (May 6, 2010)

Benetanegia said:


> Batman is the clear example that it can be effectively done, so fight as much as you want, *it can be done*.



This is precisely the point and I think that the fact that AA is enabled for Nvidia hardware on the Batman title invalidates any argument that the absence of in-game AA for ATI hardware is solely and exclusively attributable to the developer's choice of engine. Are we seriously saying that if Nvidia had not come to the rescue the game would have been released without in-game AA on the pc? If that is the case I find Nvidia's willingness to allow the developer to evade its responsibility to include this feature to be open to criticism, indeed, it is not a role that I wish to see either company perform or actively pursue.

If and when the developers begin to place more emphasis on tesselation, I would never dream of complaining about the better performance of Nvidia's products: the 480 and 470 offer superior tesselation, an open standard that forms a part of DirectX 11, and I made the decision to buy an ATI product that is inferior in this regard. However, the issue in this case is AA and I feel that, at the very least, we should expect developers to be able to produce games that comply to minimum standards without the need for additional economic or technical support from ATI or Nvidia. Other companies can clearly meet these expectations and I feel that artificially introduced differences at this level can only hurt the consumer and ultimately the industry as a whole.


----------



## entropy13 (May 6, 2010)

Mr McC said:


> This is precisely the point and I think that the fact that AA is enabled for Nvidia hardware on the Batman title invalidates any argument that the absence of in-game AA for ATI hardware is solely and exclusively attributable to the developer's choice of engine. Are we seriously saying that if Nvidia had not come to the rescue the game would have been released without in-game AA on the pc? If that is the case I find Nvidia's willingness to allow the developer to evade its responsibility to include this feature to be open to criticism, indeed, it is not a role that I wish to see either company perform or actively pursue.
> 
> *If and when the developers begin to place more emphasis on tessellation, I would never dream of complaining about the better performance of Nvidia's products: the 480 and 470 offer superior tessellation,* an open standard that forms a part of DirectX 11, and I made the decision to buy an ATI product that is inferior in this regard. However, the issue in this case is AA and I feel that, at the very least, we should expect developers to be able to produce games that comply to minimum standards without the need for additional economic or technical support from ATI or Nvidia. Other companies can clearly meet these expectations and I feel that artificially introduced differences at this level can only hurt the consumer and ultimately the industry as a whole.



The bold part is really quite important too. Remember that during the period wherein only ATi cards were DirectX 11 capable that Nvidia consistently insisted that subsisting with existing DX 11 cards with tessellation would be a waste of money since it isn't really that much of a "new feature" compared to DirectX 10. But when they finally released their own cards which were waaaaaaaaaaaaaaaaaaaaaaay overdue, to let consumers and potential customers forget about price/perf, power/perf, temps, and frying eggs and burger patties, tessellation was immediately a gift from the Gods and Nvidia cards are godsend cards quite capable with tessellation, which is the bestest ever feature everest forever best.


----------



## Fourstaff (May 6, 2010)

My take on the AA issue:

Developer develops a game on an engine without AA. Nvidia comes along and says: "We will pay you to develop AA functionality for our products as a way to improve your product for my customers" I see nothing wrong here, ATi was never part of the equation, so they never received anything. 

Gamer with ATi hardware sees the game and comments:  "Hey look, you can only activate AA with Nvidia hardware! The devs must have received Nvidia's coin and disabled AA with ATi hardware!" *boycotts game*

I come along and asks the question: If Nvidia didn't provide funding and assistance to game dev, will they provide AA functionality anyway? If answer is yes, then Nvidia is being a bad person here, manipulating things to its own benefit. If answer is no, Nvidia is a good manufacturer, he cares about maximising his customer's satisfaction.


----------



## claylomax (May 6, 2010)

Bundy said:


> Or pragmatism, depends on how you want to consider the situation.



Top draw!


----------



## Mr McC (May 6, 2010)

Fourstaff said:


> My take on the AA issue:
> 
> Developer develops a game on an engine without AA. Nvidia comes along and says: "We will pay you to develop AA functionality for our products as a way to improve your product for my customers" I see nothing wrong here, ATi was never part of the equation, so they never received anything.
> 
> ...



Could you imagine the backlash if a game was released without any AA on the pc in this day and age? A more interesting question, that leads to various suppositions, is why the developer specifically chose that engine, knowing that it would hinder their ability to enable AA. Were they aware, prior to making the engine choice, that Nvidia would be on hand with cash and expertise?


----------



## entropy13 (May 6, 2010)

Fourstaff said:


> My take on the AA issue:
> 
> Developer develops a game on an engine without AA. Nvidia comes along and says: "We will pay you to develop AA functionality for our products as a way to improve your product for my customers" I see nothing wrong here, ATi was never part of the equation, so they never received anything.
> 
> ...



http://www.hexus.net/content/item.php?item=20991



> AMD received an email dated Sept 29th at 5:22pm from Mr. Lee Singleton General Manager at Eidos Game Studios who stated that Eidos’ legal department is *preventing Eidos from allowing ATI cards to run in-game antialiasing in Batman Arkham Asylum due to NVIDIA IP ownership issues over the antialiasing code, and that they are not permitted to remove the vendor ID filter*.



So the AA should work for both...except for the vendor ID filter.


----------



## Fourstaff (May 6, 2010)

Mr McC said:


> Could you imagine the backlash if a game was released without any AA on the pc in this day and age? A more interesting question, that leads to various suppositions, is why the developer specifically chose that engine, knowing that it would hinder their ability to enable AA. Were they aware, prior to making the engine choice, that Nvidia would be on hand with cash and expertise?



Yes, I can't see why they wouldn't produce a game wiithout AA. At least Farmville doesn't have it. And yes, you grasped my thought on the matter, on how Nvidia's TWIMTBP affected the gaming industry. As I see it, Unreal engine is a popular engine, so the devs have no devious reason to choose that particular engine. I am sure they were aware that Nvidia can help them, but I don't think they are going to bet on that fact.


----------



## Benetanegia (May 6, 2010)

Mr McC said:


> This is precisely the point and I think that the fact that AA is enabled for Nvidia hardware on the Batman title invalidates any argument that the absence of in-game AA for ATI hardware is solely and exclusively attributable to the developer's choice of engine. Are we seriously saying that if Nvidia had not come to the rescue the game would have been released without in-game AA on the pc? If that is the case I find Nvidia's willingness to allow the developer to evade its responsibility to include this feature to be open to criticism, indeed, it is not a role that I wish to see either company perform or actively pursue.





Fourstaff said:


> I come along and asks the question: If Nvidia didn't provide funding and assistance to game dev, will they provide AA functionality anyway? If answer is yes, then Nvidia is being a bad person here, manipulating things to its own benefit. If answer is no, Nvidia is a good manufacturer, he cares about maximising his customer's satisfaction.



That is the case, *if Nvidia didn't tell them to use AA they would have NOT included AA*, just as 90%+ games using UE3 did not include AA. According to you, mr mcc, it is criticable. Well I don't see you saying anything about all the other UE3 based games that don't have AA either. UE3 + no AA is the norm, not the exception and although it's something that I don't like personally, that's not what is being discussed here. What it's being discussed here is what Nvidia did, and what they did is *fix the situation by making them implement AA, not making it worse.* 

And if what ctrain said here is true, "reading values from the depth buffer in dx9 is vendor specific and you might have to use a special format for it." Right there is the answer as to why AA was only activated if an Nvidia card was detected. And as to why changing the exe name will allow you to enable the feature on Ati cards, but didn't work at all anyways. And why the developer asked Ati to help them implemet and test the feature for Ati cards.

It's the same in Just Cause 2, if Nvidia didn't tell them to include those features, they would have never included them to begin with resulting in an inferior product for everybody. On top of that, they probably used CUDA, because at the time they were creating the tech for the game only CUDA was available. It's been few months since there's full OpenCL and Direct Compute support from both camps (Ati being the last one btw), so it was just imposible to make the features using them.

If you are upset because you feel that developers evade their responsability by not including AA or any other feature that you (and only you) feel it's a requirement, make a thread about that, but don't create a Nvidia bashing thread with no reason to make it.



Mr McC said:


> Could you imagine the backlash if a game was released without any AA on the pc in this day and age? A more interesting question, that leads to various suppositions, is why the developer specifically chose that engine, knowing that it would hinder their ability to enable AA. Were they aware, prior to making the engine choice, that Nvidia would be on hand with cash and expertise?



Dozens of games are released without AA every year.

And as to why they chose UE3... you are showing your ignorance here...


----------



## newtekie1 (May 6, 2010)

cadaveca said:


> And why doesn't it run in DX10, then? Cuase you know the default behavior of the engine is to recognize the hardware and decide itself unless told differently, right? I mean, you select DX10 in the options...



Just because the game engine supports DX10, that doesn't mean it defaults to using it if available.

The game has to be designed and tested to use that rendering path also.  And since Batman was a console port, with no DX10 intention from the beginning, the rendering path was not included.

And there is no place in the Batman options to select DX10.



Mr McC said:


> Could you imagine the backlash if a game was released without any AA on the pc in this day and age? A more interesting question, that leads to various suppositions, is why the developer specifically chose that engine, knowing that it would hinder their ability to enable AA. Were they aware, prior to making the engine choice, that Nvidia would be on hand with cash and expertise?



Let me see, games released without in game AA in this day and age, off the top of my head:

Mass Effect 2 - Highly Antisipated game, recieving very good reviews
X-Men Origins - Another Highly Antisipated game recieving good reviews

Yep, no backlash from either not having AA.

They more than likely chose the enigne because it is extremely versatile.  One of the few I've seen that works well as a FPS engine, and is easily adapted to a 3rd person action game.




entropy13 said:


> http://www.hexus.net/content/item.php?item=20991
> 
> 
> 
> So the AA should work for both...except for the vendor ID filter.



Should, but doesn't(that has been shown), and again that makes perfect sense if nVidia was the one that paid for the developement of AA(or did it themselves), it is their IP to do with what they please.


----------



## HalfAHertz (May 6, 2010)

A lot has been written and said on the subject already. But the point I'd like to stress out is that Nvidia has both the financial power and the legal right to do as they please in the aforementioned cases. However I think they're missing the bigger picture here. They are dividing and distancing the few PC-only game designers that are left out there. The coders have to follow and comply with a whole dozen of different standards, test it out on numerous hardware combination, etc. They're simply running themselves thin and that's why more amd more are giving up and going over to other platforms like consoles, portables and even smartphones of al things...
   Instead of bullying and forcing themselves on the fragile industry, both graphics vendors should stop bickering and fighting and try to do everything in their power to bring PC Gaming back to its rightful place - one level above all other platforms. I don't care if that means they have to combine efforts, all I care as a consumer is that I get a better product in the end, one that deserves me spending money on new hardware. In the end they are a hardware manufacturer and should stick to what they do best.


----------



## exow2 (May 6, 2010)

cadaveca said:


> DX10 was delayed due to nV not supporting the API properly. DX10.1 was barely a whisper, again, thanks to a lack of nV support.



I see in your post you basically bash Nvidia's "support" when you really can't see that Nvidia is the one trying to at least have someone enjoy the game to the max. ATI didn't implement AA in the game and who's fault is that? You blame Nvidia for that? LOL. ATI should be on top of this regardless and should be able to get AA in that game as well. What's stopping them? Nothing actually, you call Nvidia's practices "illegal" but that's a bald face lie. Show me the law or act that prohibits a company to ensure their hardware will be the most compatable on a video game.


----------



## cadaveca (May 6, 2010)

exow2 said:


> I see in your post you basically bash Nvidia's "support" when you really can't see that Nvidia is the one trying to at least have someone enjoy the game to the max. ATI didn't implement AA in the game and who's fault is that? You blame Nvidia for that? LOL. ATI should be on top of this regardless and should be able to get AA in that game as well. What's stopping them? Nothing actually, you call Nvidia's practices "illegal" but that's a bald face lie. Show me the law or act that prohibits a company to ensure their hardware will be the most compatable on a video game.



Good morning. Actually, I'm bashing the dev's for using nV's support, and nV covering thier(the devs) asses. I even said nV's more than welcome to add things from thier closed APIs that only users with thier hardware can use. Manipulating an engine so that other users without thier hardware, lose features, is NOT ok.

Also, please read my sig...this is my opinion...I am entitled to it. There's no need for me to explain anything.

I'm not really into arguing over something like this. You'll have to ask someone else. Like I said, these things aren't going to prevent me from buying these titles...I have no bones about dropping $500 on a GTX480, so I can enjoy games like this to thier fullest.


----------



## Mr McC (May 6, 2010)

exow2 said:


> I see in your post you basically bash Nvidia's "support" when you really can't see that Nvidia is the one trying to at least have someone enjoy the game to the max. ATI didn't implement AA in the game and who's fault is that? You blame Nvidia for that? LOL. ATI should be on top of this regardless and should be able to get AA in that game as well. What's stopping them? Nothing actually, you call Nvidia's practices "illegal" but that's a bald face lie. Show me the law or act that prohibits a company to ensure their hardware will be the most compatable on a video game.



I blame Nvidia for taking on a role that should have been fulfilled by the developer. Moreover, I question the motives behind the choice of the Unreal engine or the use of CUDA in a given title when there are very clearly other alternatives available that would enable the same features to be developed for the products of both companies. I do not see Nvidia as a passive agent improving an otherwise inferior product, but rather as actively promoting and cultivating such practices to the detriment of the consumer. Nvidia's actions are not illegal and, as many contributors have pointed out, they are operating according to free market principles; however, freemarket principles also entail the consumer's decision to buy or look elsewhere.

If ATI were to take a similar approach we would soon be reduced to discussing our ATI or Nvidia build in much the same manner as users discuss their Xbox or Playstation, each with its own unique library of games. As HalfAHertz pointed out, the PC gaming industry is already on the decline and these artificially introduced differentiating features will do little to improve the situation.


----------



## Mr McC (May 6, 2010)

Benetanegia said:


> That is the case, * According to you, mr mcc, it is criticable. Well I don't see you saying anything about all the other UE3 based games that don't have AA either. *



I don't have time to address each and every aspect of the PC gaming industry that is open to criticism, moreover, the post is specifically concerned with TWIMTBP games and directly related to the review recently published on HardOCP. Diverting attention to other areas that need to be improved or other questionable choices in game design does not in any way excuse the practices under consideration in this thread.  



Benetanegia said:


> If you are upset because you feel that developers evade their responsability by not including AA or any other feature that you (*and only you*) feel it's a requirement, make a thread about that, but don't create a Nvidia bashing thread with no reason to make it.



and the reviewer on HardOCP and several contributors to this thread and....


----------



## JATownes (May 6, 2010)

Ok...I am an ATI Fanboy, as most of you already know, but I am also very objective.  Here is how I see it:

1.) Nvidia has a VERY aggressive marketing campaign that ensures their name and some proprietary technology goes into some games.  This is a business decision that the company made, because at the end of the day, the board members are there to keep the shareholders happy.
2.) ATI/AMD has a very limited marketing campaign, if any at all.  This also is a business decision, made to keep shareholders happy.  

So which is better/worse?  They are both about the same.  If anything NV needs to trim their marketing back some and ATI needs to boost theirs.  

These arguements are getting very old.  Lets not forget that we "enthusiast" probably amount to 1-5% of both of these companies total revenue, so they could care less what we think.  Their main concern is the mainstream market that makes up the bulk of their revenue.

Bottom Line: Buy what the hardware you want to play the games you want, and quit whining when one company does something you don't like.  If you don't like it switch companies!


----------



## Benetanegia (May 6, 2010)

Mr McC said:


> I don't have time to address each and every aspect of the PC gaming industry that is open to criticism, moreover, the post is specifically concerned with TWIMTBP games and directly related to the review recently published on HardOCP. Diverting attention to other areas that need to be improved or other questionable choices in game design does not in any way excuse the practices under consideration in this thread.
> 
> 
> 
> and the reviewer on HardOCP and several contributors to this thread and....



Again Nvidia has nothing to do with the lack of those features so you are completely wrong. I'm not diverting attention to anything different since the problem is the one I'm saying and not the one you pretend to exist. TWIMTBP is there to add things not prevent anyone from implementing them and they do add a lot of things things. And same goes for AMD's own program. Look at the rest of games. Which ones have special features? Only some which are under TWIMTBP or AMD's Game. The rest are just pure console ports.

Developers are not leaving the PC market because programs like TWIMTBP that help them develop and add features, they are abandoning it because exactly the opposite. On consoles they get all the help they want, even monetary, and that's the reason they are moving to consoles. What we need is *more* intervention from Nvidia and AMD not less. They sell graphics hardware, that's their bussiness, but that hardware can't exist without games, it would be pointless. On the consumer space a $5 IGP can do everything except games, it's only games for what a graphics card is really needed. Helping PC game developers is an integral part of being a PC graphics vendor. Games are made for their hardware and they have to help develop the interesting things. Plain and simple.


----------



## Mr McC (May 6, 2010)

JATownes said:


> So which is better/worse?  They are both about the same.  If anything NV needs to trim their marketing back some and ATI needs to boost theirs.



Agreed.



JATownes said:


> Bottom Line: Buy what the hardware you want to play the games you want, and quit whining when one company does something you don't like.  If you don't like it switch companies!



What you define as "whining" I define as drawing an issue to the community's attention. In any event, surely I can "whine" and also follow the rest of your advice?


----------



## JATownes (May 6, 2010)

Mr McC said:


> What you define as "whining" I define as drawing an issue to the community's attention. In any event, surely I can "whine" and also follow the rest of your advice?


Maybe whining was an inappropriate term to use .  What I am referring to is the MASSIVE amount of threads dedicated to this topic.  It seems that threads such as these are posted at LEAST once a week.  *It is repetitive and, if anything, these comments should be added to other threads that have already been created.*  Also, these ATI vs. NV threads seem to lead to a lot of trolling and flame wars, which in turn leads to infractions.  And I don't ever like seeing someone get smacked with the banstick.  (except maybe Mailman, but that is because he likes it.  )


----------



## Mr McC (May 6, 2010)

Benetanegia said:


> Again Nvidia has nothing to do with the lack of those features so you are completely wrong. I'm not diverting attention to anything different since the problem is the one I'm saying and not the one you pretend to exist. TWIMTBP is there to add things not prevent anyone from implementing them and they do add a lot of things things. And same goes for AMD's own program. Look at the rest of games. Which ones have special features? Only some which are under TWIMTBP or AMD's Game. The rest are just pure console ports.
> 
> Developers are not leaving the PC market because programs like TWIMTBP that help them develop and add features, they are abandoning it because exactly the opposite. On consoles they get all the help they want, even monetary, and that's the reason they are moving to consoles. What we need is *more* intervention from Nvidia and AMD not less. They sell graphics hardware, that's their bussiness, but that hardware can't exist without games, it would be pointless. On the consumer space a $5 IGP can do everything except games, it's only games for what a graphics card is really needed. Helping PC game developers is an integral part of being a PC graphics vendor. Games are made for their hardware and they have to help develop the interesting things. Plain and simple.



We must agree to disagree: what you see as helpful assistance, I see as unwanted intervention that, in the best case scenario, will not provide any incentive to developers to address and include what should be "standard features". 

Could the developer have included the water effects in Just Cause via other means? Would the inclusion of such features have raised development costs beyond the developer's limit without Nividia's "assistance"? How much money will the developer lose, given the rate at which ATI 5xxx series cards are being adopted in this niche segment of the industry, as a result of the perception that their software is not properly optimised for ATI hardware? Would any costs entailed by implementing these "special features" be offset by incorporating rather than alienating ATI users who refuse to buy "crippled software"?


----------



## brandonwh64 (May 6, 2010)

Loud Noises!


*Im on pain meds cause of a pulled tooth and this thread is funny sorry but i got a kick out of it!*


----------



## Benetanegia (May 6, 2010)

Both Nvidia and AMD should collaborate more, that is something all of us agree, but the fact is that no one collaborates. This is not a Nvidia only issue. Take Eyefinity, for example, remember this story at Anandtech? How Ati kept Eyefinity a secret until the very end? They asked display developers to make bezel-less displays and also asked to keep it secret from Nvidia. How is that any good for the consumer? It isn't. I'm not saying they should have given Eyefinity out, but prevent everyone from knowing about it? They clearly wanted it just for themselves and you know what? Very well done. It's bussiness. 

There's also PhysX and how not only Ati didn't support it, but they chose to actively undermine it. That is not well done at all. Collaboration? Yes they should, but what would have Ati done if Nvdia went and told AMD they were adding those features and if AMD wanted to cooperate in their creation? As they've done plenty of times in the past, they would have said no, which is what they did with Batman anyway, except that it was the developer who asked and the response was: "No, you are a TWIMTBP game, so just no."

Collaboration yes. But into making something. Not collaboration to not make anything at all. And as I see it Nvidia wants those things implemented (for people like me) and just can't wait for AMD. Take HW physics for example. AMD dismissed PhysX and proposed Havok and later Bullet, ok, fair enough, but where are those? It's been more than 2 years and there's not even future planned games based on them. It took Nvidia 2 months to make physx geforce capable, so it clearly isn't a technical issue. No, I'd rather be confined to one brand in *my election* of card, than not having any new features. It's your own choice. AMD is giving no choice because they don't implement anything and your view of not implement it unless both agree, gives you no choice either: it won't be implemented. And if your choice is not having them implemented, congrats you already have that option now, just don't buy a Nvidia card.


----------



## Mr McC (May 6, 2010)

JATownes said:


> Maybe whining was an inappropriate term to use .  What I am referring to is the MASSIVE amount of threads dedicated to this topic.  It seems that threads such as these are posted at LEAST once a week.  *It is repetitive and, if anything, these comments should be added to other threads that have already been created.*  Also, these ATI vs. NV threads seem to lead to a lot of trolling and flame wars, which in turn leads to infractions.  And I don't ever like seeing someone get smacked with the banstick.  (except maybe Mailman, but that is because he likes it.  )



It is not my intention to provoke a flame war. I thought that the HardOCP review was worthy of the forum's attention. Up to this point, I feel that discussion has been quite civil, but if it degrades into a slanging match or a fanboy festival I assure you that I will be the first person to exit the thread and request a lock. I admit that the thread is provocative, but then again, so are the opinions expressed in the HardOCP review. I promise you that I will do my utmost to ensure that nobody is reprimanded by the moderators on my account. The thread may be repetitive and possibly does have a place as an addendum to existing threads, but in my defence, I thought that the opinions expressed by HardOCP deserved their own thread, not only because of what was said but also because of where it was said. In any event, the thread title clearly identifies the subject-matter and forum users are free to simply ignore it if they feel that the issue has become tedious.

I will bear Mailman's masochistic tendencies in mind.


----------



## Mr McC (May 6, 2010)

brandonwh64 said:


> Loud Noises!
> 
> 
> *Im on pain meds cause of a pulled tooth and this thread is funny sorry but i got a kick out of it!*



No problem Gummy, we are here to entertain.


----------



## cadaveca (May 6, 2010)

Benetanegia said:


> AMD is giving no choice because they don't implement anything and your view of not implement it unless both agree, gives you no choice either: it won't be implemented. And if your choice is not having them implemented, congrats you already have that option now, just don't buy a Nvidia card.



AMD IS giving a choice though...the difference is that they expect OPEN STANDARDS, such as DX, to be the middleman that ensures things work on all cards.

nVidia uses CLOSED STANDARDS, and releases the same tech, but keeps it exclusive. They turn these features into cash, while AMD says "hey. let's share!".

It's not like AMD's hardware is incapable of running this stuff...


But in the end, you are right, to a degree.


----------



## Mr McC (May 6, 2010)

Benetanegia said:


> Both Nvidia and AMD should collaborate more, that is something all of us agree, but the fact is that no one collaborates. This is not a Nvidia only issue. Take Eyefinity, for example, remember this story at Anandtech? How Ati kept Eyefinity a secret until the very end? They asked display developers to make bezel-less displays and also asked to keep it secret from Nvidia. How is that any good for the consumer? It isn't. I'm not saying they should have given Eyefinity out, but prevent everyone from knowing about it? They clearly wanted it just for themselves and you know what? Very well done. It's bussiness.
> 
> There's also PhysX and how not only Ati didn't support it, but they chose to actively undermine it. That is not well done at all. Collaboration? Yes they should, but what would have Ati done if Nvdia went and told AMD they were adding those features and if AMD wanted to cooperate in their creation? As they've done plenty of times in the past, they would have said no, which is what they did with Batman anyway, except that it was the developer who asked and the response was: "No, you are a TWIMTBP game, so just no."
> 
> Collaboration yes. But into making something. Not collaboration to not make anything at all. And as I see it Nvidia wants those things implemented (for people like me) and just can't wait for AMD. Take HW physics for example. AMD dismissed PhysX and proposed Havok and later Bullet, ok, fair enough, but where are those? It's been more than 2 years and there's not even future planned games based on them. It took Nvidia 2 months to make physx geforce capable, so it clearly isn't a technical issue. No, I'd rather be confined to one brand in *my election* of card, than not having any new features. It's your own choice. AMD is giving no choice because they don't implement anything and your view of not implement it unless both agree, gives you no choice either: it won't be implemented. And if your choice is not having them implemented, congrats you already have that option now, just don't buy a Nvidia card.



As stated above, I accept and agree that ATI has to do more; however, I do not want to see them respond in kind -further division aids nobody.


----------



## exow2 (May 6, 2010)

cadaveca said:


> Good morning. Actually, I'm bashing the dev's for using nV's support, and nV covering thier(the devs) asses. I even said nV's more than welcome to add things from thier closed APIs that only users with thier hardware can use. Manipulating an engine so that other users without thier hardware, lose features, is NOT ok.
> 
> Also, please read my sig...this is my opinion...I am entitled to it. There's no need for me to explain anything.
> 
> I'm not really into arguing over something like this. You'll have to ask someone else. Like I said, these things aren't going to prevent me from buying these titles...I have no bones about dropping $500 on a GTX480, so I can enjoy games like this to thier fullest.



Oh it was never my intention to spark an argument over this , I was merely stating my opinion as were you, and I wouldn't consider myself neither an ATI or Nvidia fanboy. Sure I have a nvidia card but they both have their ups and downs and I wouldn't think twice about buying an ATI card if it was worth it.


----------



## yogurt_21 (May 6, 2010)

Benetanegia said:


> Both Nvidia and AMD should collaborate more, that is something all of us agree, but the fact is that no one collaborates. This is not a Nvidia only issue. Take Eyefinity, for example, remember this story at Anandtech? How Ati kept Eyefinity a secret until the very end? They asked display developers to make bezel-less displays and also asked to keep it secret from Nvidia. How is that any good for the consumer? It isn't. I'm not saying they should have given Eyefinity out, but prevent everyone from knowing about it? They clearly wanted it just for themselves and you know what? Very well done. It's bussiness.
> 
> There's also PhysX and how not only Ati didn't support it, but they chose to actively undermine it. That is not well done at all. Collaboration? Yes they should, but what would have Ati done if Nvdia went and told AMD they were adding those features and if AMD wanted to cooperate in their creation? As they've done plenty of times in the past, they would have said no, which is what they did with Batman anyway, except that it was the developer who asked and the response was: "No, you are a TWIMTBP game, so just no."
> 
> Collaboration yes. But into making something. Not collaboration to not make anything at all. And as I see it Nvidia wants those things implemented (for people like me) and just can't wait for AMD. Take HW physics for example. AMD dismissed PhysX and proposed Havok and later Bullet, ok, fair enough, but where are those? It's been more than 2 years and there's not even future planned games based on them. It took Nvidia 2 months to make physx geforce capable, so it clearly isn't a technical issue. No, I'd rather be confined to one brand in *my election* of card, than not having any new features. It's your own choice. AMD is giving no choice because they don't implement anything and your view of not implement it unless both agree, gives you no choice either: it won't be implemented. And if your choice is not having them implemented, congrats you already have that option now, just don't buy a Nvidia card.



I think that collaboration is a nice idea but won't happen. I thnik the whole problem is not ati or nvidia, but microsoft, if micjrosoft offered proper support for directx to aid developers makign games nvidia's Twimtbp wouldn't stand a chnace. As it is Nvidia saw a hole in the development process and offered help, can it be shady some times? sure, but in the end nvidia s offrering the help that microsoft whould have.


----------



## newtekie1 (May 6, 2010)

cadaveca said:


> AMD IS giving a choice though...the difference is that they expect OPEN STANDARDS, such as DX, to be the middleman that ensures things work on all cards.
> 
> nVidia uses CLOSED STANDARDS, and releases the same tech, but keeps it exclusive. They turn these features into cash, while AMD says "hey. let's share!".
> 
> ...



Hardly.

ATi has Streams, their answer to CUDA, the only problem is that Streams came out too late for the devs to care about.  They were using CUDA for half a year, they didn't want to switch.

So now AMD has to rely on OpenCL and DirectCompute, because no one will use Streams.  Trust me, they don't want to, they would be perfectly happy if everyone used Streams, but that isn't going to happen.


----------



## overclocking101 (May 6, 2010)

fact is nvidia beat ati to the punch now nvidia is reaping the benefits of that. hell for all we know nvidia PAYS each developer to use cuda but what do we know???? niothing. except games keep getting released with cuda and those games run at 200fps on nvidia cards while only running at 150 on ati WHOPPidy DO! as long as the ati cards will still RUN the game which they sure will, who friggin cares! this the way its meant to be played CRAP has been going on for years now! its getting old, if you dont like how the game plays on ATI buy NVIDIA for christs sake! if you cant afford nvidia then DEAL WITH IT!


----------



## newtekie1 (May 6, 2010)

overclocking101 said:


> fact is nvidia beat ati to the punch now nvidia is reaping the benefits of that. hell for all we know nvidia PAYS each developer to use cuda but what do we know???? niothing. except games keep getting released with cuda and those games run at 200fps on nvidia cards while only running at 150 on ati WHOPPidy DO! as long as the ati cards will still RUN the game which they sure will, who friggin cares! this the way its meant to be played CRAP has been going on for years now! its getting old, if you dont like how the game plays on ATI buy NVIDIA for christs sake! if you cant afford nvidia then DEAL WITH IT!



Well, sort of, but in reality Just Cause 2 actually gets better FPS on my HD4890 then my GTX260 and even my GTX285, because those two extra eye candy effects that nVidia had added to the game use GPU power to calculate and extra rendering power to render.

Can I live without those two eye candy features? Absolutely.  Do I care that I'm missing them by playing the majority of the game with an HD4890?  Hell no.


----------



## newconroer (May 6, 2010)

the54thvoid said:


> I can see this thread getting lockdown!!
> 
> Any marketing strategy that aims to proactively diminish another companies standing is to be expected.  There (unfortunately) is nothing extraordinary about that.
> However, when a percentage of service users that fall into the opposing camp have their experience diminshed by said practice, then that is very wrong.  To create deals with software developers that aim to 'handicap' the opposition and co-erce the consumer to purchase a certain brand is, under different conditions, illegal practice.  At best, it is unfair.
> ...



Two rebuttals:

A) The American economy hasn't be a free market for decades. What you're saying is good on paper, not in practice (although it should be). Which means you're talking a lot of principal and not a lot of reality, unfortunately.

B) Nvidia's influence over the market, hasn't reached a point where people are boycotting software sales by the masses.
We should take a poll, how many people would really not buy a game because it didn't have AA support, or physics or etc? Because there's been plenty of titles just like that, and I don't believe it's stopped people in great numbers.

And as a side note, if you were really really THAT desperate to experience these miscellaneous and sometimes obscure features in a software program, then you could always go buy a good price/performance product from that company. In this case, say a GT200 series Nvidia card. Should you HAVE to? In perfect world, no. Though in our world where principal and practical rarely meet? Yes.


----------



## Benetanegia (May 6, 2010)

cadaveca said:


> AMD IS giving a choice though...the difference is that they expect OPEN STANDARDS, such as DX, to be the middleman that ensures things work on all cards.
> 
> nVidia uses CLOSED STANDARDS, and releases the same tech, but keeps it exclusive. They turn these features into cash, while AMD says "hey. let's share!".
> 
> ...



AMD plays the waiting game. Let others do the work, take the hassle of making it a standard, and when everything is done, then jump in the bandwagon. They do nothing to really promote or push those things. Like with hardware physics, they are doing nothing to move it forward. 2 years and a half and where is that shiny GPU Havok that was so much better and so much more open than PhysX? 

And OpenCL, this is the real state of OpenCL:



> If you are wondering what is the real deal with GPGPU API's, there is a telling tale of why Adobe opted to base its Mercury Engine on nVidia's CUDA language. While AMD will tell you that they're all for open standards and push OpenCL, the sad truth is that the company representatives will remain shut when you ask them about the real status of their OpenCL API - especially if you quote them a lead developer from a AAA software company with 10x more employees than AMD themselves that goes something like this: "I struggled to even get ATI's beta drivers installed and working, it was just problem after problem. Maybe once ATI gets their drivers out of beta and actually allow you to install them then I will have some performance numbers. I mean at this point AMD is so far behind in development tools they are not even worth pursuing right now."



So yeah, it's easy to "support" open standards (I support open standards. see? it took me 2 seconds to suport open standards), but when it comes to actually support them, they do next to nothing. It's not a matter of being able to do it, because they are more than competent, it's a matter of putting money into something that would potentially steal them money from their biggest bussiness, CPU and maybe more especifically server CPUs. And that's all. 

I'm all for open standards, but I don't like them to sell me vaporware. OpenCL is not vaporware, but is taking far too long and the programs and features that were promised definately are vaporware. *2 years and a half ago!* (I can't remark this on a writing as much as I would like) I was told, like everyone else on the planet, that I should pass on CUDA and PhysX, because AMD was pushing for open standards and we would have better features based on them, CUDA was dead. Well, where are those features and programs they were speaking about? V A P O R W A R E to divert attention from the real deal that CUDA/PhysX was at the time. 2 and a half years, 3 generations of cards...

As I said I rather have the features. Open standards don't help me get better features. Not when no one is doing something to promote them. Not when the one that says it is promoting and supporting them is more than likely doing as much as they can do on the shadows to keep them from actually releasing.


----------



## cadaveca (May 6, 2010)

I'm refering to the open standard available in Windows, DX, DXCompute, etc. Everything within DX ATi supports. We have been stuck with nVidia barely supporting DX since DX9, but why would they, when they have thier own proprietary solutions?


It's up to the developer to use the already-provided DX solutions, rather than something else. Why they make those choices(CUDA, STREAM, Phys-X, doesn't matter), is beyond me, if they want to maximize thier audience.

That's why AMD isn't promoting anything...that's up to Microsoft. AMD's hardware is fully DX compliant...


I agree, maybe they should do more...but...nothing other than what DX provides.

Havoc works just fine...it's developer's that need to use it...and some do. None of Havoc really requires a gpu. I think you misunderstand why you'd want to use GPU physics, and let me tell you, a lack of cpu power has nothing to do with it. Maybe a few years ago, when dualcores were the norm...but not now.


----------



## Benetanegia (May 6, 2010)

I don't agree. DX is the first thing that should die and open the door to OpenGL again. DX is not open, it's not a standard and most times than not is not what game developers want. That is why they use things like CUDA.

Not to mention that with CUDA or OpenCL you can do much much more things than you can with DirectCompute.

And if DX is what AMD choses, of course they should promote it because it's their job. M$ and DX is just the bridge between the real players on the gaming making bussiness. M$ should not be the one making decisions about which fetures are in and forcing their implementation. That's the job of game developers and hardware providers. M$ should just act as the bridge that it really is between them and offer those features that devs and GPU makers demand.

Then there's the Xbox. As long as M$ has a console, they should have nothing to do with PC gaming. Period.

Regarding GPU physics. I know very well why I want GPU physics and yes it is performance related. Rather than performance it's the amount of detail. I laugh everytime I see someone say how great physics on Bad Company 2 are. They are garbage compared to what it could be done (and has been done on PhysX demos) with just a 8800GT. JUst because the GPU power is not being used in games, doesn't mean it couldn't b used for *real* fully destructible environments. In the sledge demo Nvidia showed a single Fermi card simulating 1 million interactive particles at a time smoothly. Bad Company 2's physics are a joke technologically.


----------



## cadaveca (May 6, 2010)

Benetanegia said:


> Not to mention that with CUDA or OpenCL you can do much much more things than you can with DirectCompute.




Actually, that's not true. DXCompute can be used for basically anything...it's just that the solutions you mention (maily CUDA)have ready-baked code. That's lazy...understandable, but still lazy.

This is what "support" for these api brings...ready baked code. That is it.

Why the hell should AMD, a hardware company, be publishing software?

nVidia is NOT a hardware company...they are a softwarecompany that also sells hardware. I can provide video of Jen Hsun saying this exact thing.

If that's what you expect of AMD...well...nV doesn't make cpus. They wish they could, but will NEVER get an x86 liscence. That's why they are doing what they do...they are creating a software platform.

AMD is NOT in the business of selling software, or a software platform..they provide HARDWARE platforms.


----------



## Benetanegia (May 6, 2010)

cadaveca said:


> Actually, that's not true. DXCompute can be used for basically anything...it's just that the solutions you mention (maily CUDA)have ready-baked code. That's lazy...understandable, but still lazy.
> 
> This is what "support" for these api brings...ready baked code. That is it.
> 
> ...



Hardware is nothing without software. Like I said AMD expects the other do all the work for them. If you want to sell your hardware you need to give a reason why consumers should pay for it. And I'm not talking about publishing, I'm talking about sending engineers and helping develop and include features in games.


----------



## cadaveca (May 6, 2010)

I understand what you are saying.

What I am saying, is based on those points i jsut mentioned, it's unrealistic to expect AMD to do that.

It's RIGHT to expect it...heck..Intel does the same(although not to the extent that nV does), but that doesn't mean it will happen.

The market is SCREWED UP, period, and nV started it all, pushing forward PS3.0 in DX9. DX is only broken, thanks to nV.

BATMAN AA is the perfect example..running MSAA, using DX10.1 code in DX9. Why the heck didn't they just use DX10.1? Business.

And nV's business...isn't selling hardware. It's selling software.

You are asking AMD to become a software company...and THAT would only make this situation worse!


----------



## MadClown (May 7, 2010)

Wait, aren't game developers supposed to make games?  And aren't GPU companies supposed to make GPUs and not co-develop games?  If both types of companies stuck to their own thing, we as human's could have made greater progress.


----------



## newtekie1 (May 7, 2010)

MadClown said:


> Wait, aren't game developers supposed to make games?  And aren't GPU companies supposed to make GPUs and not co-develop games?  If both types of companies stuck to their own thing, we as human's could have made greater progress.



I don't believe we would be anywhere near where we are right now if everyone just stuck to what they were supposed to do and never ventured into new territory.


----------



## entropy13 (May 7, 2010)

newtekie1 said:


> I don't believe we would be anywhere near where we are right now if everyone just stuck to what they were supposed to do and never ventured into new territory.



Specialization contributed greatly to the modernization of the world. I don't know where you're getting that conclusion. It's just that recently, because of those advances themselves specialization was forgotten and multi-disciplinary "specialization" became the vogue. Hence a hardware-making software company is possible.


----------



## newtekie1 (May 7, 2010)

entropy13 said:


> Specialization contributed greatly to the modernization of the world. I don't know where you're getting that conclusion. It's just that recently, because of those advances themselves specialization was forgotten and multi-disciplinary "specialization" became the vogue. Hence a hardware-making software company is possible.



So you really think we would be as advanced as we are if no one tried anything new?

I mean, lets just use an example of your logic.

There is a company, that started as a logic chip maker, then moved into the RAM industry, then moved into the processor business, and today is the primary competitor to Intel...yes I'm talking about AMD.  And because they changed their business focus, and expanded what their business does, we have advances in CPU technology that we would have never seen without them making that move.  Do you really think Intel would have advanced the industry as much if it wasn't for the need to try to stay ahead of AMD?  We would all be still stuck on single core processors as powerful as PIIIs.  Hell, we wouldn't even have x64.:shadedshu


----------



## entropy13 (May 7, 2010)

newtekie1 said:


> So you really think we would be as advanced as we are if no one tried anything new?
> 
> I mean, lets just use an example of your logic.
> 
> There is a company, that started as a logic chip maker, then moved into the RAM industry, then moved into the processor business, and today is the primary competitor to Intel...yes I'm talking about AMD.  And because they changed their business focus, and expanded what their business does, we have advances in CPU technology that we would have never seen without them making that move.  Do you really think Intel would have advanced the industry as much if it wasn't for the need to try to stay ahead of AMD?  We would all be still stuck on single core processors as powerful as PIIIs.  Hell, we wouldn't even have x64.:shadedshu



So AMD is currently in the RAM industry as well? 

I thought you're using my logic? Since when did I even imply that once you get a specialization you can't change it? What I was saying is that instead of being focused on one thing, people (and companies) keep on "focusing" on many things, albeit all related, ALL AT THE SAME TIME. Which is for all intents and purposes hardly something you can call "specialization". But I'm not saying as well that doing that is necessarily a bad thing. It's just that sometimes, it just really doesn't work out.

Even in AMD itself there's still a "specialization" of sorts. 



> During this time, AMD attempted to embrace the perceived shift towards RISC with their own AMD 29K processor, and they attempted to diversify into graphics and audio devices as well as EPROM memory. It had some success in the mid-1980s with the AMD7910 and AMD7911 "World Chip" FSK modem, one of the first multistandard devices that covered both Bell and CCITT tones at up to 1200 baud half duplex or 300/300 full duplex. The AMD 29K survived as an embedded processor and AMD spinoff Spansion continues to make industry leading flash memory. *AMD decided to switch gears and concentrate solely on Intel-compatible microprocessors and flash memory, placing them in direct competition with Intel for x86 compatible processors and their flash memory secondary markets.*



A more common example of my reasoning in this:
http://blogs.static.mentalfloss.com/blogs/archives/35163.html

In Avon's case, he started selling books, then added some perfumes. The perfumes were more popular, so he SPECIALIZED on that (perfumes and cosmetics). Avon's not selling BOTH cosmetics AND books anymore. In Nokia's case, they started as a paper mill, but they aren't BOTH a paper mill AND a phone maker.

The ones that do well in diversified industries are quite big, hardly the size of an Nvidia.


----------



## ctrain (May 7, 2010)

Benetanegia said:


> And if DX is what AMD choses, of course they should promote it because it's their job. M$ and DX is just the bridge between the real players on the gaming making bussiness. M$ should not be the one making decisions about which fetures are in and forcing their implementation. That's the job of game developers and hardware providers. M$ should just act as the bridge that it really is between them and offer those features that devs and GPU makers demand.



DX features are primarily implemented based on developer request, MS doesn't force anything. Actually, I think geometry shaders might have been their idea though, but in reality, basically everything is requested.


----------



## Benetanegia (May 7, 2010)

ctrain said:


> DX features are primarily implemented based on developer request, MS doesn't force anything. Actually, I think geometry shaders might have been their idea though, but in reality, basically everything is requested.



Acepting requests and implementing some in the way M$ wants, is not listening to developers exactly. The fact of the matter ir that OpenGL is the one API that is created and updated by a consortium formed by hundreds of companies, game developers and academic figures. And another fact is that OpenGL and DX are so different, not in the features they have, but in how they are implemented, which is what I was talking about. Is been long since M$ cares much more about their console and making multi-platform migration easier than it is about making PC gaming the best thing posible for PC gamers. And again I'm not talking about the features, but how and when they are implemented and how and when and to what extent M$ decides to promote them. When a OS launch is near they will promote it to death, but after that they go mute as if pretending to hide the dust below the carpet. The fact that they pay for Xbox exclusives that not only prevents the game from going to the PS3, but also delay the PC release by several months doesn't help create the sense that M$ is promoting PC gaming either.


----------



## Mr McC (May 7, 2010)

Benetanegia said:


> Acepting requests and implementing some in the way M$ wants, is not listening to developers exactly. The fact of the matter ir that OpenGL is the one API that is created and updated by a consortium formed by hundreds of companies, game developers and academic figures. And another fact is that OpenGL and DX are so different, not in the features they have, but in how they are implemented, which is what I was talking about. Is been long since M$ cares much more about their console and making multi-platform migration easier than it is about making PC gaming the best thing posible for PC gamers. And again I'm not talking about the features, but how and when they are implemented and how and when and to what extent M$ decides to promote them. When a OS launch is near they will promote it to death, but after that they go mute as if pretending to hide the dust below the carpet. The fact that they pay for Xbox exclusives that not only prevents the game from going to the PS3, but also delay the PC release by several months doesn't help create the sense that M$ is promoting PC gaming either.



There is little that I can not agree with in your post; however, given that the Xbox is nearing the end of its life, we may have a small window of opportunity for advances in DirectX 11 gaming on the PC between the demise of the current console and the next generation replacement.

A number of contributors have pointed out that ATI, and more importantly Microsoft, whilst promoting open standards, have failed to deliver tangible results. I believe that there is a great deal of truth in this accusation, but I don't see Nvidia's push towards proprietary technology as the solution.


----------



## cadaveca (May 7, 2010)

If nVidia did nothing but provide the HAL for Windows, then they'd be a much stronger force. Everything they do is convoluted by teh fact they sell hardware as well. This is why I think AMD and nV should merge...ATI hardware engineers, and nV software engineers...would be unstoppable.


But really, they are selling hardware so that they can pay all the employees. Note that through this "global recession", nVidia didn't lay anyone off.


As a business, few can compare to nV. But the process of making money is interfering with the industry moving forward as a conglomerated unit.


----------



## newtekie1 (May 7, 2010)

entropy13 said:


> So AMD is currently in the RAM industry as well?
> 
> I thought you're using my logic? Since when did I even imply that once you get a specialization you can't change it? What I was saying is that instead of being focused on one thing, people (and companies) keep on "focusing" on many things, albeit all related, ALL AT THE SAME TIME. Which is for all intents and purposes hardly something you can call "specialization". But I'm not saying as well that doing that is necessarily a bad thing. It's just that sometimes, it just really doesn't work out.
> 
> ...



Yes, but the original argument, the one you are defending, is that nVidia is a hardware company, and they should stay a hardware company and not attempt to move into new areas.

AMD _was_ in the RAM industry, they aren't currently, because they have moved into something new and more sucuessful.  However, at one point they were doing both RAM and processors.  They also don't just sell processors currently, they do chipsets and graphics cards now(through the ATi merger).  That is hardly specialization also.

Avon, your example, proves my point perfectly.  By the argument that you are trying to defend, he should have stayed selling books, and never moved to perfumes and cosmetics.  And at one time, Avon WAS selling books and perfumes at the same time.  He didn't just decide one day that he was going to instantly stop selling books and move to cosmetics/perfumes.

Focusing on what is profitable and you are good at is one thing, and a smart business sense, but at the same time trying new areas is important for progress.

Just look at Henry Ford as an example.  By all means, if we follow the logic you are trying to defend, he should have stayed a farmer.  Never trying anything new, never venturing into new areas.  He was a farmer, he should have farmed.  But instead he did move into new areas, first moving into steam engines, and then becoming a founder for the automobile industry as we know it including inventing the modern assembly line method, something that allowed humans to progress to the point we are at today.  And all that would not have happened if we simply said a farmer is a farmer, they should't do anything else.  So, do you still want to defend the logic that a GPU company is a GPU company, and they shouldn't do anything else?


----------



## SNiiPE_DoGG (May 7, 2010)

where to start.... for one thing you are completely off base with this "trying something new" thing. Lets stick to the companies were actually talking about eh? and not abstract analogies to different markets and companies in different eras.

Nvidia goes to game developers and gives them money. The developer reciprocates by optimizing for nvidia hardware. Thats not "trying something new" that's paying to stop the devloper from optimizing for both brands.

IF nvidia were to give the game devs no money, is it in the developers best interest to optimize for only 1 hardware brand? hell no.  They want to optimize for all hardware so people buy their game and don't complain about poor performance on either side.

Instead they are getting nvidia's money so they have incentive to only optimize for one hardware brand; this hurts progress.


----------



## Mr McC (May 7, 2010)

SNiiPE_DoGG said:


> where to start.... for one thing you are completely off base with this "trying something new" thing. Lets stick to the companies were actually talking about eh? and not abstract analogies to different markets and companies in different eras.
> 
> Nvidia goes to game developers and gives them money. The developer reciprocates by optimizing for nvidia hardware. Thats not "trying something new" that's paying to stop the devloper from optimizing for both brands.
> 
> ...



That is more or less my interpretation of the situation and my only real possibility of protest is to refuse to buy where I feel that a given game has overstepped the mark by optimising features for Nvidia that should be universally available.


----------



## newtekie1 (May 7, 2010)

SNiiPE_DoGG said:


> where to start.... for one thing you are completely off base with this "trying something new" thing. Lets stick to the companies were actually talking about eh? and not abstract analogies to different markets and companies in different eras.
> 
> Nvidia goes to game developers and gives them money. The developer reciprocates by optimizing for nvidia hardware. Thats not "trying something new" that's paying to stop the devloper from optimizing for both brands.
> 
> ...



However, what we are talking about is not just the optimizations parts that included in TWIMTBP, but also the additional software elements that nVidia is developing to be added to games, and at this point in the discussion CUDA based software.

_Now_, as for nVidia giving money to the developers to optimize for their hardware, I think you're wrong there.  I believe if nVidia didn't give the money for optimization on their hardware, we would see games that had little or no extra optimization for either.  The games would still run more than reasonably on both ATi and nVidia hardware. It certainly isn't the case where TWIMTBP games run like shit on ATi hardware, despite you believing they have no optimization for ATi hardware.  If nVidia didn't pay for the optimization for their hardware, the developers would certainly not be forced to optimize more for both, that is a false belief, they instead would not optimize more for either.

Think of it like this:

The develper designs the game and optimizes it to the point where it run at 30FPS with Max Settings on equal hardware from each company(say an ATi HD4890 and a nVidia GTX275).  They are done with the amount of optimization they are going to do.  Then nVidia comes in and says they will pay for the developement costs to optimize the game so it runs at 45FPS on the GTX275.  If nVidia hadn't use the TWIMTBP program to improve the optimization for their hardware, would the ATi side get better optimization?  No.  If nVidia didn't come along would the developers be forced to do more optimization for both sides?  No, they already met their optimization goals.


----------



## Wrigleyvillain (May 7, 2010)

newtekie1 said:


> If nVidia didn't pay for the optimization for their hardware, the developers would certainly not be forced to optimize for both, that is a false belief, they instead would not optimize for either.



But they need to sell their game. With NV cash not in the equation is it not in their inherent best interest to optimize for both? That's not to say all would in every case but as long as we are making generalizations...


----------



## newtekie1 (May 7, 2010)

Wrigleyvillain said:


> But they need to sell their game. With NV cash not in the equation is it not in their inherent best interest to optimize for both? That's not to say all would in every case but as long as we are making generalizations...



Obviously a certain amount of optimization will be done regardless, and is still done with ATi hardware or the games wouldn't play at all on ATi hardware.  I did word that poorly.  However, the assumption that nVidia is paying for all the optimization to go toward their hardware is false, I believe.  Instead, they are paying for optimization beyond what the developer would normally do.  This is evident by the fact that TWIMTBP games still running perfectly fine on ATi hardware.

And if you want some examples of what I mean look at a game like Modern Warfare.  It is part of the TWIMTBP program, with really the only thing done seems to be optimizations for nVidia hardware, and we see this optimization with a GTX275 getting about 20FPS more than an ATi HD4890.  However, the HD4890 is still getting 80FPS@1920x1200.  So without nVidia stepping in the GTX275 might be right at the same 80FPS, but if nVidia didn't step in, I doubt the developers would have spent any more time and money getting better framerates for both.  I'm guessing 80FPS@1920x1200 would have been good enough for them.


----------



## Benetanegia (May 7, 2010)

Mr McC said:


> but I don't see Nvidia's push towards proprietary technology as the solution.



For me it's the lesser evil. In fact, is the natural way for new things to take traction. First a feature has to exist, has to be created has to be established as a worthwile feature and before that happens there is no interest from anyone to create a standard for something that doesn't exist yet. I come back to the fact of how AMD says they support open standards but they are not creating those standards themselves, nor helping too much on their creation either.

Everything we have today has been a propietary standard at some point. Without going any further graphics API's started as propietary tech. IMO that's the only way that things can evolve in this capitalist world, because every company has its interests and none of them is to create something for free for everyone. In the early days of computing/gaming joining forces was a rewarding venture, because they only fighted against anonymity. There was a world full of potential customers that didn't have what these companies sold nor even something similar. The challenge was making those people buy that new thing they had created and if joining forces was the best to do they would simply do it. There was enough fish in the river. 

Nowadays that doesn't happen anymore, anything sold is a replacement for another thing, it's a replacement of a competing product, so collaboration is not as rewarding as it was and that's why the only way for something to happen is making it yourself and that's what Nvidia is doing, because AMD will not create their own things and will not support Nvidia in the things they are making, as they already demostrated with PhysX. It's just a matter of finding a market for them, and although they might not be for eveyone, they surely are amazing features for some. Take Badaboom for example. A lot of people say it's useless, but for me that I record TV programs to watch them in my mp4 during my relatively long travels on the underground+train it was a blessing, I'd have my mp4 movies encoded in minutes instead of hours. Without this CUDA app I wouldn't be able to do that and I've been enjoying that app for almost 3 years if I'm not mistaken. Is it better to wait 3+ years to have it made with an open standard? Maybe for some, but not for me, 3 years is a long time. And same goes with PhysX. Forget about the fact that it has not been extensively used where it's been used (we know it's just because of lack of support from AMD), even the very little that it does in Mirror's Edge was worth for me. It was worth the effort of trying to have better physics in games and the only thing I had to pay was $0, the only requirement was going with a Nvidia card.

Things would be different if there were other alternatives, but there's none. there are promises, but again, promises were made 2-3 years ago and we have nothing yet. I can only think they will follow the path of Duke Nukem Forever.

And that's a similar point I've been making regarding Batman's antialiasing and the Just cause 2 features. If there were other similar features being made on OpenCL or DX compute *now or soon* or if most of UE3 games were coming with built in AA, I could see the situation as you do. But there's none. Those are the only examples of something like that, which is proof enough that developers don't want to make those things on their own. And I rather have those features than waiting forever. If more people had supported PhysX instead of bashing it, AMD would have been forced to make something, propose a standard, make their own thing, support PhysX... After that, we would already have an established feature, one worth making a standard off. Anything would have been better than what we have today, which is *nothing*.


----------



## erocker (May 7, 2010)

ATi needs to keep working on Stream, Open GL/CL and all that. They also need to pick up CUDA. Fact of the matter is CUDA is there and it works. I like my cake and I like eating it too. I could care less about what happens behind closed doors at these companies or why it isn't happening. Don't let pride get in the way, do it for your customers, make it happen. We all know CUDA can run on ATi cards (see NGHQ from last year).


----------



## HalfAHertz (May 7, 2010)

Benetanegia raised some valid points. Funk it, would linux for the desktop be where it is today if it wasn't for windows? Would the internet be as interactive as it is today if it wasn't for flash? In both cases it all began with a closed standard, that gained wide adoption. Once you standardize a platform or a solution and set the bar(give an example, a starting point for others to follow), it is much easier to develop an open, free alternative. So kudos to Nvidia for at least trying.

Windows>Linux
Flash>HTML5
Photoshop>Gimp
etc.


----------



## cadaveca (May 7, 2010)

erocker said:


> ATi needs to keep working on Stream, Open GL/CL and all that. They also need to pick up CUDA. Fact of the matter is CUDA is there and it works. I like my cake and I like eating it too. I could care less about what happens behind closed doors at these companies or why it isn't happening. Don't let pride get in the way, do it for your customers, make it happen. We all know CUDA can run on ATi cards (see NGHQ from last year).



ATi's problem isn't a lack of skill or any of that...they've laid off too many people, and just plain and simple, lack the nessecary manpower. They've literally striped away that side of the company during the merger, and become, 100%, a hardware company.


I think nVidia would do well from the publicity of actually giving CUDA to ATi 100% royalty free, if ATi could hire the software engineers to help support it.

and +1 to not caring about the reasoning behind it all...but understanding that side of it has me thinking we are SOL on that one.


----------



## newtekie1 (May 7, 2010)

cadaveca said:


> ATi's problem isn't a lack of skill or any of that...they've laid off too many people, and just plain and simple, lack the nessecary manpower. They've literally striped away that side of the company during the merger, and become, 100%, a hardware company.
> 
> 
> I think nVidia would do well from the publicity of actually giving CUDA to ATi 100% royalty free, if ATi could hire the software engineers to help support it.
> ...



I have to agree with nVidia giving ATi CUDA, and I think nVidia knows CUDA running on ATi hardware would give a huge boost to it.  Of course ATi is more likely to stop CUDA running on their hardware than nVidia is, since it would completely kill Streams.


----------



## Wile E (May 8, 2010)

SNiiPE_DoGG said:


> where to start.... for one thing you are completely off base with this "trying something new" thing. Lets stick to the companies were actually talking about eh? and not abstract analogies to different markets and companies in different eras.
> 
> Nvidia goes to game developers and gives them money. The developer reciprocates by optimizing for nvidia hardware. Thats not "trying something new" that's paying to stop the devloper from optimizing for both brands.
> 
> ...


No, they wouldn't optimize for either at all, and just let it happen on the driver level. Plus we wouldn't get any new features at all.

Sorry, I just don't agree that nV is making the gaming industry worse. If anything, their current practices should be kicking ATI's ass into gear offering useful development tools and proper support for the newer open standards, beating nV at it's own game.

I blame the current state of gaming affairs on ATI, for not adapting to the times, and offering better developer help and dev kits. If they would help devs more, we'd have none of this mess at all.

I love these GPGPU apps, and I really enjoy the little eye candy boost Physx gives in games that support it. Where is ATI's support or version of these? With the way things are going, I think my next card will be an nVidia card, because at least I get the new features, and not nothing at all. I hate some of their business practices, like disabling Physx in systems with an ATI card, but at least they are bringing new things to the table.


----------



## newtekie1 (May 8, 2010)

I agree with Wile E.

I also will add that I believe AMD buying ATi was a bad thing for the industry.  Obviously, now that they are a CPU company that also makes Graphics Cards, it isn't in their best interest to move computing away from the CPU.  This is why, I believe, they have been so slow to adopt it.  When nVidia releases something like CUDA or PhysX, they make a big stink about how they are going to support something different that they say is better, but it has been well over a year with PhysX and we haven't seen a thing.  They say they are going to support Havok to get it to the level of capability where PhysX is, and Havok still sucks.  I play the latest Havok games and they are no where near where some of the PhysX games are.  Theres no volumetic smoke, theres no things breaking in tiny realistic pieces when hit, none of that.

Though the physics argument might be a dying one now that it seems DX11 includes a physics engine.  Hopefully now that we have something that will run on everything that is actually good can capable, we will start to see some good things coming down the line in games.


----------



## cadaveca (May 8, 2010)

newtekie1 said:


> I agree with Wile E.
> 
> I also will add that I believe AMD buying ATi was a bad thing for the industry.  Obviously, now that they are a CPU company that also makes Graphics Cards, it isn't in their best interest to move computing away from the CPU.  This is why, I believe, they have been so slow to adopt it.  When nVidia releases something like CUDA or PhysX, they make a big stink about how they are going to support something different that they say is better, but it has been well over a year with PhysX and we haven't seen a thing.  They say they are going to support Havok to get it to the level of capability where PhysX is, and Havok still sucks.  I play the latest Havok games and they are no where near where some of the PhysX games are.  Theres no volumetic smoke, theres no things breaking in tiny realistic pieces when hit, none of that.
> 
> Though the physics argument might be a dying one now that it seems DX11 includes a physics engine.  Hopefully now that we have something that will run on everything that is actually good can capable, we will start to see some good things coming down the line in games.



I think maybe things might change a bit when Llano becomes more prevalent. With a powerful gpu in the same socket as a cpu, working the gpu's grunt into cpu calculations will be much easier. 

It takes deffered rendering to get all of this working properly in a 3D environment. There are a couple of caveats in the pc architecture that make this more difficult than it has to be, and 890FX's IOMMU will make this MUCH easier.

I really don't think hardware is ready for the fundmental change just yet, but it's getting there. Programmers are crying about the PS3, which only really works well with deffered rendering, and if you take a look at recent titles, does a pretty good job at giving decent graphics. Add in a more recent gpu, and it doesn't become easier...it simply has more capability.


----------



## newtekie1 (May 8, 2010)

I'm kind of hoping that once both AMD and Intel have integrated a GPU onto their processor packages, that programmers might start to use that GPU to do the calculations in games that CUDA and the like are doing on the main GPU right now.  Leaving the main GPU entirely for rendering the graphics.  Kind of like using the built in GPU as a co-processor...but maybe I'm just being to hopeful.


----------



## Wile E (May 9, 2010)

newtekie1 said:


> I'm kind of hoping that once both AMD and Intel have integrated a GPU onto their processor packages, that programmers might start to use that GPU to do the calculations in games that CUDA and the like are doing on the main GPU right now.  Leaving the main GPU entirely for rendering the graphics.  *Kind of like using the built in GPU as a co-processor*...but maybe I'm just being to hopeful.



Straying a little off topic here, I really wish nV would sell some cards with no video outs and driver them as CUDA co-processors instead of a display device, and let them work on any platform. As a Co-processor, they would even work in Vista with an ATI card. Cake and eat it too, anyone?


----------



## TheMailMan78 (May 9, 2010)

Wile E said:


> No, they wouldn't optimize for either at all, and just let it happen on the driver level. Plus we wouldn't get any new features at all.
> 
> Sorry, I just don't agree that nV is making the gaming industry worse. If anything, their current practices should be kicking ATI's ass into gear offering useful development tools and proper support for the newer open standards, beating nV at it's own game.
> 
> ...



Your blame is wrongly placed. If anything you should point your hate at consoles.


----------



## Wile E (May 9, 2010)

TheMailMan78 said:


> Your blame is wrongly placed. If anything you should point your hate at consoles.



While consoles do share in the blame, ATI still takes the majority of the blame in my book, as they just refuse to move ahead on the dev side.


----------



## TheMailMan78 (May 9, 2010)

Wile E said:


> While consoles do share in the blame, ATI still takes the majority of the blame in my book, as they just refuse to move ahead on the dev side.



Why should they? There is no need to push the envelope when most games are ports based off of 5 year old hardware. This has been my argument for a few years now on TPU. 

TWIMTBP and ATI's lack of funding to me have nothing to do with the downfall of gaming. I mean when PC gaming was in its prime Nvidia and ATI were using the same business model. The only thing that HAS changed is the console market. In doing so developers are now making games for the lowest denominator because its the biggest cash cow. Good example is RAGE by iD. They are making thier new flagship engine on a damn 360 and then porting it over to PC. This is iD software! Makers of Quake and DOOM doing ports!

What they don't understand is that in another 5 years when people are sick of the same old graphics and turn to PCs for the "next big thing" Nvidia and ATI will say "Hey we have been following yalls lead.". Then and only they will people realize the damage consoles are doing to gaming.

We NEED TWIMTBP. We NEED ATI to get on the ball with physics and such. If anything these programs will save gaming. Not condemn it.


----------



## Wile E (May 9, 2010)

TheMailMan78 said:


> Why should they? There is no need to push the envelope when most games are ports based off of 5 year old hardware. This has been my argument for a few years now on TPU.
> 
> TWIMTBP and ATI's lack of funding to me have nothing to do with the downfall of gaming. I mean when PC gaming was in its prime Nvidia and ATI were using the same business model. The only thing that HAS changed is the console market. In doing so developers are now making games for the lowest denominator because its the biggest cash cow. Good example is RAGE by iD. They are making thier new flagship engine on a damn 360 and then porting it over to PC. This is iD software! Makers of Quake and DOOM doing ports!
> 
> ...


You just agreed with my stance. ATI is not doing enough to get the industry to move forward. Only nV is pushing things further, but it's not enough without ATI getting off their asses.


----------



## TheMailMan78 (May 9, 2010)

Wile E said:


> You just agreed with my stance. ATI is not doing enough to get the industry to move forward. Only nV is pushing things further, but it's not enough without ATI getting off their asses.



Nvidia isn't doing all that much ether. With the exception of physx Nvidia is just as guilty as ATI. Why? Because TWIMTBP system now just makes ports run better. No envelope there to push.


----------



## Wile E (May 9, 2010)

TheMailMan78 said:


> Nvidia isn't doing all that much ether. With the exception of physx Nvidia is just as guilty as ATI. Why? Because TWIMTBP system now just makes ports run better. No envelope there to push.



That's where we disagree. At least they are pushing something PC only. Physx, better AA algorithms like in Batman, better, more realistic water effects in Just Cause 2, etc. These are things that PCs are capable of, but not consoles, so at least it's a small step forward. If ATI would join in the fray, it would all gain momentum, and we would see more significant advancements.


----------



## DaedalusHelios (May 10, 2010)

Wile E said:


> That's where we disagree. At least they are pushing something PC only. Physx, better AA algorithms like in Batman, better, more realistic water effects in Just Cause 2, etc. These are things that PCs are capable of, but not consoles, so at least it's a small step forward. If ATI would join in the fray, it would all gain momentum, and we would see more significant advancements.



I agree that obviously Nvidia is pushing forward faster than ATi on all software dev fronts. Eyefinity is nothing new. The performance of their GPUs to be able to actually handle it is what is new. My only gripe with Nvidia is their pricing is still too high. ATi's 5970 is also a little too pricey too. That being said I had no intention of buying either GTX 480 or 5970 regardless of price unless they were in the a particularly good deal but I believe pricing should be a little lower for the most part. Lower pricing and make us find a reason to buy more often(advancements in tech).


----------

