• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Criticism of Nvidia's TWIMTBP Program - HardOCP's Just Cause 2 Review

Joined
Jun 1, 2006
Messages
1,745 (0.26/day)
Location
The Nevada Wasteland
System Name 9th Level
Processor AMD Ryzen 5 5600X
Motherboard MSI X570 Carbon wifi
Cooling EK Basic 360, x2 250mm, x1 140mm, x1 120mm fans.
Memory 32GB Corsair Vengeance 3200mhz.
Video Card(s) EVGA RTX 3080 12GB FTW3
Storage 500gb ssd, 2tb ssd, 6tb HD.
Display(s) MSI 27" Curved 1440p@165hz
Case HAF 932
Power Supply Corsair HX850W
Software Windows 10 64bit
I am not a fan of Nvidia’s marketing practices as I firmly believe they damage the PC gaming industry, creating division through the use of proprietary technology where open standards are available that can produce exactly the same effects. Many people criticise ATI for the scant attention it pays to developer relations, citing the tangible advances that Nvidia offers gamers through its TWIMTBP releases and the use of CUDA and PhysX. However, I do not want to see ATI respond in kind as Nvidia seems intent on creating a situation wherein the consumer will eventually be forced to ask whether a given game is an “ATI title” or an “Nvidia title” wherein performance is essentially crippled on the competitor’s cards. I did not buy, nor will I buy Batman Arkham Asylum as Nvidia paid the developer to turn off in-game AA on ATI cards. I find that reprehensible: it is one thing to optimise a title for your hardware; it is another thing to pay the developer to ensure that the performance of a given title is reduced when you use the competitor’s hardware. Again, whilst many people ask why ATI didn’t pay the developer to ensure that this function was enabled for its hardware, I firmly believe that development should be left to the developers and that certain aspects of a game, such as AA, should be available irrespective of the brand of graphics card that the consumer decides to purchase. By all accounts Arkham Asylum is an excellent game; however, my principles will not allow me to support such practices with my money – to each, his own.

I just finished reading the review of Just Cause 2 over on HardOCP. It was very refreshing to find a reviewer who, despite the obvious pressure placed on tech sites, was willing to openly criticise Nvidia’s TWIMTBP program and marketing practices:

http://www.hardocp.com/article/2010/05/04/just_cause_2_gameplay_performance_image_quality

The Way It’s Meant to be Played?
We have no doubt that the Bokeh filter and the GPU Water Simulation options could have been executed successfully on AMD’s Radeon HD 5000 series of GPUs. That the developers chose NVIDIA’s CUDA technology over Microsoft DirectCompute or even OpenCL is probably due to the fact that NVIDIA’s developer relations team worked with Avalanche Studios developers, and of course they like to promote their own products. (We would surely love to see the contract between the two, but that will never happen.) It is certainly their right to do so, just as it is Avalanche’s right to choose whatever API they want to use. We would certainly not presume to tell any independent game developer how to design their own game, but we would suggest that a more open alternative (such as OpenCL or DirectCompute) would have been preferred by us for those gamers without CUDA compatible hardware.
This is an old argument, and is basically analogous to the adoption of PhysX as opposed to a more broadly compatible physics library. NVIDIA wants to increase its side of the GPU business by giving its customers a "tangible" advantage in as many games as possible, while gamers without NVIDIA hardware would prefer that game developers had not forget about them.
As it stands for Just Cause 2, gamers without NVIDIA hardware are missing a couple of really nice graphics features, but those features are not critical to the enjoyment of the game. Just Cause 2 still looks just fine and is just as fun without them. But if you want the very best eye candy experience possible, NVIDIA's video cards, especially the GeForce GTX 480 and GTX 470, will give it to you.
When NVIDIA tells us that it will "Do no harm!" when it comes to gaming, that is really a bold faced lie, and we knew it when it was told to us. It will do no harm to PC gaming when it fits its agenda. NVIDIA is going to continue to glom onto its proprietary technologies so that it gains a marketing edge, which it very much does though its TWIMTBP program. And we have to assume that marketing edge is worth all the bad press it does generate. To say NVIDIA does not harm to PC gaming is a delusional at best. You AMD users just got shafted on these cool effects that could have been easily developed for all PC gamers instead of just those that purchase from one company.


This is another title that I refuse to buy. I doubt that this will cause much concern to Avalanche Studios, but if sufficient numbers avoid a developer’s titles because it allows Nvidia access to areas of development that should employ open standards, the developers may begin to take notice. Hopefully, TWIMTBP will become a thing of the past as it creates division and artificially introduced differences that are neither necessary nor desirable from the point of view of the consumer.

I completely agree.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.05/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
Q: So why does that effect ATi cards in DX10?

A: Because the developer purposely changed the AA implementation throughout the engine, breaking AA on ATi's cards, using nV's code.

So, nV broke AA in Batman, and seemingly, purposefully. This is the real basis for the whole argument in the first place.

A: Because the game doesn't run in DX10, it doesn't effect ATi cards in DX10, because that situation doesn't exist.

So, no nVidia did not break AA in Batman.
 

ctrain

New Member
Joined
Jan 12, 2010
Messages
393 (0.07/day)
:confused: It was always posible via a separate Z buffer (even on DX9), although was more tricky and maybe not as desirable. DX10.1 only made it faster and introduced a clearer interface.

edit: Yeah it wouldn't be exactly MSAA, but rather an edge detect algorithm + SSAA on the edges. So... it wouldn't be a Coke, it would be Pepsi, but essentially the same. It's probably exactly what it's being done in BM:AA.

you cannot resolve the render target yourself in dx9, data required for msaa is lost.
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.48/day)
Location
Reaching your left retina.
you cannot resolve the render target yourself in dx9, data required for msaa is lost.

How the hell is going to be lost something that you specifically stored??
 

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,232 (2.50/day)
You can only have so many interupts in the pipeline. You just can't access the data at that point. That's what DX10.1 brings, is that needed interrupt. It's like HDR and AA at the same time was...you choose one or the other.

A: Because the game doesn't run in DX10, it doesn't effect ATi cards in DX10, because that situation doesn't exist.

So, no nVidia did not break AA in Batman.


:laugh:


:toast:




And why doesn't it run in DX10, then? Cuase you know the default behavior of the engine is to recognize the hardware and decide itself unless told differently, right? I mean, you select DX10 in the options...;)
 
Last edited:

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.48/day)
Location
Reaching your left retina.
What the.. are you both saying... You can store and load anything as many times as you want. Like I said you store Z separately, in a different render target, for you to access it as many times as you want. You are going to access Z data plenty of times in a deferred renderer anyway. Like I said is not MSAA, not the hardware accelerated MSAA, present in hardware, but rether an edge detect algorithm that will do supersampling on the detected edges. That's what MSAA does anyway, so it's MSAA just not being MSAA, if you know what I mean. And yeah it's slower, but not nearly as much as some make it to be. 5-10% slower maybe.
 

EastCoasthandle

New Member
Joined
Apr 21, 2005
Messages
6,885 (0.95/day)
System Name MY PC
Processor E8400 @ 3.80Ghz > Q9650 3.60Ghz
Motherboard Maximus Formula
Cooling D5, 7/16" ID Tubing, Maze4 with Fuzion CPU WB
Memory XMS 8500C5D @ 1066MHz
Video Card(s) HD 2900 XT 858/900 to 4870 to 5870 (Keep Vreg area clean)
Storage 2
Display(s) 24"
Case P180
Audio Device(s) X-fi Plantinum
Power Supply Silencer 750
Software XP Pro SP3 to Windows 7
Benchmark Scores This varies from one driver to another.
But who's still buying the game? I recall many with 260s complaining about the high frame rate hit that those 2 IQ features put on their PCs. So I get the impression that this game isn't fully playable with these features unless you have a 470/480 anyway.

What I'm getting at is are those features a game breaker for you?
 
Joined
Mar 2, 2009
Messages
5,061 (0.87/day)
Processor AMD Ryzen 5 7600
Motherboard Gigabyte B650 Aorus Elite AX
Cooling Thermalright Peerless Assassin 120 SE
Memory Kingston Fury Beast DDR5-5600 16GBx2
Video Card(s) Gigabyte Gaming OC AMD Radeon RX 7800 XT 16GB
Storage TEAMGROUP T-Force Z440 2TB, SPower A60 2TB, SPower A55 2TB, Seagate 4TBx2
Display(s) AOC 24G2 + Xitrix WFP-2415
Case Montech Air X
Audio Device(s) Realtek onboard
Power Supply Be Quiet! Pure Power 11 FM 750W 80+ Gold
Mouse Logitech G Pro X Superlight Wireless
Keyboard Royal Kludge RK-S98 Tri-Mode RGB Mechanical Keyboard
Software Windows 10
But who's still buying the game? I recall many with 260s complaining about the high frame rate hit that those 2 IQ features put on their PCs. So I get the impression that this game isn't fully playable with these features unless you have a 470/480 anyway.

What I'm getting at is are those features a game breaker for you?

Well the review does say that even with the lack of those Water Simulation and Bokeh Filter options for the 5870, the 5870 would still be the best card for that game. Unless of course if you're an Nvidia fanboi then those two options would be the Second Coming...
 

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,232 (2.50/day)
I dunno, the water difference is pretty drastic. I don;t understand the choices, but I do know what's more pleasing to the eye, and 5870 ain't gonna get it.

I'll still buy the game, but I won't play it on ATI hardware. Realyl in the end, is how much fun it is, and damn it...I like it.

No deal breaker. I bought (and finished) Batman too. My kids ask me to play it(Batman) at least once a week.
 
Joined
Mar 2, 2009
Messages
5,061 (0.87/day)
Processor AMD Ryzen 5 7600
Motherboard Gigabyte B650 Aorus Elite AX
Cooling Thermalright Peerless Assassin 120 SE
Memory Kingston Fury Beast DDR5-5600 16GBx2
Video Card(s) Gigabyte Gaming OC AMD Radeon RX 7800 XT 16GB
Storage TEAMGROUP T-Force Z440 2TB, SPower A60 2TB, SPower A55 2TB, Seagate 4TBx2
Display(s) AOC 24G2 + Xitrix WFP-2415
Case Montech Air X
Audio Device(s) Realtek onboard
Power Supply Be Quiet! Pure Power 11 FM 750W 80+ Gold
Mouse Logitech G Pro X Superlight Wireless
Keyboard Royal Kludge RK-S98 Tri-Mode RGB Mechanical Keyboard
Software Windows 10
Well go on ahead if you're going to buy the more expensive card because they make the water look better. It is your money after all, which is the primary purpose of the Water Simulation and Bokeh Filter anyway, being a "NVIDIA EXCLUSIVE - TWIMTBP"
 

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,232 (2.50/day)
I'm the guy that has all the consoles too, just for platform specifics.

It's guys like me that they are really catering too...the ones that have the cash to spend to get what you want. Screw voting with my wallet...at least somebody is gonna get fed with that job I help support. They can deal with the bad press just fine.

I mean...it sucks, it really does...but it employs people and feeds thier kids. If they want to run around and be incompetent, that's just fine by me. Free society is all about that.
 

ctrain

New Member
Joined
Jan 12, 2010
Messages
393 (0.07/day)
How the hell is going to be lost something that you specifically stored??

this isn't how it works, there is no official way to do this stuff in dx9. reading values from the depth buffer in dx9 is vendor specific and you might have to use a special format for it. it's all a hack under the hood. you cannot do a custom resolve on a multisampled depth buffer in dx9.


What the.. are you both saying... You can store and load anything as many times as you want. Like I said you store Z separately, in a different render target, for you to access it as many times as you want. You are going to access Z data plenty of times in a deferred renderer anyway. Like I said is not MSAA, not the hardware accelerated MSAA, present in hardware, but rether an edge detect algorithm that will do supersampling on the detected edges. That's what MSAA does anyway, so it's MSAA just not being MSAA, if you know what I mean. And yeah it's slower, but not nearly as much as some make it to be. 5-10% slower maybe.

i don't think you understand what SSAA is, because the method you describe doesn't make much sense.
 

Dazzeerr

New Member
Joined
Dec 29, 2008
Messages
215 (0.04/day)
Location
Oxford
System Name The D-Machine
Processor Intel Core 2 Quad Q6600 @ 2.9GHz
Motherboard Asus P5Q-SE2 P45
Cooling AC Freezer Pro 7 / MX-2 - 2x 120mm fan - 1x 80mm fan
Memory 4GB Crucial Ballistix 800MHz
Video Card(s) nVIDIA GeForce 9800GTX+ 512mb
Storage 1TB Samsung F3 (Win7), 160GB & 250GB Maxtors.
Display(s) Acer 17" AL1716 LCD
Case Coolermaster 330 Elite
Audio Device(s) Creative SB! X-Fi Xtreme Audio 7.1
Power Supply Corsair TX 650W Single 12V+ Rail {52A}
Software Windows 7 Home Premium 64-Bit
Benchmark Scores Super PI : 17.016s 3dMark06 : 14355
He said X### NOT X####. That is X800, X600... Nvidia had PS3.0 on their hardware GF6800, ATi was like 6 months late with PS3.0 hardware. Games like Oblivion lacked proper PS3.0 for that reason, for example.

Does this mean ATI paid Microsoft to release their DX11 cards first?

Give us a break. :rolleyes:
 
Joined
Apr 12, 2010
Messages
1,359 (0.25/day)
Processor Core i7 920
Motherboard Asus P6T v2
Cooling Noctua D-14
Memory OCZ Gold 1600
Video Card(s) Powercolor PCS+ 5870
Storage Samsung SpinPoint F3 1 TB
Display(s) Samsung LE-B530 37" TV
Case Lian Li PC-B25F
Audio Device(s) N/A
Power Supply Thermaltake Toughpower 700w
Software Windows 7 64-bit
But who's still buying the game? I recall many with 260s complaining about the high frame rate hit that those 2 IQ features put on their PCs. So I get the impression that this game isn't fully playable with these features unless you have a 470/480 anyway.

What I'm getting at is are those features a game breaker for you?

I believe that there are sufficient other options available to allow me to continue to play games without supporting such actions. Too often, when consumers complain about shady marketing practices within the PC gaming industry (abusive DRM, TWIMTBP, etc), their peers in forums urge them to purchase the title anyway, because it's a great game or because we have already consented to being "shafted", as the reviewer in HardOCP puts it, by our decision to use the Windows platform. Certainly, as individuals, our ability to shape or control the market is limited; however, collectively, if the developers see that their practices are costing them sales and their "special relationship" with Nvidia is actually detrimental to their interests, they may be less willing to employ proprietary technology where open standards are available. Developers may be able to measure the money they save by allowing Nvidia to develop certain aspects of their games, but they can not measure the lost sales resulting from consumers' perception that a title has been crippled for their hardware; moreover, if we remain silent, they are unlikely to even consider this possibility. I can accept that the refusal to buy is a futile gesture, when considered at the level of the individual, and I have no doubt that I will miss many great titles as a result of my principles, but the choice of whether or not to purchase is one of the few options for input that remains to us, perhaps the most important. However, it is equally important to let a company know why I refuse to purchase, in order to enable it to either address my concerns or simply dismiss me whilst admitting awareness of my issues.

By refusing to buy, I punish the developer for assigning the development of features, which I believe are developer's responsibility, to Nvidia; I punish Nvidia by showing them that they have wasted money and resources that will reap no end benefits whilst associating their name with questionable marketing practices. To what extent I am punishing myself remains to be seen, but I can live quite happily without ever playing Just Cause 2.
 
Last edited:
Joined
Mar 24, 2010
Messages
5,047 (0.93/day)
Location
Iberian Peninsula
I buy games that 'I have to play', no others. As there really are only a few 'I have to play' games, I couldn't care less about the backgrounds, sorry.

I played the Just Cause 2 demo, it's 'all ages' and mindless shooting range style is not for me, but just as a theoretical simulation, let's suppose you like it. Are you not going to play it because some corporate junk whatever? Well, ok..... but that's like not visiting a state museum just because 'your' party was not elected for government ;)
 
Joined
Apr 12, 2010
Messages
1,359 (0.25/day)
Processor Core i7 920
Motherboard Asus P6T v2
Cooling Noctua D-14
Memory OCZ Gold 1600
Video Card(s) Powercolor PCS+ 5870
Storage Samsung SpinPoint F3 1 TB
Display(s) Samsung LE-B530 37" TV
Case Lian Li PC-B25F
Audio Device(s) N/A
Power Supply Thermaltake Toughpower 700w
Software Windows 7 64-bit
I buy games that 'I have to play', no others. As there really are only a few 'I have to play' games, I couldn't care less about the backgrounds, sorry.

I played the Just Cause 2 demo, it's 'all ages' and mindless shooting range style is not for me, but just as a theoretical simulation, let's suppose you like it. Are you not going to play it because some corporate junk whatever? Well, ok..... but that's like not visiting a state museum just because 'your' party was not elected for government ;)

As the years go by, there are fewer and fewer games that I "have to play". There is no need to apologise for your stance, the fact that you do not feel that the issues I have highlighted are worthy of consideration does not make your point of view any less valid and, needless to say, I understand your perspective.

Where and when we draw the line is a personal decision and the "corporate junk" that is but a minor annoyance for one user may prove intolerable for another. I am not going to play it for the reasons outlined above and because I feel that there are enough viable alternatives to ensure that I don't have time to dwell on what I might be missing.
 
Last edited:

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.48/day)
Location
Reaching your left retina.
this isn't how it works, there is no official way to do this stuff in dx9. reading values from the depth buffer in dx9 is vendor specific and you might have to use a special format for it. it's all a hack under the hood. you cannot do a custom resolve on a multisampled depth buffer in dx9.




i don't think you understand what SSAA is, because the method you describe doesn't make much sense.

You can read te deph buffer and store it like a texture. In a deferred engine you are going to be reading that buffer many times, for lighting, shadowing, effects... Look I don't know exactly how it works, because I am not a coder, but I do read a lot on Gamedev.net and Beyond3D forums and there's plenty dozens of threads about that and they say this can be done in this way. Batman is the clear example that it can be effectively done, so fight as much as you want, it can be done and probably in this way.
 
Last edited:
Joined
Apr 12, 2010
Messages
1,359 (0.25/day)
Processor Core i7 920
Motherboard Asus P6T v2
Cooling Noctua D-14
Memory OCZ Gold 1600
Video Card(s) Powercolor PCS+ 5870
Storage Samsung SpinPoint F3 1 TB
Display(s) Samsung LE-B530 37" TV
Case Lian Li PC-B25F
Audio Device(s) N/A
Power Supply Thermaltake Toughpower 700w
Software Windows 7 64-bit
Batman is the clear example that it can be effectively done, so fight as much as you want, it can be done.

This is precisely the point and I think that the fact that AA is enabled for Nvidia hardware on the Batman title invalidates any argument that the absence of in-game AA for ATI hardware is solely and exclusively attributable to the developer's choice of engine. Are we seriously saying that if Nvidia had not come to the rescue the game would have been released without in-game AA on the pc? If that is the case I find Nvidia's willingness to allow the developer to evade its responsibility to include this feature to be open to criticism, indeed, it is not a role that I wish to see either company perform or actively pursue.

If and when the developers begin to place more emphasis on tesselation, I would never dream of complaining about the better performance of Nvidia's products: the 480 and 470 offer superior tesselation, an open standard that forms a part of DirectX 11, and I made the decision to buy an ATI product that is inferior in this regard. However, the issue in this case is AA and I feel that, at the very least, we should expect developers to be able to produce games that comply to minimum standards without the need for additional economic or technical support from ATI or Nvidia. Other companies can clearly meet these expectations and I feel that artificially introduced differences at this level can only hurt the consumer and ultimately the industry as a whole.
 
Joined
Mar 2, 2009
Messages
5,061 (0.87/day)
Processor AMD Ryzen 5 7600
Motherboard Gigabyte B650 Aorus Elite AX
Cooling Thermalright Peerless Assassin 120 SE
Memory Kingston Fury Beast DDR5-5600 16GBx2
Video Card(s) Gigabyte Gaming OC AMD Radeon RX 7800 XT 16GB
Storage TEAMGROUP T-Force Z440 2TB, SPower A60 2TB, SPower A55 2TB, Seagate 4TBx2
Display(s) AOC 24G2 + Xitrix WFP-2415
Case Montech Air X
Audio Device(s) Realtek onboard
Power Supply Be Quiet! Pure Power 11 FM 750W 80+ Gold
Mouse Logitech G Pro X Superlight Wireless
Keyboard Royal Kludge RK-S98 Tri-Mode RGB Mechanical Keyboard
Software Windows 10
This is precisely the point and I think that the fact that AA is enabled for Nvidia hardware on the Batman title invalidates any argument that the absence of in-game AA for ATI hardware is solely and exclusively attributable to the developer's choice of engine. Are we seriously saying that if Nvidia had not come to the rescue the game would have been released without in-game AA on the pc? If that is the case I find Nvidia's willingness to allow the developer to evade its responsibility to include this feature to be open to criticism, indeed, it is not a role that I wish to see either company perform or actively pursue.

If and when the developers begin to place more emphasis on tessellation, I would never dream of complaining about the better performance of Nvidia's products: the 480 and 470 offer superior tessellation, an open standard that forms a part of DirectX 11, and I made the decision to buy an ATI product that is inferior in this regard. However, the issue in this case is AA and I feel that, at the very least, we should expect developers to be able to produce games that comply to minimum standards without the need for additional economic or technical support from ATI or Nvidia. Other companies can clearly meet these expectations and I feel that artificially introduced differences at this level can only hurt the consumer and ultimately the industry as a whole.

The bold part is really quite important too. Remember that during the period wherein only ATi cards were DirectX 11 capable that Nvidia consistently insisted that subsisting with existing DX 11 cards with tessellation would be a waste of money since it isn't really that much of a "new feature" compared to DirectX 10. But when they finally released their own cards which were waaaaaaaaaaaaaaaaaaaaaaay overdue, to let consumers and potential customers forget about price/perf, power/perf, temps, and frying eggs and burger patties, tessellation was immediately a gift from the Gods and Nvidia cards are godsend cards quite capable with tessellation, which is the bestest ever feature everest forever best.
 

Fourstaff

Moderator
Staff member
Joined
Nov 29, 2009
Messages
10,084 (1.82/day)
Location
Home
System Name Orange! // ItchyHands
Processor 3570K // 10400F
Motherboard ASRock z77 Extreme4 // TUF Gaming B460M-Plus
Cooling Stock // Stock
Memory 2x4Gb 1600Mhz CL9 Corsair XMS3 // 2x8Gb 3200 Mhz XPG D41
Video Card(s) Sapphire Nitro+ RX 570 // Asus TUF RTX 2070
Storage Samsung 840 250Gb // SX8200 480GB
Display(s) LG 22EA53VQ // Philips 275M QHD
Case NZXT Phantom 410 Black/Orange // Tecware Forge M
Power Supply Corsair CXM500w // CM MWE 600w
My take on the AA issue:

Developer develops a game on an engine without AA. Nvidia comes along and says: "We will pay you to develop AA functionality for our products as a way to improve your product for my customers" I see nothing wrong here, ATi was never part of the equation, so they never received anything.

Gamer with ATi hardware sees the game and comments: "Hey look, you can only activate AA with Nvidia hardware! The devs must have received Nvidia's coin and disabled AA with ATi hardware!" *boycotts game*

I come along and asks the question: If Nvidia didn't provide funding and assistance to game dev, will they provide AA functionality anyway? If answer is yes, then Nvidia is being a bad person here, manipulating things to its own benefit. If answer is no, Nvidia is a good manufacturer, he cares about maximising his customer's satisfaction.
 
Joined
Apr 10, 2010
Messages
1,867 (0.34/day)
Location
London
System Name Jaspe
Processor Ryzen 1500X
Motherboard Asus ROG Strix X370-F Gaming
Cooling Stock
Memory 16Gb Corsair 3000mhz
Video Card(s) EVGA GTS 450
Storage Crucial M500
Display(s) Philips 1080 24'
Case NZXT
Audio Device(s) Onboard
Power Supply Enermax 425W
Software Windows 10 Pro
Joined
Apr 12, 2010
Messages
1,359 (0.25/day)
Processor Core i7 920
Motherboard Asus P6T v2
Cooling Noctua D-14
Memory OCZ Gold 1600
Video Card(s) Powercolor PCS+ 5870
Storage Samsung SpinPoint F3 1 TB
Display(s) Samsung LE-B530 37" TV
Case Lian Li PC-B25F
Audio Device(s) N/A
Power Supply Thermaltake Toughpower 700w
Software Windows 7 64-bit
My take on the AA issue:

Developer develops a game on an engine without AA. Nvidia comes along and says: "We will pay you to develop AA functionality for our products as a way to improve your product for my customers" I see nothing wrong here, ATi was never part of the equation, so they never received anything.

Gamer with ATi hardware sees the game and comments: "Hey look, you can only activate AA with Nvidia hardware! The devs must have received Nvidia's coin and disabled AA with ATi hardware!" *boycotts game*

I come along and asks the question: If Nvidia didn't provide funding and assistance to game dev, will they provide AA functionality anyway? If answer is yes, then Nvidia is being a bad person here, manipulating things to its own benefit. If answer is no, Nvidia is a good manufacturer, he cares about maximising his customer's satisfaction.

Could you imagine the backlash if a game was released without any AA on the pc in this day and age? A more interesting question, that leads to various suppositions, is why the developer specifically chose that engine, knowing that it would hinder their ability to enable AA. Were they aware, prior to making the engine choice, that Nvidia would be on hand with cash and expertise?
 
Joined
Mar 2, 2009
Messages
5,061 (0.87/day)
Processor AMD Ryzen 5 7600
Motherboard Gigabyte B650 Aorus Elite AX
Cooling Thermalright Peerless Assassin 120 SE
Memory Kingston Fury Beast DDR5-5600 16GBx2
Video Card(s) Gigabyte Gaming OC AMD Radeon RX 7800 XT 16GB
Storage TEAMGROUP T-Force Z440 2TB, SPower A60 2TB, SPower A55 2TB, Seagate 4TBx2
Display(s) AOC 24G2 + Xitrix WFP-2415
Case Montech Air X
Audio Device(s) Realtek onboard
Power Supply Be Quiet! Pure Power 11 FM 750W 80+ Gold
Mouse Logitech G Pro X Superlight Wireless
Keyboard Royal Kludge RK-S98 Tri-Mode RGB Mechanical Keyboard
Software Windows 10
My take on the AA issue:

Developer develops a game on an engine without AA. Nvidia comes along and says: "We will pay you to develop AA functionality for our products as a way to improve your product for my customers" I see nothing wrong here, ATi was never part of the equation, so they never received anything.

Gamer with ATi hardware sees the game and comments: "Hey look, you can only activate AA with Nvidia hardware! The devs must have received Nvidia's coin and disabled AA with ATi hardware!" *boycotts game*

I come along and asks the question: If Nvidia didn't provide funding and assistance to game dev, will they provide AA functionality anyway? If answer is yes, then Nvidia is being a bad person here, manipulating things to its own benefit. If answer is no, Nvidia is a good manufacturer, he cares about maximising his customer's satisfaction.

http://www.hexus.net/content/item.php?item=20991

AMD received an email dated Sept 29th at 5:22pm from Mr. Lee Singleton General Manager at Eidos Game Studios who stated that Eidos’ legal department is preventing Eidos from allowing ATI cards to run in-game antialiasing in Batman Arkham Asylum due to NVIDIA IP ownership issues over the antialiasing code, and that they are not permitted to remove the vendor ID filter.

So the AA should work for both...except for the vendor ID filter.
 

Fourstaff

Moderator
Staff member
Joined
Nov 29, 2009
Messages
10,084 (1.82/day)
Location
Home
System Name Orange! // ItchyHands
Processor 3570K // 10400F
Motherboard ASRock z77 Extreme4 // TUF Gaming B460M-Plus
Cooling Stock // Stock
Memory 2x4Gb 1600Mhz CL9 Corsair XMS3 // 2x8Gb 3200 Mhz XPG D41
Video Card(s) Sapphire Nitro+ RX 570 // Asus TUF RTX 2070
Storage Samsung 840 250Gb // SX8200 480GB
Display(s) LG 22EA53VQ // Philips 275M QHD
Case NZXT Phantom 410 Black/Orange // Tecware Forge M
Power Supply Corsair CXM500w // CM MWE 600w
Could you imagine the backlash if a game was released without any AA on the pc in this day and age? A more interesting question, that leads to various suppositions, is why the developer specifically chose that engine, knowing that it would hinder their ability to enable AA. Were they aware, prior to making the engine choice, that Nvidia would be on hand with cash and expertise?

Yes, I can't see why they wouldn't produce a game wiithout AA. At least Farmville doesn't have it. And yes, you grasped my thought on the matter, on how Nvidia's TWIMTBP affected the gaming industry. As I see it, Unreal engine is a popular engine, so the devs have no devious reason to choose that particular engine. I am sure they were aware that Nvidia can help them, but I don't think they are going to bet on that fact.
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.48/day)
Location
Reaching your left retina.
This is precisely the point and I think that the fact that AA is enabled for Nvidia hardware on the Batman title invalidates any argument that the absence of in-game AA for ATI hardware is solely and exclusively attributable to the developer's choice of engine. Are we seriously saying that if Nvidia had not come to the rescue the game would have been released without in-game AA on the pc? If that is the case I find Nvidia's willingness to allow the developer to evade its responsibility to include this feature to be open to criticism, indeed, it is not a role that I wish to see either company perform or actively pursue.

I come along and asks the question: If Nvidia didn't provide funding and assistance to game dev, will they provide AA functionality anyway? If answer is yes, then Nvidia is being a bad person here, manipulating things to its own benefit. If answer is no, Nvidia is a good manufacturer, he cares about maximising his customer's satisfaction.

That is the case, if Nvidia didn't tell them to use AA they would have NOT included AA, just as 90%+ games using UE3 did not include AA. According to you, mr mcc, it is criticable. Well I don't see you saying anything about all the other UE3 based games that don't have AA either. UE3 + no AA is the norm, not the exception and although it's something that I don't like personally, that's not what is being discussed here. What it's being discussed here is what Nvidia did, and what they did is fix the situation by making them implement AA, not making it worse.

And if what ctrain said here is true, "reading values from the depth buffer in dx9 is vendor specific and you might have to use a special format for it." Right there is the answer as to why AA was only activated if an Nvidia card was detected. And as to why changing the exe name will allow you to enable the feature on Ati cards, but didn't work at all anyways. And why the developer asked Ati to help them implemet and test the feature for Ati cards.

It's the same in Just Cause 2, if Nvidia didn't tell them to include those features, they would have never included them to begin with resulting in an inferior product for everybody. On top of that, they probably used CUDA, because at the time they were creating the tech for the game only CUDA was available. It's been few months since there's full OpenCL and Direct Compute support from both camps (Ati being the last one btw), so it was just imposible to make the features using them.

If you are upset because you feel that developers evade their responsability by not including AA or any other feature that you (and only you) feel it's a requirement, make a thread about that, but don't create a Nvidia bashing thread with no reason to make it.

Could you imagine the backlash if a game was released without any AA on the pc in this day and age? A more interesting question, that leads to various suppositions, is why the developer specifically chose that engine, knowing that it would hinder their ability to enable AA. Were they aware, prior to making the engine choice, that Nvidia would be on hand with cash and expertise?

Dozens of games are released without AA every year.

And as to why they chose UE3... you are showing your ignorance here...
 
Top