• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Criticism of Nvidia's TWIMTBP Program - HardOCP's Just Cause 2 Review

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.05/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
And why doesn't it run in DX10, then? Cuase you know the default behavior of the engine is to recognize the hardware and decide itself unless told differently, right? I mean, you select DX10 in the options...;)

Just because the game engine supports DX10, that doesn't mean it defaults to using it if available.

The game has to be designed and tested to use that rendering path also. And since Batman was a console port, with no DX10 intention from the beginning, the rendering path was not included.

And there is no place in the Batman options to select DX10.

Could you imagine the backlash if a game was released without any AA on the pc in this day and age? A more interesting question, that leads to various suppositions, is why the developer specifically chose that engine, knowing that it would hinder their ability to enable AA. Were they aware, prior to making the engine choice, that Nvidia would be on hand with cash and expertise?

Let me see, games released without in game AA in this day and age, off the top of my head:

Mass Effect 2 - Highly Antisipated game, recieving very good reviews
X-Men Origins - Another Highly Antisipated game recieving good reviews

Yep, no backlash from either not having AA.

They more than likely chose the enigne because it is extremely versatile. One of the few I've seen that works well as a FPS engine, and is easily adapted to a 3rd person action game.


http://www.hexus.net/content/item.php?item=20991



So the AA should work for both...except for the vendor ID filter.

Should, but doesn't(that has been shown), and again that makes perfect sense if nVidia was the one that paid for the developement of AA(or did it themselves), it is their IP to do with what they please.
 
Last edited:
Joined
May 4, 2009
Messages
1,972 (0.34/day)
Location
Bulgaria
System Name penguin
Processor R7 5700G
Motherboard Asrock B450M Pro4
Cooling Some CM tower cooler that will fit my case
Memory 4 x 8GB Kingston HyperX Fury 2666MHz
Video Card(s) IGP
Storage ADATA SU800 512GB
Display(s) 27' LG
Case Zalman
Audio Device(s) stock
Power Supply Seasonic SS-620GM
Software win10
A lot has been written and said on the subject already. But the point I'd like to stress out is that Nvidia has both the financial power and the legal right to do as they please in the aforementioned cases. However I think they're missing the bigger picture here. They are dividing and distancing the few PC-only game designers that are left out there. The coders have to follow and comply with a whole dozen of different standards, test it out on numerous hardware combination, etc. They're simply running themselves thin and that's why more amd more are giving up and going over to other platforms like consoles, portables and even smartphones of al things...
Instead of bullying and forcing themselves on the fragile industry, both graphics vendors should stop bickering and fighting and try to do everything in their power to bring PC Gaming back to its rightful place - one level above all other platforms. I don't care if that means they have to combine efforts, all I care as a consumer is that I get a better product in the end, one that deserves me spending money on new hardware. In the end they are a hardware manufacturer and should stick to what they do best.
 

exow2

New Member
Joined
Apr 2, 2010
Messages
100 (0.02/day)
Processor AMD Athlon II X4 630 2.8ghz
Motherboard Asus M3A78-EM
Memory 4GB Kingston DDR2 800mhz
Video Card(s) Nvidia Geforce 8800GT
Storage 640Gb 7200RPM 32MB Cache Western Digital
Power Supply 400W NZXT
Software Windows 7 Ultimate
DX10 was delayed due to nV not supporting the API properly. DX10.1 was barely a whisper, again, thanks to a lack of nV support.

I see in your post you basically bash Nvidia's "support" when you really can't see that Nvidia is the one trying to at least have someone enjoy the game to the max. ATI didn't implement AA in the game and who's fault is that? You blame Nvidia for that? LOL. ATI should be on top of this regardless and should be able to get AA in that game as well. What's stopping them? Nothing actually, you call Nvidia's practices "illegal" but that's a bald face lie. Show me the law or act that prohibits a company to ensure their hardware will be the most compatable on a video game.
 

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,232 (2.50/day)
I see in your post you basically bash Nvidia's "support" when you really can't see that Nvidia is the one trying to at least have someone enjoy the game to the max. ATI didn't implement AA in the game and who's fault is that? You blame Nvidia for that? LOL. ATI should be on top of this regardless and should be able to get AA in that game as well. What's stopping them? Nothing actually, you call Nvidia's practices "illegal" but that's a bald face lie. Show me the law or act that prohibits a company to ensure their hardware will be the most compatable on a video game.

Good morning. Actually, I'm bashing the dev's for using nV's support, and nV covering thier(the devs) asses. I even said nV's more than welcome to add things from thier closed APIs that only users with thier hardware can use. Manipulating an engine so that other users without thier hardware, lose features, is NOT ok.

Also, please read my sig...this is my opinion...I am entitled to it. There's no need for me to explain anything.

I'm not really into arguing over something like this. You'll have to ask someone else. Like I said, these things aren't going to prevent me from buying these titles...I have no bones about dropping $500 on a GTX480, so I can enjoy games like this to thier fullest.
 
Joined
Apr 12, 2010
Messages
1,359 (0.25/day)
Processor Core i7 920
Motherboard Asus P6T v2
Cooling Noctua D-14
Memory OCZ Gold 1600
Video Card(s) Powercolor PCS+ 5870
Storage Samsung SpinPoint F3 1 TB
Display(s) Samsung LE-B530 37" TV
Case Lian Li PC-B25F
Audio Device(s) N/A
Power Supply Thermaltake Toughpower 700w
Software Windows 7 64-bit
I see in your post you basically bash Nvidia's "support" when you really can't see that Nvidia is the one trying to at least have someone enjoy the game to the max. ATI didn't implement AA in the game and who's fault is that? You blame Nvidia for that? LOL. ATI should be on top of this regardless and should be able to get AA in that game as well. What's stopping them? Nothing actually, you call Nvidia's practices "illegal" but that's a bald face lie. Show me the law or act that prohibits a company to ensure their hardware will be the most compatable on a video game.

I blame Nvidia for taking on a role that should have been fulfilled by the developer. Moreover, I question the motives behind the choice of the Unreal engine or the use of CUDA in a given title when there are very clearly other alternatives available that would enable the same features to be developed for the products of both companies. I do not see Nvidia as a passive agent improving an otherwise inferior product, but rather as actively promoting and cultivating such practices to the detriment of the consumer. Nvidia's actions are not illegal and, as many contributors have pointed out, they are operating according to free market principles; however, freemarket principles also entail the consumer's decision to buy or look elsewhere.

If ATI were to take a similar approach we would soon be reduced to discussing our ATI or Nvidia build in much the same manner as users discuss their Xbox or Playstation, each with its own unique library of games. As HalfAHertz pointed out, the PC gaming industry is already on the decline and these artificially introduced differentiating features will do little to improve the situation.
 
Joined
Apr 12, 2010
Messages
1,359 (0.25/day)
Processor Core i7 920
Motherboard Asus P6T v2
Cooling Noctua D-14
Memory OCZ Gold 1600
Video Card(s) Powercolor PCS+ 5870
Storage Samsung SpinPoint F3 1 TB
Display(s) Samsung LE-B530 37" TV
Case Lian Li PC-B25F
Audio Device(s) N/A
Power Supply Thermaltake Toughpower 700w
Software Windows 7 64-bit
That is the case, According to you, mr mcc, it is criticable. Well I don't see you saying anything about all the other UE3 based games that don't have AA either.

I don't have time to address each and every aspect of the PC gaming industry that is open to criticism, moreover, the post is specifically concerned with TWIMTBP games and directly related to the review recently published on HardOCP. Diverting attention to other areas that need to be improved or other questionable choices in game design does not in any way excuse the practices under consideration in this thread.

If you are upset because you feel that developers evade their responsability by not including AA or any other feature that you (and only you) feel it's a requirement, make a thread about that, but don't create a Nvidia bashing thread with no reason to make it.

and the reviewer on HardOCP and several contributors to this thread and....
 
Joined
Nov 9, 2008
Messages
2,318 (0.39/day)
Location
Texas
System Name Mr. Reliable
Processor Ryzen R7 7800X3D
Motherboard MSI X670E Carbon Wifi
Cooling D5 Pump, Singularity Top/Res, 2x360mm EK P rads, EK Magnitude/Bitspower Blocks
Memory 32Gb (2x16Gb) GSkill Trident Z5 DDR5 6000 Cl30
Video Card(s) Asus Tuf 4080 Super
Storage 4 x Crucial P5 1TB; 2 x Samsung 870 2TB
Display(s) Acer 32" Z321QU 2560x1440; LG 34GP83A-B 34" 3440x1440
Case Lian Li PC-011 Dynamic XL; Synology DS218j w/ 2 x 2TB WD Red
Audio Device(s) SteelSeries Arctis Pro+
Power Supply EVGA SuperNova 850G3
Mouse Razer Basilisk V2
Keyboard Das Keyboard 6; Razer Orbweaver Chroma
Software Windows 11 Pro
Ok...I am an ATI Fanboy, as most of you already know, but I am also very objective. Here is how I see it:

1.) Nvidia has a VERY aggressive marketing campaign that ensures their name and some proprietary technology goes into some games. This is a business decision that the company made, because at the end of the day, the board members are there to keep the shareholders happy.
2.) ATI/AMD has a very limited marketing campaign, if any at all. This also is a business decision, made to keep shareholders happy.

So which is better/worse? They are both about the same. If anything NV needs to trim their marketing back some and ATI needs to boost theirs.

These arguements are getting very old. Lets not forget that we "enthusiast" probably amount to 1-5% of both of these companies total revenue, so they could care less what we think. Their main concern is the mainstream market that makes up the bulk of their revenue.

Bottom Line: Buy what the hardware you want to play the games you want, and quit whining when one company does something you don't like. If you don't like it switch companies!
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.48/day)
Location
Reaching your left retina.
I don't have time to address each and every aspect of the PC gaming industry that is open to criticism, moreover, the post is specifically concerned with TWIMTBP games and directly related to the review recently published on HardOCP. Diverting attention to other areas that need to be improved or other questionable choices in game design does not in any way excuse the practices under consideration in this thread.



and the reviewer on HardOCP and several contributors to this thread and....

Again Nvidia has nothing to do with the lack of those features so you are completely wrong. I'm not diverting attention to anything different since the problem is the one I'm saying and not the one you pretend to exist. TWIMTBP is there to add things not prevent anyone from implementing them and they do add a lot of things things. And same goes for AMD's own program. Look at the rest of games. Which ones have special features? Only some which are under TWIMTBP or AMD's Game. The rest are just pure console ports.

Developers are not leaving the PC market because programs like TWIMTBP that help them develop and add features, they are abandoning it because exactly the opposite. On consoles they get all the help they want, even monetary, and that's the reason they are moving to consoles. What we need is more intervention from Nvidia and AMD not less. They sell graphics hardware, that's their bussiness, but that hardware can't exist without games, it would be pointless. On the consumer space a $5 IGP can do everything except games, it's only games for what a graphics card is really needed. Helping PC game developers is an integral part of being a PC graphics vendor. Games are made for their hardware and they have to help develop the interesting things. Plain and simple.
 
Joined
Apr 12, 2010
Messages
1,359 (0.25/day)
Processor Core i7 920
Motherboard Asus P6T v2
Cooling Noctua D-14
Memory OCZ Gold 1600
Video Card(s) Powercolor PCS+ 5870
Storage Samsung SpinPoint F3 1 TB
Display(s) Samsung LE-B530 37" TV
Case Lian Li PC-B25F
Audio Device(s) N/A
Power Supply Thermaltake Toughpower 700w
Software Windows 7 64-bit
So which is better/worse? They are both about the same. If anything NV needs to trim their marketing back some and ATI needs to boost theirs.

Agreed.

Bottom Line: Buy what the hardware you want to play the games you want, and quit whining when one company does something you don't like. If you don't like it switch companies!

What you define as "whining" I define as drawing an issue to the community's attention. In any event, surely I can "whine" and also follow the rest of your advice?
 
Joined
Nov 9, 2008
Messages
2,318 (0.39/day)
Location
Texas
System Name Mr. Reliable
Processor Ryzen R7 7800X3D
Motherboard MSI X670E Carbon Wifi
Cooling D5 Pump, Singularity Top/Res, 2x360mm EK P rads, EK Magnitude/Bitspower Blocks
Memory 32Gb (2x16Gb) GSkill Trident Z5 DDR5 6000 Cl30
Video Card(s) Asus Tuf 4080 Super
Storage 4 x Crucial P5 1TB; 2 x Samsung 870 2TB
Display(s) Acer 32" Z321QU 2560x1440; LG 34GP83A-B 34" 3440x1440
Case Lian Li PC-011 Dynamic XL; Synology DS218j w/ 2 x 2TB WD Red
Audio Device(s) SteelSeries Arctis Pro+
Power Supply EVGA SuperNova 850G3
Mouse Razer Basilisk V2
Keyboard Das Keyboard 6; Razer Orbweaver Chroma
Software Windows 11 Pro
What you define as "whining" I define as drawing an issue to the community's attention. In any event, surely I can "whine" and also follow the rest of your advice?
Maybe whining was an inappropriate term to use :eek:. What I am referring to is the MASSIVE amount of threads dedicated to this topic. It seems that threads such as these are posted at LEAST once a week. It is repetitive and, if anything, these comments should be added to other threads that have already been created. Also, these ATI vs. NV threads seem to lead to a lot of trolling and flame wars, which in turn leads to infractions. And I don't ever like seeing someone get smacked with the banstick. (except maybe Mailman, but that is because he likes it. :D)
 
Joined
Apr 12, 2010
Messages
1,359 (0.25/day)
Processor Core i7 920
Motherboard Asus P6T v2
Cooling Noctua D-14
Memory OCZ Gold 1600
Video Card(s) Powercolor PCS+ 5870
Storage Samsung SpinPoint F3 1 TB
Display(s) Samsung LE-B530 37" TV
Case Lian Li PC-B25F
Audio Device(s) N/A
Power Supply Thermaltake Toughpower 700w
Software Windows 7 64-bit
Again Nvidia has nothing to do with the lack of those features so you are completely wrong. I'm not diverting attention to anything different since the problem is the one I'm saying and not the one you pretend to exist. TWIMTBP is there to add things not prevent anyone from implementing them and they do add a lot of things things. And same goes for AMD's own program. Look at the rest of games. Which ones have special features? Only some which are under TWIMTBP or AMD's Game. The rest are just pure console ports.

Developers are not leaving the PC market because programs like TWIMTBP that help them develop and add features, they are abandoning it because exactly the opposite. On consoles they get all the help they want, even monetary, and that's the reason they are moving to consoles. What we need is more intervention from Nvidia and AMD not less. They sell graphics hardware, that's their bussiness, but that hardware can't exist without games, it would be pointless. On the consumer space a $5 IGP can do everything except games, it's only games for what a graphics card is really needed. Helping PC game developers is an integral part of being a PC graphics vendor. Games are made for their hardware and they have to help develop the interesting things. Plain and simple.

We must agree to disagree: what you see as helpful assistance, I see as unwanted intervention that, in the best case scenario, will not provide any incentive to developers to address and include what should be "standard features".

Could the developer have included the water effects in Just Cause via other means? Would the inclusion of such features have raised development costs beyond the developer's limit without Nividia's "assistance"? How much money will the developer lose, given the rate at which ATI 5xxx series cards are being adopted in this niche segment of the industry, as a result of the perception that their software is not properly optimised for ATI hardware? Would any costs entailed by implementing these "special features" be offset by incorporating rather than alienating ATI users who refuse to buy "crippled software"?
 
Last edited:

brandonwh64

Addicted to Bacon and StarCrunches!!!
Joined
Sep 6, 2009
Messages
19,542 (3.47/day)
Loud Noises!


*Im on pain meds cause of a pulled tooth and this thread is funny sorry but i got a kick out of it!*
 
Last edited:

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.48/day)
Location
Reaching your left retina.
Both Nvidia and AMD should collaborate more, that is something all of us agree, but the fact is that no one collaborates. This is not a Nvidia only issue. Take Eyefinity, for example, remember this story at Anandtech? How Ati kept Eyefinity a secret until the very end? They asked display developers to make bezel-less displays and also asked to keep it secret from Nvidia. How is that any good for the consumer? It isn't. I'm not saying they should have given Eyefinity out, but prevent everyone from knowing about it? They clearly wanted it just for themselves and you know what? Very well done. It's bussiness.

There's also PhysX and how not only Ati didn't support it, but they chose to actively undermine it. That is not well done at all. Collaboration? Yes they should, but what would have Ati done if Nvdia went and told AMD they were adding those features and if AMD wanted to cooperate in their creation? As they've done plenty of times in the past, they would have said no, which is what they did with Batman anyway, except that it was the developer who asked and the response was: "No, you are a TWIMTBP game, so just no."

Collaboration yes. But into making something. Not collaboration to not make anything at all. And as I see it Nvidia wants those things implemented (for people like me) and just can't wait for AMD. Take HW physics for example. AMD dismissed PhysX and proposed Havok and later Bullet, ok, fair enough, but where are those? It's been more than 2 years and there's not even future planned games based on them. It took Nvidia 2 months to make physx geforce capable, so it clearly isn't a technical issue. No, I'd rather be confined to one brand in my election of card, than not having any new features. It's your own choice. AMD is giving no choice because they don't implement anything and your view of not implement it unless both agree, gives you no choice either: it won't be implemented. And if your choice is not having them implemented, congrats you already have that option now, just don't buy a Nvidia card.
 
Joined
Apr 12, 2010
Messages
1,359 (0.25/day)
Processor Core i7 920
Motherboard Asus P6T v2
Cooling Noctua D-14
Memory OCZ Gold 1600
Video Card(s) Powercolor PCS+ 5870
Storage Samsung SpinPoint F3 1 TB
Display(s) Samsung LE-B530 37" TV
Case Lian Li PC-B25F
Audio Device(s) N/A
Power Supply Thermaltake Toughpower 700w
Software Windows 7 64-bit
Maybe whining was an inappropriate term to use :eek:. What I am referring to is the MASSIVE amount of threads dedicated to this topic. It seems that threads such as these are posted at LEAST once a week. It is repetitive and, if anything, these comments should be added to other threads that have already been created. Also, these ATI vs. NV threads seem to lead to a lot of trolling and flame wars, which in turn leads to infractions. And I don't ever like seeing someone get smacked with the banstick. (except maybe Mailman, but that is because he likes it. :D)

It is not my intention to provoke a flame war. I thought that the HardOCP review was worthy of the forum's attention. Up to this point, I feel that discussion has been quite civil, but if it degrades into a slanging match or a fanboy festival I assure you that I will be the first person to exit the thread and request a lock. I admit that the thread is provocative, but then again, so are the opinions expressed in the HardOCP review. I promise you that I will do my utmost to ensure that nobody is reprimanded by the moderators on my account. The thread may be repetitive and possibly does have a place as an addendum to existing threads, but in my defence, I thought that the opinions expressed by HardOCP deserved their own thread, not only because of what was said but also because of where it was said. In any event, the thread title clearly identifies the subject-matter and forum users are free to simply ignore it if they feel that the issue has become tedious.

I will bear Mailman's masochistic tendencies in mind.
 
Last edited:
Joined
Apr 12, 2010
Messages
1,359 (0.25/day)
Processor Core i7 920
Motherboard Asus P6T v2
Cooling Noctua D-14
Memory OCZ Gold 1600
Video Card(s) Powercolor PCS+ 5870
Storage Samsung SpinPoint F3 1 TB
Display(s) Samsung LE-B530 37" TV
Case Lian Li PC-B25F
Audio Device(s) N/A
Power Supply Thermaltake Toughpower 700w
Software Windows 7 64-bit
Loud Noises!


*Im on pain meds cause of a pulled tooth and this thread is funny sorry but i got a kick out of it!*

No problem Gummy, we are here to entertain.
 

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,232 (2.50/day)
AMD is giving no choice because they don't implement anything and your view of not implement it unless both agree, gives you no choice either: it won't be implemented. And if your choice is not having them implemented, congrats you already have that option now, just don't buy a Nvidia card.

AMD IS giving a choice though...the difference is that they expect OPEN STANDARDS, such as DX, to be the middleman that ensures things work on all cards.

nVidia uses CLOSED STANDARDS, and releases the same tech, but keeps it exclusive. They turn these features into cash, while AMD says "hey. let's share!".

It's not like AMD's hardware is incapable of running this stuff...


But in the end, you are right, to a degree.
 
Joined
Apr 12, 2010
Messages
1,359 (0.25/day)
Processor Core i7 920
Motherboard Asus P6T v2
Cooling Noctua D-14
Memory OCZ Gold 1600
Video Card(s) Powercolor PCS+ 5870
Storage Samsung SpinPoint F3 1 TB
Display(s) Samsung LE-B530 37" TV
Case Lian Li PC-B25F
Audio Device(s) N/A
Power Supply Thermaltake Toughpower 700w
Software Windows 7 64-bit
Both Nvidia and AMD should collaborate more, that is something all of us agree, but the fact is that no one collaborates. This is not a Nvidia only issue. Take Eyefinity, for example, remember this story at Anandtech? How Ati kept Eyefinity a secret until the very end? They asked display developers to make bezel-less displays and also asked to keep it secret from Nvidia. How is that any good for the consumer? It isn't. I'm not saying they should have given Eyefinity out, but prevent everyone from knowing about it? They clearly wanted it just for themselves and you know what? Very well done. It's bussiness.

There's also PhysX and how not only Ati didn't support it, but they chose to actively undermine it. That is not well done at all. Collaboration? Yes they should, but what would have Ati done if Nvdia went and told AMD they were adding those features and if AMD wanted to cooperate in their creation? As they've done plenty of times in the past, they would have said no, which is what they did with Batman anyway, except that it was the developer who asked and the response was: "No, you are a TWIMTBP game, so just no."

Collaboration yes. But into making something. Not collaboration to not make anything at all. And as I see it Nvidia wants those things implemented (for people like me) and just can't wait for AMD. Take HW physics for example. AMD dismissed PhysX and proposed Havok and later Bullet, ok, fair enough, but where are those? It's been more than 2 years and there's not even future planned games based on them. It took Nvidia 2 months to make physx geforce capable, so it clearly isn't a technical issue. No, I'd rather be confined to one brand in my election of card, than not having any new features. It's your own choice. AMD is giving no choice because they don't implement anything and your view of not implement it unless both agree, gives you no choice either: it won't be implemented. And if your choice is not having them implemented, congrats you already have that option now, just don't buy a Nvidia card.

As stated above, I accept and agree that ATI has to do more; however, I do not want to see them respond in kind -further division aids nobody.
 

exow2

New Member
Joined
Apr 2, 2010
Messages
100 (0.02/day)
Processor AMD Athlon II X4 630 2.8ghz
Motherboard Asus M3A78-EM
Memory 4GB Kingston DDR2 800mhz
Video Card(s) Nvidia Geforce 8800GT
Storage 640Gb 7200RPM 32MB Cache Western Digital
Power Supply 400W NZXT
Software Windows 7 Ultimate
Good morning. Actually, I'm bashing the dev's for using nV's support, and nV covering thier(the devs) asses. I even said nV's more than welcome to add things from thier closed APIs that only users with thier hardware can use. Manipulating an engine so that other users without thier hardware, lose features, is NOT ok.

Also, please read my sig...this is my opinion...I am entitled to it. There's no need for me to explain anything.

I'm not really into arguing over something like this. You'll have to ask someone else. Like I said, these things aren't going to prevent me from buying these titles...I have no bones about dropping $500 on a GTX480, so I can enjoy games like this to thier fullest.

Oh it was never my intention to spark an argument over this :), I was merely stating my opinion as were you, and I wouldn't consider myself neither an ATI or Nvidia fanboy. Sure I have a nvidia card but they both have their ups and downs and I wouldn't think twice about buying an ATI card if it was worth it. :rockout:
 
Joined
Feb 18, 2006
Messages
5,147 (0.74/day)
Location
AZ
System Name Thought I'd be done with this by now
Processor i7 11700k 8/16
Motherboard MSI Z590 Pro Wifi
Cooling Be Quiet Dark Rock Pro 4, 9x aigo AR12
Memory 32GB GSkill TridentZ Neo DDR4-4000 CL18-22-22-42
Video Card(s) MSI Ventus 2x Geforce RTX 3070
Storage 1TB MX300 M.2 OS + Games, + cloud mostly
Display(s) Samsung 40" 4k (TV)
Case Lian Li PC-011 Dynamic EVO Black
Audio Device(s) onboard HD -> Yamaha 5.1
Power Supply EVGA 850 GQ
Mouse Logitech wireless
Keyboard same
VR HMD nah
Software Windows 10
Benchmark Scores no one cares anymore lols
Both Nvidia and AMD should collaborate more, that is something all of us agree, but the fact is that no one collaborates. This is not a Nvidia only issue. Take Eyefinity, for example, remember this story at Anandtech? How Ati kept Eyefinity a secret until the very end? They asked display developers to make bezel-less displays and also asked to keep it secret from Nvidia. How is that any good for the consumer? It isn't. I'm not saying they should have given Eyefinity out, but prevent everyone from knowing about it? They clearly wanted it just for themselves and you know what? Very well done. It's bussiness.

There's also PhysX and how not only Ati didn't support it, but they chose to actively undermine it. That is not well done at all. Collaboration? Yes they should, but what would have Ati done if Nvdia went and told AMD they were adding those features and if AMD wanted to cooperate in their creation? As they've done plenty of times in the past, they would have said no, which is what they did with Batman anyway, except that it was the developer who asked and the response was: "No, you are a TWIMTBP game, so just no."

Collaboration yes. But into making something. Not collaboration to not make anything at all. And as I see it Nvidia wants those things implemented (for people like me) and just can't wait for AMD. Take HW physics for example. AMD dismissed PhysX and proposed Havok and later Bullet, ok, fair enough, but where are those? It's been more than 2 years and there's not even future planned games based on them. It took Nvidia 2 months to make physx geforce capable, so it clearly isn't a technical issue. No, I'd rather be confined to one brand in my election of card, than not having any new features. It's your own choice. AMD is giving no choice because they don't implement anything and your view of not implement it unless both agree, gives you no choice either: it won't be implemented. And if your choice is not having them implemented, congrats you already have that option now, just don't buy a Nvidia card.

I think that collaboration is a nice idea but won't happen. I thnik the whole problem is not ati or nvidia, but microsoft, if micjrosoft offered proper support for directx to aid developers makign games nvidia's Twimtbp wouldn't stand a chnace. As it is Nvidia saw a hole in the development process and offered help, can it be shady some times? sure, but in the end nvidia s offrering the help that microsoft whould have.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.05/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
AMD IS giving a choice though...the difference is that they expect OPEN STANDARDS, such as DX, to be the middleman that ensures things work on all cards.

nVidia uses CLOSED STANDARDS, and releases the same tech, but keeps it exclusive. They turn these features into cash, while AMD says "hey. let's share!".

It's not like AMD's hardware is incapable of running this stuff...


But in the end, you are right, to a degree.

Hardly.

ATi has Streams, their answer to CUDA, the only problem is that Streams came out too late for the devs to care about. They were using CUDA for half a year, they didn't want to switch.

So now AMD has to rely on OpenCL and DirectCompute, because no one will use Streams. Trust me, they don't want to, they would be perfectly happy if everyone used Streams, but that isn't going to happen.
 
Joined
Apr 8, 2009
Messages
3,016 (0.52/day)
Location
vermont
System Name The wifes worst enemy
Processor i5-9600k
Motherboard Asrock z390 phantom gaming 4
Cooling water
Memory 16gb G.skill ripjaw DDR4 2400 4X4GB 15-15-15-35-2T
Video Card(s) Asrock 5600xt phantom gaming 6gb 14gb/s
Storage crucial M500 120GB SSD, Pny 256GB SSD, seagate 750GB, seagate 2TB HDD, WD blue 1TB 2.5" HDD
Display(s) 27 inch samsung @ 1080p but capable of much more ;)
Case Corsair AIR 540 Cube Mid tower
Audio Device(s) onboard
Power Supply EVGA GQ1000W MODULAR
Mouse generic for now
Keyboard generic for now
Software gotta love steam, origin etc etc
Benchmark Scores http://hwbot.org/user/philbrown_23/
fact is nvidia beat ati to the punch now nvidia is reaping the benefits of that. hell for all we know nvidia PAYS each developer to use cuda but what do we know???? niothing. except games keep getting released with cuda and those games run at 200fps on nvidia cards while only running at 150 on ati WHOPPidy DO! as long as the ati cards will still RUN the game which they sure will, who friggin cares! this the way its meant to be played CRAP has been going on for years now! its getting old, if you dont like how the game plays on ATI buy NVIDIA for christs sake! if you cant afford nvidia then DEAL WITH IT!
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.05/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
fact is nvidia beat ati to the punch now nvidia is reaping the benefits of that. hell for all we know nvidia PAYS each developer to use cuda but what do we know???? niothing. except games keep getting released with cuda and those games run at 200fps on nvidia cards while only running at 150 on ati WHOPPidy DO! as long as the ati cards will still RUN the game which they sure will, who friggin cares! this the way its meant to be played CRAP has been going on for years now! its getting old, if you dont like how the game plays on ATI buy NVIDIA for christs sake! if you cant afford nvidia then DEAL WITH IT!

Well, sort of, but in reality Just Cause 2 actually gets better FPS on my HD4890 then my GTX260 and even my GTX285, because those two extra eye candy effects that nVidia had added to the game use GPU power to calculate and extra rendering power to render.

Can I live without those two eye candy features? Absolutely. Do I care that I'm missing them by playing the majority of the game with an HD4890? Hell no.
 
Last edited:
Joined
Jun 20, 2007
Messages
3,942 (0.61/day)
System Name Widow
Processor Ryzen 7600x
Motherboard AsRock B650 HDVM.2
Cooling CPU : Corsair Hydro XC7 }{ GPU: EK FC 1080 via Magicool 360 III PRO > Photon 170 (D5)
Memory 32GB Gskill Flare X5
Video Card(s) GTX 1080 TI
Storage Samsung 9series NVM 2TB and Rust
Display(s) Predator X34P/Tempest X270OC @ 120hz / LG W3000h
Case Fractal Define S [Antec Skeleton hanging in hall of fame]
Audio Device(s) Asus Xonar Xense with AKG K612 cans on Monacor SA-100
Power Supply Seasonic X-850
Mouse Razer Naga 2014
Software Windows 11 Pro
Benchmark Scores FFXIV ARR Benchmark 12,883 on i7 2600k 15,098 on AM5 7600x
I can see this thread getting lockdown!!

Any marketing strategy that aims to proactively diminish another companies standing is to be expected. There (unfortunately) is nothing extraordinary about that.
However, when a percentage of service users that fall into the opposing camp have their experience diminshed by said practice, then that is very wrong. To create deals with software developers that aim to 'handicap' the opposition and co-erce the consumer to purchase a certain brand is, under different conditions, illegal practice. At best, it is unfair.
In a capitalist economy, it is fair to undercut, aggressively market and generally hype your product and detract from the opposition. This is all fair.
On the other hand, to use financial incentive to persuade developors to create an uneven playing field is underhanded (regardless of who does it).

Two rebuttals:

A) The American economy hasn't be a free market for decades. What you're saying is good on paper, not in practice (although it should be). Which means you're talking a lot of principal and not a lot of reality, unfortunately.

B) Nvidia's influence over the market, hasn't reached a point where people are boycotting software sales by the masses.
We should take a poll, how many people would really not buy a game because it didn't have AA support, or physics or etc? Because there's been plenty of titles just like that, and I don't believe it's stopped people in great numbers.

And as a side note, if you were really really THAT desperate to experience these miscellaneous and sometimes obscure features in a software program, then you could always go buy a good price/performance product from that company. In this case, say a GT200 series Nvidia card. Should you HAVE to? In perfect world, no. Though in our world where principal and practical rarely meet? Yes.
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.48/day)
Location
Reaching your left retina.
AMD IS giving a choice though...the difference is that they expect OPEN STANDARDS, such as DX, to be the middleman that ensures things work on all cards.

nVidia uses CLOSED STANDARDS, and releases the same tech, but keeps it exclusive. They turn these features into cash, while AMD says "hey. let's share!".

It's not like AMD's hardware is incapable of running this stuff...


But in the end, you are right, to a degree.

AMD plays the waiting game. Let others do the work, take the hassle of making it a standard, and when everything is done, then jump in the bandwagon. They do nothing to really promote or push those things. Like with hardware physics, they are doing nothing to move it forward. 2 years and a half and where is that shiny GPU Havok that was so much better and so much more open than PhysX?

And OpenCL, this is the real state of OpenCL:

If you are wondering what is the real deal with GPGPU API's, there is a telling tale of why Adobe opted to base its Mercury Engine on nVidia's CUDA language. While AMD will tell you that they're all for open standards and push OpenCL, the sad truth is that the company representatives will remain shut when you ask them about the real status of their OpenCL API - especially if you quote them a lead developer from a AAA software company with 10x more employees than AMD themselves that goes something like this: "I struggled to even get ATI's beta drivers installed and working, it was just problem after problem. Maybe once ATI gets their drivers out of beta and actually allow you to install them then I will have some performance numbers. I mean at this point AMD is so far behind in development tools they are not even worth pursuing right now."

So yeah, it's easy to "support" open standards (I support open standards. see? it took me 2 seconds to suport open standards), but when it comes to actually support them, they do next to nothing. It's not a matter of being able to do it, because they are more than competent, it's a matter of putting money into something that would potentially steal them money from their biggest bussiness, CPU and maybe more especifically server CPUs. And that's all.

I'm all for open standards, but I don't like them to sell me vaporware. OpenCL is not vaporware, but is taking far too long and the programs and features that were promised definately are vaporware. 2 years and a half ago! (I can't remark this on a writing as much as I would like) I was told, like everyone else on the planet, that I should pass on CUDA and PhysX, because AMD was pushing for open standards and we would have better features based on them, CUDA was dead. Well, where are those features and programs they were speaking about? V A P O R W A R E to divert attention from the real deal that CUDA/PhysX was at the time. 2 and a half years, 3 generations of cards...

As I said I rather have the features. Open standards don't help me get better features. Not when no one is doing something to promote them. Not when the one that says it is promoting and supporting them is more than likely doing as much as they can do on the shadows to keep them from actually releasing.
 

cadaveca

My name is Dave
Joined
Apr 10, 2006
Messages
17,232 (2.50/day)
I'm refering to the open standard available in Windows, DX, DXCompute, etc. Everything within DX ATi supports. We have been stuck with nVidia barely supporting DX since DX9, but why would they, when they have thier own proprietary solutions?


It's up to the developer to use the already-provided DX solutions, rather than something else. Why they make those choices(CUDA, STREAM, Phys-X, doesn't matter), is beyond me, if they want to maximize thier audience.

That's why AMD isn't promoting anything...that's up to Microsoft. AMD's hardware is fully DX compliant...


I agree, maybe they should do more...but...nothing other than what DX provides.

Havoc works just fine...it's developer's that need to use it...and some do. None of Havoc really requires a gpu. I think you misunderstand why you'd want to use GPU physics, and let me tell you, a lack of cpu power has nothing to do with it. Maybe a few years ago, when dualcores were the norm...but not now.
 
Top