• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

OFFICIAL The Witcher 3: Wild Hunt (Discussion)

They're making money and implementing new visual effects to make some more in the long run, they can afford it.
It's anti-trust material: creating an exclusive market.

...never seen any game that forces you to use them.
Arkham games, Witcher 3, most Unreal Engine 4 games, and the Farming Simulator games. They all use PhysX for physics calculations. There's some physics based games out there that can't run on anything except NVIDIA hardware because PhysX is forced.

If you don't have an NVIDIA card, it disables visual effects and does it on the CPU. NVIDIA made no attempt make PhysX run on GCN nor did they give AMD the option to optimize their own hardware for PhysX. There's other physics libraries out there that works as good or better than PhysX via OpenCL (works on all GPUs). If NVIDIA wasn't paying developers to exclusively use their technology, PhysX would have died a decade ago.
 
Last edited:
Why can I not get into this game? I love Skyrim but this just bores me...
 
It's anti-trust material: creating an exclusive market.


Arkham games, Witcher 3, most Unreal Engine 4 games, and the Farming Simulator games. They all use PhysX for physics calculations. There's some physics based games out there that can't run on anything except NVIDIA hardware because PhysX is forced.

If you don't have an NVIDIA card, it disables visual effects and does it on the CPU. NVIDIA made no attempt make PhysX run on GCN nor did they give AMD the option to optimize their own hardware for PhysX. There's other physics libraries out there that works as good or better than PhysX via OpenCL (works on all GPUs). If NVIDIA wasn't paying developers to exclusively use their technology, PhysX would have died a decade ago.
So on an AMD card you don't get this cloth physics I linked in the witcher 3 video ?

So if you don't have a nvidia card and the effect is disabled and physics is handled by the cpu then it is NOT forcing ppl to use nvidia hardware. You can run a game without the nvidia physics, without the said visual effect. Just like disabling godrays from game options.
 
Why can I not get into this game? I love Skyrim but this just bores me...
You have to get through the first town and then figure out the fighting... I had the same issue at first, and then once i got into it - it was amazing.
 
I'll have to try it again
 
Why can I not get into this game? I love Skyrim but this just bores me...
Recommend playing it with a controller (crossbow is pretty damn useless and when needed, auto-targeting works well enough). Witcher games have always been more about defense than offense in gameplay. Not really sure what's boring you about it. Other than the huge difference in combat, the games are very similar.
You can run a game without the nvidia physics, without the said visual effect.
You're talking about an effect. The cause is that NVIDIA doesn't accelerate PhysX on any hardware except their own. The effect is developers can't rely on it so it's only practical use is in visual effects. CD Projekt RED released an update to turn down/disable NVIDIA effects on non-NVIDIA cards. I'd argue they shouldn't bother implementing these technologies at all if they're not platform agnostic. They only exist to sell NVIDIA cards and game makers shouldn't get in the business of promoting hardware.
 
I don't think they're similar, just because Witcher is more story based (but the best of the story heavy open worlds). Skyrim has the emergent gameplay factor. Lots of randomness. Plus you can mod your eyes out.
 
Recommend playing it with a controller (crossbow is pretty damn useless and when needed, auto-targeting works well enough). Witcher games have always been more about defense than offense in gameplay. Not really sure what's boring you about it. Other than the huge difference in combat, the games are very similar.

You're talking about an effect. The cause is that NVIDIA doesn't accelerate PhysX on any hardware except their own. The effect is developers can't rely on it so it's only practical use is in visual effects. CD Projekt RED released an update to turn down/disable NVIDIA effects on non-NVIDIA cards. I'd argue they shouldn't bother implementing these technologies at all if they're not platform agnostic. They only exist to sell NVIDIA cards and game makers shouldn't get in the business of promoting hardware.
I hear your point but it's still progress. It's just AMD got nothing going on and nvidia are being kinda dicks for keeping it ALL proprietary. Still, from my viewpoint, I don't care who gets the revenue, I want some of those options in my game to look better.

You're being too moralistic about video games. I played lots of "gaming evolved" titles, ain't that "promoting hardware" too ? This discussions always goes nowhere and makes me sick, AMD would do that too if they could.
 
AMD has freesync monitors all over the place. That's at least one good platform specific "side" feature they've done well.
 
still, freesync is working in a very limited range for some of those. but that's totally OT.
 
I have said this in another thread and I'll say it again , GameWork should have no place in any decent game engine. Any developer who has good intentions and is serious about their games will develop and use their own in-house technologies that are hardware agnostic which simply work and look better.
 
still, freesync is working in a very limited range for some of those. but that's totally OT.

I'm not trying to be offtopic. I'm just saying they managed one thing well. I agree with you in general. This is just sour grapes.

I have said this in another thread and I'll say it again , GameWork should have no place in any decent game engine. Any developer who has good intentions and is serious about their games will develop and use their own in-house technologies that are hardware agnostic and simply work and look better.

Reinventing the wheel is a lot of work and money.
 
Reinventing the wheel is a lot of work and money.

Kind of a common misconception. It doesn't take that much time and effort to come up with a solution for a particular technical feature. Pretty much all breakthroughs came from independent game studios and not from the almighty multi billion dollar Nvidia corporation. And calling them "breakthroughs" is sort of a overstatement , most of the time it's just 1-2 programmers implementing a technique described in a academic paper from years ago. No one is really reinventing the wheel. And of course I am talking about big studios.
 
Kind of a common misconception. It doesn't take that much time and effort to come up with a solution for a particular technical feature. Pretty much all breakthroughs came from independent game studios and from the almighty multi billion dollar Nvidia corporation. And calling them "breakthroughs" is sort of a overstatement , most of the time it's just 1-2 programmers implementing a technique described in an academic paper from years ago.

Either that or they're already common in pro rendering applications. But bringing things to the normal consumer is a feat in itself. And even if a single game developer came up with a framework, it doesn't mean others can use it. It might be in a publisher's hands, who create an even worse situation than Nvidia or AMD.
 
And even if a single game developer came up with a framework, it doesn't mean others can use it. It might be in a publisher's hands, who create an even worse situation than Nvidia or AMD.

But it just simply doesn't act as an impediment to anyone. Like I said , 99% of the time it's just implementations of things that have already been described in great detail , there is no need to license this from others. Hell , some studios actually publish in detail some of their work. However when someone like Nvidia turns this inherently transparent and open practice into something proprietary , I say that's a big no.
 
But it just simply doesn't act as an impediment to anyone. Like I said , 99% of the time it's just implementations of things that have already been described in great detail , there is no need to license this from others. Hell , some studios actually publish in detail some of their work. However when someone like Nvidia turns this inherently transparent and open practice into something proprietary , I say that's a big no.

I'd prefer some open standard too, but I can't fault them for being opportunists. This is capitalism. Games aren't subject to government-like regulations. They're not even a service that fits well for that (say, like power and broadcasting).
 
I played lots of "gaming evolved" titles, ain't that "promoting hardware" too ?
That's branding, which is fine. What isn't fine is encouraging developers to use tech that damages the user experience of their competitors. AMD would likely get a settlement out of NVIDIA if they pursued legal action for anti-trust behavior: NVIDIA is undeniably creating an ecosystem that represents a barrier to entry to competitors. Thing is, AMD cards still sell very well despite NVIDIA's behavior so AMD would struggle to make the argument that they were in fact damaged.
Games aren't subject to government-like regulations.
I'd argue this is exactly why AMD open sources everything. If AMD did what NVIDIA got away with, regulators would be all over them. NVIDIA gets a pass because it's not linked to Intel. Intel is a name everyone knows and most people know of AMD by extension. NVIDIA is largely unknown outside of discreet graphics cards.
 
Let AMD win like a man. They may very well just do that. For heaven's sake, they practically dominate everything but PCs and their fans still feel like they're "fighting the system" or something.

Let the gaming industry go completely over the heads of governments. Don't even hope for notice or scrutiny. This will be good in the longrun. Because it wouldn't end at just "regulating" graphics APIs (just the thought of this is goofy in itself, but it would get worse).
 
Oh, so you think it's perfectly fine that you really have to run Windows to game on PC? No imagine if you could only play games on AMD or NVIDIA graphics cards. This is pretty much how it was back in the 90s before OpenGL and DirectX standardized graphics acceleration. It was especially bad with sound cards because there was a lot of vendors out there and no uniformity in how to talk to them. NVIDIA tried (still is) to push the industry back in that direction. It's good for no one. Luckily most developers got the hint and stopped blindly supporting NVIDIA middleware. There's still a lot of PhysX implementations out there but the vast majority developers strictly use it on CPU only so there's no favoritism (and subsequent bugs).
 
Oh, so you think it's perfectly fine that you really have to run Windows to game on PC? No imagine if you could only play games on AMD or NVIDIA graphics cards. This is pretty much how it was back in the 90s before OpenGL and DirectX standardized graphics acceleration. It was especially bad with sound cards because there was a lot of vendors out there and no uniformity in how to talk to them. NVIDIA tried (still is) to push the industry back in that direction. It's good for no one. Luckily most developers got the hint and stopped blindly supporting NVIDIA middleware. There's still a lot of PhysX implementations out there but the vast majority developers strictly use it on CPU only so there's no favoritism (and subsequent bugs).

Regulation doesn't help things. Competition does. I'm only in support of things that do the latter. You could argue that Nvidia has created a non-competitive environment.. but that's not what you're doing. It sounded more like legal action, and some top down approach instead... by a body of people who will more than likely not be in the mud and dirt, truly working in the industry or interacting with consumers, but playing it safe. And it'd be even worse if the government was an actual part of it. They don't know shit. It'd be even worse than the DRAM speed standards that only advocate 2333Mhz, which is merely an informal body of standards. Governments enforce things instead.

When States get involved, it's no better than Soviets regulating the precise number of thistles on a toothbrush (yeah, they did that). They find an "ideal" and rarely deviate.. even when better ideas come along. All the while, a competitive market across the globe created the electronic brushes, myriad variations in soft/hardness, patterns, etc.

edit: BTW, I use Windows because it's better. And I know Linux well enough.. I've been tinkering with it almost soon after Linus built the kernel (circa 94/95) and completed distros were coming out. It's still a piece of shit for desktop computing. And the polished ones are nothing but a pretty house of cards.

It's why I went to OS X for awhile. They made a better UNIX system (and so did NeXT before that). So much so that people don't even know it's UNIX. It had potential for gaming, but Apple is idiotic and slow about hardware (not to mention they killed clones before OS X came out). It's their own fault Windows still won, after they dropped the old Mac OS.
 
Last edited:
Competition only works when the market is healthy. Regulation works to make markets healthy. App stores (Steam, Google Play, Windows Store, Xbox Store, Apple app store, etc.) and APIs like DirectX, even though they're quite lacking in competition do create a market under them which is why the regulation hammer doesn't drop on them. NVIDIA is attempting to create a middleware/developer/hardware ecosystem that shut out competitors and they did succeed to some extent. This is so off topic so I'll stop there.
 
Competition only works when the market is healthy. Regulation works to make markets healthy. This is so off topic so I'll stop there.

Fair enough. Sorry for the rant btw. The subject just unearthed a lot of random thoughts on this.

The problem is that regulation doesn't always make markets healthy. I'm all for creating the right environments. But sometimes, it's just siding with losers for the hell of it, without paying attention to what actually made them losers. Like they're only there because some "big bad meanie" made them that way. They're being rewarded for their own shitty decisions, while the giant got punished simply for playing the game better.
 
The only real gripe I have with GameWorks is that in games like The Witcher 3, physx cannot be done on a separate card. I take that as maybe a CDPR decision though.

Sometimes I see GW as a way of Nvidia saying "this is what we are doing with the extra umph available." Not letting AMD optimize is a business decision. If they are intentionally hindering GCN then that is indeed sheisty. But saying AMD should be allowed to optimize for GW would be like giving your battle opponent your war plans. Its a socialist idea really.

No problem with GW as long as it is not intentionally hindering GCN. But running poorly because AMD has no access to code is just good ol capitalism.
 
AMD could run GameWorks tasks async which GCN is fantastic at. Yes, it would make Maxwell cards and down look bad.
 
So guys, any chance we can return the Witcher 3 thread to Witcher 3? :rolleyes::p
 
Back
Top