• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Is DirectX 12 Worth the Trouble?

If you calm down and stop the rant, I think the main focus should be on building game engines natively on D3D12 instead of D3D9 or 10/11. Unless you have an Unreal Engine 5, for example, or Frostbite 3 build with that in mind, all the games are going to suck on D3D12.

That's not how things work. For an engine, there is no "native" thing. You're confusing game engine and renderer (render path). They are not the same thing. But they do work together to output image you see in the end. There may be things that explicitly depend on game engine support and are hard to do later, but majority isn't.
 
And if you really want to say Nvidia can't do DX12, what are the fastest 2-3 single gfx cards right now?
Being the fastest or most powerful, does not equate to good Dx12 performance.
It's like building a bigger V8 to compare with high tech turbo 4's they may win a race, but they brute force it, no refinement.
 
Being the fastest or most powerful, does not equate to good Dx12 performance.
It's like building a bigger V8 to compare with high tech turbo 4's they may win a race, but they brute force it, no refinement.
Well, the bigger V8 will most likely outlast that high tech turbo, so there's that :D
 
DX12 for Deus-Ex Mankind Revolution was a pretty huge difference in performance and frame stuttering. I could actually crank up the graphics options even higher than the DX11 render path, and not lose any frame rate at all. The framerate itself was also smoother.

I've been impressed with my limited experience with DX12 games so far.
 
DX12 for Deus-Ex Mankind Revolution was a pretty huge difference in performance and frame stuttering. I could actually crank up the graphics options even higher than the DX11 render path, and not lose any frame rate at all. The framerate itself was also smoother.

I've been impressed with my limited experience with DX12 games so far.

Same here, huge benefits in DX12 on my old entertainment system, it was like making a 1055T suddenly live a year longer.

The xeon system doesn't have a decent graphicscard in it and it really really hates putting one in it too, bluescreens so it is a cpu brute with 10 cores 20 thread.

hence my 1055T gaming system which runs everything I want up to about this point.. somehow... it's really crap.. but still games...
 
Fury X actually beats 980Ti AND 1070 in the linked benchmarked (scroll down to DirectX12)

Read my post. I clearly say stock 980ti is a limp fish. But most OC to 20% gains. The Gaming X 980ti is faster than Fury X. I use my res of 1440p.
 
For the amount of money developers are charging for games they have a lot of balls to complain it's "too much work". I am not content with complacency.

Seems more like the industry is trying to hamstring advancement. In the name of profits. Again. IMHO
 
I don't know, should not we wait at least until Shader Model 6.0 is released, or isn't it supposed be part of DirectX 12, before we or the devs bury DX12?
 
Nixxess is a tool.
 
For the amount of money developers are charging for games they have a lot of balls to complain it's "too much work". I am not content with complacency.

Seems more like the industry is trying to hamstring advancement. In the name of profits. Again. IMHO
Try to minimize the effort and maximize profits, isn't this what any industry is about?

What's missing from the picture is what is the amount of extra effort needed to get those improvements from DX12. Cause if you're spending 10% extra time to get 10% better performance, that's one thing. If you're spending 100% extra time to get 10% better performance, that's another. I'm a developer and I have an idea, but I won't speculate.
 
Try to minimize the effort and maximize profits, isn't this what any industry is about?

What's missing from the picture is what is the amount of extra effort needed to get those improvements from DX12. Cause if you're spending 10% extra time to get 10% better performance, that's one thing. If you're spending 100% extra time to get 10% better performance, that's another. I'm a developer and I have an idea, but I won't speculate.

If that's what satisfies the market, yes. Consumers have the power. NOT the corporations. If a pile of poop is what it takes to make you happy, that is all you're going to get.

TBH I think the fact that the smaller player is pushing for these advancements while the larger player is ignoring/dismissing them is quite telling. Let the hardware guys focus/work on hardware and let the software guys make the absolute best use of it. We all win. :)
 
What's missing from the picture is what is the amount of extra effort needed to get those improvements from DX12. Cause if you're spending 10% extra time to get 10% better performance, that's one thing. If you're spending 100% extra time to get 10% better performance, that's another.
It's more like you spend 50% addtional time (over the time spent for the DX11 engine you already have), just to not be worse .. reach the same performance.

Ubisoft had a (highly technical) presentation, on porting Anvil Engine to DX12. Their first internal versions, where they just did a simple port from DX11 to DX12 without touching on any features, they got 200% worse frametimes.
 
Rise of the Tomb Raider is probably the best optimized DX12 title out there so when the developer said DX12 is not worth it it's probably true. I would rather play game when it launch rather than 3 months later for 10% better performance, maybe for gamers on a budget who wait for steam sale DX12 is rather intriguing.
 
It's more like you spend 50% addtional time (over the time spent for the DX11 engine you already have), just to not be worse .. reach the same performance.

Ubisoft had a (highly technical) presentation, on porting Anvil Engine to DX12. Their first internal versions, where they just did a simple port from DX11 to DX12 without touching on any features, they got 200% worse frametimes.
:eek::eek::eek:o_Oo_Oo_O:eek::eek::eek: Wow that's not encouraging.
 
Developers sure like to kick the sheets around when it comes to this stuff. I don't know why they're still pouting about not having super fine control over the hardware when they know they're going to dump a lot on nVidia and AMD to fix in the drivers anyway.

Consumer complains about Ports

Now Porting studio complain about porting from DX11 to DX12.

Stop porting and make DX12 the standard
Add in Vulkan support alongside DX12 and everyone wins.
 
Well, the bigger V8 will most likely outlast that high tech turbo, so there's that :D

No, they usually don't. The high tech turbo will outlast big V8s at the track, besides being lighter, there are fewer moving parts.

It's why heavy industry doesn't use V8 engines and usually prefers inline6 engines with one or two turbos.

Developers are whiny about DX12 as it IS more complicated to code for, and it's a much tougher gate to entry VS system managed but less robust code.

The nice thing is, good and great developers will use it to its fullest.

Lastly, DX11 was so unused that GTA5 didn't become the performance optimization wonder that it is..... just the difference in a developer wanting to do a great job vs slop.
 
But that's exactly the thing developers are complaining about: a lot of work with results that only show if you have a rather new AMD card and in some cases also an older CPU.

Do I really need to point you to steam statistics? 46% are still running dual cores. I imagine MANY of the quads are older, too.

Devs are lazy assholes. They don't care about majority (and likely vast majority) of users. They want to hide their shitty work with new hardware. We see it nearly every game release, now.
 
Do I really need to point you to steam statistics? 46% are still running dual cores. I imagine MANY of the quads are older, too.

Devs are lazy assholes. They don't care about majority (and likely vast majority) of users. They want to hide their shitty work with new hardware. We see it nearly every game release, now.
Actually, the vast majority of users is the only thing devs care about, cause that's where sales come from.

And yes, 46% of the users may be running dual cores, but just below half of them have both a DX12 GPU and run Win10. And in Steam's view, 750Ti or HD7900 qualifies as a DX12 card. I'm not disputing that people use older systems, but those are not the people playing AAA games, which is where you expect DX12 to make its entrance. Whatever is out there, isn't enough to worth the effort, that's what the article says. And while I know devs are lazy, in this case I don't think they're wrong.
 
Actually, the vast majority of users is the only thing devs care about, cause that's where sales come from.

And yes, 46% of the users may be running dual cores, but just below half of them have both a DX12 GPU and run Win10. And in Steam's view, 750Ti or HD7900 qualifies as a DX12 card. I'm not disputing that people use older systems, but those are not the people playing AAA games, which is where you expect DX12 to make its entrance. Whatever is out there, isn't enough to worth the effort, that's what the article says. And while I know devs are lazy, in this case I don't think they're wrong.

How can they play AAA games when a lot are mediocre looking, but require high end hardware to run? It's a self-fulfilling prophecy.

And it's not like only AAA titles matter, b/c they're total shit on the whole. I don't play them, either.
 
I can see where Vulkan and GPUOpen can really shine in this situation with pre-canned scripts to do what you need. Modify the code to your needs. Now if the developers used something like that i feel they could waste less time and make their timeline. Only problem is now is alot of developers will have to build code from scratch until the repositories start to fill.
 
How can they play AAA games when a lot are mediocre looking, but require high end hardware to run? It's a self-fulfilling prophecy.

And it's not like only AAA titles matter, b/c they're total shit on the whole. I don't play them, either.

Doesn't that just prove his point? There is no market for shit games and if there is one, it evaporates shortly after release (No Man's Sky anyone?). And you can polish a turd, it's still a turd.

What Nixxes is saying (not a person, its a company, but ok) is that the adoption rate of DX12, alongside its limited advantage on CURRENTLY dominant hardware among users, is not worth the effort. And that is just about similar to what most of the sane people on this forum said when AMD's Mantle vanished and all eyes turned to DX12 as the new miracle API.

It's the same as with all economics; if there is no good business case, nobody's doing the business. And for DX12, this is very true in most cases for the vast majority of games, because most games, especially triple A console ports, are hardly CPU intensive because then the consoles couldn't run them. You don't need the majority of theoretical performance advantage you get with DX12 because that advantage reflects mostly on CPU load, not GPU.
 
Back
Top