• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

The secret of Doom (2016) performance on AMD GPUs

That's equally as stupid as saying S3 Metal was just a Glide. Core idea was the same, the actual thing wasn't. It's the same here. Mantle is NOT Vulkan.

Even Khronos own PR pamphlet says that:



Foundation means core idea. How it operated aka close to metal. It doesn't mean this IS Mantle. Because it isn't. If it was, then it wouldn't work on NVIDIA. Obviously.

I mean, don't you find it peculiar how Vulkan and DX12 both appeared all of a sudden basically out of nowhere when AMD showcased practical use of Mantle in real games? It's because they took Mantle's core idea and added it to OpenGL and D3D11.

EDIT:
Corrected D3D12 to D3D11.

Here's how it has to be whether you want to piss in the wind or not.

Factoid One: Vulkan is NOT mantle - we agree
Factoid Two: Vulkan exists because of Mantle. Read the quotes.

So, as far as they are not the same thing which I don't thing @arbiter was trying to say - Vulkan was created from scratch using Mantle's best components.

So you are both right and wrong. Vulkan uses the best parts of Mantle but it is not Mantle with a different badge. Agreed?

Which isn't really a home console, but a handheld and will likely get next to no 3rd party support (again, lol). Nintendo knows how to play that though, in handhelds they reign supreme for, whahaha, nearly 30 years. (RIP Vita... :()

I'm not a portable console person myself. I read that the hand held part will also have a home dock? I think it's possibly also going to be a Pascal Tegra chip.
 
Factoid One: Vulkan is NOT mantle - we agree
Factoid Two: Vulkan exists because of Mantle. Read the quotes.

Factoid Three: So does D3D12 and it has nothing to do with Mantle apart from idea itself.
 
AMD's drivers are still miles behind, they have way more cpu overhead it seems when you are more cpu limited.
Why is everyone so sure about it?
Couldn't it simply be, well, caused by hardware architecture differences?
 
Factoid Three: So does D3D12 and it has nothing to do with Mantle apart from idea itself.

Dude - you're impossible. I give up, discussing things with you is a total waste of time.

Why is everyone so sure about it?
Couldn't it simply be, well, caused by hardware architecture differences?

It's a bit of both. It takes a lot longer for AMD to optimise the drivers for DX11 to suit the game code, due to their hardware. This is why AMD gets better over time - because they learn better optimisations for titles. DX12 and Vulkan is so low level, the coding is required far less on the driver side as the GCN extensions can be used in dev code and it just runs so much better than in DX11. So it's not so much the DX11 drivers are bad, just not initially optimised to work the code around GCN's hardware.
Which is why it will now flip for Pascal so that Nvidia will have to work harder on their drivers to better use their extensions for Pascal in DX12 and Vulkan. Although if the developers code appropriately, Nvidia will run near optimal anyway and wont see much more improvement.
 
Calling me "impossible" is the strongest argument you've got? XD Ok then.

What they've done is they took Mantle's ideas and used them on OpenGL while also cleaning that one to create Vulkan. That's what it is. That's what has always been. Because if you look closely, Vulkan still has the same modular structure of OpenGL. It's essentially OpenGL on steroids named Vulkan.
 
Calling me "impossible" is the strongest argument you've got? XD Ok then

It's not an argument - it's a statement. And there is no discussion with you.
 
Proprietary, but not closed. Unlike, cough...
Well if it wasn't closed when was it ever open sourced before turning it over to kronos group? The answer was never, they said they would release it. That day came and went, month went by no source release, 6 months went by nothing, 1 year went by they announced they are canning it and handing it over to kronos. It was closed as they never released it open source like they said they would.

Why is everyone so sure about it?
Couldn't it simply be, well, caused by hardware architecture differences?

Don't remember where it was posted but there was stories about it some years ago of machines with amd gpu's having higher loads.
 
Well if it wasn't closed when was it ever open sourced before turning it over to kronos group? The answer was never, they said they would release it. That day came and went, month went by no source release, 6 months went by nothing, 1 year went by they announced they are canning it and handing it over to kronos. It was closed as they never released it open source like they said they would.
You mean the 435 page programming guide isn't enough? :p
 
that is just a guide which is far from the source SDK they promised and never gave.
You're really bad at identifying sarcasm, aren't you?
 
Lol @ all the "fan boy" statements around here... I mean come on people.. It's a freaking graphics card.. Get over it..
 
Dude - you're impossible. I give up, discussing things with you is a total waste of time.



It's a bit of both. It takes a lot longer for AMD to optimise the drivers for DX11 to suit the game code, due to their hardware. This is why AMD gets better over time - because they learn better optimisations for titles. DX12 and Vulkan is so low level, the coding is required far less on the driver side as the GCN extensions can be used in dev code and it just runs so much better than in DX11. So it's not so much the DX11 drivers are bad, just not initially optimised to work the code around GCN's hardware.
Which is why it will now flip for Pascal so that Nvidia will have to work harder on their drivers to better use their extensions for Pascal in DX12 and Vulkan. Although if the developers code appropriately, Nvidia will run near optimal anyway and wont see much more improvement.
From
http://www.bit-tech.net/news/hardware/2008/10/22/nvidia-gpus-support-dx10-1-features-in-far-cry-2/1

"The Ubisoft team wanted to enhance the anti-aliasing through the reading of the multisampled depth Z-buffers,explained Vincent Greco, Worldwide Production Technical Coordinator at Ubisoft. "This feature was enabled by either using DX10.1 or using a DX10.0 extension supported by Nvidia DirectX 10 GPUs."



The above example was from year 2008. This example shows NVIDIA's (or it's agents) ability to embed specific NVIDIA hardware code path, while AMD has address it via drivers i.e. the complex JIT re-compiler and post-game release profiling.


It was only recent (i.e. around this game console's generation) that AMD has awoken to the fact that NVIDIA wasn't playing by the same rules. ATI/AMD was too dumb to realize NVIDIA wasn't playing by the same rules.


NVAPI's details was under NDA i.e. it's wasn't transparent to the general public. The NVAPI details was hidden under NVIDIA's "The Way It's Meant To Be Played" program.

It's a no brainier to why DX12 has little benefits for NVIDIA GPUs.
 
That's equally as stupid as saying S3 Metal was just a Glide. Core idea was the same, the actual thing wasn't. It's the same here. Mantle is NOT Vulkan.

Even Khronos own PR pamphlet says that:



Foundation means core idea. How it operated aka close to metal. It doesn't mean this IS Mantle. Because it isn't. If it was, then it wouldn't work on NVIDIA. Obviously.

I mean, don't you find it peculiar how Vulkan and DX12 both appeared all of a sudden basically out of nowhere when AMD showcased practical use of Mantle in real games? It's because they took Mantle's core idea and added it to OpenGL and D3D11.

EDIT:
Corrected D3D12 to D3D11.



3094931-8657726637-29925.jpg
 
Holy shit, you've just listed all the instructions used by ANY API probably going as far back as 3dfx Glide. That's like having a glass of water and a bottled water and then creating a 6 hours long presentation on how both waters are wet...

EDIT:
Btw, can you please also list OpenGL instructions side by side? I bet they'll all be exactly the same with "gl" prefix instead of "vk" and "gr"? That would be embarrassing...
 
Last edited:
@rvalencia My god, any instruction starting with "vk" is universal (vulkan, like "gl", "arb" or "ext" ones in opengl maybe?), even a Haswell IGP can run them, a "nv" or "amd" would be a specific one, and is the developer's decision to implement it.

You seem to have a 290X, download GPU Caps and look at the OpenGL and Vulkan extensions, see all those AMD/ATI ones? Those are what you call "specific AMD hardware codepath", Nvidia doesn't have them, and they are not just one or two.
 
Last edited:
Take 2.

NV has secretly been having it´s hardware supported directly by (some) games instead through drivers.

AMD notices this and launches a long-term program; which they somewhat fail at, and decide to offload it with the added bonus it will be implemented in more games automatically instead of bribing; sorry; convincing; game devs to use that tech(NV)

summary:
NV > "secret tech" > game devs
AMD > fails to launch "secret tech" > offloads to 3th party(no longer closed?) > game devs

Title of this topic should be: "Dirty secret of NV and how AMD failed to respond thus giving theirs for free out of spite"
 
How do you consider Mantle a "fail" when it was the catalyst for DX12 and Vulkan (upgraded OpenGL basically)? Mantle was just picking up in high profile games when MS announced they've done the same in DX12, making Mantle redundant. Why would AMD keep insisting on forcing their API and wasting resources there when it's not needed anymore. And because they've been designing GPU's around that for ages, of course theirs benefit the most from it. They've decided to give it away to open source community and Vulkan was born. Which is essentially still OpenGL, just better and with different name.
 
Take 2.

NV has secretly been having it´s hardware supported directly by (some) games instead through drivers.

AMD notices this and launches a long-term program; which they somewhat fail at, and decide to offload it with the added bonus it will be implemented in more games automatically instead of bribing; sorry; convincing; game devs to use that tech(NV)

summary:
NV > "secret tech" > game devs
AMD > fails to launch "secret tech" > offloads to 3th party(no longer closed?) > game devs

Title of this topic should be: "Dirty secret of NV and how AMD failed to respond thus giving theirs for free out of spite"

Odd that even in AMD games during DX11, Nvidia still led the performance. So, in that regard, your tin foil hat theory falls down. If BioShock (infinite?), for example is an AMD sponsored title, it was one of Nvidia's better performing games.

By the way, how does a high level API directly use hardware without drivers? I think what you misread is that NV code extensions were easily used to address NV hardware. That's called development and driver. AMD simply had hardware that was a lot harder to use in certain games. Ironically, IIRC, Crysis (crytek) seemed to use AMD hardware quite well, tesselation conspiracy/issue aside.
 
Whats Interesting is AMD made the API open to NV to use but NV refused to use it, where as NV refuses to share their APIs. Honestly Open APIs allow everyone to win no matter Red or Green
 
@rvalencia My god, any instruction starting with "vk" is universal (vulkan, like "gl", "arb" or "ext" ones in opengl maybe?), even a Haswell IGP can run them, a "nv" or "amd" would be a specific one, and is the developer's decision to implement it.

You seem to have a 290X, download GPU Caps and look at the OpenGL and Vulkan extensions, see all those AMD/ATI ones? Those are what you call "specific AMD hardware codepath", Nvidia doesn't have them, and they are not just one or two.
Nvidia's Vulkan has it's own extensions e.g.
GL_NV_draw_vulkan_image
VK_NV_glsl_shader <---------------------- focus on this.


https://developer.nvidia.com/unlocking-gpu-intrinsics-hlsl

None of the intrinsics are possible in standard DirectX or OpenGL. But they have been supported and well-documented in CUDA for years. A mechanism to support them in DirectX has been available for a while but not widely documented. I happen to have an old NVAPI version 343 on my system from October 2014 and the intrinsics are supported in DirectX by that version and probably earlier versions. This blog explains the mechanism for using them in DirectX.

Unlike OpenGL or Vulkan, DirectX unfortunately doesn't have a native mechanism for vendor-specific extensions. But there is still a way to make all this functionality available in DirectX 11 or 12 through custom intrinsics. That mechanism is implemented in our graphics driver and accessible through the NVAPI library.
-------

http://www.bit-tech.net/news/hardware/2008/10/22/nvidia-gpus-support-dx10-1-features-in-far-cry-2/1

"The Ubisoft team wanted to enhance the anti-aliasing through the reading of the multisampled depth Z-buffers, explained Vincent Greco, Worldwide Production Technical Coordinator at Ubisoft. "This feature was enabled by either using DX10.1 or using a DX10.0 extension supported by Nvidia DirectX 10 GPUs."

The above example was from year 2008. This example shows NVIDIA's (or it's agents) ability to embed specific NVIDIA hardware code path, while AMD has address it via drivers i.e. the complex JIT re-compiler and post-game release profiling.

It was only recent (i.e. around this game console's generation) that AMD has awoken to the fact that NVIDIA wasn't playing by the same rules. ATI/AMD was too dumb to realize NVIDIA wasn't playing by the same rules.

It's a no brainier to why DX12 has little benefits for NVIDIA GPUs.



Further information on NVIDIA's shader intrinsics from https://developer.nvidia.com/reading-between-threads-shader-intrinsics



Odd that even in AMD games during DX11, Nvidia still led the performance. So, in that regard, your tin foil hat theory falls down. If BioShock (infinite?), for example is an AMD sponsored title, it was one of Nvidia's better performing games.

By the way, how does a high level API directly use hardware without drivers? I think what you misread is that NV code extensions were easily used to address NV hardware. That's called development and driver. AMD simply had hardware that was a lot harder to use in certain games. Ironically, IIRC, Crysis (crytek) seemed to use AMD hardware quite well, tesselation conspiracy/issue aside.
AMD sponsored title could have focused on good effective draw call batching to reduce AMD's DX11 driver draw call overheads.



https://developer.nvidia.com/dx12-dos-and-donts

On DX11 the driver does farm off asynchronous tasks to driver worker threads where possible - NVIDIA



NVIDIA DX11 drivers has some key DX12 like speed up methods e.g Async task and multithreading. This information wasn't in the general public prior to DX12.


Under DX12, AMD gains Async (hardware accelerated via ACE units) and multithreading.
 
Last edited:
vk_nv_glsl_shader is for using existing GLSL shaders, the ones used in OpenGL (they are compiled in runtime) on Vulkan, so you don't have to port them to SPIR-V (the new universal method, mostly precompiled). It's purely for easing the porting work, it even makes things run slower.
Good point on the Far Cry 2 example, but you have to remember Nvidia refused to implement DX10.1, they had to do an implementation or they would have looked slower/older than the competition.

OpenGL had vendor specific extensions for decades, not just with the recent console generation.
 
So does let's say 6700k benefit more or 6800k do? I couldn't find benchmark for Doom Vulkan between the 2 processors.
 
Last edited:
Mmm good question, that depends of how IdTech implemented multi-theading in the engine.
Vulkan, as is, doesn't use multiple cores, but is capable of doing it, so, it has to be implemented. If the IdTech engine supports 8 threads in Direct3D, it supports it in Vulkan as well.
 
Back
Top