Wednesday, October 30th 2019

Microsoft Details DirectX Raytracing Tier 1.1, New DirectX 12 Features

Microsoft detailed feature additions to the DirectX 12 3D graphics API, and an expansion of its DirectX Ray-tracing (DXR) API to Tier 1.1. The updated APIs will be included with the Windows 10 major update that's scheduled for the first half of 2020 — the features are accessible already for developers in Windows Insider preview builds. DXR 1.1 is the first major update to the API since its Q4-2018 launch, and adds three major features. To begin with, it brings support for extra shaders to an existing ray-tracing PSO (pipeline-state object), increasing the efficiency of dynamic PSO additions. Next up, is ExecuteIndirect for Raytracing support, described by Microsoft as "enabling adaptive algorithms where the number of rays is decided on the GPU execution timeline." This could be a hint what to expect from NVIDIA's next-generation GPUs that are expected for next year. Lastly, the API introduces support for Inline Raytracing, which gives developers more control over ray traversal and scheduling.

Over in the main DirectX 12 API, Microsoft is introducing support for Mesh Shaders, which brings about systemic changes to the graphics pipeline. "Mesh shaders and amplification shaders are the next generation of GPU geometry processing capability, replacing the current input assembler, vertex shader, hull shader, tessellator, domain shader, and geometry shader stages," writes Microsoft in its blog post. DirectX Sampler Feedback contributes toward memory management by allowing games to better understand which texture assets are more frequently accessed and need to remain resident.
Microsoft describes the performance impact of DirectX Sampler Feedback on two key API features: streamed textures and texture-space shading, as follows:

Texture Streaming
Many next-gen games have the same problem: when rendering bigger and bigger worlds with higher and higher quality textures, games suffer from longer loading time, higher memory pressure, or both. Game developers have to trim down their asset quality, or load in textures at runtime more than necessary. When targeting 4k resolution, the entire MIP 0 of a high quality texture takes a lot of space! It is highly desirable to be able to load only the necessary portions of the most detailed MIP levels.

One solution to this problem is texture streaming as outlined below, where Sampler Feedback greatly improves the accuracy with which the right data can be loaded at the right times.
  • Render scene and record desired texture tiles using Sampler Feedback.
  • If texture tiles at desired MIP levels are not yet resident:
    o Render current frame using lower MIP level.
    o Submit disk IO request to load desired texture tiles.
  • (Asynchronously) Map desired texture tiles to reserved resources when loaded.
Texture-Space Shading
Another interesting scenario is Texture Space Shading, where games dynamically compute and store intermediate shader values in a texture, reducing both spatial and temporal rendering redundancy. The workflow looks like the following, where again, Sampler Feedback greatly improves efficiency by avoiding redundant work computing parts of a texture that were not actually needed.
  • Draw geometry with simple shaders that record Sampler Feedback to determine which parts of a texture are needed.
  • Submit compute work to populate the necessary textures.
  • Draw geometry again, this time with real shaders that apply the generated texture data.
Other Features
  • DRED support for PIX markers
  • New APIs for interacting with the D3D9on12 mapping layer
  • R11G11B10_FLOAT format supported for shared resources
  • New resource allocation flags that allow creating D3D12 resources without also making them resident in GPU memory, or without zero initializing them, which can improve the performance of resource creation
  • D3DConfig: A new tool to manage DirectX Control Panel settings
PIX Support
PIX support for these new DirectX 12 features is coming in the next few months. We will provide more details when deep diving into each feature in coming weeks.
Source: Microsoft DirectX Developer Blog
Add your own comment

11 Comments on Microsoft Details DirectX Raytracing Tier 1.1, New DirectX 12 Features

#1
The Egg
Cool stuff. I have a somewhat dumb question (or maybe it’s just me that’s dumb): Do these new features require new hardware to implement, or can any full DX12 card make use of them after a software/driver update? So far as I can tell, this is not mentioned.
Posted on Reply
#2
Solaris17
Super Dainty Moderator
TIL snapdragons use directx
Posted on Reply
#3
DeathtoGnomes
Texture Streaming works well for cards with low vram and tend to work better in conjunction with things like distant terrain rendering reducing micro stuttering in several games.
Posted on Reply
#4
londiste
The EggCool stuff. I have a somewhat dumb question (or maybe it’s just me that’s dumb): Do these new features require new hardware to implement, or can any full DX12 card make use of them after a software/driver update? So far as I can tell, this is not mentioned.
It depends. Existing hardware may be capable of new features in which case it would be doable with driver update.
Some of the new stuff does seem to be more about optimizing API, software and their interactions though.

Timing is also curious. 2020H1 is where current rumors put Ampere and RDNA2 cannot be far behind.
Posted on Reply
#5
Zubasa
londisteIt depends. Existing hardware may be capable of new features in which case it would be doable with driver update.
Some of the new stuff does seem to be more about optimizing API, software and their interactions though.

Timing is also curious. 2020H1 is where current rumors put Ampere and RDNA2 cannot be far behind.
I suspect some of these features are back-ported from what ever new developmental version of DX the upcoming Xbox will use.
So most likely something to do with RTRT on RDNA2.
Posted on Reply
#6
RH92
The EggCool stuff. I have a somewhat dumb question (or maybe it’s just me that’s dumb): Do these new features require new hardware to implement, or can any full DX12 card make use of them after a software/driver update? So far as I can tell, this is not mentioned.
To my understanding yes any full DX12 card can make use or them BUT that doesn't mean any full DX12 card will take full advantage while using them ! For instance it has become clear that raytracing workloads require dedicated hardware for optimal results so don't expect full DX12 cards that lack said dedicated hardware to take any serious advantage of those new features .
Posted on Reply
#8
InVasMani
awesomesauceAsteroids Mesh Shaders Demo

These mesh shader optimizations will be key to enabling ray tracing at acceptable performance in complex lighting/shading scenes. At least that's my thoughts on it. Not just that though they'll be really useful for animation as well if they can lumped together and utilized jointly to make animations more adaptive in such a way they can be more expressive or less dynamic with the animations. Optimizing with more variable means of GPU displaying and compression and such is one of the best ways forward for GPU tech outside of more cores/shaders and faster cores/shaders I'm sure we'll be seeing a lot more granular GPU optimizations applied to various area's of gpu rendering. The mesh shaders demo was probably the most exciting part of the RTX launch in a lot of ways, but least talked about. These would actually be key to making more realistic tree's in foliage for example that sway with the wind. Once this tech gets better we can see leaves on the ground that blow in the wind that don't look static it'll be cool and like rain on leaves that cause them to bend and the rain to roll off of them. They'll be key to making realistic rain drops as well and that is a challenging task especially if you had to render all those malleable rain drops.

On that note it just shows how far away we are from ray traced graphics because there is still so much geometric power that needs to be further harnessed and once it's better harnessed it'll make faking lighting and shading better in the first place, but also making ray tracing more challenging at the same time. What needs to happen with ray tracing is they need to join mesh shading with ray tracing as a means to make it more variable. For example allow higher mesh detail for the scene geometry itself to be rendered, but used a lower LOD mesh detail for the actual ray traced rendering along with a flexible amount of ray tracing light rays. In fact if they do that and can have it selectively apply additional variable amount light rays to denser area's of geometric detail that would be even more ideal.
Posted on Reply
#9
Flanker
Solaris17TIL snapdragons use directx
I guess it means that Windows on ARM thing is still alive
Posted on Reply
#10
Vayra86
InVasManiThese mesh shader optimizations will be key to enabling ray tracing at acceptable performance in complex lighting/shading scenes. At least that's my thoughts on it. Not just that though they'll be really useful for animation as well if they can lumped together and utilized jointly to make animations more adaptive in such a way they can be more expressive or less dynamic with the animations. Optimizing with more variable means of GPU displaying and compression and such is one of the best ways forward for GPU tech outside of more cores/shaders and faster cores/shaders I'm sure we'll be seeing a lot more granular GPU optimizations applied to various area's of gpu rendering. The mesh shaders demo was probably the most exciting part of the RTX launch in a lot of ways, but least talked about. These would actually be key to making more realistic tree's in foliage for example that sway with the wind. Once this tech gets better we can see leaves on the ground that blow in the wind that don't look static it'll be cool and like rain on leaves that cause them to bend and the rain to roll off of them. They'll be key to making realistic rain drops as well and that is a challenging task especially if you had to render all those malleable rain drops.

On that note it just shows how far away we are from ray traced graphics because there is still so much geometric power that needs to be further harnessed and once it's better harnessed it'll make faking lighting and shading better in the first place, but also making ray tracing more challenging at the same time. What needs to happen with ray tracing is they need to join mesh shading with ray tracing as a means to make it more variable. For example allow higher mesh detail for the scene geometry itself to be rendered, but used a lower LOD mesh detail for the actual ray traced rendering along with a flexible amount of ray tracing light rays. In fact if they do that and can have it selectively apply additional variable amount light rays to denser area's of geometric detail that would be even more ideal.
Great post and very much agree
Posted on Reply
#11
InVasMani
The human eye automatically culls all that geometric mesh detail it see's that isn't visible instantly which shows how dumb GPU's are still. Another aspect of the mesh shader optimizations is that every pixel saved is a pixel earned so to speak. Can always apply that power and heat waste for other GPU tasks just don't waste the GPU's resourced rendering ghosts even if you're looking at them and named Mario they are unfriendly and they are invisible treat them as such, but continue to stare on a need be basis.
Posted on Reply
Add your own comment
Nov 21st, 2024 11:35 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts