Saturday, January 11th 2025

Microsoft Lays DirectX API-level Groundwork for Neural Rendering

Microsoft announced updates to the DirectX API that would pave the way for neural rendering. Neural rendering is a concept where portions of a frame in real-time 3D graphics are drawn using a generative AI model that works in tandem with classic raster 3D graphics pipeline, along with other advancements, such as real-time ray tracing. This is different from AI-based super resolution technologies. The generative AI here is involved in rendering the input frames for a super resolution technology. One of the nuts and bolts of neural rendering is cooperative vectors, enable an information pathway between the conventional graphics pipeline and the generative AI, telling it what it's doing, what needs to be done by the AI model, and what the ground truth for the model is.

Microsoft says that its HLSL team is working with AMD, Intel, NVIDIA, and Qualcomm to bring cross-vendor support for cooperative vectors in the DirectX ecosystem. The very first dividends of this effort will be seen in the upcoming NVIDIA GeForce RTX 50-series "Blackwell" GPUs, which will use cooperative vectors to drive neural shading. "Neural shaders can be used to visualize game assets with AI, better organize geometry for improved path tracing performance and tools to create game characters with photo-realistic visuals," Microsoft says.
Source: Microsoft DirectX Blog
Add your own comment

20 Comments on Microsoft Lays DirectX API-level Groundwork for Neural Rendering

#1
Daven
DirectX 12 is almost 10 years old. Are we still going to be on DirectX 12 in another ten years? I mean why can't AI, RT, super sampling, upscaling, neural render, direct storage, etc. be considered enough of a change to warrant calling it DirectX 13? Does Microsoft get some sort of benefit by changing so much but keeping the version number the same?

Edit: Here are the release times for past versions:

1.0 to 2.0 1 year
2.0 to 3.0 1 year
3.0 to 5.0 1 year (4.0 never released)
5.0 to 6.0 1 year
6.0 to 7.0 1 year
7.0 to 8.0 1 year
8.0 to 9.0 2 years
9.0 to 10.0 4 years
10.0 to 11.0 3 years
11.0 to 12.0 6 years
12.0 going on 10 years now!!
Posted on Reply
#2
ZoneDymo
DavenDirectX 12 is almost 10 years old. Are we still going to be on DirectX 12 in another ten years? I mean why can't AI, RT, super sampling, upscaling, neural render, direct storage, etc. be considered enough of a change to warrant calling it DirectX 13? Does Microsoft get some sort of benefit by changing so much but keeping the version number the same?
its....just a name? better question is why do you care if it gets a new name? it does not mean anything.


on the actual article, im still not clear on what this is suppose to be.
"neural rendering" apart from being yet another stupid marketing name, seems to imply its doing something with the actual...ya know.. rendering of the frame, so maybe half is done by traditional rasterization so the AI has something to build off and then it makes the rest of the image?

on the other hand we get this fantastic sentence:
"This is different from AI-based super resolution technologies. The generative AI here is involved in rendering the input frames for a super resolution technology."

why does it say its rendering the input frames.....for an upscaler?
So like it provides the motion vectors or some crap?
Posted on Reply
#3
Daven
ZoneDymoits....just a name? better question is why do you care if it gets a new name? it does not mean anything.
Version numbers do mean something. Why else would they exist?
Posted on Reply
#4
ZoneDymo
DavenVersion numbers do mean something. Why else would they exist?
I mean sure but directX is so ermm vague? idk the word for it, complex i guess, that the version numbers never meant anything to me.
Its like a game that runs on windows 10 but not on 7, ok good to know but I dont know the actual reason for it, what tech does 10 has that 7 does not?

likewise directx versions have never meant anything to me, I can run Warframe from the launcher in DX11 and DX12 mode with no visual difference between them.

So when a game runs on DX12, thats just like a game running on Unreal Engine 5, it COULD use certain features but might just as well not, the product will show that eventually, so yeah for all I care it stays DX12 from now on because whatever it supports does not mean the product will use it anyway.

and just to make clear, im not against any change either, its all good for me, I just dont share the feeling that its needed.
Posted on Reply
#5
dyonoctis
ZoneDymoits....just a name? better question is why do you care if it gets a new name? it does not mean anything.


on the actual article, im still not clear on what this is suppose to be.
"neural rendering" apart from being yet another stupid marketing name, seems to imply its doing something with the actual...ya know.. rendering of the frame, so maybe half is done by traditional rasterization so the AI has something to build off and then it makes the rest of the image?

on the other hand we get this fantastic sentence:
"This is different from AI-based super resolution technologies. The generative AI here is involved in rendering the input frames for a super resolution technology."

why does it say its rendering the input frames.....for an upscaler?
So like it provides the motion vectors or some crap?
I think it's using AI to calculate parts of every frame, rather than just guessing frames between two frames, or enhancing a lower-resolution frame.

Nvidia seems to provide more detail about that. RTX neural seems to be to DX neural what RTX IO is to direct storage: aka the same thing.
NVIDIA RTX Neural Rendering Introduces Next Era of AI-Powered Graphics Innovation | NVIDIA Technical Blog
GPU-accelerated primitive that reduces the amount of geometry necessary to render strands of hair and uses spheres instead of triangles to get a more accurate fit for hair shapes. LSS makes it possible to do ray-traced hair with better performance and a smaller memory footprint. [...]
RTX Neural Faces offers an innovative, new approach to improve face quality using generative AI. Instead of brute force rendering, Neural Faces takes a simple rasterized face plus 3D pose data as input and uses a real-time generative AI model to infer a more natural face. The generated face is trained from thousands of offline generated images of that face at every angle, under different lighting, emotion, and occlusion conditions
Traditional rendering methods don’t accurately simulate how light interacts with human skin, which can result in a plastic-like look. Subsurface Scattering (SSS) simulates how light penetrates beneath the surface of translucent materials and scatters internally, creating a softer, more natural appearance.
So you train the API on the rasterized game and use AI to improve some areas of the render. Seems promising in theory, but the demo doesn't look visually stable. But that answers a few of my questions: I was wondering how they would pull off stuff like real-time SSS or caustics, but it seems that anything too heavy will be AI-generated.

Edit: It seems that neural rendering will also accelerate RT/PT, if I understand correctly, the GPU isn't going to brute force every single bounce of light, but rather infer data from the first few bounces...and also justify Nvidia being stingy on VRAM :D
  • RTX Neural Texture Compression uses AI to compress thousands of textures in less than a minute. Their neural representations are stored or accessed in real time or loaded directly into memory without further modification. The neurally compressed textures save up to 7x more VRAM or system memory than traditional block compressed textures at the same visual quality.
  • RTX Neural Materials uses AI to compress complex shader code typically reserved for offline materials and built with multiple layers such as porcelain and silk. The material processing is up to 5x faster, making it possible to render film-quality assets at game-ready frame rates.
  • RTX Neural Radiance Cache uses AI to learn multi-bounce indirect lighting to infer an infinite amount of bounces after the initial one to two bounces from path traced rays. This offers better path traced indirect lighting and performance versus path traced lighting without a radiance cache. NRC is now available through the RTX Global Illumination SDK, and will be available soon through RTX Remix and Portal with RTX.
Posted on Reply
#6
ZoneDymo
dyonoctisNvidia seems to provide more detail about that. RTX neural seems to be to DX neural what RTX IO is to direct storage: aka the same thing.
NVIDIA RTX Neural Rendering Introduces Next Era of AI-Powered Graphics Innovation | NVIDIA Technical Blog
Hey thanks for the link but its sitll a bit vague for me, also hilarious some of these demos...
Like the face one is in no way better, they just made a more beautiful person lol, and that AI co-player, yikez.

Better compression is cool I guess and if AI somehow does this well...idk I guess thats also something I dont get about AI, its not actively learning right? so its just an algorithm like any other that perhaps was trained in AI terms but after that its just done, a better compression technique, which is cool but why would anything need "AI" hardware to use such a thing?
And if it doesnt then who gives a crap of how the algorithm was made? I assume some calculations were done and now its just here for me to use.....
Posted on Reply
#7
dyonoctis
ZoneDymoHey thanks for the link but its sitll a bit vague for me, also hilarious some of these demos...
Like the face one is in no way better, they just made a more beautiful person lol, and that AI co-player, yikez.

Better compression is cool I guess and if AI somehow does this well...idk I guess thats also something I dont get about AI, its not actively learning right? so its just an algorithm like any other that perhaps was trained in AI terms but after that its just done, a better compression technique, which is cool but why would anything need "AI" hardware to use such a thing?
And if it doesnt then who gives a crap of how the algorithm was made? I assume some calculations were done and now its just here for me to use.....
From what I could see, AI is better at handling dynamic stuff. It's learning in the data center, but general-purpose hardware doesn't seem to be very efficient/fast to apply that code in real-time on things that move in non scripted ways, so it's using tailor-made hardware instead, and that doesn't use the more general compute resources.

It's also my understanding that a generalist AI upscaler/frame generator doesn't exist because each vendor is using very specialized ML hardware tailor-made for their API: Xess doesn't use the ML hardware of other vendors, but fall back on something more generalist that doesn't performs as well as native Xess. Direct X neural is supposed to avoid that clusterfuck, so Intel and AMD probably developed their next-gen ML hardware with the required stuff to run all those things.
Posted on Reply
#8
Prima.Vera
Most likely it will be called DirectAI or something and it will be part of DirectX 12.3
Posted on Reply
#9
GerKNG
i am really sick of this AI! AI! AI! AI! AI! AI!!!!!!!!! BS.
give me TFLOPS and optimize your spaghetticode Frankenstein Monster of a Game...
Posted on Reply
#10
TumbleGeorge
Terrible. So far dlss, and now the sick hallucinations of some LLM. I'm not sure what you'll get on the monitor when you launch Diablo 4 on your computer, and instead of game a episode 3141 from a Turkish TV series is displayed.
Posted on Reply
#11
windwhirl
DavenDirectX 12 is almost 10 years old.
Well, we're technically on DX12 Ultimate, which is 5 years old, and includes RT, VRS and a couple more things.

I think internally the API is version 12.2 currently.

But it'd be no surprise if Microsoft never changes the major version number internally since they avoid compatibility issues that way (just as Windows 11 doesn't have a major version number 11, it remains as 10). Basically software doesn't like it when you respond to a request in a manner that was not expected.

That aside, I'm not sure if I understand the point of this technology other than for Nvidia et al to sell us hardware we wouldn't need otherwise.
Posted on Reply
#12
theouto
The generative AI here is involved in rendering the input frames for a super resolution technology.
So microsoft is working on nothing new that will do nothing you haven't seen before, with resolve that is still worse than that of a native image, all in the name of selling a lie.

As soon as I get into the hobby it goes to shit, lovely.
Posted on Reply
#13
Daven
theoutoSo microsoft is working on nothing new that will do nothing you haven't seen before, with resolve that is still worse than that of a native image, all in the name of selling a lie.

As soon as I get into the hobby it goes to shit, lovely.
A lot of what you are talking about is due to zombie purchasing of Nvidia. There is almost nothing of real enjoyment that separates AMD and Nvidia (and now Intel) but Nvidia gets 90%+ of the market. Therefore innovation has stalled, worthless, unasked for features permeate everything and brand loyalists delude themselves while feasting on internet myths to justify buying more and more expensive hardware from their beloved company.

It’s hurting everyone so you are not alone.
Posted on Reply
#14
dyonoctis
DavenA lot of what you are talking about is due to zombie purchasing of Nvidia. There is almost nothing of real enjoyment that separates AMD and Nvidia (and now Intel) but Nvidia gets 90%+ of the market. Therefore innovation has stalled, worthless, unasked for features permeate everything and brand loyalists delude themselves while feasting on internet myths to justify buying more and more expensive hardware from their beloved company.

It’s hurting everyone so you are not alone.
theoutoSo microsoft is working on nothing new that will do nothing you haven't seen before, with resolve that is still worse than that of a native image, all in the name of selling a lie.

As soon as I get into the hobby it goes to shit, lovely.
Hopefully, in the future, Microsoft will do a better job at explaining what this is, because just like you two a lot of people seems confused about what neural rendering is about to be: It's not about upscaling or frame generation, it's about using ML to render a "real frame" more efficiently. It's going to affect how materials, texture, lighting, geometry is handled before any kind of upscaling or frame generation is involved.
Raster is faster to begin with because it uses lots of trick when Fully Path traced graphics are all about brute force. Neural rendering is about to add even more tricks to rasterization in the hope of reducing the computational load required to achieve a similar level of graphics. And that includes lowering the load for path tracing as well.
Posted on Reply
#15
bug
DavenDirectX 12 is almost 10 years old. Are we still going to be on DirectX 12 in another ten years? I mean why can't AI, RT, super sampling, upscaling, neural render, direct storage, etc. be considered enough of a change to warrant calling it DirectX 13? Does Microsoft get some sort of benefit by changing so much but keeping the version number the same?

Edit: Here are the release times for past versions:

1.0 to 2.0 1 year
2.0 to 3.0 1 year
3.0 to 5.0 1 year (4.0 never released)
5.0 to 6.0 1 year
6.0 to 7.0 1 year
7.0 to 8.0 1 year
8.0 to 9.0 2 years
9.0 to 10.0 4 years
10.0 to 11.0 3 years
11.0 to 12.0 6 years
12.0 going on 10 years now!!
Somebody hasn't been paying attention. DX12 is "lower level" than DX11 (in a somewhat Vulkan way), it gives more control to the game engines, precisely so it doesn't need to be upgraded as often. The current DX12 is not the same DX12 that was released initially, several extensions have been added since: en.wikipedia.org/wiki/DirectX#Version_history
Posted on Reply
#18
Tsukiyomi91
TLDR; this is why corpos aren't your friends and they don't deserve special treatment. Fuck their PR nonsense and scop-tier products.
Posted on Reply
#19
b1k3rdude
More Ai branded fcuk-shittery....?
Posted on Reply
#20
ModEl4
If DX12 Ultimate was called DX13 back in 2018 with Turing generation, AMD would probably have sold half of what sold between then and the launch of 6000 series.In the same way if it gives a different DX number for those cards that support neural rendering, AMD will sell less until the launch of UDNA (which probably will support it I hope)
Posted on Reply
Add your own comment
Jan 11th, 2025 15:02 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts