• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Microsoft Lays DirectX API-level Groundwork for Neural Rendering

bug

Joined
May 22, 2015
Messages
13,924 (3.95/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Yeah... except one visible artifact in the picture is enough to kill the dream and the ignorance.
True that. Initial attempts to move past supersampling were pretty cringe. First LOD implementations resulted in those demarcation lines that moved as you moved... I have faith current attempts will be able to overcome their teething problems, too.
To me it all really comes down to very simple principles. Its either convincing, or its not. There's no in-between, those are just failed attempts. The moment RT and AI generated frames are convincing to me, I'll be an adopter. So far, its just a live alpha that you keep paying for. F*ck that.
I feel it's far better than an alpha, but I agree that everybody will adopt (or not) at their own pace. Continuing the example above I, myself, was all over the control panel trying various AA and trilinear optimization settings for quite a while. As time went by, I can't even tell where those settings are otoh.
Another aspect that's often forgotten in computer graphics is that the chase to approach 'realism' doesn't always make for a better picture. Cinematography is more than just 'showing things as they are', its often the very opposite.
Again, agreed. Realism is just a baseline: do things the default way, you'll get something as close as possible to "real life". From there on, you're free to add any artistic flavor you want.
 
Last edited:
Joined
Nov 22, 2023
Messages
286 (0.68/day)
Somebody hasn't been paying attention. DX12 is "lower level" than DX11 (in a somewhat Vulkan way), it gives more control to the game engines, precisely so it doesn't need to be upgraded as often. The current DX12 is not the same DX12 that was released initially, several extensions have been added since: https://en.wikipedia.org/wiki/DirectX#Version_history

-Yep, a close to metal api with fully programmable shaders in theory means it's the "last" API. Anything you add to it can *technically* be done via hardware shaders if necessary (albeit not well) so there really isn't a need for an updated version number.

That said, when GPU sales / Windows sales start slumping, MS/NV/AMD with get together and repackage everything as DX13 that will only be supported on the latest Windows and the latest GPUs to force obsolescence and drive up sales.
 
Joined
May 31, 2005
Messages
286 (0.04/day)
I wonder what kind of performance we would be at now for traditional programmable shading and rasterization if RT and "neural processing" weren't consuming transistor budget. We are essentially back in the days of non-unified architectures.
 
Joined
Jun 20, 2005
Messages
91 (0.01/day)
Location
Leeds, UK
System Name My PC
Processor 6700K @ 4.5GHz
Motherboard GigaByte GA-Z170XP-SLI
Cooling Pure Rock 2 + 4 Fans
Memory 2 x 16GB Corsair 3200MHz DDR4
Video Card(s) MSI RX 6900 XT Gaming X Trio
Storage PNY CS3030 NVMe 1TB, MX500 2TB x 2, 3TB WD Blue
Display(s) 27" curved 165Hz VA 1080p (Gigabyte)
Case Corsair 200R
Audio Device(s) Creative X4, AVR + Monitor Audio MASS 5.1
Power Supply Corsair RM750
Mouse Deathadder 2
Keyboard Xtrfy K4
Software W10 Pro
Benchmark Scores 14k1 (ish) Timespy (20k2 gfx 5k2 cpu)
RT was a thing in the early 90's for RF wave propagation. H/W was even weaker then, hence .....
 
Joined
Jul 4, 2023
Messages
52 (0.09/day)
Location
You wish
Hilarious so, algorithmically correct ray tracing is a real computing headache, let's drop the hard stuff and let's put a model on it, teach it and it will invent the hard parts that might or might not look that way around the corner.

Mission accomplished

Forget Lucas Industrial Lights and Magic, say hello to Nvidia Gaslight
 

bug

Joined
May 22, 2015
Messages
13,924 (3.95/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Hilarious so, algorithmically correct ray tracing is a real computing headache, let's drop the hard stuff and let's put a model on it, teach it and it will invent the hard parts that might or might not look that way around the corner.

Mission accomplished

Forget Lucas Industrial Lights and Magic, say hello to Nvidia Gaslight
If possible, why not?
The thing about neural networks is they may take ages to train, but, once trained, they will recognize stuff in a jiffy. If you can leverage that, why not?
And it's not Nvidia anything, this is DX. Available for everyone.
 
Joined
May 26, 2023
Messages
50 (0.08/day)
DirectX 12 is almost 10 years old. Are we still going to be on DirectX 12 in another ten years? I mean why can't AI, RT, super sampling, upscaling, neural render, direct storage, etc. be considered enough of a change to warrant calling it DirectX 13? Does Microsoft get some sort of benefit by changing so much but keeping the version number the same?

Edit: Here are the release times for past versions:

1.0 to 2.0 1 year
2.0 to 3.0 1 year
3.0 to 5.0 1 year (4.0 never released)
5.0 to 6.0 1 year
6.0 to 7.0 1 year
7.0 to 8.0 1 year
8.0 to 9.0 2 years
9.0 to 10.0 4 years
10.0 to 11.0 3 years
11.0 to 12.0 6 years
12.0 going on 10 years now!!

To be fair, DX 12 Ultimate could very well be named DX13.

There is also the need to preserve some compatibility because of the install base. The number of people playing games were much smaller than now and hardware is much slower.
We're at two years per new architecture, was in those early days it was 6 months.

Most of it has also been Microsoft's fault, as game developers may not want to support newer standards. Besides the hardware IIRC, Windows 10 didn't get 12 Ultimate for quite some time, as they wanted to push Win 11 adoption.
So, developers tend to stick to a common platform until it is possible to move forward. They are not going to alienate 50%+ of the user base just because Microsoft is stubborn.
 

Rightness_1

New Member
Joined
Dec 30, 2024
Messages
7 (0.44/day)
A sad reflection of Microsoft's decline. This had to be in the works at NV for at least 4 years, yet it's still not ready to be rolled out via DirectX. I really wonder if any Blackwell card will ever render a single frame (outside of a demo) in the first year that the cards are released for.
 
Last edited:
Joined
Jul 4, 2023
Messages
52 (0.09/day)
Location
You wish
If possible, why not?
The thing about neural networks is they may take ages to train, but, once trained, they will recognize stuff in a jiffy. If you can leverage that, why not?
And it's not Nvidia anything, this is DX. Available for everyone.
Of course, but my user experience with prediction engines (also known colloquially as ai) is traumatizing- so if this "leverage" will fall into similar qualities then no thanks,
I want to have Empirical Ray-tracing

A sad reflection of Microsoft's decline. This had to be in the works at NV for at least 4 years, yet it's still not ready to be rolled out via DirectX. I really wonder if any Blackwell card will ever render a single frame (outside of a demo) in the first year that the cards are released for.
There is no people who gawk high-performance graphics pipeline and low-end optimizations in Microsoft because its Microsoft.
 
Joined
Dec 25, 2020
Messages
7,264 (4.90/day)
Location
São Paulo, Brazil
System Name "Icy Resurrection"
Processor 13th Gen Intel Core i9-13900KS
Motherboard ASUS ROG Maximus Z790 Apex Encore
Cooling Noctua NH-D15S upgraded with 2x NF-F12 iPPC-3000 fans and Honeywell PTM7950 TIM
Memory 32 GB G.SKILL Trident Z5 RGB F5-6800J3445G16GX2-TZ5RK @ 7600 MT/s 36-44-44-52-96 1.4V
Video Card(s) NVIDIA RTX A2000
Storage 500 GB WD Black SN750 SE NVMe SSD + 4 TB WD Red Plus WD40EFPX HDD
Display(s) 55-inch LG G3 OLED
Case Pichau Mancer CV500 White Edition
Power Supply EVGA 1300 G2 1.3kW 80+ Gold
Mouse Microsoft Classic IntelliMouse (2017)
Keyboard IBM Model M type 1391405
Software Windows 10 Pro 22H2
Benchmark Scores I pulled a Qiqi~
DirectX 12 is almost 10 years old. Are we still going to be on DirectX 12 in another ten years? I mean why can't AI, RT, super sampling, upscaling, neural render, direct storage, etc. be considered enough of a change to warrant calling it DirectX 13? Does Microsoft get some sort of benefit by changing so much but keeping the version number the same?

DirectX 13 is simply called "12 Ultimate". DirectX development has always been tied to the latest versions of Windows, not to mention that DirectX 11 is still the most widely used graphics API overall, and for good reason. It's the most versatile of them, capabilities align with most budgets and gamers' interests, OS compatibility is good and isn't as low-level as DirectX 12 tends to be, allowing more programmers to work with it. Vendors that aren't green and clad in a fancy leather or snakeskin jacket also have great trouble keeping up with the DirectX spec, in most cases they are late to adopt the latest driver or shader models and simply opt not to support any features that aren't absolutely mandated by Microsoft and considered to be optional extensions, unless their hand is forced by an ISV that desperately wants access to said feature.
 
Top