Thursday, March 19th 2020
NVIDIA GeForce RTX GPUs to Support the DirectX 12 Ultimate API
NVIDIA graphics cards, starting from the current generation GeForce RTX "Turing" lineup, will support the upcoming DirectX 12 Ultimate API. Thanks to a slide obtained by our friends over at VideoCardz, we have some information about the upcoming iteration of the DirectX 12 API made by Microsoft. In the new API revision, called "DirectX 12 Ultimate", it looks like there are some enhancements made to the standard DirectX 12 API. From the leaked slide we can see the improvements coming in the form of a few additions.
The GeForce RTX lineup will support the updated version of API with features such as ray tracing, variable-rate shading, mesh shader, and sampler feedback. While we do not know why Microsoft decided to call this the "Ultimate" version, it is possibly used to convey clearer information about which features are supported by the hardware. In the leaked slide there is a mention of consoles as well, so it is coming to that platform as well.
Source:
VideoCardz
The GeForce RTX lineup will support the updated version of API with features such as ray tracing, variable-rate shading, mesh shader, and sampler feedback. While we do not know why Microsoft decided to call this the "Ultimate" version, it is possibly used to convey clearer information about which features are supported by the hardware. In the leaked slide there is a mention of consoles as well, so it is coming to that platform as well.
42 Comments on NVIDIA GeForce RTX GPUs to Support the DirectX 12 Ultimate API
And if they make it homebrew capable of doing so with an official press release saying they will allow it, I'm game too. Heck its the reason I bought a PS3 back in the day. The console as just a gaming machine is useless to me.
Why is Intel doing LITTLE.big or BIG.little or whatever? Power savings, thermal management, that little part of the core that can run RISC code at uber speed due to it merely being lookup tables that are being compared is small and can be power gated and shut off, or power gate other parts of the chip that makes it a whole CISC core to make it as or more efficient.
We could also ask why don't we have ASIC's running everything, and hte answer again is, I want to listen to youtube music, while typing, reading a document that is in an odd application specific format, and that would require specific hardware for each, and what happens when the ASIC you just bought doesn't meet standard X Y or Z, or when you dont have enough expansion ports for your ASIC's...... You build a general purpose CISC X86-64 processor.
The reason why I raised the question is because of some very serious concerns and actually it turns out to be a general consensus that x86 hits IPC and clock ceiling and further development won't be possible.
Since this, a new architecture and standard will be needed at some point in the future.
I expect it to become clear in 2-3 years.
You seem fairly intelligent, but I would go read up on architectures and maybe design some logic circuits of your own to understand why we are here, and no, its not cause Intel has a "monopoly" to protect.
X86 will die when apple, AMD, and walmart go out of business and right before the US ceases to exist after being conquered by north korea.
That already looks dated, quite a feat.
When RDNA1.0 cards were first out I already told that their buyers are screwed and now they really are. With all the hate/flak NVIDIA receives from AMD fans, NVIDIA's products have turned out to be more future-proof. What a shame.
Do I think it's a bit of a s**tty move to launch RDNA1, then RDNA2 with much better features under a year later? Yes. But on the flipside, RDNA1 cards have been cheaper than NVIDIA cards, and it's not like people have been unaware of the lack of RTRT in RDNA1 (e.g. W1zz's reviews have always called it out). It's not like AMD has forced people to buy their cards, or that those cards are bad - I would've gone for AMD this round if not for the driver issues, simply because RTRT was not compelling enough for me. (Of course, RTRT just got a lot more compelling with these console announcements.)
So I really don't think RDNA1 buyers are screwed in any way shape or form. The hardware is competitive, and will remain so as long as rasterisation remains the primary rendering technique, which I'm quite sure will be the case for another decade at least. The only people who are likely to be salty are those who keep graphics cards for that amount of time, and expect the "fine wine" driver treatment to keep them relevant... there's no amount of driver updates that can add hardware RTRT.
The interesting question I think is how the line up and the stack will be for both camps. Will they fill it up with RT capable cards top to bottom? Or will it cut off hard at the midrange at some price point because its just not going to be viable anyway? Will we see another split line up or a double one? How much of that GPU will be devoted to RT?
Interesting times! Now much more so than during Turing IMO. Its picking up steam.