• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Could Capitalize on AMD Graphics CoreNext Not Supporting Direct3D 12_1

Joined
Jul 13, 2010
Messages
62 (0.01/day)
Location
Slovakia
System Name Pap1er
Processor AMD Ryzen 7 5800X3D
Motherboard MSI B450 GAMING PRO CARBON AC
Cooling ARCTIC Freezer 33 eSport One - Red
Memory G.SKILL 16 GB KIT DDR4 4000 MHz CL15 Ripjaws V
Video Card(s) Sapphire NITRO+ AMD Radeon™ RX 7800 XT
Storage ADATA XPG GAMMIX S50 CORE/1TB/SSD/M.2 NVMe
Display(s) LG 29UC88 Curved UltraWide Monitor
Case Corsair 230T Graphite Series - Orange
Audio Device(s) OnBoard
Power Supply Corsair CX750M
Mouse Creative Sound BlasterX Siege M04 Gaming mouse
Software Windows 11 Pro x64
What does it mean in terms of performance if GCN support or does not support Direct3D 12_1 ?
I would appreciate short and clear explanation.
Would it affect performance at all? If so, how much?
 
Joined
Sep 7, 2011
Messages
2,785 (0.57/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
What does it mean in terms of performance if GCN support or does not support Direct3D 12_1 ?
I would appreciate short and clear explanation.
Would it affect performance at all? If so, how much?
How long is a piece of string?
If the card is saving GPU horsepower by better use of rasterization resources then the amount of gain depends upon the scenes being rendered. Not all games or game engjnes are created equal, and that doesn't take into account a myriad other graphical computation levels also needing to be taken into consideration ( I.e. tessellation). Even if you could quantify the gains/deficits, they are then affected by how different architectures handle post-rasterization image quality features ( post process depth of field, motion blur, global illumination etc.)
Basically what you want is a set figure when the actuality is that wont ever be the case unless every other variable becomes a constant- and every architecture and every part within every architecture handles every facet of the game to a varying degree.
 
Last edited:
Joined
Oct 2, 2004
Messages
13,791 (1.86/day)
So, out of this whole clusterfuck of info, when it comes to D3D12_0, old AMD GPU's still support way more than old NVIDIA GPU's (Kepler and Maxwell 1 support none of the feature levels and the ones that does are all Tier 1). Excluding Maxwell 2 since it's the newest one and was built for D3D12 to begin with anyway. Now it's just a question how far will the Fiji go with support. But seeing that GCN 1.0 already supports some, we can quite safely assume they'll support more than Maxwell 2. Not doing so would be kinda stupid from AMD considering how late they are releasing the Fiji compared to Maxwell 2...
 
Joined
Aug 20, 2007
Messages
21,572 (3.40/day)
System Name Pioneer
Processor Ryzen R9 9950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage Intel 5800X Optane 800GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64 / Windows 11 Enterprise IoT 2024
What does it mean in terms of performance if GCN support or does not support Direct3D 12_1 ?
I would appreciate short and clear explanation.
Would it affect performance at all? If so, how much?

I can provide a short one (no offense HumanSmoke)

If only one brand supports it, as is suggested, no one in their right mind will code for it.

So short answer is no, it won't make much difference at all.
 
Joined
May 2, 2013
Messages
489 (0.11/day)
Location
GA
System Name RYZEN RECKER
Processor Ryzen 5 5600X
Motherboard Asus Prime B350-plus
Cooling Arctic Cooler 120mm CPU, Cougar case fans.
Memory 16GB (2x8GB) Corsair LPX 3200mhz
Video Card(s) XFX 6700XT Swift 309
Storage 6.5TB total across 4 different drives
Display(s) Acer 32" 170hz VA
Case Antec 900
Audio Device(s) Logitech G430 headset
Power Supply Corsair CX850m
Mouse Steel Series Sensei Ten
Keyboard Corsair K55
Software Windows 10 64-bit
Sniff....sniff.....sniff.... Does anyone else smell a turd? I think I remember that smell... Vista, is that you?

Ah yes, DirectX 10. The redheaded step-child between DX9 and DX11.

Aren't they making a new Doom? (I bought Wolfenstein: The New Order the day of release and STILL have an unused Doom beta key sitting on my desk...)
What if they came out swinging with Doom on their IDTech OpenGL based engine when DX12 was launched? They have had a good bit of time to optimize it since Rage came out.
 
Joined
Sep 5, 2004
Messages
1,959 (0.26/day)
Location
The Kingdom of Norway
Processor Ryzen 5900X
Motherboard Gigabyte B550I AORUS PRO AX 1.1
Cooling Noctua NB-U12A
Memory 2x 32GB Fury DDR4 3200mhz
Video Card(s) PowerColor Radeon 7800 XT Hellhound
Storage Kingston FURY Renegade 2TB PCIe 4.0
Display(s) 2x Dell U2412M
Case Phanteks P400A
Audio Device(s) Hifimediy Sabre 9018 USB DAC
Power Supply Corsair AX850 (from 2012)
Software Windows 10?
What kind of nonsense is this? Dx12 is not even out yet.
and AMD can still have support in hardware, nobody knew of some of their past generation product like tessellation in 2900 XT that was unused in DX10.

but wait for Windows 10, DX12 and DX12 games to find out..
 
Joined
Sep 7, 2011
Messages
2,785 (0.57/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
I can provide a short one (no offense HumanSmoke)
If only one brand supports it, as is suggested, no one in their right mind will code for it.
So short answer is no, it won't make much difference at all.
No offense taken, and you're right, most PC games are developed for console - and consoles don't support FL 12_1
The only caveats are game engines developed primarily or in tandem with PC where the features could be unused in the console version, and OpenGL game engines of course.
and AMD can still have support in hardware, nobody knew of some of their past generation product like tessellation in 2900 XT that was unused in DX10.
Not really. The tessellator in the R600 was known about from the moment it was launched - it wasn't some kind of secret squirrel hidden capability.
The whole point of this article and thread, is that the GCN 1.x architecture cannot undertake conservative raterization in hardware. It can however emulate it in software using the compute ability of the architectures inbuilt asynchronous compute engines.
 
Last edited:
Joined
Nov 4, 2005
Messages
12,023 (1.72/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs, 24TB Enterprise drives
Display(s) 55" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
No offense taken, and you're right, most PC games are developed for console - and consoles don't support FL 12_1
The only caveats are game engines developed primarily or in tandem with PC where the features could be unused in the console version, and OpenGL game engines of course.

Not really. The tessellator in the R600 was known about from the moment it was launched - it wasn't some kind of secret squirrel hidden capability.
The whole point of this article and thread, is that the GCN 1.x architecture cannot undertake conservative raterization in hardware. It can however emulate it in software using the compute ability of the architectures inbuilt asynchronous compute engines.
and AMD can still have support in hardware, nobody knew of some of their past generation product like tessellation in 2900 XT that was unused in DX10.

but wait for Windows 10, DX12 and DX12 games to find out..


TruForm ATI 8500 had hardware tessellation, and no one used it as no competitors had it or wanted to invest in it at the time.
 
Joined
Sep 7, 2011
Messages
2,785 (0.57/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
TruForm ATI 8500 had hardware tessellation, and no one used it as no competitors had it or wanted to invest in it at the time.
TruForm did have reasonable amount of game support- including a number of AAA titles.

Same old ATI/AMD tune isn't it?
At least ATI worked to get TruForm integrated as a game feature. AMD get involved and immediately turn an R600 feature into a footnote in history by tossing ATI's game development program into the nearest dumpster.
 
Joined
Jan 2, 2015
Messages
1,099 (0.30/day)
Processor FX6350@4.2ghz-i54670k@4ghz
Video Card(s) HD7850-R9290
i think i would have to agree this means little to nothing.. so game devs will continue to load textures the same and some will let you decide like forever now. by the time gamers actually need a full dx12.1+ or whatever we will be talking about dx13. assuming dx is still the way to go for gaming by then.

p.s didnt amd have a hand in developing gpu tessellation and had fully supporting dx11 gpu's before nvidia? would that not also be around the same time gcn was proving to be more powerful than kepler in direct compute and gpu acceleration?

well i know for sure tessellation works fine on my amd gpu's and amd optimized tessellation looks great.. especially in evolved games.
 
Last edited:
Joined
Sep 7, 2011
Messages
2,785 (0.57/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
i think i would have to agree this means little to nothing.. so game devs will continue to load textures the same and some will let you decide like forever now.
It won't be a major factor, but the consensus amongst developers seems to be that 12_1 features such as conservative rasterization, rasterizer ordered views (ROV)/ order-independent transparency (OIT), voxelization, and adaptive volumetric shadow maps are the way forward for more realistic portrayal of gameplay, reduction of GPU compute overhead, and greater developer control. These may be slow in coming to fruition with DirectX (thanks to consoles not supporting the features natively, or not at all), but OpenGL already has them enabled. In a way, AMD can thank Nvidia and Intel for making Vulkan that much more relevant - how's that for irony.
p.s didnt amd have a hand in developing gpu tessellation and had fully supporting dx11 gpu's before nvidia?
No and Yes.
No. ATI's TruForm was the first GPU tessellation hardware followed by Matrox's Parhelia line (N-Patch support), then ATI's Xenos (R500 / C1) graphics chip for the Xbox 360. All of these pre-date AMD's involvement with ATI.
Yes. AMD's Evergreen series were the first DirectX 11 compliant GPUs. They arrived just over six months before Nvidia's own DX11 cards.
would that not also be around the same time gcn was proving to be more powerful than kepler in direct compute and gpu acceleration?
Do tell? You're starting to sound like AMD's Roy Taylor.
DirectCompute, like most computation depends upon the workload, software, and just as importantly, software support (drivers). It also depends heavily upon the emphasis placed upon the designs by the architects. A case in point is the tessellation you seem very keen to explore. ATI pioneered it, but it went largely unused. Under AMD's regime tessellation wasn't prioritized where Nvidia made Maxwell a tessellation monster. Neither DC or tessellation on their own define the architecture, or are indicators of much besides that facet.

by the time gamers actually need a full dx12.1+ or whatever we will be talking about dx13. assuming dx is still the way to go for gaming by then.
Well, if Vulkan and the new OpenGL extensions take off like people are expecting, the DirectX coding arena may have their hand forced. If the new OGL turns into the old OGL, Microsoft can probably wait ten years before updating DirectX.
 
Last edited:
Joined
Jan 2, 2015
Messages
1,099 (0.30/day)
Processor FX6350@4.2ghz-i54670k@4ghz
Video Card(s) HD7850-R9290
thanks smoke.. i honestly just didnt even know what to believe with all the stuff people say around. thinking about it.. i think you shared a interview with amd's "gaming scientist" some time ago and he explained something of a tessellation war going on or rather over tessellation. before i started getting into it but still interesting.
10 years haha yeah dx12 should be easier to work with and i mean what are they even going to add to make it more complex that is a real game changer like tessellation was.
 
Joined
May 20, 2011
Messages
227 (0.05/day)
System Name Windows 10 Pro 64 bit
Processor Ryzen 5 5600 @4.65 GHz
Motherboard Asus ROG X570-E
Cooling Thermalright
Memory 32 GB 3200 MHz
Video Card(s) Asus RX 6700XT 12 GB Dual
Storage 1TB Samsung 970 EVO Plus
Display(s) SS QHD 144Hz + LG 55 Inch 4K
Case Corsair 4000D
Power Supply Superflower 850
We should use benchmark instead of play game
 
Joined
Jan 2, 2015
Messages
1,099 (0.30/day)
Processor FX6350@4.2ghz-i54670k@4ghz
Video Card(s) HD7850-R9290
DX12 FTW MICROSOFT!

1990 REHASHED HAHA

DX13? :)
 
Joined
Nov 3, 2011
Messages
697 (0.15/day)
Location
Australia
System Name Eula
Processor AMD Ryzen 9 7900X PBO
Motherboard ASUS TUF Gaming X670E Plus Wifi
Cooling Corsair H150i Elite LCD XT White
Memory Trident Z5 Neo RGB DDR5-6000 64GB (4x16GB F5-6000J3038F16GX2-TZ5NR) EXPO II, OCCT Tested
Video Card(s) Gigabyte GeForce RTX 4080 GAMING OC
Storage Corsair MP600 XT NVMe 2TB, Samsung 980 Pro NVMe 2TB, Toshiba N300 10TB HDD, Seagate Ironwolf 4T HDD
Display(s) Acer Predator X32FP 32in 160Hz 4K FreeSync/GSync DP, LG 32UL950 32in 4K HDR FreeSync/G-Sync DP
Case Phanteks Eclipse P500A D-RGB White
Audio Device(s) Creative Sound Blaster Z
Power Supply Corsair HX1000 Platinum 1000W
Mouse SteelSeries Prime Pro Gaming Mouse
Keyboard SteelSeries Apex 5
Software MS Windows 11 Pro
AMD's Graphics CoreNext (GCN) architecture does not support Direct3D feature-level 12_1 (DirectX 12.1), according to a ComputerBase.de report. The architecture only supports Direct3D up to feature-level 12_0. Feature-level 12_1 adds three features over 12_0, namely Volume-Tiled Resources, Conservative Rasterization and Rasterizer Ordered Views.

Volume Tiled-resources, is an evolution of tiled-resources (analogous to OpenGL mega-texture), in which the GPU seeks and loads only those portions of a large texture that are relevant to the scene it's rendering, rather than loading the entire texture to the memory. Think of it as a virtual memory system for textures. This greatly reduces video memory usage and bandwidth consumption. Volume tiled-resources is a way of seeking portions of a texture not only along X and Y axes, but adds third dimension. Conservative Rasterization is a means of drawing polygons with additional pixels that make it easier for two polygons to interact with each other in dynamic objects. Raster Ordered Views is a means to optimize raster loads in the order in which they appear in an object. Practical applications include improved shadows.



Given that GCN doesn't feature bare-metal support for D3D feature-level 12_1, its implementation will be as limited as feature-level 11_1 was, when NVIDIA's Kepler didn't support it. This is compounded by the fact that GCN is a more popular GPU architecture than Maxwell (which supports 12_1), thanks to new-generation game consoles. It could explain why NVIDIA dedicated three-fourths of its GeForce GTX 980 Ti press-deck to talking about the features of D3D 12_1 at length. The company probably wants to make a few new effects that rely on D3D 12_1 part of GameWorks, and deflect accusations of exclusivity to the competition (AMD) not supporting certain API features, which are open to them. Granted, AMD GPUs, and modern game consoles such as the Xbox One and PlayStation 4 don't support GameWorks, but that didn't stop big game devs from implementing them.

Source: ComputerBase.de
CR(Conservative Rasterization)

Read http://devgurus.amd.com/message/1308511



Question:

I need for my application that every drawing produce at least one pixel output (even if this is an empty triangle = 3 identical points). NVidia have an extension (GL_NV_conservative_raster) to enable such a mode (on Maxwell+ cards). Is there a similar extension on AMD cards

Answer (from AMD):

Some of our hardware can support functionality similar to that in the NVIDIA extension you mention, but we are currently not shipping an extension of our own. We will likely hold off until we can come to a consensus with other hardware vendors on a common extension before exposing the feature, but it will come in time.








Even Nvidia first gen Maxwell card 750Ti that was launch early in 2014 doesn't even have 12.1.


From Christophe Riccio
https://twitter.com/g_truc/status/581224843556843521

It seems that shader invocation ordering is proportionally a lot more expensive on GM204 than S.I. or HSW.
 
Last edited:
Joined
Nov 3, 2011
Messages
697 (0.15/day)
Location
Australia
System Name Eula
Processor AMD Ryzen 9 7900X PBO
Motherboard ASUS TUF Gaming X670E Plus Wifi
Cooling Corsair H150i Elite LCD XT White
Memory Trident Z5 Neo RGB DDR5-6000 64GB (4x16GB F5-6000J3038F16GX2-TZ5NR) EXPO II, OCCT Tested
Video Card(s) Gigabyte GeForce RTX 4080 GAMING OC
Storage Corsair MP600 XT NVMe 2TB, Samsung 980 Pro NVMe 2TB, Toshiba N300 10TB HDD, Seagate Ironwolf 4T HDD
Display(s) Acer Predator X32FP 32in 160Hz 4K FreeSync/GSync DP, LG 32UL950 32in 4K HDR FreeSync/G-Sync DP
Case Phanteks Eclipse P500A D-RGB White
Audio Device(s) Creative Sound Blaster Z
Power Supply Corsair HX1000 Platinum 1000W
Mouse SteelSeries Prime Pro Gaming Mouse
Keyboard SteelSeries Apex 5
Software MS Windows 11 Pro
It won't be a major factor, but the consensus amongst developers seems to be that 12_1 features such as conservative rasterization, rasterizer ordered views (ROV)/ order-independent transparency (OIT), voxelization, and adaptive volumetric shadow maps are the way forward for more realistic portrayal of gameplay, reduction of GPU compute overhead, and greater developer control. These may be slow in coming to fruition with DirectX (thanks to consoles not supporting the features natively, or not at all), but OpenGL already has them enabled. In a way, AMD can thank Nvidia and Intel for making Vulkan that much more relevant - how's that for irony.

No and Yes.
No. ATI's TruForm was the first GPU tessellation hardware followed by Matrox's Parhelia line (N-Patch support), then ATI's Xenos (R500 / C1) graphics chip for the Xbox 360. All of these pre-date AMD's involvement with ATI.
Yes. AMD's Evergreen series were the first DirectX 11 compliant GPUs. They arrived just over six months before Nvidia's own DX11 cards.

Do tell? You're starting to sound like AMD's Roy Taylor.
DirectCompute, like most computation depends upon the workload, software, and just as importantly, software support (drivers). It also depends heavily upon the emphasis placed upon the designs by the architects. A case in point is the tessellation you seem very keen to explore. ATI pioneered it, but it went largely unused. Under AMD's regime tessellation wasn't prioritized where Nvidia made Maxwell a tessellation monster. Neither DC or tessellation on their own define the architecture, or are indicators of much besides that facet.


Well, if Vulkan and the new OpenGL extensions take off like people are expecting, the DirectX coding arena may have their hand forced. If the new OGL turns into the old OGL, Microsoft can probably wait ten years before updating DirectX.




AMD is aware of extreme tessellation issue hence improvements with R9-285 (28 CU).
 
Joined
Feb 24, 2009
Messages
3,516 (0.61/day)
System Name Money Hole
Processor Core i7 970
Motherboard Asus P6T6 WS Revolution
Cooling Noctua UH-D14
Memory 2133Mhz 12GB (3x4GB) Mushkin 998991
Video Card(s) Sapphire Tri-X OC R9 290X
Storage Samsung 1TB 850 Evo
Display(s) 3x Acer KG240A 144hz
Case CM HAF 932
Audio Device(s) ADI (onboard)
Power Supply Enermax Revolution 85+ 1050w
Mouse Logitech G602
Keyboard Logitech G710+
Software Windows 10 Professional x64
Lets clarify this by someone who actually has some knowledge about this: Link.

Qualifications of the writer: Written by Alessio Tommasini, Directx 12 early access program member.

At this time, we are witnessing an incredible work of disinformation about the supported DirectX 12 features by the various GPUs currently available from AMD and NVIDIA. Personally, we do not know if some company arranges this sort-of campaign, or if it is just all due by some uninformed journalists.

What is for sure is that people need some clarifications. First of all, Direct3D 12 is an API designed to run on the currently available hardware, as long as it supports virtual memory and tiled resources.
The new API has been largely shaped around a new resource-binding model, which defines the management of textures and buffers in physical and virtual memory (dedicated or system-shared) of the graphics hardware.

In order to ensure that Direct3D 12 could support the widest range of hardware, without significant compromises that could limit the longevity of the new API, Microsoft and its partners established to divide into three “tiers” the level of support of the new resource-binding model.
Each tier is the superset of its predecessor, that is, tier 1 hardware comes with the strongest constraints about the resource-binding model, tier 3 conversely has no limitations, while tier 2 represent intermediate level of constrictions.

If we talk about the hardware on sale, the situation about the resource-binding tiers is the following:

  • Tier 1: INTEL Haswell e Broadwell, NVIDIA Fermi
  • Tier 2: NVIDIA Kepler, Maxwell 1.0 and Maxwell 2.0
  • Tier 3: AMD GCN 1.0, GCN 1.1 and GCN 1.2
Regarding the resource binding, currently only AMD GPUs come without hardware limitations, which has been erroneously defined as a “full support” by some sources.

In addition to the three resource-binding tiers, Direct3D 12 exposes four “feature levels”, that is, four levels of capabilities, each one that states a well-defined set of hardware rendering features.
It is important to specify that these feature levels are not directly related to the resource-binding tiers; moreover, they cover only some of the rendering capabilities exposed by Direct3D 12.
Some of these capabilities are not covered at all even by the highest feature level, and all others can be individually supported by the graphics hardware (if the GPU architecture and the drivers allow them) regardless of the supported feature level.

If we talk again about the hardware on sale, the situation about the feature levels is the following:

  • Feature level 11.0: NVIDIA Fermi, Kepler, Maxwell 1.0
  • Feature level 11.1: AMD GCN 1.0, INTEL Haswell and Broadwell
  • Feature level 12.0: AMD GCN 1.1 and GCN 1.2
  • Feature level 12.1: NVIDIA Maxwell 2.0
The first two feature levels roughly coincide to the DirectX 11 levels with the same name (with some differences due the new resource binding model), while feature level 12.0 and 12.1 are new to Direct3D 12.
Despite being pleonastic, it is worth to restate that feature level 12.1 does not coincide with an imaginary “full/complete DirectX 12 support” since it does not cover many important or secondary features exposed by Direct3D 12.

Reading various “exclusive news” found around the internet, additional confusion precisely comes from the individual capabilities: some of them are once again pooled together in groups of two or three tiers, but it has to specify that each of them is completely unrelated to the other, to the three resource-binding tiers, and clearly from the four feature levels.

In the end, as regards the support of every single capability, it is currently not possible, nor appropriate, to draw up a complete and well-defined table showing the support of on sale hardware.
Unless you name is AMD, INTEL or NVIDIA, you cannot present such report with the drivers currently available on the public channels, nor with non-NDA documentation, therefore everything else is only to be considered as pure rants.
 
Joined
Sep 7, 2011
Messages
2,785 (0.57/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
Lets clarify this by someone who actually has some knowledge about this: Link.
Qualifications of the writer: Written by Alessio Tommasini, Directx 12 early access program member.
The same information - and informed opinion by programmers (including the author you just quoted and Andrew Lauritzen of Intel), and tech reviewers such as Anandtech's Ryan Smith is being actively discussed in a less FUD-orientated thread at B3D if anyone is actually interested. (Link is the last current page of the discussion but I would recommend reading the whole thread).
 
Joined
Aug 20, 2007
Messages
21,572 (3.40/day)
System Name Pioneer
Processor Ryzen R9 9950X
Motherboard GIGABYTE Aorus Elite X670 AX
Cooling Noctua NH-D15 + A whole lotta Sunon and Corsair Maglev blower fans...
Memory 64GB (4x 16GB) G.Skill Flare X5 @ DDR5-6000 CL30
Video Card(s) XFX RX 7900 XTX Speedster Merc 310
Storage Intel 5800X Optane 800GB boot, +2x Crucial P5 Plus 2TB PCIe 4.0 NVMe SSDs
Display(s) 55" LG 55" B9 OLED 4K Display
Case Thermaltake Core X31
Audio Device(s) TOSLINK->Schiit Modi MB->Asgard 2 DAC Amp->AKG Pro K712 Headphones or HDMI->B9 OLED
Power Supply FSP Hydro Ti Pro 850W
Mouse Logitech G305 Lightspeed Wireless
Keyboard WASD Code v3 with Cherry Green keyswitches + PBT DS keycaps
Software Gentoo Linux x64 / Windows 11 Enterprise IoT 2024
The same information - and informed opinion by programmers (including the author you just quoted and Andrew Lauritzen of Intel), and tech reviewers such as Anandtech's Ryan Smith is being actively discussed in a less FUD-orientated thread at B3D if anyone is actually interested. (Link is the last current page of the discussion but I would recommend reading the whole thread).

But brain hurts! I just want video game.

In seriousness, I may browse through it when I get a chance...
 
Joined
Feb 24, 2009
Messages
3,516 (0.61/day)
System Name Money Hole
Processor Core i7 970
Motherboard Asus P6T6 WS Revolution
Cooling Noctua UH-D14
Memory 2133Mhz 12GB (3x4GB) Mushkin 998991
Video Card(s) Sapphire Tri-X OC R9 290X
Storage Samsung 1TB 850 Evo
Display(s) 3x Acer KG240A 144hz
Case CM HAF 932
Audio Device(s) ADI (onboard)
Power Supply Enermax Revolution 85+ 1050w
Mouse Logitech G602
Keyboard Logitech G710+
Software Windows 10 Professional x64
The same information - and informed opinion by programmers (including the author you just quoted and Andrew Lauritzen of Intel), and tech reviewers such as Anandtech's Ryan Smith is being actively discussed in a less FUD-orientated thread at B3D if anyone is actually interested. (Link is the last current page of the discussion but I would recommend reading the whole thread).

Thanks, didn't realize that discussion was going on over there.

What it seems to be is that no GPU out now or soon fully supports DX12. nVidia 12_1 is not better then AMD 12_0 (or vise versa), it just seems to be a different feature sub, sub set.

edit: Even these sub, sub set numbers (12_0 or 12_1) do not fully support all of DX12 features. Also it would seem that DX11 is a subset of DX12. Therefore, while GCN 1.0 may not support some of the new features in DX12, it still "supports" DX12. Even Fermi appears to "support" DX12, but only support DX11.1 feature level.
 
Last edited:
Joined
Nov 3, 2011
Messages
697 (0.15/day)
Location
Australia
System Name Eula
Processor AMD Ryzen 9 7900X PBO
Motherboard ASUS TUF Gaming X670E Plus Wifi
Cooling Corsair H150i Elite LCD XT White
Memory Trident Z5 Neo RGB DDR5-6000 64GB (4x16GB F5-6000J3038F16GX2-TZ5NR) EXPO II, OCCT Tested
Video Card(s) Gigabyte GeForce RTX 4080 GAMING OC
Storage Corsair MP600 XT NVMe 2TB, Samsung 980 Pro NVMe 2TB, Toshiba N300 10TB HDD, Seagate Ironwolf 4T HDD
Display(s) Acer Predator X32FP 32in 160Hz 4K FreeSync/GSync DP, LG 32UL950 32in 4K HDR FreeSync/G-Sync DP
Case Phanteks Eclipse P500A D-RGB White
Audio Device(s) Creative Sound Blaster Z
Power Supply Corsair HX1000 Platinum 1000W
Mouse SteelSeries Prime Pro Gaming Mouse
Keyboard SteelSeries Apex 5
Software MS Windows 11 Pro
Thanks, didn't realize that discussion was going on over there.

What it seems to be is that no GPU out now or soon fully supports DX12. nVidia 12_1 is not better then AMD 12_0 (or vise versa), it just seems to be a different feature sub, sub set.

edit: Even these sub, sub set numbers (12_0 or 12_1) do not fully support all of DX12 features. Also it would seem that DX11 is a subset of DX12. Therefore, while GCN 1.0 may not support some of the new features in DX12, it still "supports" DX12. Even Fermi appears to "support" DX12, but only support DX11.1 feature level.
From http://www.bitsandchips.it/52-engli...out-tier-and-feature-levels-of-the-directx-12

•Tier 1: INTEL Haswell e Broadwell, NVIDIA Fermi
•Tier 2: NVIDIA Kepler, Maxwell 1.0 and Maxwell 2.0
•Tier 3: AMD GCN 1.0, GCN 1.1 and GCN 1.2

•Feature level 11.0: NVIDIA Fermi, Kepler, Maxwell 1.0
•Feature level 11.1: AMD GCN 1.0, INTEL Haswell and Broadwell
•Feature level 12.0: AMD GCN 1.1 and GCN 1.2
•Feature level 12.1: NVIDIA Maxwell 2.0


The max DX12 support would be Tier 3 and Feature Level 12.1.

Tier level and feature level support would be useless if the said features are slow i.e. decelerator
 
Last edited:
Joined
Feb 24, 2009
Messages
3,516 (0.61/day)
System Name Money Hole
Processor Core i7 970
Motherboard Asus P6T6 WS Revolution
Cooling Noctua UH-D14
Memory 2133Mhz 12GB (3x4GB) Mushkin 998991
Video Card(s) Sapphire Tri-X OC R9 290X
Storage Samsung 1TB 850 Evo
Display(s) 3x Acer KG240A 144hz
Case CM HAF 932
Audio Device(s) ADI (onboard)
Power Supply Enermax Revolution 85+ 1050w
Mouse Logitech G602
Keyboard Logitech G710+
Software Windows 10 Professional x64
DX11 is a subset of DX12, thus you have different tiers. A GPU is considered to support DX12 "as long as it supports virtual memory and tiled resources", to quote the article. The feature level is not a DX version support, but a classification of what features of DX12 said GPU supports. "The first two feature levels roughly coincide to the DirectX 11 levels with the same name (with some differences due the new resource binding model), while feature level 12.0 and 12.1 are new to Direct3D 12."

The feature level is a classification of what DX12 features said GPU supports. So, Fermi, Kepler, Maxwell 1.0 all support feature level 11.0 of DX12. GCN 1.0, Haswell, and Broadwell support feature level 11.1 and so on.

With all that said, "Despite being pleonastic, it is worth to restate that feature level 12.1 does not coincide with an imaginary “full/complete DirectX 12 support” since it does not cover many important or secondary features exposed by Direct3D 12."
 
Joined
Jan 2, 2015
Messages
1,099 (0.30/day)
Processor FX6350@4.2ghz-i54670k@4ghz
Video Card(s) HD7850-R9290
its been my understanding that what all dx11 gpu's will support of dx12 is the way the gpu and cpu communicate.. draw calls i believe? and if a feature is gpu specific then well its just like normal and you need a new gpu for that feature but it doesnt seem like any gamer especially casual is going to need to be rushing out to upgrade even a year from now if they have a system from the past few generations.

 
Joined
Sep 7, 2011
Messages
2,785 (0.57/day)
Location
New Zealand
System Name MoneySink
Processor 2600K @ 4.8
Motherboard P8Z77-V
Cooling AC NexXxos XT45 360, RayStorm, D5T+XSPC tank, Tygon R-3603, Bitspower
Memory 16GB Crucial Ballistix DDR3-1600C8
Video Card(s) GTX 780 SLI (EVGA SC ACX + Giga GHz Ed.)
Storage Kingston HyperX SSD (128) OS, WD RE4 (1TB), RE2 (1TB), Cav. Black (2 x 500GB), Red (4TB)
Display(s) Achieva Shimian QH270-IPSMS (2560x1440) S-IPS
Case NZXT Switch 810
Audio Device(s) onboard Realtek yawn edition
Power Supply Seasonic X-1050
Software Win8.1 Pro
Benchmark Scores 3.5 litres of Pale Ale in 18 minutes.
its been my understanding that what all dx11 gpu's will support of dx12 is the way the gpu and cpu communicate.. draw calls i believe? and if a feature is gpu specific then well its just like normal and you need a new gpu for that feature but it doesnt seem like any gamer especially casual is going to need to be rushing out to upgrade even a year from now if they have a system from the past few generations.
Sort of. AMD's VLIW architecture (HD 5000, 6000 series) isn't compatible with DX12, but anything GCN as well as any Nvidia DX11 capable card is DX12 compatible. All the cards support DX12 - it is just a matter of the level of support. No cards at the present time support every facet of the API or its complete feature set.

The basic features of DX12 will be available to most of the GPUs (as well as DirectX 11.3).It will be up to game developers as to which additional features they might include - but I would say that if there is not broad based support for existing cards, the additional features will be options within the game code rather than mandatory.
 
Top