Thursday, June 4th 2015

NVIDIA Could Capitalize on AMD Graphics CoreNext Not Supporting Direct3D 12_1

AMD's Graphics CoreNext (GCN) architecture does not support Direct3D feature-level 12_1 (DirectX 12.1), according to a ComputerBase.de report. The architecture only supports Direct3D up to feature-level 12_0. Feature-level 12_1 adds three features over 12_0, namely Volume-Tiled Resources, Conservative Rasterization and Rasterizer Ordered Views.

Volume Tiled-resources, is an evolution of tiled-resources (analogous to OpenGL mega-texture), in which the GPU seeks and loads only those portions of a large texture that are relevant to the scene it's rendering, rather than loading the entire texture to the memory. Think of it as a virtual memory system for textures. This greatly reduces video memory usage and bandwidth consumption. Volume tiled-resources is a way of seeking portions of a texture not only along X and Y axes, but adds third dimension. Conservative Rasterization is a means of drawing polygons with additional pixels that make it easier for two polygons to interact with each other in dynamic objects. Raster Ordered Views is a means to optimize raster loads in the order in which they appear in an object. Practical applications include improved shadows.
Given that GCN doesn't feature bare-metal support for D3D feature-level 12_1, its implementation will be as limited as feature-level 11_1 was, when NVIDIA's Kepler didn't support it. This is compounded by the fact that GCN is a more popular GPU architecture than Maxwell (which supports 12_1), thanks to new-generation game consoles. It could explain why NVIDIA dedicated three-fourths of its GeForce GTX 980 Ti press-deck to talking about the features of D3D 12_1 at length. The company probably wants to make a few new effects that rely on D3D 12_1 part of GameWorks, and deflect accusations of exclusivity to the competition (AMD) not supporting certain API features, which are open to them. Granted, AMD GPUs, and modern game consoles such as the Xbox One and PlayStation 4 don't support GameWorks, but that didn't stop big game devs from implementing them.
Source: ComputerBase.de
Add your own comment

79 Comments on NVIDIA Could Capitalize on AMD Graphics CoreNext Not Supporting Direct3D 12_1

#26
Folterknecht
The source for that computerbase.de "article" is an AMD guy (Robert Hallock) with whom computerbase.de had a chat.
Posted on Reply
#27
Kaotik
Random MurdererGCN 1.0 was confirmed DX12-compliant.
And the point I was bringing up is that two of the features specifically mentioned in this article that GCN supposedly doesn't support, it does at a hardware level, unlike anything(except latest Maxwell) NV has, specifically tiled resources and conservative rasterization. Regardless of tier, these are DX12_1 features, and they are supported on-metal by GCN, which directly contradicts this "news" article.
AMD has said it themselves too, current GCN implementations only support FL12_0 (or possibly under in case of 1.0), they lack some of the required hardware features required for DX12's ROVs or conservative rasterization, or both, can't remember out of the top of my head. Also, volume tiled resources is different from tiled resources. GCN's can do the latter for sure, there's still no clear answer for the former.
The only relevant remaining question is is whether GCN 1.0 is 11_1 or 12_0.

(also, sidenote: not all chips considered GCN 1.0 are necessarily 1.0 anymore, there's indications that Cape Verde and Pitcairn (Curacao) have been upgraded to 1.1 after HD7-series (AMD OpenCL 2.0 requires GCN 1.1, yet their list for OpenCL 2.0 supporting products include several Cape Verde & Pitcairn based products (but not their HD7-versions))
((GCN 1.1 in this case refererring to the 3D/Compute-related capabilities of the chips, not including display controller or audio controller features which obviously aren't there))

edit:
Regarding the source, they seem to be somewhat confused, too. Bonaire is same gen as Hawaii, but they seem to think it's older)
Posted on Reply
#29
15th Warlock
the54thvoidI don't read German so I can't read the source article but it's not Nvidia saying it. Is it? Is the German article alluding to a fact not of Nvidia's fabrication or is it pertaining to Nvidia's heavy PR on the feature set Maxwell focusses on?
Can someone clarify without any brand bias? Obviously I can see the charts posted but has AMD stated in any press deck that they dont support tier 1?
In other use words to simplify, who the f@ck said GCN doesn't support it?
Lol, yes you're right, so this news item, is it more of an editorial then? Nowhere in the quoted article says nvidia will use this to their advantage :shadedshu:
Posted on Reply
#30
Nosada
Just like we have the (PR) tags in front of articles that are press releases from companies, can we get a (FUD) tag for articles that have no basis in reality please?
Posted on Reply
#31
arbiter
btarunrThe company probably wants to make a few new effects that rely on D3D 12_1 part of GameWorks, and deflect accusations of exclusivity to the competition (AMD) not supporting certain API features, which are open to them. Granted, AMD GPUs, and modern game consoles such as the Xbox One and PlayStation 4 don't support GameWorks, but that didn't stop big game devs from implementing them.
hairworks relies on DX11 tessellation so they already do that with current effects. AMD cards royally suck at tessellation.
NC37nVidia must be terrified of Fury if they are running this scared and attempting to deceive consumers about DX12. 6/16 is looking even more interesting.
pfft most reports show fury isn't as good as people expect, few say it will be SLOWER then a 980ti which given history could be possible, but likely its only as fast and cost more.


Kinda funny first page of comments are all AMD fans attacking pretty sad
Posted on Reply
#32
Fluffmeister
Still, It's good to know Maxwell V2 is DX12_1 compliant at least. :P
Posted on Reply
#33
HumanSmoke
My god, yet another topic that seems to be less understood the more its talked about.

DirectX12 has both hardware level support and feature level support - they are NOT the same entities.
Resource binding tiers are a subset of DirectX 11 not 12. Tier 1 requires 11.0 support, Tier 2 requires 11.1 support, Tier 3 requires 11.2 support. Tier support - whether 1, 2, or 3 is mandatory for DX12. Supported architectures at each level are:


Within those tiers, the feature level support associated with - but not the same as- the hardware support (since some of the features can be emulated in software) is broken down thus:


The hardware level and feature level, and whether any natively supported feature level will be emulated will be down to the game developers.
As I pointed out in another posting a few days ago, feature level support will likely depend on whose IHV game dev program backing (if any) the game studio leverages.
FluffmeisterThis is a good article, they indicate Feature Levels have different Tiers within them, for example:
So in short, it's perfectly reasonble to assume AMD support DX12_0, up to Tier 3, Tier 3 doesn't automatically mean they support DX12_1
Exactly.
If the publicity is guilty of anything it is making the explanation too simplistic. AMD is also to blame here as well, since they widely circulated some not very concise slides, and made little if no distinction between native support and emulation which seems to cause some confusion regarding GCN 1.0 support of DX11.2 Tier 2 (which of course is carried over to DX12).
OctopussOh this is btarunr who posted this. Not buying into his Nvidia bullshit anymore.
It's actually btarunr quoting a German site of some repute, who interviewed AMD's head graphics marketing exec who is telling anyone who will listen that Nvidia could include conservative rasterization features into their game dev program....which could well eventuate, although since Maxwell v2 makes up such as small part of the graphics market they would almost certainly be addition features rather than obligated ones. So, Mr/Mrs/Ms Octopuss what this is all about is AMD alerting the world to the possible (and in some way probable) expansion of Nvidia's GameWorks program. Maybe you should have taken the time to read the original source material.
Posted on Reply
#34
Lionheart
arbiterhairworks relies on DX11 tessellation so they already do that with current effects. AMD cards royally suck at tessellation.


pfft most reports show fury isn't as good as people expect, few say it will be SLOWER then a 980ti which given history could be possible, but likely its only as fast and cost more.


Kinda funny first page of comments are all AMD fans attacking pretty sad
Dude I wouldn't talk if I were you, you praise Nvidia like close minded PC fanboys praise Gabe Newell!
Posted on Reply
#35
GhostRyder
LionheartDude I wouldn't talk if I were you, you praise Nvidia like close minded PC fanboys praise Gabe Newell!
Shots fired! If anyone though is biased he's one of the major ones.

As for the article, it's an article based on what another site wrote. I don't really count any of this stating bt is biased. He was happy to post multiple articles about the gtx 970 memory and the blocked overclocking so I really would not call him biased.

Whatever the case, the truth will be revealed in time. You can lie all you want (not claiming anything specific is a lie) but in the end the truth will be shown one way or another.
Posted on Reply
#36
Xzibit
HumanSmoke
Thats a user generated table from DX 11.3 FL 11_1



It also contradicts what Nvidia has said about pre 980/970 and CR. Which falls inline with the other table.
HumanSmoke
Posted on Reply
#37
arbiter
LionheartDude I wouldn't talk if I were you, you praise Nvidia like close minded PC fanboys praise Gabe Newell!
Fact are facts, even though AMD fans won't admit the truth when it hurts their favorite brand.

As for ALL the hype over AMD's latest chip, its ALL hype. just cause it has HBM this 4000 gcn that, as proven over last many years just cause AMD chip has more of this or that, doesn't mean its better or faster. History proves that as Fact.
Posted on Reply
#38
R-T-B
HumanSmokeMy god, yet another topic that seems to be less understood the more its talked about.

DirectX12 has both hardware level support and feature level support - they are NOT the same entities.
Resource binding tiers are a subset of DirectX 11 not 12. Tier 1 requires 11.0 support, Tier 2 requires 11.1 support, Tier 3 requires 11.2 support. Tier support - whether 1, 2, or 3 is mandatory for DX12. Supported architectures at each level are:


Within those tiers, the feature level support associated with - but not the same as- the hardware support (since some of the features can be emulated in software) is broken down thus:


The hardware level and feature level, and whether any natively supported feature level will be emulated will be down to the game developers.
As I pointed out in another posting a few days ago, feature level support will likely depend on whose IHV game dev program backing (if any) the game studio leverages.

Exactly.
If the publicity is guilty of anything it is making the explanation too simplistic. AMD is also to blame here as well, since they widely circulated some not very concise slides, and made little if no distinction between native support and emulation which seems to cause some confusion regarding GCN 1.0 support of DX11.2 Tier 2 (which of course is carried over to DX12).


It's actually btarunr quoting a German site of some repute, who interviewed AMD's head graphics marketing exec who is telling anyone who will listen that Nvidia could include conservative rasterization features into their game dev program....which could well eventuate, although since Maxwell v2 makes up such as small part of the graphics market they would almost certainly be addition features rather than obligated ones. So, Mr/Mrs/Ms Octopuss what this is all about is AMD alerting the world to the possible (and in some way probable) expansion of Nvidia's GameWorks program. Maybe you should have taken the time to read the original source material.
Thanks for taking the time to explain that... I'm sure I'm not the only one here to admit it makes my head spin a bit.

Bottom line: The game devs will code at the lowest common denominator, as they always have, making ALL of this stupid FUD.
Posted on Reply
#39
Lionheart
arbiterFact are facts, even though AMD fans won't admit the truth when it hurts their favorite brand.

As for ALL the hype over AMD's latest chip, its ALL hype. just cause it has HBM this 4000 gcn that, as proven over last many years just cause AMD chip has more of this or that, doesn't mean its better or faster. History proves that as Fact.
Yeah maybe I guess, will just have to wait & see :toast:
Posted on Reply
#40
renz496
15th WarlockThis is so confusing, the other day I read in this forum that GCN supports DX12 _3 all the way to ver 1.1, and now this seems to contradict that, all these DX12 feature levels are making my head spin, I mean, can someone please clarify what's going on with that? :(
except there is no such thing as 12_3. there are only 12_0 and 12_1. people are confusing tier level support with feature level support.
Posted on Reply
#41
HumanSmoke
XzibitIt also contradicts what Nvidia has said about pre 980/970 and CR. Which falls inline with the other table.
Really?
And what did Nvidia say about Fermi and Maxwell v1 and conservative rasterization?

Just throwing shit at the wall and hoping nobody notices, or are you privy to facts you aren't willing to share? Or are you confusing full hardware support with software emulation? Because that is available to most architectures- it just isn't very resource friendly.

renz496except there is no such thing as 12_3. there are only 12_0 and 12_1. people are confusing tier level support with feature level support.
+1....tier
R-T-BBottom line: The game devs will code at the lowest common denominator, as they always have, making ALL of this stupid FUD.
Yes, I doubt that it makes much of a difference in the long run (although The Witcher 3 utilizes some of the same features, which is why Maxwell v2 cards and AMD's GCN does better than Kepler since the latter doesn't have DX 11.2 support), but I'm sure it will churn up endless debate over what probably amounts to 3-4% performance variance, while the content of the bulk of the games themselves - uninspiring corridor shooters and thinly veiled ripoffs of earlier titles receive a bare fraction of the same forum bandwidth. Yay for common sense.
Posted on Reply
#42
Haytch
Devon68Well yeah but AMD has plenty of both especially with the new HBM cards, so I see no issues with this.
This should help Nvidia with their 4Gb - Sorry, I mean 3.5Gb GPU lineup :)
Posted on Reply
#43
R-T-B
HumanSmokewhile the content of the bulk of the games themselves - uninspiring corridor shooters and thinly veiled ripoffs of earlier titles receive a bare fraction of the same forum bandwidth. Yay for common sense.
Glad to see I'm not the only one who finds present games rather uninspired.
Posted on Reply
#44
Redkaliber
Im just mad that my 970's wont run the full 12.1 crap. I mean wtf, I bought em thinking id be set for a while. Now I find out their is a higher version not supported by my card? WTF M$
Posted on Reply
#45
btarunr
Editor & Senior Moderator
For those of you who don't know Google Translate exists,

In summary, no 12_1 for GCN. Not only did a person of authority from AMD confirm it, but also AMD sourgraped about it much like NVIDIA did about 11_1 for Kepler.





"For 'Fiji' AMD is not expressed" means that AMD guy didn't want to talk about Fiji. Any DACH guys can confirm that translation.
Posted on Reply
#46
lilhasselhoffer
This is a fun discussion. It reads first as red/green team member "proving" their team is better without ever proving anything. It morphs into accusations that the article's author is biased. Finally, we wind up in the bizarre land of what exactly DX12 support is.

Addressing these issues in order:

When GCN and Kepler came out DX11 was just starting to be the standard for gaming. The fact that there is any proposed support for either of these architectures to offer DX12 features is a small miracle. It'd be like expecting your 2000 Toyota Camry to have an Ipod dock. Obviously DX12 is something that is meant to reach back and give some cards new features, but expecting any cards with 100% compatibility is a pipe dream. Neither AMD nor Nvidia are magic, deal with it.


Btarunr and I often have differences of opinion. Look back to the editorial on everyone "needing to accept windows 8, because it's not going away" and you can see that clearly. Despite differences of opinion, you need to offer respect where it is due. This article, and Btarunr in general, are shooting straight. There aren't any lies introduced by the author, there's no reason to believe fiscal gains are the motive, and reading the original piece I think this is a fair representation. I disagree that reporting on reporters reporting is news (seriously, it's hard enough to report fairly when it's a one-on-one interview), but that's no reason to accuse a person of being a shill. Whenever you've got incontrovertible proof of wrong doing, present it. Right now these accusations are hollow, and dismissed as easily as they are made.


Sniff....sniff.....sniff.... Does anyone else smell a turd? I think I remember that smell... Vista, is that you?

Jokes aside, isn't anyone else getting flashbacks of Vista when they hear about DX12? Some hardware supports it, other hardware is capable of supporting part of it, and people are selling hardware as compatible despite knowing that you'd need a half dozen upgrades to get a system running reasonably. DX12 is a complicated animal, over simplified in the media and by sales, and the ensuing fallout from the false portrayal will hurt more people than it could ever help. The end goal is just to move more hardware, as people get frustrated by partial support and the promised improvements being hollow.



Let's start some more fires. Fiji is supposed to faster than, slower than, and as fast as the GTX 980 ti. Zen is supposed to both compete with Intel, and put them back into the dedicated CPU market. Intel is going to both improve FPGA production and cost by incorporating Altera into its production chain, but because the market will become a monopoly it will destroy affordable product lines. You see, FUD is fun. My last four sentences are complete speculation, yet they've started flame wars on forums. Seriously, show me the numbers before I give a crap. Until there are solid FACTs you might as well be playing with yourself for all the energy you're wasting on going nowhere.
Posted on Reply
#47
bubbleawsome
Honestly, as long as my 280x can handle the most basic dx12 features or at least spoof them without too much of an issue I couldn't give less of a crap. My GPU came out in 2012, we should just be expecting dx12 support in this generation or the next. The fact that old cards might still get the CPU (and other benefits) of dx12 is good enough for me.

Honestly I'll probably need to upgrade my GPU before dx12 is widespread enough to be required.
Posted on Reply
#48
snakefist
The League of Extraordinary (Wealthy) Gentlemen

This surmarizes my feelings when reading these comments.

To be honest, I'll say this from the start. I like AMD better than Intel or NVIDIA - but it never stopped me from buying the a clearly better product, when it makes sense. Secondly, my machine - on which I do games (though mostly ones that I really like and spending many hours, often hundreds, with) - is getting old, almost 3 years now. Even when bought, it was of average 'strength' - though I do tend to build a lots of system, for my friends who need it and have less experience than I do (I own different PCs from 1987th, after all:) )

But, an average comment poster seems to have something like this:
- at least i7 of the newer generation, looking for an upgrade
- Titan X or, at least 980x (preferably in SLI)
- 4k monitor (at least one), or some multi-monitor configuration

Same poster is:
- highly environment-aware, though PSU is >800+W, and every 10-30W of extra usage counts as unacceptable
- liking and needing Iris Pro
- is highly dependant on yet unreleased DirectX 12, even to a point where some small inability of the graphic card to use hardware acceleration of the feature means life or death, NOW

Configuration mentioned easily surpasses 2000$, and may need replacement immediately, or very soon, because, well, some of the games released after famous June 29th won't have every every hardware optimization of feature which would be questionably used in that game(s) -softwere emulation is out of the question, naturally

Well, MY old system is very capable to support games I play, and it is not Soltaire - a number of AAA titles are there.

- 4K - though I've seen enough gaming on those, I've decided that I don't need one yet
- Multi-monitor gaming - it looks... bad, in my book (though I do have a second monitor, I use it for a different purposes entirely)
- Iris Pro is inherently... unneeded, a person spending as much for a CPU and is a passionate gamer buys a REAL graphic card, even if it's just an average one, and gets like 5-10 better results of it (minimum - whole concept of having Iris Pro eludes me, it's suitable for a budget CPU and budget gaming, yet it bears premium price)

Main features of DX12 are, as my understanding goes, to better utilize CPU cores and GPU, existing ones as yet unreleased. Other than this, there are bunch of new features - but then again, there are PsysX (and Flex and Enviromental and so on), Fur, God Rays, HairFX - also TrueAudio, Tressfx, Mantle... Neither green or red using games profited hugely out of having, or missing those - yes, they are noticeable, especially in demos, and probably moreso than some cryptic "Feature Level 12_x".

That being said, I wouldn't worry too much of it for the time being - and neither should others. Buying a new GPU cause it supports something as vague as "Feature Level 12_x" as primary reason is pure waste of money. You can always do the same when and IF this starts to affect your experience. When it happens, all the chances are that prices will be lower, products more mature and better overall, to the point of having the whole new generation.

I plan to upgrade my computer in 2016, if something important doesn't change - this goes both ways.

(Oh, and I did had a privilege to see several expensive components and how they affect my configuration - for me, they weren't worth buying, but mileage may vary - those who need them, probably have them already)

Sorry for a wall of text, but I felt that I have to write this, since the whole discussion regarding replacing GPU or profiting over having "Feature Level 12_x" seems so pointless to me.
Posted on Reply
#49
qubit
Overclocked quantum bit
Ouch, can anyone say controversial?

There's no love lost between these two, that's for sure lol.
Posted on Reply
#50
CyberBuddha
What does it mean in terms of performance if GCN support or does not support Direct3D 12_1 ?
I would appreciate short and clear explanation.
Would it affect performance at all? If so, how much?
Posted on Reply
Add your own comment
Dec 20th, 2024 23:37 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts