# NVIDIA Could Capitalize on AMD Graphics CoreNext Not Supporting Direct3D 12_1



## btarunr (Jun 4, 2015)

AMD's Graphics CoreNext (GCN) architecture does not support Direct3D feature-level 12_1 (DirectX 12.1), according to a ComputerBase.de report. The architecture only supports Direct3D up to feature-level 12_0. Feature-level 12_1 adds three features over 12_0, namely Volume-Tiled Resources, Conservative Rasterization and Rasterizer Ordered Views. 

Volume Tiled-resources, is an evolution of tiled-resources (analogous to OpenGL mega-texture), in which the GPU seeks and loads only those portions of a large texture that are relevant to the scene it's rendering, rather than loading the entire texture to the memory. Think of it as a virtual memory system for textures. This greatly reduces video memory usage and bandwidth consumption. Volume tiled-resources is a way of seeking portions of a texture not only along X and Y axes, but adds third dimension. Conservative Rasterization is a means of drawing polygons with additional pixels that make it easier for two polygons to interact with each other in dynamic objects. Raster Ordered Views is a means to optimize raster loads in the order in which they appear in an object. Practical applications include improved shadows.






Given that GCN doesn't feature bare-metal support for D3D feature-level 12_1, its implementation will be as limited as feature-level 11_1 was, when NVIDIA's Kepler didn't support it. This is compounded by the fact that GCN is a more popular GPU architecture than Maxwell (which supports 12_1), thanks to new-generation game consoles. It could explain why NVIDIA dedicated three-fourths of its GeForce GTX 980 Ti press-deck to talking about the features of D3D 12_1 at length. The company probably wants to make a few new effects that rely on D3D 12_1 part of GameWorks, and deflect accusations of exclusivity to the competition (AMD) not supporting certain API features, which are open to them. Granted, AMD GPUs, and modern game consoles such as the Xbox One and PlayStation 4 don't support GameWorks, but that didn't stop big game devs from implementing them.

*View at TechPowerUp Main Site*


----------



## Kaotik (Jun 4, 2015)

Saying "GCN doesn't support" is bad to start with, considering that GCN is already compromised of 3 separate generations with different featuresets.
Current GCN generations, 1.0, 1.1 and 1.2 are indeed limited to Feature Level 12_0, but that doesn't mean all GCN generations will be.
((there's slight change 1.0 would be limited to 11_1, but several sources suggest it can do 12_0, too))


----------



## Fluffmeister (Jun 4, 2015)

http://www.techpowerup.com/forums/t...70-more-details-revealed.212850/#post-3286732

Let's hope so then!


----------



## RejZoR (Jun 4, 2015)

Whoa whoa, stop your horses. Weren't we looking at charts 2 weeks ago where ALL GCN chips supported up to Tier 3 and now all of the sudden they don't support anything? NVIDIA, are you digging yourself out of a pile of dung again with lies?

EDIT:
http://www.techpowerup.com/forums/t...70-more-details-revealed.212850/#post-3286730

What the freaking hell is this then!?


----------



## Assimilator (Jun 4, 2015)

Kaotik said:


> Saying "GCN doesn't support" is bad to start with, considering that GCN is already compromised of 3 separate generations with different featuresets.
> Current GCN generations, 1.0, 1.1 and 1.2 are indeed limited to Feature Level 12_0, but that doesn't mean all GCN generations will be.
> ((there's slight change 1.0 would be limited to 11_1, but several sources suggest it can do 12_0, too))



Have to agree - this article is low on facts and high on clickbaity title.


----------



## 15th Warlock (Jun 4, 2015)

This is so confusing, the other day I read in this forum that GCN supports DX12 _3 all the way to ver 1.1, and now this seems to contradict that, all these DX12 feature levels are making my head spin, I mean, can someone please clarify what's going on with that?


----------



## Devon68 (Jun 4, 2015)

> This greatly reduces video memory usage and bandwidth consumption.


Well yeah but AMD has plenty of both especially with the new HBM cards, so I see no issues with this.


----------



## Xzibit (Jun 4, 2015)

15th Warlock said:


> This is so confusing, the other day I read in this forum that GCN supports DX12 _3 all the way to ver 1.1, and now this seems to contradict that, all these DX12 feature levels are making my head spin, I mean, can someone please clarify what's going on with that?



According to Microsoft DX 12 presentations Tier 1 = DX 12_1.   Even Intel iGP is 12_1 capable.


----------



## 15th Warlock (Jun 4, 2015)

Xzibit said:


> According to Microsoft DX 12 presentations Tier 1 = DX 12_1.   Even intel iGP is 12_1 capable.



Thanks, you seem to be very knowledgeable on this topic, I've read about 390X supporting feature level _12, which wasn't even mentioned before, can you please help me understand about all these feature levels? Thank you


----------



## Fluffmeister (Jun 4, 2015)

15th Warlock said:


> This is so confusing, the other day I read in this forum that GCN supports DX12 _3 all the way to ver 1.1, and now this seems to contradict that, all these DX12 feature levels are making my head spin, I mean, can someone please clarify what's going on with that?



From an interview with Robert Hallock, Head of Global Technical Marketing at AMD, maybe he doesn't know either!  LOL.


----------



## Xzibit (Jun 4, 2015)

15th Warlock said:


> Thanks, you seem to be very knowledgeable on this topic, I've read about 390X supporting feature level _12, which wasn't even mentioned before, can you please help me understand about all these feature levels? Thank you



Here is the video from Microsoft saying Tier 1 = DX 12_1. Also gives explanation to Tiers

*Advanced DirectX12 Graphics and Performance

*


----------



## Random Murderer (Jun 4, 2015)

Xzibit said:


> According to Microsoft DX 12 presentations Tier 1 = DX 12_1.   Even Intel iGP is 12_1 capable.





15th Warlock said:


> Thanks, you seem to be very knowledgeable on this topic, I've read about 390X supporting feature level _12, which wasn't even mentioned before, can you please help me understand about all these feature levels? Thank you





Xzibit said:


> Here is the video from Microsoft saying Tier 1 = DX 12_1. Also gives explanation to Tiers
> 
> *Advanced DirectX12 Graphics and Performance
> *



Regardless of what this FUD article states, the only GPUs currently in production that can run DX12 tier 3 are GCN-core GPUs. Maxwell tops out at tier 2, and Fermi, Kepler and Intel integrated fall into tier 1. I can't believe how NV is trying to spin this considering their GPUs and drivers are going to have to emulate certain DX12 features that GCN supports on-metal, like conservative rasterization(one of the key features this article mentions).










Bottom line: We're all going to have to wait for Win10 and some DX12.1 software to sift through all this crap we're being fed about compatibility.


----------



## NC37 (Jun 4, 2015)

Xzibit said:


> Here is the video from Microsoft saying Tier 1 = DX 12_1. Also gives explanation to Tiers
> 
> *Advanced DirectX12 Graphics and Performance
> *



nVidia must be terrified of Fury if they are running this scared and attempting to deceive consumers about DX12. 6/16 is looking even more interesting.


----------



## the54thvoid (Jun 4, 2015)

I don't read German so I can't read the source article but it's not Nvidia saying it. Is it? Is the German article alluding to a fact not of Nvidia's fabrication or is it pertaining to Nvidia's heavy PR on the feature set Maxwell focusses on?
Can someone clarify without any brand bias? Obviously I can see the charts posted but has AMD stated in any press deck that they dont support tier 1?
In other use words to simplify, who the f@ck said GCN doesn't support it?


----------



## Random Murderer (Jun 4, 2015)

the54thvoid said:


> In other use words to simplify, who the f@ck said GCN doesn't support it?


This.
Until we get a solid source, this is just FUD.


----------



## Fluffmeister (Jun 4, 2015)

This is a good article, they indicate Feature Levels have different Tiers within them, for example:



> From what we have been told DX12 Feature Level 12.0 supports tiled resources, bindless textures, and typed UAV access.
> 
> ....
> 
> From what we gather with DirectX 12 is that you have have tiled resources support in DX12 Feature Level 12.1, but then have different Tiered Levels within that Feature Set. Confused yet? Microsoft and NVIDIA aren’t talking about the Tier Levels that are within the Feature Levels just yet, but a quick glance at the DirectX Caps Viewer in the public builds of Windows 10 will show mention of tier levels.





> Everything looks as expected, but we noticed that the Tiled Resources are shown as tier 3. This appears to confirm that Feature Level 12.0 has at least three tiers of options within the feature set. The whole point of having features and tiers is so the software guys know how to code things and it looks like that just got tougher from what we can tell.



So in short, it's perfectly reasonble to assume AMD support DX12_0, up to Tier 3, Tier 3 doesn't automatically mean they support DX12_1

Read more at http://www.legitreviews.com/geforce...l-and-tier-details_164782#TdG5DzvvlYrloE41.99


----------



## midnightoil (Jun 4, 2015)

Wow, btarunr runs another load of anti-AMD FUD that's COMPLETELY unsubstantiated, and I'm pretty sure he knows for the most part is bare-faced lies.  Colour me unsurprised.


----------



## Random Murderer (Jun 4, 2015)

midnightoil said:


> Wow, btarunr runs another load of anti-AMD FUD that's COMPLETELY unsubstantiated, and I'm pretty sure he knows for the most part are bare-faced lies.  Colour me unsurprised.


A year ago, I would have defended btarunr, but the amount of stuff like this that has started being posted on the front page is just asinine. I read TPU news for NEWS, not FUD.


----------



## Fluffmeister (Jun 4, 2015)

Gee so much hate.... at the end of the day why worry guys, with AMD having a monopoly on the console market, devs will always target the lowest common denominator.


----------



## Octopuss (Jun 4, 2015)

What kind of nonsense is this? Dx12 is not even out yet.

edit: Oh this is btarunr who posted this. Not buying into his Nvidia bullshit anymore.


----------



## erixx (Jun 4, 2015)

Î never cared a flying hoopla when AMD had DX 11.1 (or .2 or whatever) and my Geforce not.
Let both of them capitalize on solid working products.


----------



## Devon68 (Jun 4, 2015)

This is the person that made the best argument here:


> What kind of nonsense is this? Dx12 is not even out yet.


----------



## Kaotik (Jun 4, 2015)

For all those confused about "tier 3" and whatnot:

Feature tiers and Feature Levels are completely different things
Several features have 3 different tiers, depending on hardware capabilities Card X supports Tier 1, Tier 2 or Tier 3 of feature X or Y or whatever

Feature Levels, however, are different. Feature level x may require that certain feature is supported at Tier 2 for example, but that doesn't mean that a card supporting Tier 3 on that feature would meet all the other requirements of that Feature Level.

GCN's  (1.1 and 1.2 for sure, 1.0 probably) support Feature Level 12_0.
They may support some features at higher Tier than for example Maxwell does, but they still lack the required features for Feature Level 12_1, and those features they might support at higher Tier aren't required for 12_1.


----------



## Random Murderer (Jun 4, 2015)

Kaotik said:


> GCN's  (1.1 and 1.2 for sure, 1.0 probably) support Feature Level 12_0.
> They may support some features at higher Tier than for example Maxwell does, but they still lack the required features for Feature Level 12_1, and those features they might support at higher Tier aren't required for 12_1.


GCN 1.0 was confirmed DX12-compliant.
And the point I was bringing up is that two of the features specifically mentioned in this article that GCN supposedly doesn't support, it does at a hardware level, unlike anything(except latest Maxwell) NV has, specifically tiled resources and conservative rasterization. Regardless of tier, these are DX12_1 features, and they are supported on-metal by GCN, which directly contradicts this "news" article.


----------



## el etro (Jun 4, 2015)

NvidiaPowerup.


----------



## GC_PaNzerFIN (Jun 4, 2015)

I suspect these "Feature levels and x.1 supports mean as much as they have had in the past. Absolutely *nothing*. Never had any benefit whatsoever, but AMD and NVIDIA PR sure fires off ton of slides always.


----------



## Folterknecht (Jun 4, 2015)

The source for that computerbase.de "article" is an AMD guy (Robert Hallock) with whom computerbase.de had a chat.


----------



## Kaotik (Jun 4, 2015)

Random Murderer said:


> GCN 1.0 was confirmed DX12-compliant.
> And the point I was bringing up is that two of the features specifically mentioned in this article that GCN supposedly doesn't support, it does at a hardware level, unlike anything(except latest Maxwell) NV has, specifically tiled resources and conservative rasterization. Regardless of tier, these are DX12_1 features, and they are supported on-metal by GCN, which directly contradicts this "news" article.


AMD has said it themselves too, current GCN implementations only support FL12_0 (or possibly under in case of 1.0), they lack some of the required hardware features required for DX12's ROVs or conservative rasterization, or both, can't remember out of the top of my head. Also, volume tiled resources is different from tiled resources. GCN's can do the latter for sure, there's still no clear answer for the former.
The only relevant remaining question is is whether GCN 1.0 is 11_1 or 12_0.

(also, sidenote: not all chips considered GCN 1.0 are necessarily 1.0 anymore, there's indications that Cape Verde and Pitcairn (Curacao) have been upgraded to 1.1 after HD7-series (AMD OpenCL 2.0 requires GCN 1.1, yet their list for OpenCL 2.0 supporting products include several Cape Verde & Pitcairn based products (but not their HD7-versions))
((GCN 1.1 in this case refererring to the 3D/Compute-related capabilities of the chips, not including display controller or audio controller features which obviously aren't there))

edit:
Regarding the source, they seem to be somewhat confused, too. Bonaire is same gen as Hawaii, but they seem to think it's older)


----------



## Casecutter (Jun 4, 2015)

In other real news:
*Microsoft to Discuss Windows 10 and DirectX 12 at AMD’s E3 PC Gaming Show*
Read more: http://wccftech.com/microsoft-directx-12-windows-10-amd-e3-pc/#ixzz3c7srZRps


----------



## 15th Warlock (Jun 4, 2015)

the54thvoid said:


> I don't read German so I can't read the source article but it's not Nvidia saying it. Is it? Is the German article alluding to a fact not of Nvidia's fabrication or is it pertaining to Nvidia's heavy PR on the feature set Maxwell focusses on?
> Can someone clarify without any brand bias? Obviously I can see the charts posted but has AMD stated in any press deck that they dont support tier 1?
> In other use words to simplify, who the f@ck said GCN doesn't support it?


Lol, yes you're right, so this news item, is it more of an editorial then? Nowhere in the quoted article says nvidia will use this to their advantage


----------



## Nosada (Jun 4, 2015)

Just like we have the (PR) tags in front of articles that are press releases from companies, can we get a (FUD) tag for articles that have no basis in reality please?


----------



## arbiter (Jun 4, 2015)

btarunr said:


> The company probably wants to make a few new effects that rely on D3D 12_1 part of GameWorks, and deflect accusations of exclusivity to the competition (AMD) not supporting certain API features, which are open to them. Granted, AMD GPUs, and modern game consoles such as the Xbox One and PlayStation 4 don't support GameWorks, but that didn't stop big game devs from implementing them.


hairworks relies on DX11 tessellation so they already do that with current effects. AMD cards royally suck at tessellation.



NC37 said:


> nVidia must be terrified of Fury if they are running this scared and attempting to deceive consumers about DX12. 6/16 is looking even more interesting.


pfft most reports show fury isn't as good as people expect, few say it will be SLOWER then a 980ti which given history could be possible, but likely its only as fast and cost more.


Kinda funny first page of comments are all AMD fans attacking pretty sad


----------



## Fluffmeister (Jun 4, 2015)

Still, It's good to know Maxwell V2 is DX12_1 compliant at least.


----------



## HumanSmoke (Jun 4, 2015)

My god, yet another topic that seems to be less understood the more its talked about.

DirectX12 has both hardware level support and feature level support - they are* NOT* the same entities.
Resource binding tiers are a subset of DirectX 11 *not* 12. Tier 1 requires 11.0 support, Tier 2 requires 11.1 support, Tier 3 requires 11.2 support. Tier support - whether 1, 2, or 3 is mandatory for DX12. Supported architectures at each level are:






Within those tiers, the feature level support associated with - but not the same as- the hardware support (since some of the features can be emulated in software) is broken down thus:





The hardware level and feature level, and whether any natively supported feature level will be emulated will be down to the game developers.
As I pointed out in another posting a few days ago, feature level support will likely depend on whose IHV game dev program backing (if any) the game studio leverages.


Fluffmeister said:


> This is a good article, they indicate Feature Levels have different Tiers within them, for example:
> So in short, it's perfectly reasonble to assume AMD support DX12_0, up to Tier 3, Tier 3 doesn't automatically mean they support DX12_1


Exactly.
If the publicity is guilty of anything it is making the explanation too simplistic. AMD is also to blame here as well, since they widely circulated some not very concise slides, and made little if no distinction between native support and emulation which seems to cause some confusion regarding GCN 1.0 support of DX11.2 Tier 2 (which of course is carried over to DX12).






Octopuss said:


> Oh this is btarunr who posted this. Not buying into his Nvidia bullshit anymore.


It's actually btarunr quoting a German site of some repute, who interviewed AMD's head graphics marketing exec who is telling anyone who will listen that Nvidia could include conservative rasterization features into their game dev program....which could well eventuate, although since Maxwell v2 makes up such as small part of the graphics market they would almost certainly be addition features rather than obligated ones. So, Mr/Mrs/Ms Octopuss what this is all about is* AMD* alerting the world to the possible (and in some way probable) expansion of Nvidia's GameWorks program. Maybe you should have taken the time to read the original source material.


----------



## Lionheart (Jun 4, 2015)

arbiter said:


> hairworks relies on DX11 tessellation so they already do that with current effects. AMD cards royally suck at tessellation.
> 
> 
> pfft most reports show fury isn't as good as people expect, few say it will be SLOWER then a 980ti which given history could be possible, but likely its only as fast and cost more.
> ...



Dude I wouldn't talk if I were you, you praise Nvidia like close minded PC fanboys praise Gabe Newell!


----------



## GhostRyder (Jun 4, 2015)

Lionheart said:


> Dude I wouldn't talk if I were you, you praise Nvidia like close minded PC fanboys praise Gabe Newell!


Shots fired!  If anyone though is biased he's one of the major ones.

As for the article, it's an article based on what another site wrote.  I don't really count any of this stating bt is biased.  He was happy to post multiple articles about the gtx 970 memory and the blocked overclocking so I really would not call him biased.

Whatever the case, the truth will be revealed in time.  You can lie all you want (not claiming anything specific is a lie) but in the end the truth will be shown one way or another.


----------



## Xzibit (Jun 4, 2015)

HumanSmoke said:


>



Thats a user generated table from DX 11.3 FL 11_1






It also contradicts what Nvidia has said about pre 980/970 and CR.  Which falls inline with the other table.



HumanSmoke said:


>


----------



## arbiter (Jun 5, 2015)

Lionheart said:


> Dude I wouldn't talk if I were you, you praise Nvidia like close minded PC fanboys praise Gabe Newell!


Fact are facts, even though AMD fans won't admit the truth when it hurts their favorite brand.

As for ALL the hype over AMD's latest chip, its ALL hype. just cause it has HBM this 4000 gcn that, as proven over last many years just cause AMD chip has more of this or that, doesn't mean its better or faster. History proves that as Fact.


----------



## R-T-B (Jun 5, 2015)

HumanSmoke said:


> My god, yet another topic that seems to be less understood the more its talked about.
> 
> DirectX12 has both hardware level support and feature level support - they are* NOT* the same entities.
> Resource binding tiers are a subset of DirectX 11 *not* 12. Tier 1 requires 11.0 support, Tier 2 requires 11.1 support, Tier 3 requires 11.2 support. Tier support - whether 1, 2, or 3 is mandatory for DX12. Supported architectures at each level are:
> ...




Thanks for taking the time to explain that...  I'm sure I'm not the only one here to admit it makes my head spin a bit.

Bottom line:  The game devs will code at the lowest common denominator, as they always have, making ALL of this stupid FUD.


----------



## Lionheart (Jun 5, 2015)

arbiter said:


> Fact are facts, even though AMD fans won't admit the truth when it hurts their favorite brand.
> 
> As for ALL the hype over AMD's latest chip, its ALL hype. just cause it has HBM this 4000 gcn that, as proven over last many years just cause AMD chip has more of this or that, doesn't mean its better or faster. History proves that as Fact.



Yeah maybe I guess, will just have to wait & see


----------



## renz496 (Jun 5, 2015)

15th Warlock said:


> This is so confusing, the other day I read in this forum that GCN supports DX12 _3 all the way to ver 1.1, and now this seems to contradict that, all these DX12 feature levels are making my head spin, I mean, can someone please clarify what's going on with that?



except there is no such thing as 12_3. there are only 12_0 and 12_1. people are confusing tier level support with feature level support.


----------



## HumanSmoke (Jun 5, 2015)

Xzibit said:


> It also contradicts what Nvidia has said about pre 980/970 and CR.  Which falls inline with the other table.


Really?
And what did Nvidia say about Fermi and Maxwell v1 and conservative rasterization?

Just throwing shit at the wall and hoping nobody notices, or are you privy to facts you aren't willing to share? Or are you confusing full hardware support with software emulation? Because that is available to most architectures- it just isn't very resource friendly.







renz496 said:


> except there is no such thing as 12_3. there are only 12_0 and 12_1. people are confusing tier level support with feature level support.


+1....tier


R-T-B said:


> Bottom line:  The game devs will code at the lowest common denominator, as they always have, making ALL of this stupid FUD.


Yes, I doubt that it makes much of a difference in the long run (although The Witcher 3 utilizes some of the same features, which is why Maxwell v2 cards and AMD's GCN does better than Kepler since the latter doesn't have DX 11.2 support), but I'm sure it will churn up endless debate over what probably amounts to 3-4% performance variance, while the content of the bulk of the games themselves - uninspiring corridor shooters and thinly veiled ripoffs of earlier titles receive a bare fraction of the same forum bandwidth. Yay for common sense.


----------



## Haytch (Jun 5, 2015)

Devon68 said:


> Well yeah but AMD has plenty of both especially with the new HBM cards, so I see no issues with this.


This should help Nvidia with their 4Gb - Sorry, I mean 3.5Gb GPU lineup


----------



## R-T-B (Jun 5, 2015)

HumanSmoke said:


> while the content of the bulk of the games themselves - uninspiring corridor shooters and thinly veiled ripoffs of earlier titles receive a bare fraction of the same forum bandwidth. Yay for common sense.



Glad to see I'm not the only one who finds present games rather uninspired.


----------



## Redkaliber (Jun 5, 2015)

Im just mad that my 970's wont run the full 12.1 crap. I mean wtf, I bought em thinking id be set for a while. Now I find out their is a higher version not supported by my card? WTF M$


----------



## btarunr (Jun 5, 2015)

For those of you who don't know Google Translate exists,

In summary, no 12_1 for GCN. Not only did a person of authority from AMD confirm it, but also AMD sourgraped about it much like NVIDIA did about 11_1 for Kepler.











"For 'Fiji' AMD is not expressed" means that AMD guy didn't want to talk about Fiji. Any DACH guys can confirm that translation.


----------



## lilhasselhoffer (Jun 5, 2015)

This is a fun discussion.  It reads first as red/green team member "proving" their team is better without ever proving anything.  It morphs into accusations that the article's author is biased.  Finally, we wind up in the bizarre land of what exactly DX12 support is.  

Addressing these issues in order:

When GCN and Kepler came out DX11 was just starting to be the standard for gaming.  The fact that there is any proposed support for either of these architectures to offer DX12 features is a small miracle.  It'd be like expecting your 2000 Toyota Camry to have an Ipod dock.  Obviously DX12 is something that is meant to reach back and give some cards new features, but expecting any cards with 100% compatibility is a pipe dream.  Neither AMD nor Nvidia are magic, deal with it.


Btarunr and I often have differences of opinion.  Look back to the editorial on everyone "needing to accept windows 8, because it's not going away" and you can see that clearly.  Despite differences of opinion, you need to offer respect where it is due.  This article, and Btarunr in general, are shooting straight.  There aren't any lies introduced by the author, there's no reason to believe fiscal gains are the motive, and reading the original piece I think this is a fair representation.  I disagree that reporting on reporters reporting is news (seriously, it's hard enough to report fairly when it's a one-on-one interview), but that's no reason to accuse a person of being a shill.  Whenever you've got incontrovertible proof of wrong doing, present it.  Right now these accusations are hollow, and dismissed as easily as they are made.


Sniff....sniff.....sniff....  Does anyone else smell a turd?  I think I remember that smell...  Vista, is that you?

Jokes aside, isn't anyone else getting flashbacks of Vista when they hear about DX12?  Some hardware supports it, other hardware is capable of supporting part of it, and people are selling hardware as compatible despite knowing that you'd need a half dozen upgrades to get a system running reasonably.  DX12 is a complicated animal, over simplified in the media and by sales, and the ensuing fallout from the false portrayal will hurt more people than it could ever help.  The end goal is just to move more hardware, as people get frustrated by partial support and the promised improvements being hollow.



Let's start some more fires.  Fiji is supposed to faster than, slower than, and as fast as the GTX 980 ti.  Zen is supposed to both compete with Intel, and put them back into the dedicated CPU market.  Intel is going to both improve FPGA production and cost by incorporating Altera into its production chain, but because the market will become a monopoly it will destroy affordable product lines.  You see, FUD is fun.  My last four sentences are complete speculation, yet they've started flame wars on forums.  Seriously, show me the numbers before I give a crap.  Until there are solid *FACTs* you might as well be playing with yourself for all the energy you're wasting on going nowhere.


----------



## bubbleawsome (Jun 5, 2015)

Honestly, as long as my 280x can handle the most basic dx12 features or at least spoof them without too much of an issue I couldn't give less of a crap. My GPU came out in 2012, we should just be expecting dx12 support in this generation or the next. The fact that old cards might still get the CPU (and other benefits) of dx12 is good enough for me.

Honestly I'll probably need to upgrade my GPU before dx12 is widespread enough to be required.


----------



## snakefist (Jun 5, 2015)

*The League of Extraordinary (Wealthy) Gentlemen*

This surmarizes my feelings when reading these comments.

To be honest, I'll say this from the start. I like AMD better than Intel or NVIDIA - but it never stopped me from buying the a clearly better product, when it makes sense. Secondly, my machine - on which I do games (though mostly ones that I really like and spending many hours, often hundreds, with) - is getting old, almost 3 years now. Even when bought, it was of average 'strength' - though I do tend to build a lots of system, for my friends who need it and have less experience than I do (I own different PCs from 1987th, after all )

But, an average comment poster seems to have something like this:
- at least i7 of the newer generation, looking for an upgrade
- Titan X or, at least 980x (preferably in SLI)
- 4k monitor (at least one), or some multi-monitor configuration

Same poster is:
- highly environment-aware, though PSU is >800+W, and every 10-30W of extra usage counts as unacceptable
- liking and needing Iris Pro
- is highly dependant on yet unreleased DirectX 12, even to a point where some small inability of the graphic card to use hardware acceleration of the feature means life or death, NOW

Configuration mentioned easily surpasses 2000$, and may need replacement immediately, or very soon, because, well, some of the games released after famous June 29th won't have every every hardware optimization of feature which would be questionably used in that game(s) -softwere emulation is out of the question, naturally

Well, MY old system is very capable to support games I play, and it is not Soltaire - a number of AAA titles are there.

- 4K - though I've seen enough gaming on those, I've decided that I don't need one yet
- Multi-monitor gaming - it looks... bad, in my book (though I do have a second monitor, I use it for a different purposes entirely)
- Iris Pro is inherently... unneeded, a person spending as much for a CPU and is a passionate gamer buys a REAL graphic card, even if it's just an average one, and gets like 5-10 better results of it (minimum - whole concept of having Iris Pro eludes me, it's suitable for a budget CPU and budget gaming, yet it bears premium price)

Main features of DX12 are, as my understanding goes, to better utilize CPU cores and GPU, existing ones as yet unreleased. Other than this, there are bunch of new features - but then again, there are PsysX (and Flex and Enviromental and so on), Fur, God Rays, HairFX - also TrueAudio, Tressfx, Mantle... Neither green or red using games profited hugely out of having, or missing those - yes, they are noticeable, especially in demos, and probably moreso than some cryptic "Feature Level 12_x".

That being said, I wouldn't worry too much of it for the time being - and neither should others. Buying a new GPU cause it supports something as vague as "Feature Level 12_x" as primary reason is pure waste of money.  You can always do the same when and IF this starts to affect your experience. When it happens, all the chances are that prices will be lower, products more mature and better overall, to the point of having the whole new generation.

I plan to upgrade my computer in 2016, if something important doesn't change - this goes both ways.

(Oh, and I did had a privilege to see several expensive components and how they affect my configuration - for me, they weren't worth buying, but mileage may vary - those who need them, probably have them already)

Sorry for a wall of text, but I felt that I have to write this, since the whole discussion regarding replacing GPU or profiting over having "Feature Level 12_x" seems so pointless to me.


----------



## qubit (Jun 5, 2015)

Ouch, can anyone say controversial?

There's no love lost between these two, that's for sure lol.


----------



## Pap1er (Jun 5, 2015)

What does it mean in terms of performance if GCN support or does not support Direct3D 12_1 ?
I would appreciate short and clear explanation.
Would it affect performance at all? If so, how much?


----------



## HumanSmoke (Jun 5, 2015)

Pap1er said:


> What does it mean in terms of performance if GCN support or does not support Direct3D 12_1 ?
> I would appreciate short and clear explanation.
> Would it affect performance at all? If so, how much?


How long is a piece of string?
If the card is saving GPU horsepower by better use of rasterization resources then the amount of gain depends upon the scenes being rendered. Not all games or game engjnes are created equal, and that doesn't take into account a myriad other graphical computation levels also needing to be taken into consideration ( I.e. tessellation). Even if you could quantify the gains/deficits, they are then affected by how different architectures handle post-rasterization image quality features ( post process depth of field, motion blur, global illumination etc.)
Basically what you want is a set figure when the actuality is that wont ever be the case unless every other variable becomes a constant- and every architecture and every part within every architecture handles every facet of the game to a varying degree.


----------



## RejZoR (Jun 5, 2015)

So, out of this whole clusterfuck of info, when it comes to D3D12_0, old AMD GPU's still support way more than old NVIDIA GPU's (Kepler and Maxwell 1 support none of the feature levels and the ones that does are all Tier 1). Excluding Maxwell 2 since it's the newest one and was built for D3D12 to begin with anyway. Now it's just a question how far will the Fiji go with support. But seeing that GCN 1.0 already supports some, we can quite safely assume they'll support more than Maxwell 2. Not doing so would be kinda stupid from AMD considering how late they are releasing the Fiji compared to Maxwell 2...


----------



## R-T-B (Jun 5, 2015)

Pap1er said:


> What does it mean in terms of performance if GCN support or does not support Direct3D 12_1 ?
> I would appreciate short and clear explanation.
> Would it affect performance at all? If so, how much?



I can provide a short one (no offense HumanSmoke)

If only one brand supports it, as is suggested, no one in their right mind will code for it.

So short answer is no, it won't make much difference at all.


----------



## _larry (Jun 5, 2015)

lilhasselhoffer said:


> Sniff....sniff.....sniff....  Does anyone else smell a turd?  I think I remember that smell...  Vista, is that you?



Ah yes, DirectX 10. The redheaded step-child between DX9 and DX11. 

Aren't they making a new Doom? (I bought Wolfenstein: The New Order the day of release and STILL have an unused Doom beta key sitting on my desk...)
What if they came out swinging with Doom on their IDTech OpenGL based engine when DX12 was launched? They have had a good bit of time to optimize it since Rage came out.


----------



## wiak (Jun 5, 2015)

Octopuss said:


> What kind of nonsense is this? Dx12 is not even out yet.


and AMD can still have support in hardware, nobody knew of some of their past generation product like tessellation in 2900 XT that was unused in DX10.

but wait for Windows 10, DX12 and DX12 games to find out..


----------



## HumanSmoke (Jun 5, 2015)

R-T-B said:


> I can provide a short one (no offense HumanSmoke)
> If only one brand supports it, as is suggested, no one in their right mind will code for it.
> So short answer is no, it won't make much difference at all.


No offense taken, and you're right, most PC games are developed for console - and consoles don't support FL 12_1
The only caveats are game engines developed primarily or in tandem with PC where the features could be unused in the console version, and OpenGL game engines of course.


wiak said:


> and AMD can still have support in hardware, nobody knew of some of their past generation product like tessellation in 2900 XT that was unused in DX10.


Not really. The tessellator in the R600 was known about from the moment it was launched - it wasn't some kind of secret squirrel hidden capability.
The whole point of this article and thread, is that the GCN 1.x architecture cannot undertake conservative raterization in hardware. It can however emulate it in software using the compute ability of the architectures inbuilt asynchronous compute engines.


----------



## Steevo (Jun 5, 2015)

HumanSmoke said:


> No offense taken, and you're right, most PC games are developed for console - and consoles don't support FL 12_1
> The only caveats are game engines developed primarily or in tandem with PC where the features could be unused in the console version, and OpenGL game engines of course.
> 
> Not really. The tessellator in the R600 was known about from the moment it was launched - it wasn't some kind of secret squirrel hidden capability.
> The whole point of this article and thread, is that the GCN 1.x architecture cannot undertake conservative raterization in hardware. It can however emulate it in software using the compute ability of the architectures inbuilt asynchronous compute engines.





wiak said:


> and AMD can still have support in hardware, nobody knew of some of their past generation product like tessellation in 2900 XT that was unused in DX10.
> 
> but wait for Windows 10, DX12 and DX12 games to find out..




TruForm ATI 8500 had hardware tessellation, and no one used it as no competitors had it or wanted to invest in it at the time.


----------



## HumanSmoke (Jun 5, 2015)

Steevo said:


> TruForm ATI 8500 had hardware tessellation, and no one used it as no competitors had it or wanted to invest in it at the time.


TruForm did have reasonable amount of game support- including a number of AAA titles.

Same old ATI/AMD tune isn't it?
At least ATI worked to get TruForm integrated as a game feature. AMD get involved and immediately turn an R600 feature into a footnote in history by tossing ATI's game development program into the nearest dumpster.


----------



## xfia (Jun 6, 2015)

i think i would have to agree this means little to nothing.. so game devs will continue to load textures the same and some will let you decide like forever now. by the time gamers actually need a full dx12.1+ or whatever we will be talking about dx13. assuming dx is still the way to go for gaming by then.

p.s didnt amd have a hand in developing gpu tessellation and had fully supporting dx11 gpu's before nvidia? would that not also be around the same time gcn was proving to be more powerful than kepler in direct compute and gpu acceleration?

well i know for sure tessellation works fine on my amd gpu's and amd optimized tessellation looks great.. especially in evolved games.


----------



## HumanSmoke (Jun 6, 2015)

xfia said:


> i think i would have to agree this means little to nothing.. so game devs will continue to load textures the same and some will let you decide like forever now.


It won't be a major factor, but the consensus amongst developers seems to be that 12_1 features such as conservative rasterization, rasterizer ordered views (ROV)/ order-independent transparency (OIT), voxelization, and adaptive volumetric shadow maps are the way forward for more realistic portrayal of gameplay, reduction of GPU compute overhead, and greater developer control. These may be slow in coming to fruition with DirectX (thanks to consoles not supporting the features natively, or not at all), but OpenGL already has them enabled. In a way, AMD can thank Nvidia and Intel for making Vulkan that much more relevant - how's that for irony.


xfia said:


> p.s didnt amd have a hand in developing gpu tessellation and had fully supporting dx11 gpu's before nvidia?


No and Yes.
No. ATI's TruForm was the first GPU tessellation hardware followed by Matrox's Parhelia line (N-Patch support), then ATI's Xenos (R500 / C1) graphics chip for the Xbox 360. All of these pre-date AMD's involvement with ATI.
Yes. AMD's Evergreen series were the first DirectX 11 compliant GPUs. They arrived just over six months before Nvidia's own DX11 cards.


> would that not also be around the same time gcn was proving to be more powerful than kepler in direct compute and gpu acceleration?


Do tell? You're starting to sound like AMD's Roy Taylor.
DirectCompute, like most computation depends upon the workload, software, and just as importantly, software support (drivers). It also depends heavily upon the emphasis placed upon the designs by the architects. A case in point is the tessellation you seem very keen to explore. ATI pioneered it, but it went largely unused. Under AMD's regime tessellation wasn't prioritized where Nvidia made Maxwell a tessellation monster. Neither DC or tessellation on their own define the architecture, or are indicators of much besides that facet.







xfia said:


> by the time gamers actually need a full dx12.1+ or whatever we will be talking about dx13. assuming dx is still the way to go for gaming by then.


Well, if Vulkan and the new OpenGL extensions take off like people are expecting, the DirectX coding arena may have their hand forced. If the new OGL turns into the old OGL, Microsoft can probably wait ten years before updating DirectX.


----------



## xfia (Jun 6, 2015)

thanks smoke..  i honestly just didnt even know what to believe with all the stuff people say around. thinking about it.. i think you shared a interview with amd's "gaming scientist" some time ago and he explained something of a tessellation war going on or rather over tessellation. before i started getting into it but still interesting. 
10 years haha yeah dx12 should be easier to work with and i mean what are they even going to add to make it more complex that is a real game changer like tessellation was.


----------



## darkangel0504 (Jun 6, 2015)

We should use benchmark instead of play game


----------



## xfia (Jun 6, 2015)

DX12  FTW MICROSOFT!




1990 REHASHED HAHA




DX13?


----------



## rvalencia (Jun 6, 2015)

btarunr said:


> AMD's Graphics CoreNext (GCN) architecture does not support Direct3D feature-level 12_1 (DirectX 12.1), according to a ComputerBase.de report. The architecture only supports Direct3D up to feature-level 12_0. Feature-level 12_1 adds three features over 12_0, namely Volume-Tiled Resources, Conservative Rasterization and Rasterizer Ordered Views.
> 
> Volume Tiled-resources, is an evolution of tiled-resources (analogous to OpenGL mega-texture), in which the GPU seeks and loads only those portions of a large texture that are relevant to the scene it's rendering, rather than loading the entire texture to the memory. Think of it as a virtual memory system for textures. This greatly reduces video memory usage and bandwidth consumption. Volume tiled-resources is a way of seeking portions of a texture not only along X and Y axes, but adds third dimension. Conservative Rasterization is a means of drawing polygons with additional pixels that make it easier for two polygons to interact with each other in dynamic objects. Raster Ordered Views is a means to optimize raster loads in the order in which they appear in an object. Practical applications include improved shadows.
> 
> ...


CR(Conservative Rasterization)

Read http://devgurus.amd.com/message/1308511



Question:

I need for my application that every drawing produce at least one pixel output (even if this is an empty triangle = 3 identical points). NVidia have an extension (GL_NV_conservative_raster) to enable such a mode (on Maxwell+ cards). Is there a similar extension on AMD cards

Answer (from AMD):

Some of our hardware can support functionality similar to that in the NVIDIA extension you mention, but we are currently not shipping an extension of our own. We will likely hold off until we can come to a consensus with other hardware vendors on a common extension before exposing the feature, but it will come in time.














Even Nvidia first gen Maxwell card 750Ti that was launch early in 2014 doesn't even have 12.1.


*From Christophe Riccio*
https://twitter.com/g_truc/status/581224843556843521

_It seems that shader invocation ordering is proportionally a lot more expensive on GM204 than S.I. or HSW._


----------



## rvalencia (Jun 6, 2015)

HumanSmoke said:


> It won't be a major factor, but the consensus amongst developers seems to be that 12_1 features such as conservative rasterization, rasterizer ordered views (ROV)/ order-independent transparency (OIT), voxelization, and adaptive volumetric shadow maps are the way forward for more realistic portrayal of gameplay, reduction of GPU compute overhead, and greater developer control. These may be slow in coming to fruition with DirectX (thanks to consoles not supporting the features natively, or not at all), but OpenGL already has them enabled. In a way, AMD can thank Nvidia and Intel for making Vulkan that much more relevant - how's that for irony.
> 
> No and Yes.
> No. ATI's TruForm was the first GPU tessellation hardware followed by Matrox's Parhelia line (N-Patch support), then ATI's Xenos (R500 / C1) graphics chip for the Xbox 360. All of these pre-date AMD's involvement with ATI.
> ...









AMD is  aware of extreme tessellation issue hence improvements with R9-285 (28 CU).


----------



## mastrdrver (Jun 6, 2015)

Lets clarify this by someone who actually has some knowledge about this: Link.

Qualifications of the writer: Written by Alessio Tommasini, Directx 12 early access program member.



> At this time, we are witnessing an incredible work of disinformation about the supported DirectX 12 features by the various GPUs currently available from AMD and NVIDIA. Personally, we do not know if some company arranges this sort-of campaign, or if it is just all due by some uninformed journalists.
> 
> What is for sure is that people need some clarifications. First of all, Direct3D 12 is an API designed to run on the currently available hardware, as long as it supports virtual memory and tiled resources.
> The new API has been largely shaped around a new resource-binding model, which defines the management of textures and buffers in physical and virtual memory (dedicated or system-shared) of the graphics hardware.
> ...


----------



## HumanSmoke (Jun 7, 2015)

mastrdrver said:


> Lets clarify this by someone who actually has some knowledge about this: Link.
> Qualifications of the writer: Written by Alessio Tommasini, Directx 12 early access program member.


The same information - and informed opinion by programmers (including the author you just quoted and Andrew Lauritzen of Intel), and tech reviewers such as Anandtech's Ryan Smith is being actively discussed in a less FUD-orientated thread at B3D if anyone is actually interested. (Link is the last current page of the discussion but I would recommend reading the whole thread).


----------



## R-T-B (Jun 7, 2015)

HumanSmoke said:


> The same information - and informed opinion by programmers (including the author you just quoted and Andrew Lauritzen of Intel), and tech reviewers such as Anandtech's Ryan Smith is being actively discussed in a less FUD-orientated thread at B3D if anyone is actually interested. (Link is the last current page of the discussion but I would recommend reading the whole thread).



But brain hurts!  I just want video game.

In seriousness, I may browse through it when I get a chance...


----------



## mastrdrver (Jun 7, 2015)

HumanSmoke said:


> The same information - and informed opinion by programmers (including the author you just quoted and Andrew Lauritzen of Intel), and tech reviewers such as Anandtech's Ryan Smith is being actively discussed in a less FUD-orientated thread at B3D if anyone is actually interested. (Link is the last current page of the discussion but I would recommend reading the whole thread).



Thanks, didn't realize that discussion was going on over there.

What it seems to be is that no GPU out now or soon fully supports DX12. nVidia 12_1 is not better then AMD 12_0 (or vise versa), it just seems to be a different feature sub, sub set.

edit: Even these sub, sub set numbers (12_0 or 12_1) do not fully support all of DX12 features. Also it would seem that DX11 is a subset of DX12. Therefore, while GCN 1.0 may not support some of the new features in DX12, it still "supports" DX12. Even Fermi appears to "support" DX12, but only support DX11.1 feature level.


----------



## rvalencia (Jun 8, 2015)

mastrdrver said:


> Thanks, didn't realize that discussion was going on over there.
> 
> What it seems to be is that no GPU out now or soon fully supports DX12. nVidia 12_1 is not better then AMD 12_0 (or vise versa), it just seems to be a different feature sub, sub set.
> 
> edit: Even these sub, sub set numbers (12_0 or 12_1) do not fully support all of DX12 features. Also it would seem that DX11 is a subset of DX12. Therefore, while GCN 1.0 may not support some of the new features in DX12, it still "supports" DX12. Even Fermi appears to "support" DX12, but only support DX11.1 feature level.


From http://www.bitsandchips.it/52-engli...out-tier-and-feature-levels-of-the-directx-12

•Tier 1: INTEL Haswell e Broadwell, NVIDIA Fermi
•Tier 2: NVIDIA Kepler, Maxwell 1.0 and Maxwell 2.0
•Tier 3: AMD GCN 1.0, GCN 1.1 and GCN 1.2

•Feature level 11.0: NVIDIA Fermi, Kepler, Maxwell 1.0
•Feature level 11.1: AMD GCN 1.0, INTEL Haswell and Broadwell
•Feature level 12.0: AMD GCN 1.1 and GCN 1.2
•Feature level 12.1: NVIDIA Maxwell 2.0


The max DX12 support would be Tier 3 and Feature Level 12.1.

Tier level and feature level support would be useless if the said features are slow i.e. decelerator


----------



## mastrdrver (Jun 8, 2015)

DX11 is a subset of DX12, thus you have different tiers. A GPU is considered to support DX12 "as long as it supports virtual memory and tiled resources", to quote the article. The feature level is not a DX version support, but a classification of what features of DX12 said GPU supports. "The first two feature levels roughly coincide to the DirectX 11 levels with the same name (with some differences due the new resource binding model), while feature level 12.0 and 12.1 are new to Direct3D 12."

The feature level is a classification of what DX12 features said GPU supports. So, Fermi, Kepler, Maxwell 1.0 all support feature level 11.0 of DX12. GCN 1.0, Haswell, and Broadwell support feature level 11.1 and so on.

With all that said, "Despite being pleonastic, it is worth to restate that *feature level 12.1 does not coincide with an imaginary “full/complete DirectX 12 support”* since it does not cover many important or secondary features exposed by Direct3D 12."


----------



## xfia (Jun 8, 2015)

its been my understanding that what all dx11 gpu's will support of dx12 is the way the gpu and cpu communicate.. draw calls i believe? and if a feature is gpu specific then well its just like normal and you need a new gpu for that feature but it doesnt seem like any gamer especially casual is going to need to be rushing out to upgrade even a year from now if they have a system from the past few generations.  
_
_


----------



## HumanSmoke (Jun 8, 2015)

xfia said:


> its been my understanding that what all dx11 gpu's will support of dx12 is the way the gpu and cpu communicate.. draw calls i believe? and if a feature is gpu specific then well its just like normal and you need a new gpu for that feature but it doesnt seem like any gamer especially casual is going to need to be rushing out to upgrade even a year from now if they have a system from the past few generations.


Sort of. AMD's VLIW architecture (HD 5000, 6000 series) isn't compatible with DX12, but anything GCN as well as any Nvidia DX11 capable card is DX12 compatible. All the cards support DX12 - it is just a matter of the level of support. No cards at the present time support every facet of the API or its complete feature set.

The basic features of DX12 will be available to most of the GPUs (as well as DirectX 11.3).It will be up to game developers as to which additional features they might include - but I would say that if there is not broad based support for existing cards, the additional features will be options within the game code rather than mandatory.


----------



## Wshlist (Jun 9, 2015)

People here act like it's somebody fabricating stuff, but the level of support of currently available AMD cards is *confirmed* by AMD.
But of course we don't know about the about to be released new card.

Anyway the german site has a nice table: http://www.computerbase.de/2015-06/directx-12-amd-radeon-feature-level-12-0-gcn/


----------



## rvalencia (Jun 9, 2015)

Wshlist said:


> People here act like it's somebody fabricating stuff, but the level of support of currently available AMD cards is *confirmed* by AMD.
> But of course we don't know about the about to be released new card.
> 
> Anyway the german site has a nice table: http://www.computerbase.de/2015-06/directx-12-amd-radeon-feature-level-12-0-gcn/


*Again, feature level 12.1 does not coincide with an imaginary “full/complete DirectX 12 support”* since it does not cover many important or secondary features exposed by Direct3D 12

Microsoft allocated a lecture on Resource Binding tier levels during GDC  2015.
http://channel9.msdn.com/Events/GDC/GDC-2015/Advanced-DirectX12-Graphics-and-Performance
Time stamp:  8:09


----------



## Wshlist (Jun 11, 2015)

rvalencia said:


> *Again, feature level 12.1 does not coincide with an imaginary “full/complete DirectX 12 support”* since it does not cover many important or secondary features exposed by Direct3D 12
> 
> Microsoft allocated a lecture on Resource Binding tier levels during GDC  2015.
> http://channel9.msdn.com/Events/GDC/GDC-2015/Advanced-DirectX12-Graphics-and-Performance
> Time stamp:  8:09



Did you even look at the german site's table? At all?


----------



## rvalencia (Jun 15, 2015)

Wshlist said:


> Did you even look at the german site's table? At all?


The German site only shows Feature Levels hence *feature level 12.1 does not coincide with an imaginary “full/complete DirectX 12 support”.
*
Again*, *Microsoft allocated a lecture on *Resource Binding* *tier levels* during GDC 2015.
http://channel9.msdn.com/Events/GDC/GDC-2015/Advanced-DirectX12-Graphics-and-Performance
Time stamp: 8:09


----------



## Vayra86 (Jun 15, 2015)

Everyone should just relax, and wait for actual games to get released with said technologies.

By then we are looking at Pascal, if not later, and this whole discussion is moot anyway.

Bottom line: 2015 is the worst year in a decade to be buying a new GPU (end of 28nm / beginning of a double node size reduction, beginning of new featuresets, stagnating performance levels). Yes, Maxwell buyers, you too. That also puts AMD"s second/third rebrand in a whole other light.


----------



## xfia (Jun 16, 2015)

i predict most if not all dx12 games will be multi level dx like how some dx11 games will let you run 10 or 9 but its not experiencing the game at its best so i would imagine that it will be much the same way. if you have a dx12.0 capable gpu and playing a dx12 game but on windows 7-8.1 your not going to get the improved rendering capability and cpu overhead as if you had windows 10. 
i have little to no doubt that windows 10 and dx12 capability will soon be the must have for power users and gamers. getting a lot of companies to upgrade though may be a different story.. just getting the majority to drop xp took support ending and becoming a security risk.


----------

