Thursday, June 4th 2015
NVIDIA Could Capitalize on AMD Graphics CoreNext Not Supporting Direct3D 12_1
AMD's Graphics CoreNext (GCN) architecture does not support Direct3D feature-level 12_1 (DirectX 12.1), according to a ComputerBase.de report. The architecture only supports Direct3D up to feature-level 12_0. Feature-level 12_1 adds three features over 12_0, namely Volume-Tiled Resources, Conservative Rasterization and Rasterizer Ordered Views.
Volume Tiled-resources, is an evolution of tiled-resources (analogous to OpenGL mega-texture), in which the GPU seeks and loads only those portions of a large texture that are relevant to the scene it's rendering, rather than loading the entire texture to the memory. Think of it as a virtual memory system for textures. This greatly reduces video memory usage and bandwidth consumption. Volume tiled-resources is a way of seeking portions of a texture not only along X and Y axes, but adds third dimension. Conservative Rasterization is a means of drawing polygons with additional pixels that make it easier for two polygons to interact with each other in dynamic objects. Raster Ordered Views is a means to optimize raster loads in the order in which they appear in an object. Practical applications include improved shadows.Given that GCN doesn't feature bare-metal support for D3D feature-level 12_1, its implementation will be as limited as feature-level 11_1 was, when NVIDIA's Kepler didn't support it. This is compounded by the fact that GCN is a more popular GPU architecture than Maxwell (which supports 12_1), thanks to new-generation game consoles. It could explain why NVIDIA dedicated three-fourths of its GeForce GTX 980 Ti press-deck to talking about the features of D3D 12_1 at length. The company probably wants to make a few new effects that rely on D3D 12_1 part of GameWorks, and deflect accusations of exclusivity to the competition (AMD) not supporting certain API features, which are open to them. Granted, AMD GPUs, and modern game consoles such as the Xbox One and PlayStation 4 don't support GameWorks, but that didn't stop big game devs from implementing them.
Source:
ComputerBase.de
Volume Tiled-resources, is an evolution of tiled-resources (analogous to OpenGL mega-texture), in which the GPU seeks and loads only those portions of a large texture that are relevant to the scene it's rendering, rather than loading the entire texture to the memory. Think of it as a virtual memory system for textures. This greatly reduces video memory usage and bandwidth consumption. Volume tiled-resources is a way of seeking portions of a texture not only along X and Y axes, but adds third dimension. Conservative Rasterization is a means of drawing polygons with additional pixels that make it easier for two polygons to interact with each other in dynamic objects. Raster Ordered Views is a means to optimize raster loads in the order in which they appear in an object. Practical applications include improved shadows.Given that GCN doesn't feature bare-metal support for D3D feature-level 12_1, its implementation will be as limited as feature-level 11_1 was, when NVIDIA's Kepler didn't support it. This is compounded by the fact that GCN is a more popular GPU architecture than Maxwell (which supports 12_1), thanks to new-generation game consoles. It could explain why NVIDIA dedicated three-fourths of its GeForce GTX 980 Ti press-deck to talking about the features of D3D 12_1 at length. The company probably wants to make a few new effects that rely on D3D 12_1 part of GameWorks, and deflect accusations of exclusivity to the competition (AMD) not supporting certain API features, which are open to them. Granted, AMD GPUs, and modern game consoles such as the Xbox One and PlayStation 4 don't support GameWorks, but that didn't stop big game devs from implementing them.
79 Comments on NVIDIA Could Capitalize on AMD Graphics CoreNext Not Supporting Direct3D 12_1
The only relevant remaining question is is whether GCN 1.0 is 11_1 or 12_0.
(also, sidenote: not all chips considered GCN 1.0 are necessarily 1.0 anymore, there's indications that Cape Verde and Pitcairn (Curacao) have been upgraded to 1.1 after HD7-series (AMD OpenCL 2.0 requires GCN 1.1, yet their list for OpenCL 2.0 supporting products include several Cape Verde & Pitcairn based products (but not their HD7-versions))
((GCN 1.1 in this case refererring to the 3D/Compute-related capabilities of the chips, not including display controller or audio controller features which obviously aren't there))
edit:
Regarding the source, they seem to be somewhat confused, too. Bonaire is same gen as Hawaii, but they seem to think it's older)
Microsoft to Discuss Windows 10 and DirectX 12 at AMD’s E3 PC Gaming Show
Read more: http://wccftech.com/microsoft-directx-12-windows-10-amd-e3-pc/#ixzz3c7srZRps
Kinda funny first page of comments are all AMD fans attacking pretty sad
DirectX12 has both hardware level support and feature level support - they are NOT the same entities.
Resource binding tiers are a subset of DirectX 11 not 12. Tier 1 requires 11.0 support, Tier 2 requires 11.1 support, Tier 3 requires 11.2 support. Tier support - whether 1, 2, or 3 is mandatory for DX12. Supported architectures at each level are:
Within those tiers, the feature level support associated with - but not the same as- the hardware support (since some of the features can be emulated in software) is broken down thus:
The hardware level and feature level, and whether any natively supported feature level will be emulated will be down to the game developers.
As I pointed out in another posting a few days ago, feature level support will likely depend on whose IHV game dev program backing (if any) the game studio leverages. Exactly.
If the publicity is guilty of anything it is making the explanation too simplistic. AMD is also to blame here as well, since they widely circulated some not very concise slides, and made little if no distinction between native support and emulation which seems to cause some confusion regarding GCN 1.0 support of DX11.2 Tier 2 (which of course is carried over to DX12).
It's actually btarunr quoting a German site of some repute, who interviewed AMD's head graphics marketing exec who is telling anyone who will listen that Nvidia could include conservative rasterization features into their game dev program....which could well eventuate, although since Maxwell v2 makes up such as small part of the graphics market they would almost certainly be addition features rather than obligated ones. So, Mr/Mrs/Ms Octopuss what this is all about is AMD alerting the world to the possible (and in some way probable) expansion of Nvidia's GameWorks program. Maybe you should have taken the time to read the original source material.
As for the article, it's an article based on what another site wrote. I don't really count any of this stating bt is biased. He was happy to post multiple articles about the gtx 970 memory and the blocked overclocking so I really would not call him biased.
Whatever the case, the truth will be revealed in time. You can lie all you want (not claiming anything specific is a lie) but in the end the truth will be shown one way or another.
It also contradicts what Nvidia has said about pre 980/970 and CR. Which falls inline with the other table.
As for ALL the hype over AMD's latest chip, its ALL hype. just cause it has HBM this 4000 gcn that, as proven over last many years just cause AMD chip has more of this or that, doesn't mean its better or faster. History proves that as Fact.
Bottom line: The game devs will code at the lowest common denominator, as they always have, making ALL of this stupid FUD.
And what did Nvidia say about Fermi and Maxwell v1 and conservative rasterization?
Just throwing shit at the wall and hoping nobody notices, or are you privy to facts you aren't willing to share? Or are you confusing full hardware support with software emulation? Because that is available to most architectures- it just isn't very resource friendly.
+1....tier Yes, I doubt that it makes much of a difference in the long run (although The Witcher 3 utilizes some of the same features, which is why Maxwell v2 cards and AMD's GCN does better than Kepler since the latter doesn't have DX 11.2 support), but I'm sure it will churn up endless debate over what probably amounts to 3-4% performance variance, while the content of the bulk of the games themselves - uninspiring corridor shooters and thinly veiled ripoffs of earlier titles receive a bare fraction of the same forum bandwidth. Yay for common sense.
In summary, no 12_1 for GCN. Not only did a person of authority from AMD confirm it, but also AMD sourgraped about it much like NVIDIA did about 11_1 for Kepler.
"For 'Fiji' AMD is not expressed" means that AMD guy didn't want to talk about Fiji. Any DACH guys can confirm that translation.
Addressing these issues in order:
When GCN and Kepler came out DX11 was just starting to be the standard for gaming. The fact that there is any proposed support for either of these architectures to offer DX12 features is a small miracle. It'd be like expecting your 2000 Toyota Camry to have an Ipod dock. Obviously DX12 is something that is meant to reach back and give some cards new features, but expecting any cards with 100% compatibility is a pipe dream. Neither AMD nor Nvidia are magic, deal with it.
Btarunr and I often have differences of opinion. Look back to the editorial on everyone "needing to accept windows 8, because it's not going away" and you can see that clearly. Despite differences of opinion, you need to offer respect where it is due. This article, and Btarunr in general, are shooting straight. There aren't any lies introduced by the author, there's no reason to believe fiscal gains are the motive, and reading the original piece I think this is a fair representation. I disagree that reporting on reporters reporting is news (seriously, it's hard enough to report fairly when it's a one-on-one interview), but that's no reason to accuse a person of being a shill. Whenever you've got incontrovertible proof of wrong doing, present it. Right now these accusations are hollow, and dismissed as easily as they are made.
Sniff....sniff.....sniff.... Does anyone else smell a turd? I think I remember that smell... Vista, is that you?
Jokes aside, isn't anyone else getting flashbacks of Vista when they hear about DX12? Some hardware supports it, other hardware is capable of supporting part of it, and people are selling hardware as compatible despite knowing that you'd need a half dozen upgrades to get a system running reasonably. DX12 is a complicated animal, over simplified in the media and by sales, and the ensuing fallout from the false portrayal will hurt more people than it could ever help. The end goal is just to move more hardware, as people get frustrated by partial support and the promised improvements being hollow.
Let's start some more fires. Fiji is supposed to faster than, slower than, and as fast as the GTX 980 ti. Zen is supposed to both compete with Intel, and put them back into the dedicated CPU market. Intel is going to both improve FPGA production and cost by incorporating Altera into its production chain, but because the market will become a monopoly it will destroy affordable product lines. You see, FUD is fun. My last four sentences are complete speculation, yet they've started flame wars on forums. Seriously, show me the numbers before I give a crap. Until there are solid FACTs you might as well be playing with yourself for all the energy you're wasting on going nowhere.
Honestly I'll probably need to upgrade my GPU before dx12 is widespread enough to be required.
This surmarizes my feelings when reading these comments.
To be honest, I'll say this from the start. I like AMD better than Intel or NVIDIA - but it never stopped me from buying the a clearly better product, when it makes sense. Secondly, my machine - on which I do games (though mostly ones that I really like and spending many hours, often hundreds, with) - is getting old, almost 3 years now. Even when bought, it was of average 'strength' - though I do tend to build a lots of system, for my friends who need it and have less experience than I do (I own different PCs from 1987th, after all:) )
But, an average comment poster seems to have something like this:
- at least i7 of the newer generation, looking for an upgrade
- Titan X or, at least 980x (preferably in SLI)
- 4k monitor (at least one), or some multi-monitor configuration
Same poster is:
- highly environment-aware, though PSU is >800+W, and every 10-30W of extra usage counts as unacceptable
- liking and needing Iris Pro
- is highly dependant on yet unreleased DirectX 12, even to a point where some small inability of the graphic card to use hardware acceleration of the feature means life or death, NOW
Configuration mentioned easily surpasses 2000$, and may need replacement immediately, or very soon, because, well, some of the games released after famous June 29th won't have every every hardware optimization of feature which would be questionably used in that game(s) -softwere emulation is out of the question, naturally
Well, MY old system is very capable to support games I play, and it is not Soltaire - a number of AAA titles are there.
- 4K - though I've seen enough gaming on those, I've decided that I don't need one yet
- Multi-monitor gaming - it looks... bad, in my book (though I do have a second monitor, I use it for a different purposes entirely)
- Iris Pro is inherently... unneeded, a person spending as much for a CPU and is a passionate gamer buys a REAL graphic card, even if it's just an average one, and gets like 5-10 better results of it (minimum - whole concept of having Iris Pro eludes me, it's suitable for a budget CPU and budget gaming, yet it bears premium price)
Main features of DX12 are, as my understanding goes, to better utilize CPU cores and GPU, existing ones as yet unreleased. Other than this, there are bunch of new features - but then again, there are PsysX (and Flex and Enviromental and so on), Fur, God Rays, HairFX - also TrueAudio, Tressfx, Mantle... Neither green or red using games profited hugely out of having, or missing those - yes, they are noticeable, especially in demos, and probably moreso than some cryptic "Feature Level 12_x".
That being said, I wouldn't worry too much of it for the time being - and neither should others. Buying a new GPU cause it supports something as vague as "Feature Level 12_x" as primary reason is pure waste of money. You can always do the same when and IF this starts to affect your experience. When it happens, all the chances are that prices will be lower, products more mature and better overall, to the point of having the whole new generation.
I plan to upgrade my computer in 2016, if something important doesn't change - this goes both ways.
(Oh, and I did had a privilege to see several expensive components and how they affect my configuration - for me, they weren't worth buying, but mileage may vary - those who need them, probably have them already)
Sorry for a wall of text, but I felt that I have to write this, since the whole discussion regarding replacing GPU or profiting over having "Feature Level 12_x" seems so pointless to me.
There's no love lost between these two, that's for sure lol.
I would appreciate short and clear explanation.
Would it affect performance at all? If so, how much?