Tuesday, August 18th 2015
AMD GPUs Show Strong DirectX 12 Performance on "Ashes of the Singularity"
Stardock's "Ashes of the Singularity" may not be particularly pathbreaking as an RTS, in the Starcraft era, but has the distinction of being the first game to the market with a DirectX 12 renderer, in addition to its default DirectX 11 one. This gave gamers the first peak at API to API comparisons, to test the tall bare-metal optimizations of DirectX 12, and as it turns out, AMD GPUs do seem to benefit big.
In a GeForce GTX 980 vs. Radeon R9 390X comparison by PC Perspective, the game seems to perform rather poorly on its default DirectX 11 renderer for the R9 390X, which when switched to DirectX 12, not only takes a big leap (in excess of 30%) in frame-rates, but also outperforms the GTX 980. A skeptical way of looking at these results would be that the R9 390X isn't optimized for the D3D 11 renderer to begin with, and merely returns to its expected performance vs. the GTX 980, with the D3D 12 renderer.Comparing the two GPUs on CPU-intensive resolutions (900p and 1080p), across various CPUs (including the i7-5960X, i7-6700K, i3-4330 dual-core, FX-8350, FX-6300, reveals that the R9 390X has a narrow performance drop with fewer CPU cores, and has slight performance gains with increasing number of cores. Find the full insightful review in the source link below.
Source:
PC Perspective
In a GeForce GTX 980 vs. Radeon R9 390X comparison by PC Perspective, the game seems to perform rather poorly on its default DirectX 11 renderer for the R9 390X, which when switched to DirectX 12, not only takes a big leap (in excess of 30%) in frame-rates, but also outperforms the GTX 980. A skeptical way of looking at these results would be that the R9 390X isn't optimized for the D3D 11 renderer to begin with, and merely returns to its expected performance vs. the GTX 980, with the D3D 12 renderer.Comparing the two GPUs on CPU-intensive resolutions (900p and 1080p), across various CPUs (including the i7-5960X, i7-6700K, i3-4330 dual-core, FX-8350, FX-6300, reveals that the R9 390X has a narrow performance drop with fewer CPU cores, and has slight performance gains with increasing number of cores. Find the full insightful review in the source link below.
118 Comments on AMD GPUs Show Strong DirectX 12 Performance on "Ashes of the Singularity"
Pure crap for gaming? Dunno what your smoking.... With my FX 8350 I have had almost no issues playing any game on High-Ultra detail with 50-60 FPS...
Granted, MMO games DO run better on Intel, but they are still 100% playable on an AMD CPU.
Crysis 3 has honestly been the only game that kinda runs like crap, and that seems to be a game issue...
So, while buying an AMD CPU right now is basically investing in old tech, it still will run games better than you say. Keep in mind too, if I had the option to build another rig I would buy intel.... But I just wanted to set that comment straight... because its false.
Either way, if AMD is this good, this will force NVIDIA to enhance their drivers (I think hardware wise NVIDIA is already prepared). Anyone remembers the DX11 boost NVIDIA made when AMD started fiddling with Mantle more aggressively? Yeah, that :)
BTW, the comment from Relayer could also be pointed at AMD and Star Dock. AMD had a big push with this dev.
Time will tell but I doubt much will change in the grand scheme of things.
A few facts (you can run the 3Dmark DX12 API test for the little repeatable information on DX12 we have publicly available)
1. AMD ran poorly in DX11 (and older) for multi threading. Nvidia had a lead there for a long time.
2. This made AMD GPU's work better on a faster IPC/single threaded CPU
3. AMD do not sell leading performance single threaded CPU's
4. DX12 addresses the poor multi threading, bringing AMD GPU's up to par with Nvidia and greatly increasing performance in some cases
5. This also makes AMD CPU's more attractive to gamers, as the extra threads now help out.
I don't think Nvidia tried too hard with this game code. It seems that odd that they lose some perf at DX12 where AMD gain.
Again, AMD worked closely with Stardock so Nvidia may have neglected it. However in general I imagine Nvidia will try harder once the PR starts to look bad for them.
They have the financial clout to 'buy' titles, DX12 or not. Last thing we want is Gameworks everywhere.
I bet exactly $1 that DX12 being used on the Xbox one and AMD hardware being in that console has something to do with it.
Either way it will be interesting. Love that this is shown on the 390X rather than just existing on the Fury. It means there is a slim chance my 290 could receive a performance boost in future games, adding life to it.
Keep in mind that many of the planned driver improvements for Direct3D 12 was implemented by Nvidia a while ago, and since Nvidia implements Direct3D and OpenGL through their native superset Cg, these improvements have been universal. Also remember that R9 390X grealy surpasses GTX 980 in theoretical performance, so it's about time AMD try to utilize some of this potential. Probably not, it has to do with game engine design.
If you don't like PCPercpective's article, comments about PCPerspective being AMD biased is at least amusing, there are a dozen other reviews at least with results that come to the same conclusions. I bet AMD doesn't have the money to buy the coffees for one site, not to convince a dozen sites to play with the results.
They have gotten MUCH better recently, just as TR has, as I think Shrout got an immense amount of heartburn (if you've listened to their podcast over the last year you could tell it was really getting to him) over the outcry at some point. Now, they just joke about him being "BIOS'd.'
...at least I think that's what they called him on the last nvidia/aib-sponsored livestream, while clearly referencing an inside joke about haters gonna hate (which one can assume was about the exact same thing; his perceived bias against AMD).
Enough about that though...
I agree this is just one game. I agree this was obviously always meant to showcase amd's improvements in dx/mantle, and was probably programmed with that in mind. Surely there are other areas of dx12 nvidia's hardware excels and/or their driver is more mature.
That said, I think it's great this is nvidia once again showcasing the mindset that absolutely nothing is their fault, as demoed by Roy Taylor's cutting tweet, as well as Stardock's response to nvidia claiming multiple times in their memo everything is on the dev to fix (rather than immature drivers/dx runtime that should handle situations at default better than they do, granted the dev can fix.)
twitter.com/amd_roy/status/633535969015275521
twitter.com/draginol/status/632957839556874240
Makes you wonder if they aren't leaving performance on the table for existing DX11 titles.
Release notes:
Just in time for the preview of Ashes of the Singularity, this new GeForce Game Ready driver ensures you’ll have the best possible gaming experience.
Game Ready
Best gaming experience for Ashes of the Singularity preview
Honestly, Nvidia has done nothing indecorous here, so big deal their driver for what to them is "Alpha Preview" is still in the works. It's known that AMD has work closely with the Oxide Games and its Nitrous Engine, so AMD has undoubtedly been working hard to make their asynchronous shader opportunities' work in this game, and leverage the GCN support in executing work from multiple queues. It's honestly not surprising it's rosy for AMD
Nor should Nvidia be consider behind on matching AMD for what's an "Alpha Preview". Nvidia newest hardware “Maxwell 2” has 32 queues (composed of 1 graphics queue and 31 compute queues), so Nvidia hardware that can be made to do such work.
As to "Nvidia has had source code access for a year" the only place to clarify that is Ryan Shout signifying he agreed with another post saying "^ this" to someone else's post. Would be good to have the veracity of such a statement.Edit: After getting this written I saw 'john_' post above regarding this! Thanks,Honestly the only gaffe I see was Nvidia attaching the ”Game Ready” moniker for this driver release. Had they kept the iteration, “Ashes of the Singularity Preview Driver” they would’ve had no issue. And even what Brian Burke, Senior PR Manager statement could’ve be less cynical, just saying it like… 'according to the Oxide this Alpha stage preview is still being optimized, and not approaching Beta. Nvidia has the resources to enhance final ”Game Ready” driver well before the games final releases.'
Honestly it a-lot about nothing.
Performance in dx11 has been a feather in nvidia's cap going forward since the 337.50 Beta drivers in April 2014. Their response to mantle was tighten up their dx11 performance with optimizations towards alleviating that specific bottleneck, something AMD never did; it essentially gave somewhat similar results to AMD using Mantle (or now dx12), granted obviously not as profound as the new api itself.
I think that very much explains why nvidia's dx11 and amd's dx12 performance are as they appear.