Tuesday, August 18th 2015

AMD GPUs Show Strong DirectX 12 Performance on "Ashes of the Singularity"

Stardock's "Ashes of the Singularity" may not be particularly pathbreaking as an RTS, in the Starcraft era, but has the distinction of being the first game to the market with a DirectX 12 renderer, in addition to its default DirectX 11 one. This gave gamers the first peak at API to API comparisons, to test the tall bare-metal optimizations of DirectX 12, and as it turns out, AMD GPUs do seem to benefit big.

In a GeForce GTX 980 vs. Radeon R9 390X comparison by PC Perspective, the game seems to perform rather poorly on its default DirectX 11 renderer for the R9 390X, which when switched to DirectX 12, not only takes a big leap (in excess of 30%) in frame-rates, but also outperforms the GTX 980. A skeptical way of looking at these results would be that the R9 390X isn't optimized for the D3D 11 renderer to begin with, and merely returns to its expected performance vs. the GTX 980, with the D3D 12 renderer.
Comparing the two GPUs on CPU-intensive resolutions (900p and 1080p), across various CPUs (including the i7-5960X, i7-6700K, i3-4330 dual-core, FX-8350, FX-6300, reveals that the R9 390X has a narrow performance drop with fewer CPU cores, and has slight performance gains with increasing number of cores. Find the full insightful review in the source link below.
Source: PC Perspective
Add your own comment

118 Comments on AMD GPUs Show Strong DirectX 12 Performance on "Ashes of the Singularity"

#26
Prima.Vera
This just proves even more, that AMD's CPUs are pure crap for gaming. Like junk.
Posted on Reply
#27
Jborg
Prima.VeraThis just proves even more, that AMD's CPUs are pure crap for gaming. Like junk.
Yes, they do not perform as well as Intel per core.... this is pretty obvious by now.....

Pure crap for gaming? Dunno what your smoking.... With my FX 8350 I have had almost no issues playing any game on High-Ultra detail with 50-60 FPS...

Granted, MMO games DO run better on Intel, but they are still 100% playable on an AMD CPU.

Crysis 3 has honestly been the only game that kinda runs like crap, and that seems to be a game issue...

So, while buying an AMD CPU right now is basically investing in old tech, it still will run games better than you say. Keep in mind too, if I had the option to build another rig I would buy intel.... But I just wanted to set that comment straight... because its false.
Posted on Reply
#28
RejZoR
Until I see some other benchmarks, this number means nothing. It has been developed with AMD in mind since day one s the results aren't all that surprising (even though I know AMD is strong in this regard). There are also games where NVIDIA absolutely destroys AMD. But we aren't running around saying how AMD sucks and how NVIDIA is awesome.

Either way, if AMD is this good, this will force NVIDIA to enhance their drivers (I think hardware wise NVIDIA is already prepared). Anyone remembers the DX11 boost NVIDIA made when AMD started fiddling with Mantle more aggressively? Yeah, that :)
Posted on Reply
#29
the54thvoid
Super Intoxicated Moderator
RelayerI agree. Wonder what nVidia will do dealing with a dev who doesn't need their trips, free hardware, or precompiled code?
They'll ignore them. And instead focus on the triple AAA titles. That's not a good thing.
BTW, the comment from Relayer could also be pointed at AMD and Star Dock. AMD had a big push with this dev.

Time will tell but I doubt much will change in the grand scheme of things.
Posted on Reply
#30
Mussels
Freshwater Moderator
Still surprised at the lack of information and fanboyism out there.

A few facts (you can run the 3Dmark DX12 API test for the little repeatable information on DX12 we have publicly available)

1. AMD ran poorly in DX11 (and older) for multi threading. Nvidia had a lead there for a long time.
2. This made AMD GPU's work better on a faster IPC/single threaded CPU
3. AMD do not sell leading performance single threaded CPU's
4. DX12 addresses the poor multi threading, bringing AMD GPU's up to par with Nvidia and greatly increasing performance in some cases
5. This also makes AMD CPU's more attractive to gamers, as the extra threads now help out.
Posted on Reply
#31
DeadSkull
Bwahahaha, nvidia failing once again and refusing to admit fault. How typical is that...


Posted on Reply
#32
RejZoR
Extra cores on AMD's don't seem to help much. However, Core i7 5820 all of a sudden becomes a lot more interesting I think...
Posted on Reply
#33
Prima.Vera
JborgYes, they do not perform as well as Intel per core.... this is pretty obvious by now.....

Pure crap for gaming? Dunno what your smoking.... With my FX 8350 I have had almost no issues playing any game on High-Ultra detail with 50-60 FPS...

Granted, MMO games DO run better on Intel, but they are still 100% playable on an AMD CPU.

Crysis 3 has honestly been the only game that kinda runs like crap, and that seems to be a game issue...

So, while buying an AMD CPU right now is basically investing in old tech, it still will run games better than you say. Keep in mind too, if I had the option to build another rig I would buy intel.... But I just wanted to set that comment straight... because its false.
I was just talking based on those charts. When your top AMD processor is slower than a cheapo i3, then you know imo, that your CPU is crap :)
Posted on Reply
#34
Mussels
Freshwater Moderator
Prima.VeraI was just talking based on those charts. When your top AMD processor is slower than a cheapo i3, then you know imo, that your CPU is crap :)
for some titles that is true, and the i3's cost a lot more... so you know. personal preference.
Posted on Reply
#35
Assimilator
Everything we've seen so far is showing that DX12 may level the playing field for AMD, and/or that they've been playing the long game by optimising for DX12 while letting DX11 support fall behind. Either way, a few graphs from an alpha piece of software aren't going to sway me. Let's see some released game titles, some optimised for AMD and some for nVIDIA, before we draw any conclusions.
Posted on Reply
#36
Sempron Guy
Looking forward to TPUs review on this one. I personally don't trust PCPER with any review AMD related.
Posted on Reply
#37
the54thvoid
Super Intoxicated Moderator
MusselsStill surprised at the lack of information and fanboyism out there.

A few facts (you can run the 3Dmark DX12 API test for the little repeatable information on DX12 we have publicly available)

1. AMD ran poorly in DX11 (and older) for multi threading. Nvidia had a lead there for a long time.
2. This made AMD GPU's work better on a faster IPC/single threaded CPU
3. AMD do not sell leading performance single threaded CPU's
4. DX12 addresses the poor multi threading, bringing AMD GPU's up to par with Nvidia and greatly increasing performance in some cases
5. This also makes AMD CPU's more attractive to gamers, as the extra threads now help out.
Hope that fanboy comment covers the post immediately after yours...

I don't think Nvidia tried too hard with this game code. It seems that odd that they lose some perf at DX12 where AMD gain.
Again, AMD worked closely with Stardock so Nvidia may have neglected it. However in general I imagine Nvidia will try harder once the PR starts to look bad for them.
They have the financial clout to 'buy' titles, DX12 or not. Last thing we want is Gameworks everywhere.
Posted on Reply
#38
Mussels
Freshwater Moderator
AMD have an advantage with DX12 this time around, because it uses their code from mantle. Nvidia could well take the lead again with future driver updates, but most likely future hardware releases.

I bet exactly $1 that DX12 being used on the Xbox one and AMD hardware being in that console has something to do with it.
Posted on Reply
#39
R-T-B
MxPhenom 216So AMD cards shine in DX12, and Nvidia cards stay about the same.
Looks more like NVIDIA cards take a hit to me. WTF?
AMD have an advantage with DX12 this time around, because it uses their code from mantle.
Not really.
Vayra86This. Overall, this is not news, this is just a stir in the AMD-Nvidia fanrage bowl, with a game that nobody really plays or even heard of.
Ah come on man. It's plastered all over Stardocks homepage like it's Christmas. There's no way you are doing it justice with that phrase.
Posted on Reply
#40
Kemar Stewart
I find it interesting that there testing methodology doesn't elaborate on which drivers there are using for thesetests. That being said. Will wait for another reviewer before I speculate
Posted on Reply
#41
yogurt_21
So either the AMD cards have always been optimized for DirectX 12 or for some reason this games cripples the AMD cards in DX 11.

Either way it will be interesting. Love that this is shown on the 390X rather than just existing on the Fury. It means there is a slim chance my 290 could receive a performance boost in future games, adding life to it.
Posted on Reply
#42
efikkan
MusselsAMD have an advantage with DX12 this time around, because it uses their code from mantle. Nvidia could well take the lead again with future driver updates, but most likely future hardware releases.
No, AMD have an advantage primarily because this game was developed for GCN(Mantle) and then ported to Direct3D 12. The changes in Direct3D 12 allows for more tailored game engines, which probably will result in AMD "friendly" and Nvidia "friendly" games, depending on which vendor the developers cooperate with during development.

Keep in mind that many of the planned driver improvements for Direct3D 12 was implemented by Nvidia a while ago, and since Nvidia implements Direct3D and OpenGL through their native superset Cg, these improvements have been universal. Also remember that R9 390X grealy surpasses GTX 980 in theoretical performance, so it's about time AMD try to utilize some of this potential.
MusselsI bet exactly $1 that DX12 being used on the Xbox one and AMD hardware being in that console has something to do with it.
Probably not, it has to do with game engine design.
Posted on Reply
#43
Octopuss
Isn't it a "little" too early to jump at any conclusions?...
Posted on Reply
#44
john_
I am just coping and pasting a comment from PCPerspective's article
oxidegames.com/2015/08/16/the-birth-of-a-new-api/

Baker wrote a specific section just for you.

"All IHVs have had access to our source code for over year, and we can confirm that both Nvidia and AMD compile our very latest changes on a daily basis and have been running our application in their labs for months."

Then, farther on:

"Often we get asked about fairness, that is, usually if in regards to treating Nvidia and AMD equally? Are we working closer with one vendor then another? The answer is that we have an open access policy. Our goal is to make our game run as fast as possible on everyone’s machine, regardless of what hardware our players have.

To this end, we have made our source code available to Microsoft, Nvidia, AMD and Intel for over a year. We have received a huge amount of feedback. For example, when Nvidia noticed that a specific shader was taking a particularly long time on their hardware, they offered an optimized shader that made things faster which we integrated into our code.

We only have two requirements for implementing vendor optimizations: We require that it not be a loss for other hardware implementations, and we require that it doesn’t move the engine architecture backward (that is, we are not jeopardizing the future for the present)."
That's for everyone talking about companies in bed, probably having Nvidia and Ubisoft in mind.

If you don't like PCPercpective's article, comments about PCPerspective being AMD biased is at least amusing, there are a dozen other reviews at least with results that come to the same conclusions. I bet AMD doesn't have the money to buy the coffees for one site, not to convince a dozen sites to play with the results.
Posted on Reply
#45
alwayssts
john_If you don't like PCPercpective's article, comments about PCPerspective being AMD biased is at least amusing, there are a dozen other reviews at least with results that come to the same conclusions. I bet AMD doesn't have the money to buy the coffees for one site, not to convince a dozen sites to play with the results.
With respect, it appears you clearly don't understand the public history PCP has regarding AMD. What you took from his comments are the exact opposite of what anyone that has followed them over the years would assume.

They have gotten MUCH better recently, just as TR has, as I think Shrout got an immense amount of heartburn (if you've listened to their podcast over the last year you could tell it was really getting to him) over the outcry at some point. Now, they just joke about him being "BIOS'd.'

...at least I think that's what they called him on the last nvidia/aib-sponsored livestream, while clearly referencing an inside joke about haters gonna hate (which one can assume was about the exact same thing; his perceived bias against AMD).

Enough about that though...

I agree this is just one game. I agree this was obviously always meant to showcase amd's improvements in dx/mantle, and was probably programmed with that in mind. Surely there are other areas of dx12 nvidia's hardware excels and/or their driver is more mature.

That said, I think it's great this is nvidia once again showcasing the mindset that absolutely nothing is their fault, as demoed by Roy Taylor's cutting tweet, as well as Stardock's response to nvidia claiming multiple times in their memo everything is on the dev to fix (rather than immature drivers/dx runtime that should handle situations at default better than they do, granted the dev can fix.)


twitter.com/amd_roy/status/633535969015275521
twitter.com/draginol/status/632957839556874240
Posted on Reply
#46
Fluffmeister
If anything I'm just really impressed with Nv's DX11 performance! :eek:
Posted on Reply
#47
Slizzo
FluffmeisterIf anything I'm just really impressed with Nv's DX11 performance! :eek:
This. This test showed how abysmal AMD's optimizations were for DX11. Now, for this game that could entirely be because the game was originally Mantle, and was ported to DX12, and thus AMD didn't pay any mind to DX11 as their users would have been using Mantle's API but....

Makes you wonder if they aren't leaving performance on the table for existing DX11 titles.
Posted on Reply
#48
Casecutter
NVIDIA GeForce 355.60 WHQL Driver:
Release notes:

Just in time for the preview of Ashes of the Singularity, this new GeForce Game Ready driver ensures you’ll have the best possible gaming experience.
Game Ready
Best gaming experience for Ashes of the Singularity preview

Honestly, Nvidia has done nothing indecorous here, so big deal their driver for what to them is "Alpha Preview" is still in the works. It's known that AMD has work closely with the Oxide Games and its Nitrous Engine, so AMD has undoubtedly been working hard to make their asynchronous shader opportunities' work in this game, and leverage the GCN support in executing work from multiple queues. It's honestly not surprising it's rosy for AMD

Nor should Nvidia be consider behind on matching AMD for what's an "Alpha Preview". Nvidia newest hardware “Maxwell 2” has 32 queues (composed of 1 graphics queue and 31 compute queues), so Nvidia hardware that can be made to do such work. As to "Nvidia has had source code access for a year" the only place to clarify that is Ryan Shout signifying he agreed with another post saying "^ this" to someone else's post. Would be good to have the veracity of such a statement. Edit: After getting this written I saw 'john_' post above regarding this! Thanks,

Honestly the only gaffe I see was Nvidia attaching the ”Game Ready” moniker for this driver release. Had they kept the iteration, “Ashes of the Singularity Preview Driver” they would’ve had no issue. And even what Brian Burke, Senior PR Manager statement could’ve be less cynical, just saying it like… 'according to the Oxide this Alpha stage preview is still being optimized, and not approaching Beta. Nvidia has the resources to enhance final ”Game Ready” driver well before the games final releases.'

Honestly it a-lot about nothing.
Posted on Reply
#49
alwayssts
SlizzoThis. This test showed how abysmal AMD's optimizations were for DX11. Now, for this game that could entirely be because the game was originally Mantle, and was ported to DX12, and thus AMD didn't pay any mind to DX11 as their users would have been using Mantle's API but....

Makes you wonder if they aren't leaving performance on the table for existing DX11 titles.
It's long been known how AMD deals with DX11 cpu overhead/bottlenecking isn't exactly the greatest, and much of that was deferred to Mantle/DX12 to solve (which it appears to have done).

Performance in dx11 has been a feather in nvidia's cap going forward since the 337.50 Beta drivers in April 2014. Their response to mantle was tighten up their dx11 performance with optimizations towards alleviating that specific bottleneck, something AMD never did; it essentially gave somewhat similar results to AMD using Mantle (or now dx12), granted obviously not as profound as the new api itself.

I think that very much explains why nvidia's dx11 and amd's dx12 performance are as they appear.
Posted on Reply
#50
HumanSmoke
the54thvoidI don't think Nvidia tried too hard with this game code. It seems that odd that they lose some perf at DX12 where AMD gain.
Another point to consider might be that the Nitrous engine might require some specialized coding. From the developer himself:
We've offered to do the optimization for their DirectX 12 driver on the app side that is in line with what they had in their DirectX 11 driver.
From that comment, it sounds like optimization by a third party is a non-trivial matter - so Nvidia preferred to do the driver coding themselves. Oxide and Nvidia's coding partnership when Star Swarm launched was abysmal, and wasn't rectified until Nvidia's 337.50 arrived IIRC some months later. The reasons for Nvidia deciding to go it alone could be many and varied, but the first go round as BFF wasn't an unmitigated success.
the54thvoidAgain, AMD worked closely with Stardock so Nvidia may have neglected it. However in general I imagine Nvidia will try harder once the PR starts to look bad for them.
Might be a case of Star Swarm Redux. Initial drivers were a basket case, but once the driver team got to grips with the code, the performance picked up. Probably depends on a lot of variables though - driver coding priority by the vendor, whether the game code undergoes any significant revision before release (as per Tomb Raider), or whether the game engine is simply not tailored that well for Nvidia's architecture (I'd assume that Oxide and AMD's close relationship would make GCN a primary focus for optimization).
Posted on Reply
Add your own comment
Nov 21st, 2024 08:42 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts