Monday, March 23rd 2015

AMD Bets on DirectX 12 for Not Just GPUs, but Also its CPUs

In an industry presentation on why the company is excited about Microsoft's upcoming DirectX 12 API, AMD revealed its most important feature that could impact on not only its graphics business, but also potentially revive its CPU business among gamers. DirectX 12 will make its debut with Windows 10, Microsoft's next big operating system, which will be given away as a free upgrade for _all_ current Windows 8 and Windows 7 users. The OS will come with a usable Start menu, and could lure gamers who stood their ground on Windows 7.

In its presentation, AMD touched upon two key features of the DirectX 12, starting with its most important, Multi-threaded command buffer recording; and Asynchronous compute scheduling/execution. A command buffer is a list of tasks for the CPU to execute, when drawing a 3D scene. There are some elements of 3D graphics that are still better suited for serial processing, and no single SIMD unit from any GPU architecture has managed to gain performance throughput parity with a modern CPU core. DirectX 11 and its predecessors are still largely single-threaded on the CPU, in the way it schedules command buffer.
A graph from AMD on how a DirectX 11 app spreads CPU load across an 8-core CPU reveals how badly optimized the API is, for today's CPUs. The API and driver code is executed almost entirely on one core, and this is something that's bad for even dual- and quad-core CPUs (if you fundamentally disagree with AMD's "more cores" strategy). Overloading fewer cores with more API and driver-related serial workload makes up the "high API overhead" issue that AMD believes is holding back PC graphics efficiency compared to consoles, and it has a direct and significant impact on frame-rates.
DirectX 12 heralds a truly multi-threaded command buffer pathway, which scales up with any number of CPU cores you throw at it. Driver and API workloads are split evenly between CPU cores, significantly reducing API overhead, resulting in huge frame-rate increases. How big that increase is in the real-world, remains to be seen. AMD's own Mantle API addresses this exact issue with DirectX 11, and offers a CPU-efficient way of rendering. Its performance-yields are significant on GPU-limited scenarios such as APUs, but on bigger setups (eg: high-end R9 290 series graphics, high resolutions), the performance gains though significant, are not mind-blowing. In some scenarios, Mantle offered the difference between "slideshow" and "playable." Cynics have to give DirectX 12 the benefit of the doubt. It could end up doing a better job than even Mantle, at pushing paper through multi-core CPUs.

AMD's own presentation appears to agree with the way Mantle played out in the real world (benefits for APUs vs. high-end GPUs). A slide highlights how DirectX 12 and its new multi-core efficiency could step up draw-call capacity of an A10-7850K by over 450 percent. Sufficed to say, DirectX 12 will be a boon for smaller, cheaper mid-range GPUs, and make PC gaming more attractive for the gamer crowd at large. The fine-grain asynchronous compute-scheduling/execution, is another feature to look out for. It breaks down complex serial workloads into smaller, parallel tasks. It will also ensure that unused GPU resources are put to work on these smaller parallel tasks.
So where does AMD fit in all of this? DirectX 12 support will no doubt help AMD sell GPUs. Like NVIDIA, AMD has preemptively announced DirectX 12 API support on all its GPUs based on the Graphics CoreNext architecture (Radeon HD 7000 series and above). AMD's real takeaway from DirectX 12 will be how its cheap 8-core socket AM3+ CPUs could gain tons of value overnight. The notion that "games don't use >4 CPU cores" will dramatically change. Any DirectX 12 game will split its command buffer and API loads between any number of CPU cores you throw at it. AMD sells you 8-core CPUs for as low as $170 (the FX-8320). Intel's design strategy of placing stronger but fewer cores on its client processors, could face its biggest challenge with DirectX 12.
Add your own comment

87 Comments on AMD Bets on DirectX 12 for Not Just GPUs, but Also its CPUs

#77
Caring1
Interesting that it showed a much more substantial improvement with the AMD graphics, almost double the improvement shown by the base nvidia card which had a 600% improvement.
Posted on Reply
#78
Sony Xperia S
That impressive jump in draw calls. If it led also to the same jump in frames per second. :eek:


Many major hardware news sites report about Samsung attempting to acquire AMD.

If this happens, the best news for quite a while.

Samsung To Allegedly Acquire AMD To Compete With Intel And Qualcomm Head On

Read more: wccftech.com/amd-allegedly-merge-samsung/#ixzz3VVRRd3g5
Posted on Reply
#79
john_
Not gone happen. Totally BS. And in my opinion if it does happen, we say goodbye to AMD CPUs/APUs and probably GPUs. I don't think Samsung is interested in those markets, it could be interested in business markets and probably for the patents and nothing more. If they do let AMD to continue their X86 and GPU designs for the desktops, probably we can forget the VFM from AMD that we use to know. Samsung is a bigger and brighter sticker than that from Nvidia and big, bright and shiny stickers cost a lot.
Posted on Reply
#80
Solaris17
Super Dainty Moderator
荷兰大母猪So what will the first DX12 game be?
Crysis 4
Posted on Reply
#81
oblivionlord
MrGeniusI never said it was the purpose of DX12. What I was saying is that the performance increases will be across the board. Meaning ALL CPUs(again regardless of core/thread count) will benefit greatly from DX12 or Mantle.

Sorry...but I have to laugh now...:laugh: Why?

Because my "casual gaming rig" with an ancient dual core(E8600) and a 280X runs anything you've got at more than playable speeds. And when I run Star Swarm, D3D vs. Mantle, it's the "slideshow" vs. "playable"(if it were) scenario. Soo...

BTW, I've already shifted to Windows 10. I've had it running on said machine for a couple months now.:pimp:
Seems like you were actually implying what you claim you didn't say... Lets go over it again...
"Making super powerful mega multicore CPUs even more of a waste of money(than they are now)." To put you in your place since you probably have a absolutely horrible testing methods of your own. My Q9550 is 2 of your cpu's in 1 package. If you run any modern game in window mode while running task manager in the background, you'll see that your cpu is using alot more % than even mine. Mine uses 60-70% while running any modern AAA title. I am not going to get 60fps in bf4 at 1080p with my gtx760 even at lowest gfx settings because the cpu itself is a bottleneck.

On the other hand... perhaps your monitor is just garbage where you just cant see the clarity in motion compared to better monitors and that's why you are fine with it. If you had a more modern monitor to compare with yours side by side then you'd see the instant clarity in motion which you'll also see every other bit of imperfection. Furthermore 60fps isnt even quality either compared to 120fps consistent framerate.

www.web-cyb.org/images/lcds/blurbusters-motion-blur-from-persistence.jpeg
This will show you the difference in clarity while in motion
Posted on Reply
#82
WhoDecidedThat
荷兰大母猪So what will the first DX12 game be?
Crysis 4, hopefully!
Posted on Reply
#83
xorbe
荷兰大母猪So what will the first DX12 game be?
Half Life 3 confirmed!
Posted on Reply
#84
MrGenius
oblivionlordSeems like you were actually implying what you claim you didn't say... Lets go over it again...
"Making super powerful mega multicore CPUs even more of a waste of money(than they are now)." To put you in your place since you probably have a absolutely horrible testing methods of your own. My Q9550 is 2 of your cpu's in 1 package. If you run any modern game in window mode while running task manager in the background, you'll see that your cpu is using alot more % than even mine. Mine uses 60-70% while running any modern AAA title. I am not going to get 60fps in bf4 at 1080p with my gtx760 even at lowest gfx settings because the cpu itself is a bottleneck.

On the other hand... perhaps your monitor is just garbage where you just cant see the clarity in motion compared to better monitors and that's why you are fine with it. If you had a more modern monitor to compare with yours side by side then you'd see the instant clarity in motion which you'll also see every other bit of imperfection. Furthermore 60fps isnt even quality either compared to 120fps consistent framerate.

www.web-cyb.org/images/lcds/blurbusters-motion-blur-from-persistence.jpeg
This will show you the difference in clarity while in motion
It's funny you mention that. Because I actually use a 1600 x1200 75Hz CRT monitor for most of my gaming. Not for fps performance, or visual acuity. Since I don't really care what fps my games run at, so long as it's better than 24fps. That's when things get too slow for my taste. Plus I won't play without v-sync, so you get what you get with that. The only reason I sometimes use 1080p is so I can chill on my couch and game, and not lose any significant visual clarity. Instead of sitting at my "desk".

You fps whores really crack me up. Like it matters THAT much. As if. Just turn the fps monitor off and play your game. You see how much funner that is?

If it makes you feel better to have a more "powerful" CPU than mine, good for you. I happy for you. Whatever it does for you, that's great. Have at it. More "power" to you. Pardon the pun.

BTW, you haven't taught me anything I didn't already know. I'm not stupid. I've been doing this for a long time now. This is not my first rodeo. But I also have a life that doesn't revolve entirely around my computer. Frankly, I've got better things to do than play games...of any sort. But I've been addicted to video games since I was ~4 years old. So I doubt I'll ever stop playing them, at least occasionally.
Posted on Reply
#85
Octopuss
MrGeniusYou fps whores really crack me up. Like it matters THAT much. As if. Just turn the fps monitor off and play your game. You see how much funner that is?
It does! But it's extremely game-related.
I find BF4 very playable at as low as 35 fps. I laugh at idiots like xfaptor or levelcap or whoever these "pro" players might be running BF4 at low details just so that they can have 144 fps (anything less is extremely noticeable and significantly affects gameplay. pro note: they were saying/doing the same crap when 120Hz was the thing).
At the same time, Skyrim is absolutely unplayable for me when fps dips below 50 (I just get sick of how the image moves or something, I cannot explain it).
Posted on Reply
#86
xorbe
Different strokes for different folks. It's all good, people.
Posted on Reply
#87
Corey
bobalazsOkay, so who cares about dx 12, when there's still gonna be majority of dx 9-11 out there, and amd cpu's as we know is lackluster in that field.
Which is why we need Devs to either support their chips better or adopt an API that make it easier. Even with DX12 I don't think people are going to run out and buy an 8350 when you can get the i5 4460 for around the same price. What I am saying is those who have the FX chips or any of the a8/10 chips will have some extended life hopefully giving them a chance to wait for Zen so they have another cheap upgrade option.
Posted on Reply
Add your own comment
Dec 4th, 2024 03:35 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts