# DirectX 12 Boosts Draw Calls By 330% On 3 Year Old GTX 670, More Details Revealed



## qubit (May 25, 2015)

If this result by a Reddit user turns out to be anywhere near true across both AMD and NVIDIA cards, low end to high end and on existing hardware, DX12 is gonna be a killer reason for gamers to get Windows 10. You can be sure that games developers will jump at the chance to use DX12 too, preventing the classic chicken and egg situation.


> According to him, the tests he ran show a boost of close to 400% in draw call throughput. As visible in the image below, the draw call count according to the results on DirectX 11 was 1,515,965 whereas on multi-thread, it was 2,532, 181. Although, when the user switched to DirectX 12, the number of draw calls raised to 8,562,158, which is around 330% increase in the total performance.









http://wccftech.com/directx-12-draw-calls-330-3-year-gtx-670/


----------



## NC37 (May 25, 2015)

Hopin to see one of these done for Fermi sometime.


----------



## qubit (May 25, 2015)

NC37 said:


> Hopin to see one of these done for Fermi sometime.


+1 Would be nice to see the improvement on my Fermi-based cards, the GTX 580, GTX 590 and the little GT520. I suspect the GT520 might likely run out of bandwidth though, lol.


----------



## Caring1 (May 25, 2015)

Wasn't there a bunch of these charts already posted showing this increase, possibly in the news section when Dx12 was making headlines?
Now i'll have to go on a search for them.
Had to dig back a bit and only came up with this link:
http://www.forbes.com/sites/jasonev...el-hardware-tested-with-awesome-improvements/


----------



## qubit (May 25, 2015)

That's a pretty good article, nice one. 

Looks like he got even better results.


----------



## AsRock (May 25, 2015)

670 even DX12 ?, but anyways even if it can do some commands answer is still no due tot he fact there is no DX12 games lol.


----------



## Mussels (May 25, 2015)

AsRock said:


> 670 even DX12 ?, but anyways even if it can do some commands answer is still no due tot he fact there is no DX12 games lol.



Unity will go DX12, which means a lot of smaller indie games will upgrade fairly quickly. Add to that the Xbox one running windows 10 and DX12, and i'd expect to see ports pretty quick this time around - they just have to re-do the controls and optionally a HD texture pack really.


----------



## Xzibit (May 25, 2015)

Its all up to software and developers. DX 12 tier stage will make for the same old lazy development.  Depending on Tier support there is a limitation on what your able to support and do.  The higher the tier your gpu supports Tier 1. 2 & 3 the more resources


----------



## Mussels (May 25, 2015)

So devs will code DX12 to the nvidia limits for a generation or two, got it.


----------



## Ebo (May 25, 2015)

Xzibit

Due to the 2 pic you put up, I think development of games using dx12 will be fast, due to xbox one and PS4 allready are dx12 ready since both is using the Jaguar APU(I think it is).

To me the rest dosent matter, only dx12 counts and if it can provide that extra power to older systems, then its a win win situation for all parts.


----------



## Xzibit (May 25, 2015)

Ebo said:


> Xzibit
> 
> Due to the 2 pic you put up, I think development of games using dx12 will be fast, due to xbox one and PS4 allready are dx12 ready since both is using the Jaguar APU(I think it is).
> 
> To me the rest dosent matter, only dx12 counts and if it can provide that extra power to older systems, then its a win win situation for all parts.



That's the only saving grace that a Tier 3 is inside the consoles to bank roll any PC port or effort.  If they weren't we might be looking at 5-10yrs for Tier 3 adaptation much like how DX 11, 64bit executables and multi-threading from DX10 only made it in towards the end of windows 7.


----------



## Frick (May 25, 2015)

So what's a draw call then? What does it do, and how does it affect performance?


----------



## Mussels (May 25, 2015)

Frick said:


> So what's a draw call then? What does it do, and how does it affect performance?



short oversimplified version: its how often the CPU can ask the GPU to draw something.


----------



## Frick (May 25, 2015)

Mussels said:


> short oversimplified version: its how often the CPU can ask the GPU to draw something.



I should have realized. Language center of the brain, you've foiled me again! 
	

	
	
		
		

		
		
	


	




EDIT Seriously though, Google brought me this, looks promising.


----------



## NC37 (May 25, 2015)

So Radeons are already on T3 while Geforce is still on T2. Guess that explains why board makers are taking more interest in AMD. 

Still, I imagine next Geforce will be T3 for sure and probably give AMD a run for it's money. Just that chart isn't giving me much confidence for my Fermi. Debating if DX12 will deliver enough performance to buy another year out of these. Guess maybe not. Gah, wish nVidia would plan ahead better like AMD does.


----------



## Mussels (May 25, 2015)

NC37 said:


> So Radeons are already on T3 while Geforce is still on T2. Guess that explains why board makers are taking more interest in AMD.
> 
> Still, I imagine next Geforce will be T3 for sure and probably give AMD a run for it's money. Just that chart isn't giving me much confidence for my Fermi. Debating if DX12 will deliver enough performance to buy another year out of these. Guess maybe not. Gah, wish nVidia would plan ahead better like AMD does.



not to mention the consoles - AMD planned this out well in advance it seems.

The console hardware could get some serious performance boosts here, and then with AMD GPU's being ahead, DX12 providing CPU performance gains and AMD having the cheapest many-cores CPU's... well, seems like they had a direction and its paying off.


----------



## RejZoR (May 25, 2015)

So, if I'm reading the above images correctly, ALL Radeon graphic cards that feature GCN support Tier 3 ? This means EVERYTHING starting with HD7900 series and above where GeForce only supports large majority of features with latest Maxwell (not even all of them). AMD seems to always be commited to fully support stuff even if it's not fully used by developers. I guess they learned this back in the Radeon 9700 days where following the DX specs helped them to entirely dominate NVIDIA's crappy GeForceFX... History is repeating. Maybe not to usch extent, but Radeon users are certainly at better here...


----------



## Mussels (May 25, 2015)

RejZoR said:


> So, if I'm reading the above images correctly, ALL Radeon graphic cards that feature GCN support Tier 3 ? This means EVERYTHING starting with HD7900 series and above where GeForce only supports large majority of features with latest Maxwell (not even all of them). AMD seems to always be commited to fully support stuff even if it's not fully used by developers. I guess they learned this back in the Radeon 9700 days where following the DX specs helped them to entirely dominate NVIDIA's crappy GeForceFX... History is repeating. Maybe not to usch extent, but Radeon users are certainly at better here...



at a guess its because of mantle - they knew what they wanted and designed hardware around it, planned for it and proved with mantle it was possible to drastically improve performance with existing hardware (theirs, at least) - and then MS got on board because "fuck yeah, now we have a reason to make people get windows 10" so once again (this isn't the first time) AMD got the technology crown, if not the performance crown.


----------



## RejZoR (May 25, 2015)

Yeah, that could be the reason since Mantle works very similar to DX12. I think they kinda anticipated all this when they started with the GCN architecture...


----------



## Folterknecht (May 25, 2015)

Mussels said:


> ... and then MS got on board *because "fuck yeah, now we have a reason to make people get windows 10"* so once again (this isn't the first time) AMD got the technology crown, if not the performance crown.



Might be one reason what you say there, but I personally see an other aspect. Imagine what would have happend, if Mantle (or something similar) would have gotten momentum in the Linux world. If playing games on Linux becomes a thing, MS and Windows are in deep shit.


----------



## RejZoR (May 25, 2015)

I don't think that's the case. I've installed latest Ubuntu like a month ago and it was the clumsiest piece of software I've used in ages. Sure its's free and offers tons of stuff, but it's just bad like every single Linux distro I've seen so far. Windows isn't perfect either, but at least basic stuff works right and whatever driver I have to install it's a matter of few clicks and almost zero problems unlike installing drivers on Ubuntu...


----------



## 64K (May 25, 2015)

I was planning to put Win 10 on my next build later this year anyway but if I weren't doing a Skylake build then I would upgrade to Win 10 anyway for DX12.


----------



## qubit (May 25, 2015)

RejZoR said:


> I don't think that's the case. I've installed latest Ubuntu like a month ago and it was the clumsiest piece of software I've used in ages. Sure its's free and offers tons of stuff, but it's just bad like every single Linux distro I've seen so far. Windows isn't perfect either, but at least basic stuff works right and whatever driver I have to install it's a matter of few clicks and almost zero problems unlike installing drivers on Ubuntu...


That's really sad and a real shame. I could have cut and pasted your statement and said it 15 years ago about Linux when I was playing around with it. No wonder it hasn't gotten anywhere on the desktop against Windows, regardless of any other reasons.


----------



## RejZoR (May 25, 2015)

Microsoft and Windows may be evil and all, but since it's not a clusterfuck of all worlds knowledge being condensed into a single OS, they know how to design and standardize things. And that's why Windows works and Linux doesn't. Windows has progressed and while they've done stupid things with Vista and vanilla Win8, they fixed things with Win7 and Win10. Linux is the same confusing stuff ever since I can remember it. Sure, it made some improvements, I'll admit that, but that's like comparing Windows 8 and Windows 8.1. Unfortunately that's how it is. I'd love to see a proper real alternative to Windows, but to be honest there just isn't one.


----------



## Ebo (May 25, 2015)

If AMD hadent come up with Mantle, we wouldnt have seen dx12 for years to come, thats for sure.

AMD proved tru Mantle that they could make some software that improved and ran over dx11 and at the same time taking Microsoft off gard, showing them it could be done.

Unfortunately AMD isent developing Mantle further since dx12 can do exactly the same and also give the same to non-Radeon owners. 
I think AMD has shot themselves in the foot by not moving Mantle further, that would give people who are looking for a new GFX an extra push to take a serious look at what AMD has to offer.

A lot of people say that the Hawaii core in R9 290/290X is bad due to running hot and using a lot of power. Thats wrong in my book, if you own a R9 290 series card with a AIB cooling solution, then heat and throttle  simply dosent exist. 

The more of power use, as  in seen of a year, thats also close to nothing on your electrialbill, it just dosent add up in my book.

Now R9 390 series is cumming out with a brand new ram teknology which Nvidia hasent, and wont have for the next 6-12 months. Since HBM ram is a joitventure between Samsung and AMD, I just dont understand why they havent taken out a patent on the teknology, and just cut Nvidia out from ever using it.


----------



## Xzibit (May 25, 2015)

Also the longevity of GCN.  A tier 3 DX 12 card that was introduced in Dec 2011/Jan 2012.  AMD had a complete D12 tier 3 line up by the end of Q1 2012.  For those that keep their card for longer periods of time.  Less of a need to upgrade for features, good or bad but its easier on the pocket books then introducing features by generations (for the consumer).


----------



## Ebo (May 25, 2015)

For people with older systems dx 12 is  win win situation, its going to keep your old hardware running just a bit longer.

For us with highend systems, not so much or close to nothing is gained by shifting, but that dosent really matter anyway, win 10 is comming and its a free upgrade, so whats the hassle ?.


----------



## 15th Warlock (May 25, 2015)

Ebo said:


> If AMD hadent come up with Mantle, we wouldnt have seen dx12 for years to come, thats for sure.
> 
> AMD proved tru Mantle that they could make some software that improved and ran over dx11 and at the same time taking Microsoft off gard, showing them it could be done.
> 
> ...



Mantle is still alive, only now it's called Vulkan


----------



## Ebo (May 25, 2015)

15th Warlock said:


> Mantle is still alive, only now it's called Vulkan



You are wrong, just look
http://www.vrworld.com/2015/03/05/amds-mantle-efforts-come-end/

Vulcan is a completely different API, and they have to chatch up when dx 12 comes out with win 10


----------



## Frag_Maniac (May 26, 2015)

Yay GCN, my now 2.5 yr old $330 7970 with a decent 3 game bundle is still holding it's value! 

Drivers are slipping a bit, but nothing a WHQL rollback can't solve.


----------



## Fluffmeister (May 26, 2015)

They certainly are milking it.


----------



## Nordic (May 26, 2015)

RejZoR said:


> I don't think that's the case. I've installed latest Ubuntu like a month ago and it was the clumsiest piece of software I've used in ages. Sure its's free and offers tons of stuff, but it's just bad like every single Linux distro I've seen so far. Windows isn't perfect either, but at least basic stuff works right and whatever driver I have to install it's a matter of few clicks and almost zero problems unlike installing drivers on Ubuntu...


That is very different than my experience. Linux for me works really well. More often than windows it just works. But when it has a problem I have less experience with linux and I don't know how to fix it. That is a learning to use the os problem, nothing else.


----------



## lilhasselhoffer (May 26, 2015)

So, for those asking about it AMD has released a list of cards that will support DX12:
http://www.guru3d.com/news-story/amd-released-list-of-compatible-directx-12-cards.html

Basically anything since the 7730 (ie, anything better than a 7730 in the 7xxx generation and all future series of cards) is on the list.  



This kinda boggles my mind.  We get a long form discussion about how AMD is going to die by 2020.  We get another discussion about how AMD is looking toward the future (more so than _Nvidia_).  It really gives me moment to pause when people boil down complex issues into green/red team, then we get to see both sides of the coin discussed rationally elsewhere.  

No matter what the insanity, Techpowerup forums always make my day better.


----------



## Mussels (May 26, 2015)

lilhasselhoffer said:


> So, for those asking about it AMD has released a list of cards that will support DX12:
> http://www.guru3d.com/news-story/amd-released-list-of-compatible-directx-12-cards.html
> 
> Basically anything since the 7730 (ie, anything better than a 7730 in the 7xxx generation and all future series of cards) is on the list.
> ...



that makes me a little sad, since my second system has 6870 crossfire. i thought they'd be DX12 capable :/


----------



## RejZoR (May 26, 2015)

HD6000 series aren't GCN powered, they are using VLIW. It's an older architecture, so different it's probably why they don't even bother making it support DX12. Don't get me wrong, with DX12, you can still run games in DX12 on HD6000 cards, you just won't be able to use DX12 specific features.


----------



## Mussels (May 26, 2015)

RejZoR said:


> HD6000 series aren't GCN powered, they are using VLIW. It's an older architecture, so different it's probably why they don't even bother making it support DX12. Don't get me wrong, with DX12, you can still run games in DX12 on HD6000 cards, you just won't be able to use DX12 specific features.



oh, so in theory i should still get some performance improvement on the draw calls?


----------



## R-T-B (May 26, 2015)

AsRock said:


> 670 even DX12 ?, but anyways even if it can do some commands answer is still no due tot he fact there is no DX12 games lol.



Everything Fermi plus gets the DX12 base treatment from NVIDIA.



> Gah, wish nVidia would plan ahead better like AMD does.



If I had a nickel for everytime I've ever heard that uttered...  I'd have a nickel.



Mussels said:


> oh, so in theory i should still get some performance improvement on the draw calls?



Unfortunately no.  That comes with WDDM 2.0 I think, and you'll need a supporting driver.  You can run the games but your performance will be no better.

I think...  no way to be certain yet.


----------



## LightningJR (May 26, 2015)

I'll take all the extra performance I can get from my 670 that's for sure. Windows 10 being free makes it easier to move it, the DX12 performance increase is icing on the cake  I wonder what developers will be able to do now that they weren't able to do before. I am sure there's people out there that have been in the situation where they had a vision but they couldn't achieve it due to this overhead. Anyone know of an article of a developer/engine designer/graphics designer talking about DX12?



Ebo said:


> A lot of people say that the Hawaii core in R9 290/290X is bad due to running hot and using a lot of power. Thats wrong in my book, if you own a R9 290 series card with a AIB cooling solution, then heat and throttle simply dosent exist.
> 
> The more of power use, as in seen of a year, thats also close to nothing on your electrialbill, it just dosent add up in my book.



As a guy who loves overclocking and efficiency also physically hates being hot and has his PC in a small room I can tell you the R9 290/X is a failure. I live in Canada and the heat in the PC room is already getting to be a bit much while stressing the GPU and I have a 670.. Overclocking is limited when the GPU is drawing that much power and putting off that heat, even with the AIB cooling solution. Also there's something about efficiency that makes me happy, doing more with less. I was stunned with the 600 series by NVidia, low power, low heat high performance. I am not trying to argue for NVidia, I am just trying to explain how I see it as a failure so you can understand. I was pleasantly surprise, however, with the performance of the 290's for the price (well after the price came back down to it's normal levels)


Edit: I had Frick's link opened in a new tab, wrote the reply before I got to it. Sweet blog.


----------



## R-T-B (May 26, 2015)

LightningJR said:


> I'll take all the extra performance I can get from my 670 that's for sure. Windows 10 being free makes it easier to move it, the DX12 performance increase is icing on the cake  I wonder what developers will be able to do now that they weren't able to do before. I am sure there's people out there that have been in the situation where they had a vision but they couldn't achieve it due to this overhead. Anyone know of an article of a developer/engine designer/graphics designer talking about DX12?
> 
> 
> 
> As a guy who loves overclocking and efficiency also physically hates being hot and has his PC in a small room I can tell you the R9 290/X is a failure. I live in Canada and the heat in the PC room is already getting to be a bit much while stressing the GPU and I have a 670.. Overclocking is limited when the GPU is drawing that much power and putting off that heat, even with the AIB cooling solution. Also there's something about efficiency that makes me happy, doing more with less. I was stunned with the 600 series by NVidia, low power, low heat high performance. I am not trying to argue for NVidia, I am just trying to explain how I see it as a failure so you can understand. I was pleasantly surprise, however, with the performance of the 290's for the price (well after the price came back down to it's normal levels)



I thought the same thing.  FWIW, I'm in Washington (almost Canada, we want to secede anyways, lol) and I have an R9 290X.  It's a great card and doesn't overheat, but thing is, I do.  In my small room it is only really usable in comfort with a good climate control system and or and open window in the winter.  Otherwise you just sweat it out.  This time of year, I am sweating myself to death for every FPS gain I got.


----------



## Frag_Maniac (May 26, 2015)

lilhasselhoffer said:


> So, for those asking about it AMD has released a list of cards that will support DX12:
> http://www.guru3d.com/news-story/amd-released-list-of-compatible-directx-12-cards.html
> 
> Basically anything since the 7730 (ie, anything better than a 7730 in the 7xxx generation and all future series of cards) is on the list.


Our older 7000 series cards only have 2 AC (ACE) engines though, so we'll likely only see half of the boost the newer ones will get. I'll take it and run with it like a shameless hobo though.


----------



## AsRock (May 26, 2015)

R-T-B said:


> I thought the same thing.  FWIW, I'm in Washington (almost Canada, we want to secede anyways, lol) and I have an R9 290X.  It's a great card and doesn't overheat, but thing is, I do.  In my small room it is only really usable in comfort with a good climate control system and or and open window in the winter.  Otherwise you just sweat it out.  This time of year, I am sweating myself to death for every FPS gain I got.



WOW, i remember this last winter and fuck mine did not keep me warm in fact i remember my fingers getting like icicles and that's from some one who like's the colder seasons.


----------



## RCoon (May 26, 2015)

lilhasselhoffer said:


> This kinda boggles my mind. We get a long form discussion about how AMD is going to die by 2020. We get another discussion about how AMD is looking toward the future (more so than _Nvidia_). It really gives me moment to pause when people boil down complex issues into green/red team, then we get to see both sides of the coin discussed rationally elsewhere.
> 
> No matter what the insanity, Techpowerup forums always make my day better.



I always found that news posts that quote anything to do with AMD/NVidia turn into a giant trashfest. The forums less so, only when somebody is quoting news sources does it descend into madness. Probably says something about the reader base (conjecture) vs the forum base (actual experience/application).


----------



## rtwjunkie (May 26, 2015)

Microsoft really has us by the proverbial short hairs on this. On the one hand they dangle the DirectX 12 carrot in front of us, but behind their back they hold the unknown of whether W10 will be subscription based, etc.  They are counting on the masses rushing forward like lemmings off a cliff.  I hope we don't get the "gotcha" a year after release....


----------



## lilhasselhoffer (May 26, 2015)

rtwjunkie said:


> Microsoft really has us by the proverbial short hairs on this. On the one hand they dangle the DirectX 12 carrot in front of us, but behind their back they hold the unknown of whether W10 will be subscription based, etc.  They are counting on the masses rushing forward like lemmings off a cliff.  I hope we don't get the "gotcha" a year after release....



Personally, I think MS knows that the bulk of their software is pirated in the rest of the world.  In the US, and to some extent Europe, copyrights are enforced.  In China, a copyright is only as valuable as the toilet paper it is printed on.

To that end, MS wants its operating system to be ubiquitous.  They know that pirates will continue pirating the software, and that paying customers are tired of having to deal with crappy OSes every other time.  To that end, they offer a free upgrade which will both move everyone to a new standard and make piracy less of a concern in the short run.  

I'm thinking that this is a bid to move from an OS maker, into a media hub.  Honestly, consider MS offering a Steam like software store...oh wait, they already do that.  Well, what about offering a gaming experience like Steam/Origin... well, GFWL tried that and failed spectacularly.  What about integration with a gaming market they already control...is that you over there Xbone?  

I hate to say this, but MS seems to be slashing the price of their OS, and instead selling their functional integration as a means to financial ends.  It's what we saw with the Xbone (honestly, did the thing do anything but play tv?), it's inherent in the push for windows on mobile devices, and it's exemplified by the free upgrade.  I believe MS has seen Valve as an example.  Software isn't where the money is at, it's with the gate keeping of content.  If MS takes a minor loss today they can make significant sums of cash tomorrow.


The only way I see this back firing is how you've described.  If MS gets greedy and/or installs some backwards form of DRM into Windows 10 (if that keylogger doesn't disappear I'm happy enough with DX11) you lose any trust.  I'm pretty sure MS isn't stupid enough to give away tainted product, because the repercussions of doing so would be infinitely worse than another $140 OS purchase.  MS could easily survive another OS flop, but if Windows 7 lasts as long as XP did then there will have to be changes at MS.  Not everyone there has a golden parachute, so hopefully the top brass at MS know better.... only time will tell...


----------



## 15th Warlock (May 26, 2015)

Ebo said:


> You are wrong, just look
> http://www.vrworld.com/2015/03/05/amds-mantle-efforts-come-end/
> 
> Vulcan is a completely different API, and they have to chatch up when dx 12 comes out with win 10



Nope, I'm not wrong:


_AMD's Robert Hallock confirmed on a blog post that Mantle had, for the most part, been turned into the Khronos Group’s Vulkan API that would supersede OpenGL.

“The cross-vendor Khronos Group has chosen the best and brightest parts of Mantle to serve as the foundation for 'Vulkan,' the exciting next version of the storied OpenGL API,” Hallock wrote. “Vulkan combines and extensively iterates on (Mantle’s) characteristics as one new and uniquely powerful graphics API. And as the product of an incredible collaboration between many industry hardware and software vendors, Vulkan paves the way for a renaissance in cross-platform and cross-vendor PC games with exceptional performance, image quality and features.”

http://www.pcworld.com/article/2894...ises-from-the-ashes-as-opengls-successor.html
_
Oh, and it's Vulkan, not Vulcan btw


----------



## ChevyOwner (May 26, 2015)

> Would big DX12 performance increases encourage you to upgrade to Windows 10?



If there is a RAM Limit like Windows 7 x64 Home Basic(8GB), or Home Premium(16GB) then no.


----------



## REAYTH (May 26, 2015)

Looks my 670's in SLI will be good for another few years in 1080p.


----------



## R-T-B (May 26, 2015)

AsRock said:


> WOW, i remember this last winter and fuck mine did not keep me warm in fact i remember my fingers getting like icicles and that's from some one who like's the colder seasons.



I live in a room that would put a soviet apartment complex to shame.  It's very small.  That and our winter was warmer than most of the states this year.


----------



## LightningJR (May 26, 2015)

AsRock said:


> WOW, i remember this last winter and fuck mine did not keep me warm in fact i remember my fingers getting like icicles and that's from some one who like's the colder seasons.





R-T-B said:


> I live in a room that would put a soviet apartment complex to shame.  It's very small.  That and our winter was warmer than most of the states this year.



I wouldn't expect a R9 290X to heat my PC room in the winter, that's just crazy. Bring on the Quad-Fire 290X's then XD  I don't have AC and keeping a room below 20C is impossible in the summer with a PC, I couldn't imagine having a 290 in those situations.


----------



## Mussels (May 26, 2015)

LightningJR said:


> I wouldn't expect a R9 290X to heat my PC room in the winter, that's just crazy. Bring on the Quad-Fire 290X's then XD  I don't have AC and keeping a room below 20C is impossible in the summer with a PC, I couldn't imagine having a 290 in those situations.




we regularly go 40-45C here in summer, so umm... i use an android tablet in those months for a lot of things. DirectX12 won't stop global warming kids, maybe drag this back on topic?


----------



## R-T-B (May 26, 2015)

It heats my pathetic room when coupled with a hot Nehalem chip but yes, kudos to being back on topic.  

I'm just happy my R9 290X purchase decision is paying off again, I bought it ironically for it's DirectX 11.1 support (and price, of course, it was a steal).  Now I get DX12 Tier 3...  woo!


----------



## TRWOV (May 26, 2015)

Damn AMD, you're making very difficult for me to come up with a reason to upgrade from my 7970 




Xzibit said:


> Its all up to software and developers. DX 12 tier stage will make for the same old lazy development.  Depending on Tier support there is a limitation on what your able to support and do.  The higher the tier your gpu supports Tier 1. 2 & 3 the more resources


----------



## Arjai (May 26, 2015)

Ok, after voting, Not Sure, yesterday. I changed my vote today.

Reading through some links here and google links there, a total of perhaps 2 hours over two days, I am impressed.

The sooner the better!! Bring on DX12!!


----------



## Slizzo (May 26, 2015)

Gonna be upgrading to Windows 10 anyway (both my 8.1 machine and 7 machine) so any benefit I receive is gravy.


----------



## Caring1 (May 27, 2015)

Slizzo said:


> Gonna be upgrading to Windows 10 anyway (both my 8.1 machine and 7 machine) so any benefit I receive is gravy.


Same here, my Laptop with an APU needs every boost it can get.


----------



## Mussels (May 27, 2015)

Caring1 said:


> Same here, my Laptop with an APU needs every boost it can get.



have you checked if its DX12 capable? sadly despite having a kickass gaming laptop, mine falls short (6570 2GB)


----------



## Caring1 (May 27, 2015)

Mussels said:


> have you checked if its DX12 capable? sadly despite having a kickass gaming laptop, mine falls short (6570 2GB)


If that's the case then mine does too, it's a 6310, not that it matters really, I hardly use that aside from crunching.


----------



## Mussels (May 27, 2015)

Caring1 said:


> If that's the case then mine does too, it's a 6310, not that it matters really, I hardly use that aside from crunching.



it gave me man feels to realise that myself, the CPU in my laptop is better than a lot of desktop users have, so being able to let DX12 utilise that would have been great.


edit: i did some googling and its hard to be sure, but DX11 hardware may still run DX12 titles with the CPU performance gains - just cant use the new graphical stuff on the GPU side.


----------



## R-T-B (May 27, 2015)

> i did some googling and its hard to be sure, but DX11 hardware may still run DX12 titles with the CPU performance gains - just cant use the new graphical stuff on the GPU side.



The more I read, the more I think it's likely you'll get some of them at least.  You'll get much more if AMD releases a WDDM 2.0 driver, which brings it up to the bare minimum needed for many of the CPU performance improvements.  AMD will likely do this given time as the 6000 series is still technically a supported product.


----------



## Mussels (May 27, 2015)

I hope so, and i hope the mobility side gets some lovin too - they're usually rebadged older cards.

Be freaking nice for AMD if they released a driver that gave us the ability to run these games with at least the CPU benefits, and reduced graphics.


----------



## LightningJR (May 27, 2015)

I am going to put the Windows 10 Preview on my A8-4500M laptop and try the 3DMark draw call benchmark. We'll see how it goes. It would have been nice if AMD supported Mantle for my APU. I find the CPU on my laptop is weak, when Youtube uses VP9 I can't watch 1080p60fps.. I just barely can 720p60fps.


Edit: The DX12 benchmark isn't working.. Idk if I am doing something wrong or it just doesn't work for APUs right now. I'll do more research to see if I can figure it out.

Edit 2: Tried 5 different drivers with no success. My APU is not GCN idk if that's why or if the drivers just simply don't support my APU yet.


----------



## Frag_Maniac (May 28, 2015)

Smaller dies end up being bigger OCs and not much diff in temp. Deal with it peeps. If you're lucky it will put out just enough heat in the winter to keep you comfy, and not give you heat stroke in the summer. I'm picturing Bear Grylls laughing at all this.


----------



## DeViLzzz (May 28, 2015)

Anything that gives my GTX 670 more life I am a fan of.  I definitely will get Windows 10 for DirectX 12.


----------



## R-T-B (May 28, 2015)

Frag Maniac said:


> Smaller dies end up being bigger OCs and not much diff in temp. Deal with it peeps. If you're lucky it will put out just enough heat in the winter to keep you comfy, and not give you heat stroke in the summer. I'm picturing Bear Grylls laughing at all this.



Bear Grylls probably would die of laughter if he ever saw this website...

But yeah, Hawaii and Nehalem make for a hot build, most certainly.


----------

