Tuesday, July 1st 2008
AMD/ATI Tempts Game Developers with DirectX 10.1
With a market position resurrection in progress, AMD/ATI look to compete using a tried and tested tool for technological supremacy over its rival(s), developer-level optimizations for their games. Blizzard has been looking at implementing DirectX 10.1 with its future games. If that happens, it becomes a favorable scenario for AMD's products since Blizzard aren't habituated to making games that run best on only the most expensive hardware, but that with DirectX 10.1, they will look to implement certain DX10.1-exclusive effects, which means that even mid-range users of ATI products could enjoy the best visuals that the game has to offer, something NVIDIA and its users could miss out on.
It is learned that AMD is looking to team up with developers for implementation of the DirectX 10.1 features. An 'old friend' of ATI, Valve could just have DirectX 10.1 implemented with ATI apparently making sure that happens. The next major revision of the Source engine that drives upcoming game titles such as Half Life 2: Episode 3, Portal 2 and Left 4 Dead could just be DirectX 10.1 titles looking at the flexibility of the Source engine and the ease with which new technologies could be incorporated into it.
Game developers have a tendency to play it safe though, and whether there will be any exclusive effects or not remains to be seen. There is no reason as to for why they shouldn't implement DirectX 10.1 though, the worst-case-scenario is that people with compliant hardware will get a performance boost where DX10.1 makes a difference over DX10.0. On the surface, DirectX 10.1 is touted to be more of a feature upgrade than performance.
Source:
NordicHardware
It is learned that AMD is looking to team up with developers for implementation of the DirectX 10.1 features. An 'old friend' of ATI, Valve could just have DirectX 10.1 implemented with ATI apparently making sure that happens. The next major revision of the Source engine that drives upcoming game titles such as Half Life 2: Episode 3, Portal 2 and Left 4 Dead could just be DirectX 10.1 titles looking at the flexibility of the Source engine and the ease with which new technologies could be incorporated into it.
Game developers have a tendency to play it safe though, and whether there will be any exclusive effects or not remains to be seen. There is no reason as to for why they shouldn't implement DirectX 10.1 though, the worst-case-scenario is that people with compliant hardware will get a performance boost where DX10.1 makes a difference over DX10.0. On the surface, DirectX 10.1 is touted to be more of a feature upgrade than performance.
24 Comments on AMD/ATI Tempts Game Developers with DirectX 10.1
The shader/AA bug caused a huge slowdown. DX10.1 gets it back up to speed.
en.wikipedia.org/wiki/DirectX_10#DirectX_10
www.pcper.com/images/reviews/472/DirectX%2010_1%20White%20Paper%20v0.4.pdf
"DirectX 10.1 offers incremental improvements to the programming interface that address limitations of DirectX 10," read bugs
Should be interesting, but saying that, look at CoD 4 - DX10 style GFX, and it's still DX9. DX10 still needs to do something extraordinary to win me over.
Tell that to EA with Crysis. Some playing it safe there, and look what that did.
How does a 25-30% bump in performance with AA sounds like?
That is the reason Assassin's Creed dumped DX10.1 support: it would have made AMD's get better framerates than nVidia's even though the game "proudly" displays the "nVidia, the way it's meant to be played!".
10.1 forces 'optional' 10.0 features to become mandatory.
DX10 titles have been lacking so far, and i agree.. Call of duty 4 had some damned awesome graphics, and its only DX9.
So replace "performance for free" with "bug fix to remove performance fail"
well regardless, i have a 10.1 card (media PC has a 3450) so i can always use that and see what the fuss is about :P
things look good for AMD... :)
1- Many said you STILL may need that pass for many other things, so getting rid of it is not always beneficial. After reding this thing I really wonder if that was the case of AC. Maybe they just scrapped the rendering pass because it wasn't needed for AA, totally forgetting they (another developer agnostic to the AA implementation) were using for something else. Fits with the explanation of the devs IMO. After reading a lot about the issue, I'm all for this theory and totally against the Nvidia paid and developers cheat one.
2- Many said you CAN do the same in DX10 as in DX10.1 regarding the feature that improved AA, but because in DX10 was not mandatory it was not properly documented, it was a lot harder to implement but not impossible. According to them the performance in DX10 would be almost the same, but for many of them due to lack of documentation it didn't make a lot of sense. TBH they also say that in DX10.1 it's a lot better implemented, in the sense of how easy is to use, and sometimes this is more important than anything else.
@ DarkMatter, what d'you mean by pass?
What I'm just trying to explain is that apparently (at least is what I understand of what I've read), what we saw in AC is not something we will see as much, because it only offers a performance boost in some limited cases. In fact AC devs said that by removing that pass, the engine was not rendering the same thing, because that pass was probably used for something else besides AA under DX10, even if it's not very apparent.
EDIT: It's important to note here that the discussion I'm talking about occured a lot before Assasins Creed and that it is me who has related those comments about DX10 and 10.1 features, and it's pros and cons, with what happened with AC.
;)
Someone correct me if I'm wrong, though, but I thought AC reinstated DX10.1 with version 1.2 patch after both Ubisoft and nVidia came under a lot of fire for patch 1.1 and the removal of .1 support, and neither company wanted to give a straight answer and neither story matched up?
IMO, though, I'd like to see DX10.1 support start rolling into games, it would really help boost AMD/ATI's current growing market, and ATI could even use the opportunity to work with game developers more, and help push their AMD GAME! campaign. We really need the competition in all areas of the market . . . everything hardware wise has become very stale over the last 3-5 yeras, too many companies with sole dominance and lack of real competition.
On the ATI side specifically I'm pretty sure people were also noting how the new architecture of the R600 had a lot more growth potential suggesting that while Nv's g80 architecture was better at the time it would hit a wall before ATI's did. Personally I think that's what happened now.