Wednesday, June 3rd 2009
AMD Demonstrates World’s First Microsoft DirectX 11 Graphics Processor
At a press conference in Taipei, Taiwan today, AMD publicly demonstrated the world's first Microsoft DirectX 11 graphics processor. The series of demonstrations shed new light on the significantly improved computing experience set to debut at the end of 2009. The fusion of AMD's new ground-breaking graphics processors with the forthcoming DirectX 11 programming interface is set to forever change both applications and PC gaming for the better. To illustrate, AMD showed numerous examples of faster application performance and new game features using the world's first true DirectX 11 graphics processor.
Source:
AMD
- Get ready for a revolution: Games and other applications are about to get a lot better as a result of AMD's new graphics hardware and DirectX 11. DirectX 11 features such as tessellation will bring consumers higher quality, superior performing games making use of 6th generation AMD technology. Another DirectX 11 feature, the compute shader, will enable AMD's DirectX 11 graphics cards to help make Windows 7 run faster in a wide number of applications and in a manner that's completely transparent to users, for example, in seamlessly accelerating the conversion of video for playback on portable media players through a drag-and-drop interface.
- DirectX 11 done right on AMD: The development of DirectX 11 has been broadly influenced by AMD graphics technology. Each new version of DirectX builds on the versions that came before it, and many of the capabilities of DirectX 11 were pioneered on AMD GPUs, including DirectX 10.1, tessellation, compute shaders, Fetch4, custom filter anti-aliasing and high-definition ambient occlusion shading.
- Bringing consumers DirectX 11 sooner: The preview of the world's first DirectX 11 graphics processor at Computex 2009 validates AMD's commitment to delivering leading technologies to market before anyone else, and to continuing to foster innovation in computing.
- Fueling developer demand: It's not just consumers who are excited about the prospects of DirectX 11, game developers are also incredibly enthusiastic about taking advantage of new DirectX 11 hardware to bring even better games to market, in large part due to AMD's readiness to meet their DirectX 11 needs. Many developers have indicated their commitment to building DirectX 11 games initially on AMD's DirectX 11 hardware, delivering superior performance and compatibility.
61 Comments on AMD Demonstrates World’s First Microsoft DirectX 11 Graphics Processor
Battlefield: bad company 2 is a dx10.1 title with ati's tessellation support (2xxx,3xxx,4xxx series and xbox). They ported it to dx11 just to "import" dx11 tessellation, which can be used by gt300 (and rv870).
They don't use any of dx11 features, only the dx11 tessellation.
Slides:
hardocp.com/news.html?news=Mzk5NTAsLCxoZW50aHVzaWFzdCwsLDE=
So dx10.x cards could run dx11 titles (with some eye-candy and performance loss), but dx9 cards couldn't.
Just like today, you need a dx10 card to run dx10 titles.
Anyway dx9 only cards are too weak for today and future titles.
However with software rendering you can render dx10/11 games on cpu (with a dx9 card), but it will be very slow.
msdn.microsoft.com/en-us/library/dd285359.aspx
Nvidias first Dx11 GPU should be a creature of might too, cant wait for more than buzz words to surface.
Also about dx10 and 11 compatibility and convertion say tessellation unit for example. Dx10 sdk requires developer to have supportive hardware (which is not there) or create software routine on his own while dx11 will have both already. So lets say dx10 developers designed software tessellation unit. Now the only thing they have to do is to invoke sdk routine instead. Simple isn't it.
But to be honest I can't wait to see the first images of a DX11 game because I want to see the future of gaming :D
I'll be working overtime to get my hands on one of those babies...I'm wondering if Crysis 2 will have support DX11 fully.
I love tessellation, I find it to be a great alternative to anti-aliasing with little performance drop. At least not as much.
Anyway I remember the Microsoft Flight Simulator X DX10.1 TARGET pre-rendered images. They don't come even close to the real thing. So, be happy they are moving forward, support them by buying the next king-of-the-hill video card. I know I will. And I also know I WILL be disappointed.
It's how the world works... They give you a prequel to a cookie (DX10.1), then, when you get so close to enjoying the actual cookie (DX10.1), you see that it isn't what you hoped it would be, but it's all ok because a new cookie with hazelnuts (DX11) appears and you forget all about the last cookie (DX10.1) and all the other cookies that you were forced to swallow (<= DX9) and left you with a sour aftertaste.
www.youtube.com/watch?v=Utz8D5aSK84
Anyway I won't buy any Ati video card until they change their VGA coolers and add physix support or whatever its called.
I'm not native English, thus i don't know what do you not understand on what i wrote about crysis and dx10.
The Witcher: this is no different to how GTA IV was mode for console games, and ported to PC. If it was made for PC first, it wouldnt run so poorly, or need such excessive hardware to run.
The same is said for all DX10 games out there - if theyd made them for DX10 from the start, they would be FASTER than DX9, not slower.
Thanks for the explanation, now I got the idea :D
Because I can't see a big difference between that and DX10, anyway nice find :rockout:
GEOMETRY INSTANCING From Wikipedia:
"In real-time computer graphics, geometry instancing refers to the practice of rendering multiple copies of the same mesh in a scene at once. This technique is primarily used for objects such as trees, grass, or buildings which can be represented as repeated geometry without appearing unduly repetitive, but may also be used for characters.
Although vertex data is duplicated across all instanced meshes, each instance may have other differentiating parameters (such as color, or skeletal animation pose) changed in order to reduce the appearance of repetition. By factoring out common data between instances to achieve lower memory usage, this technique is an example of the flyweight design pattern."
Geometry TESSELLATION From ExtremeTech:
The hull shader takes control points for a patch as an input. Note that this is the first appearance of patch-based data used in DirectX. The output of the hull shader essentially tells the tessellator stage how much to tessellate. The tessellator itself is a fixed function unit, taking the outputs from the hull shader and generating the added geometry. The domain shader calculates the vertex positions from the tessellation data, which is passed to the geometry shader.
It's important to recognize that the key primitive used in the tessellator is no longer a triangle: It's a patch. A patch represents a curve or region, and can be represented by a triangle, but the more common representation is a quad, used in many 3D authoring applications.
What all this means is that fully compliant DirectX 11 hardware can procedurally generate complex geometry out of relatively sparse data sets, improving bandwidth and storage requirements. This also affects animation, as changes in the control points of the patch can affect the final output in each frame.
The cool thing about hardware tessellation is that it's scalable. It's possible that low end hardware would simply generate less complex models than high-end hardware, while the actual data fed into the GPUs remains the same.
much more efficient.
Shiny Entertainment did geometry sub-division in their engine for "Messiah" and "sacrifice". it was all software done by the engine.
now we get it hardware accelerated as part of the DirectX API
N.B.
i may be a little off on some parts, so anyone more knowledgeable feel free to set things straight for the greater good :)