• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Mantle API presentation by AMD, DICE and Oxide - AMD Summit 2013

It's funny how people are all sceptical over Mantle. Like low level API's are something new. Am i really the only one who remembers 3dfx Glide and S3 Metal ?
 
It's funny how people are all sceptical over Mantle. Like low level API's are something new. Am i really the only one who remembers 3dfx Glide and S3 Metal ?

I'm skeptical because it's AMD, their PR team is horrifically bad. I remember Glide (not so much Metal) hence why I referenced it--I'm worried about a similar situation where one manufacturer depends on a low level API to maintain a performance edge. Honestly, I find it a bit confusing that AMD claims all these developers are asking them for a solution to code to the metal when back during the Glide-era developers were begging Microsoft and SGI to make a hardware agnostic API to get away from coding to the metal.

We'll see how it plays out, but I'm not willing to accept these 10x performance gains AMD is touting. I've heard similar talk in the past and it never pans out as well as intended.
 
It's funny how people are all sceptical over Mantle. Like low level API's are something new. Am i really the only one who remembers 3dfx Glide and S3 Metal ?

The skeptics like I don't doubt that the technology can provide performance improvements or if it can be implemented. However, we are still wondering:

  1. What magnitude of overall performance improvement will result?
  2. Will this actually catch on and have widespread adoption?
Neither one is easy to quantify at the moment. For performance figures AMD is using qualitative means to describe the overall performance improvement (e.g. "blow your socks off" CPU utilization) and only using quantitative measures for low level performance measurements (e.g. number of draws per frame) that don't directly translate into the end user experience. As far as widespread adoption, I am not willing to trust developers like EA just based upon them boasting that this will be used in all their latest games since games get cancelled and delayed all the time.

AMD has let me down a lot in the past (I still am waiting for the Crossfire Eyefinity frame pacing driver they promised 6 months ago); if the company wants my business in the future they will have to prove that they can deliver products and technologies, not just talk about them. Send out a few finished games and beta or release drivers to independent reviewers and let them confirm these claims. If AMD's claims still hold up at that point, then I will be convinced. I've learned to never buy a product based on anticipated releases or features since they rarely come about or are delayed significantly. The people who bought their new graphics card claiming "it supports Mantle" are going to be in for a rude awakening when they realize that their card is horribly obsolete before Mantle games become widely available.
 
What right do people get to complain about inefficiencies if they cannot be bothered to make optimizations? I never hear about Crytek or 4A Games complaining about Graphics API's. I've heard Carmack voice his dislike of DirectX but also crap all over OpenGL for not updating quickly enough. I like DICE's games, but they are far from perfect, and have really odd priorities if they think their time is best spent working with AMD on Mantle. How about changing Battlefield's hit detection to serverside and fixing the god damn net code so people stop killing each other at the same time ~50% of the time.
In a perfect world all devs WOULD care a lot about optimization, CryTek, not so much. They're more interested in melting PCs, by their own words. Neither CryTek or Id are good examples of good game optimization. Carmack's horrible Megatextures concept obviously had benefits only on their side speeding up production, and nothing but tradeoffs on the user side. The nuances of how the games are made mean little really, it's more about the potential via the engines they build, because the rest is their personal vision vs expertise.
Glide was supported by quite a few devs as well. Hell, looking at the list EA and Interplay appear to have taken quite a liking to Glide.
Supported by is not the same as requested and co-developed by, not by a long shot. It's easy to sit back and say we'll try it if you make it, with no commitment up front.
 
In a perfect world all devs WOULD care a lot about optimization, CryTek, not so much. They're more interested in melting PCs, by their own words. Nether CryTek or Id are good examples of good game optimization. Carmack's horrible Megatextures concept obviously had benefits only on their side speeding up production, and nothing but tradeoffs on the user side. The nuances of how the games are made mean little really, it's more about the potential via the engines they build, because the rest is their personal vision vs expertise.Supported by is not the same as requested and co-developed by, not by a long shot. It's easy to sit back and say we'll use it if you make it, with no commitment up front.

Crytek games are well optimised, as you can run them on a lot of computers with lower settings. When you turn everything to 11 they melt PC's, and I believe the lack of that option is one of the things PC gamers bitch about when they talk console games.
 
Crytek games are well optimised, as you can run them on a lot of computers with lower settings. When you turn everything to 11 they melt PC's, and I believe the lack of that option is one of the things PC gamers bitch about when they talk console games.

Exactly. Games can be extremely taxing and in fact should be if they are perfectly optimized--that is, taking full advantage of all the hardware available.
 
I'm skeptical because it's AMD, their PR team is horrifically bad. I remember Glide (not so much Metal) hence why I referenced it--I'm worried about a similar situation where one manufacturer depends on a low level API to maintain a performance edge. Honestly, I find it a bit confusing that AMD claims all these developers are asking them for a solution to code to the metal when back during the Glide-era developers were begging Microsoft and SGI to make a hardware agnostic API to get away from coding to the metal.

We'll see how it plays out, but I'm not willing to accept these 10x performance gains AMD is touting. I've heard similar talk in the past and it never pans out as well as intended.

You're looking at two different situations. Back then we had like 10 graphic vendors, each doing their own thing. We had like 4 or 5 different API's back then and graphics were evolving heavily. They needed something to unify everything. These days, we only have 2 major graphics makers (and i'm not counting Intel), basically just 1 API where OpenGL isn't used much anymore and DirectX seems to be stagnating for the last few years.

S3 Metal was exactly what AMD Mantle is. And if S3 wasn't busted back then and disolved into VIA, it could most likely live on. It was after all supported by one of the widely used engines, Unreal Engine. The performance i got out of crappy S3 Savage3D 8MB was easily comparable with like 4 times more expensive RivaTNT2 back in those days. It was because of S3 Metal that i could enjoy best games of that era on a crappy graphic card. UT99, Deus Ex, Undying, The Wheel of Time etc. All this was the good stuff.

DICE has already officially supported AMD Mantle which is coming to BF4 in December. Since EA announced that they'll be using Frostbite engine for pretty much all games, that's already a massive number of games. Then there is again Unreal Engine which also announced support. Imagine how many developers use Unreal Engine including all the indie studios. These two are almost half of the entire game market, the rest are proprietary engines which could potentially also support Mantle. And if AMD is planning support for other vendors as well, it has a success no matter what you think right now.
 
The skeptics like I don't doubt that the technology can provide performance improvements or if it can be implemented. However, we are still wondering:

  1. What magnitude of overall performance improvement will result?
  2. Will this actually catch on and have widespread adoption?

Here ya GO!
These are my own subjective speculations, nothing to do with anything else.

6K1l0qw.jpg

That 25% at the 4x might be even less, I didn't found a 4x comparison article to get an idea, but i guess the gain is so minimal they don't even make reviews about it.
And I also used the lowest of my own MT Boost estimate ...


------------------------
Great post I found on other sites on the web, not only agreeing with what I was saying before, but also explaining in detail which I could not.


The common misconception seems to be that Mantle brings only CPU gains, and only helps with low end GPUs. This is not true.

Here are some examples of potential GPU gains:
Bindless textures and hardware virtual memory (*) allow rendering in larger batches, thus increasing the GPU utilization (GPU partially idles at start/end of draw/dispatch calls). Application controlled memory management means that GPU needs to shuffle less resources around (this isn't only a CPU hit and many current games have frame rate spikes because of this issue). Also the developer can pack resources more tightly (multiple resources in same page/line, increasing memory/cache utilization). With Mantle you can run multiple kernels in parallel (or kernel + graphics in parallel) in a controlled way, and thus reduce the GPU bottlenecks. For example render a shadow map (mostly ROP and geometry setup) and compute ALU heavy pass (for example lighting for the previous light source) at the same time. This results in much higher GPU utilization. Better predicates and storing GPU query results to GPU buffers (without CPU intervention) allow GPU optimization techniques that are not possible with PC DirectX. AMD also claims improvements to indirect draw/dispatch mechanisms, but do not spill the details in Mantle slides (these improvements potentially bring big GPU gains for certain advanced use cases). Direct access to MSAA data could also make deferred rendering AA much faster (more about that in the reply below).

(*) DirectX 11.2 also has partial support for hardware virtual memory (in form of tiled resources). However it has limitations and the Windows 8.1 requirement basically makes the API useless right now (Mantle has much bigger user base right now). Hopefully Microsoft will solve this issue, and bring some other Mantle's features to 11.3 (and/or 12.0).

AMD announced that with Mantle we finally have full manual access to both GPUs in Crossfire. This is excellent news. I was quite worried that SLI/Crossfire would die soon, as many new graphics engines will start doing scene management and rendering decisions on GPU side. Alternate frame rendering (with automatically synchronized memory between cards) is just not a good fit for a scenario where the data set is mutated slightly every frame (by compute shader passes that are pretty much impossible to analyze by automatic logic). AFR works best when everything is freshly generated during a single frame and there are no dependencies to existing data. However this kind of processing is a huge waste of GPU (and CPU) time, and frankly we can do much better (and I believe that forthcoming "pure" DX11+ engines that have no legacy baggage surely will). With Mantle, supporting Crossfire is possible even in these kinds of advanced GPU driven rendering engines. Hopefully Nvidia releases something similar in the future as well, or they will see very bad SLI scaling in some games/engines in the future.


Deferred antialiasing will be much more efficient, assuming the "Advanced MSAA features" in the Mantle slides means that you have direct access to GPU color/depth blocks and MSAA/layer data (including coverage sample index data). With all that data available, tiled (and clustered) deferred renderers can separate pixels (different sample counts) more efficiently and recover geometry edge information (using coverage samples) in a much more precise and efficient way.


That would definitely give a big GPU boost for a deferred renderer with MSAA (especially with coverage based EQAA/CSAA). Of course estimating the gains is not possible right now, since AMD hasn't yet released full Mantle API specifications, so we don't know exactly how low level access you have to the MSAA & depth/color compression data.

 
Last edited:
When you turn everything to 11 they melt PC's, and I believe the lack of that option is one of the things PC gamers bitch about when they talk console games.

This pretty much makes my point. Instead of being optimized for the consumer market they target most, they "melt PCs". It's not hard in the final phases of production to dial things in for the current top shelf hardware at that time and not have this be the case. What's the point of having high end graphics if you can't use max settings without problems and have to wait to upgrade your GPU? By then most are tired of the game. This is why most games are NOT in fact made for hardware not available yet.
 
This pretty much makes my point. Instead of being optimized for the consumer market they target most, they "melt PCs". It's not hard in the final phases of production to dial things in for the current top shelf hardware at that time and not have this be the case. What's the point of having high end graphics if you can't use max settings without problems and have to wait to upgrade your GPU? By then most are tired of the game. This is why most games are NOT in fact made for hardware not available yet.

I believe we're comparing apples and oranges.

I know I (cannot speak for Frick) am saying that the companies I mentioned optimize in the sense that they use all available resources to push the games to the edge. Yes, you usually need the highest end hardware to run them, but shouldn't that be the case with Max Settings? They could do things like dial down the view distance, halve the texture resolution, and remove some lighting effects, but that detracts from the experience. It's worth noting that a game like BF3 which ran at about 60 fps for me on mid/high settings with my HD5850 looked infinitely better than most games maxed which ran at 60 fps, because it had access to more hardware. It was using all 4 threads of my i5-2500k where as most of the games I could run maxed out were only using about 2.

"Max Settings" is kind of irrelevant when comparing different titles with different standards. Diablo 3 maxed out looks worse than Battlefield 3 on med/low settings because the standards are completely different. If you use more available resources but also dial up the quality of the graphics you can turn what would have been a system pushing game at max to a system pushing game at medium/high. It's essentially just scaling the quality upwards by accessing more available resources. I always found it odd that people were turned off by only being able to use medium or high settings, and always wanted ultra or max regardless of what the quality was.
 
Optimization to the extent you're now referring to has never been CryTek's goal or result. You just can't expect a solid 60 FPS at 1080p with max settings unless you're doubling up your GPUs in SLI or Crossfire.

For the record I was for a long time willing to settle for the best settings I could make do with when I was using a mere GTS 250 for over two years, but CryTek games are noted for the extra details they bring graphically.

It's kinda contradictory to equip yourself for such games and not use max settings, because it's only then that you employ the extra eye candy that separates them from other games, but at a cost.

Anyways, back on topic, we're somewhat straying from that. Mantle at least brings to the table not just an alternate graphics API that promises to bring more efficiency in gaming, but more in developing with it's debugging, and the latter is what I think is most exciting.

Teams like CryTek that try to juice the most out of hardware should now be able do it more predictably without going overboard. It will help dial in graphics power much better within acceptable performance limits.

Interesting to note too that CryTek started out with Nvidia, and with Crysis 3 went with AMD. I don't know if it had anything to do with Mantle, but it's certainly a possibility given what they're striving for with it's use in development.
 
Optimization to the extent you're now referring to has never been CryTek's goal or result. You just can't expect a solid 60 FPS at 1080p with max settings unless you're doubling up your GPUs in SLI or Crossfire.

For the record I was for a long time willing to settle for the best settings I could make do with when I was using a mere GTS 250 for over two years, but CryTek games are noted for the extra details they bring graphically.

It's kinda contradictory to equip yourself for such games and not use max settings, because it's only then that you employ the extra eye candy that separates them from other games, but at a cost.

Anyways, back on topic, we're somewhat straying from that. Mantle at least brings to the table not just an alternate graphics API that promises to bring more efficiency in gaming, but more in developing with it's debugging, and the latter is what I think is most exciting.

Teams like CryTek that try to juice the most out of hardware should now be able do it more predictably without going overboard. It will help dial in graphics power much better within acceptable performance limits.

Interesting to note too that CryTek started out with Nvidia, and with Crysis 3 went with AMD. I don't know if it had anything to do with Mantle, but it's certainly a possibility given what they're striving for with it's use in development.


Long story of my opinions around Crytek ... we all know what happened with crysis 2 ... instead of building and improving on what they made in the beginning they jumped on the trendy failtrain, while making it seem like they're knowing what they're doing, ofcourse a new IP wouldn't be as respected in beginning, being heavily pirated but still made 3 million sales because it was a great damn game and people kept buying over the years, and over the time it became an icon.


When I think about mantle, there will always be 1 thing I will want to see in this world before I die. A Crysis 1 Remastered with a new engine and with Mantle! Improved Ai and even more harder difficulty to make the gameplay seriously tactical and calculated gameplay. I have really no interest much about anything multiplayer to be honest, I played over 2000 hours of call of duty 2, great times, I would play a remaster of it from time to time, but other than that, about FPSs, the other thing is FPA like Metroid, Zelda, etc, other than that it's RTS, but RTS doesn't really like critically require mantle. Crysis would benefit immensly. Crytek to me just seems like a confused company, they don't really know what the hell they want, the fix something that's not broken, the whole CryMod doesn't exist anymore, I didn't even checked where all the assets are in 2 years, it's sad and depressing I don't really want to think about it, I was there in the community in the core, such great people ... the crysis story went totally around it self it's not crysis any more and I simply didn't want to have anything to do with crysis 3, I don't even have enough good graphics card.


------------------------------------------------
------------------------------------------------

Proudly Presents: an insight who Mantle haters are, usually some stupid programmers who have no idea what they're talking about, this guy isn't even in 3D programming, has no idea what benefits consoles have in terms of code efficiency. That's who the blogger twatter haters are, they don't have the complete picture for their opinion to even be considered, they have no critical thinking


xuP4THV.png
 
Last edited:
I think the best thing for Mantle would be if devs start using it successfully with engines other than Frostbite. Not that Frostbite isn't a good engine in it's own right, but it's complex coding is a big learning curve for most devs, and too much use of only Frostbite with Mantle is going to send a bad message to those wanting to implement it in their own engines.

I'm not sure Crysis 1 really NEEDS a remake with a different engine. Use of Mantle just might make it more efficient on the updated version of CryENGINE.
 
Not that Frostbite isn't a good engine in it's own right, but it's complex coding is a big learning curve for most devs, and too much use of only Frostbite with Mantle is going to send a bad message to those wanting to implement it in their own engines.

That doesn't make any sense.


I'm not sure Crysis 1 really NEEDS a remake with a different engine. Use of Mantle just might make it more efficient on the updated version of CryENGINE.

These 2 statements contradict each other.
 
That doesn't make any sense.

He's saying the growth of anything is dependant on the barrier to entry. Frostbite is infinitely more difficult to use than say, Unity or Unreal. Not to mention I believe it is very cost prohibitive, so you're unlikely to see anyone not developing for EA using it. I would assume Mantle requires alterations to be made to game engines, so who knows if these other more readily available engines will be updated to make use of Mantle.
 
He's saying the growth of anything is dependant on the barrier to entry. Frostbite is infinitely more difficult to use than say, Unity or Unreal. Not to mention I believe it is very cost prohibitive, so you're unlikely to see anyone not developing for EA using it. I would assume Mantle requires alterations to be made to game engines, so who knows if these other more readily available engines will be updated to make use of Mantle.

What on earth does mantle have to do with Frostbite ?
 
What on earth does mantle have to do with Frostbite ?

Probably the fact that Frostbite is the entire center of their marketting strategy for Mantle? If AMD just quietly released Mantle everybody would ignore it, they are pushing it hard with the Frostbite engine to show people its capabilities.
 
Probably the fact that Frostbite is the entire center of their marketting strategy for Mantle? If AMD just quietly released Mantle everybody would ignore it, they are pushing it hard with the Frostbite engine to show people its capabilities.

What the hell are you talking about. Nobody's pushing anything, they're still testing the thing, they're still making debug tools, they're still not ready to release the SDK planned for early 2014, ofcourse BF4 will have the first proof-point prototype because it's DICE who co-developed the API and they're the first ones who started working with it. There are 4 other developers who joined forces, however AMD said that the initial testing teams are intentionally kept small, so it was a few guys there at the Q&A; Oxide has their own demo planning to release in early 2014, or it's a game, or I may misheard the part but it's called Nitrous or Starswarm whatever it is.

And has nothing to do with any other engine or any other developer, and their implementation chances, it happens to be Frostbite, it could be anyone.
 
That's just patently false--AMD is pushing Mantle, to do such they have to prove it's actually worth investing in, that's where Frostbite comes in. If they (along with DICE) can prove it has benefits other developers will work with it. But as it stands, with only Frostbite they can't do a ton because EA keeps Frostbite under lock and key. The biggest issue is that it needs to be configured to work properly with additional engines, preferabbly ones that are readily available--like Unity or Unreal--so that more developers can have a go at it. As it stands right now, the people supporting AMD Mantle are DICE, Cloud Imperium (with Star Citizen), and Oxide Games. Oxide is a development studio--made of primarily former Civ V devs--that is making an engine called Nitrous.
 
That's just patently false--AMD is pushing Mantle, to do such they have to prove it's actually worth investing in, that's where Frostbite comes in. If they (along with DICE) can prove it has benefits other developers will work with it. But as it stands, with only Frostbite they can't do a ton because EA keeps Frostbite under lock and key. The biggest issue is that it [i assume you mean Mantle] needs to be configured to work properly with additional engines, preferabbly ones that are readily available--like Unity or Unreal--so that more developers can have a go at it.

That's bullshit.
 
That's bullshit.

I think you're confused here. Most game studios never touch Mantle (or DirectX) directly; they will only utilize it through game engines. So if game engines don't adopt Mantle support, Mantle will remain an interesting idea but nothing more. EA's Frostbite is a test platform to try to convince other game engine developers to use it, but Frostbite is basically only for EA games; just announcing Frostbite support and nothing else is not enough to change the market. Think about the last five years; there are countless games on Unreal Engine 3 but only a small minority on Frostbite 1/2 (basically EA only). Believe it or not there is a lot of money invested in the DirectX ecosystem, and it's going to take a big push to change the status quo and convince companies like Epic Games to invest tens to hundreds of millions of dollars to integrate Mantle support.
 
Exactly. The burden is on Developers to implement Mantle, and AMD has to entice them and prove it's worth the investment. If Mantle were something they could just tack on everyone would already be working on it, but the fact is they have to rework their game engines to support it, which will take a lot of time and money.
 
I think you're confused here. Most game studios never touch Mantle (or DirectX) directly; they will only utilize it through game engines. So if game engines don't adopt Mantle support, Mantle will remain an interesting idea but nothing more. EA's Frostbite is a test platform to try to convince other game engine developers to use it, but Frostbite is basically only for EA games; just announcing Frostbite support and nothing else is not enough to change the market. Think about the last five years; there are countless games on Unreal Engine 3 but only a small minority on Frostbite 1/2 (basically EA only). Believe it or not there is a lot of money invested in the DirectX ecosystem, and it's going to take a big push to change the status quo and convince companies like Epic Games to invest tens to hundreds of millions of dollars to integrate Mantle support.


Exactly. The burden is on Developers to implement Mantle, and AMD has to entice them and prove it's worth the investment. If Mantle were something they could just tack on everyone would already be working on it, but the fact is they have to rework their game engines to support it, which will take a lot of time and money.


You guys have a preconditioned view that AMD is using DICE as PR, and also, it's developers who wanted Mantle, which pretty much invalidates everything you're trying to say; not the other way around. If Mantle was nothing and there weren't big developers involved I would have detected PR spin a long time ago, basically I have no interest in discussing this, you're wrong, it doesn't work this way, you're mixing things around, Frostbite is not being exploited for PR, it could be any engine that happened to be the first one implemented, you guys are spinning this around Nothing what you have said makes sense considering everything we've heard, I suggest you go back to page 1 and check out all the links.




If Mantle were something they could just tack on everyone would already be working on it

Total BS.
 
looks like i might not need a CPU upgrade after all :)


what i'm expecting is that if the consoles support this (mostly XB1 i assume) then all the console ports (and their engines which we all known will get reused a lot) support mantle, then its going to take off massively.
 
You guys have a preconditioned view that AMD is using DICE as PR, and also, it's developers who wanted Mantle, which pretty much invalidates everything you're trying to say; not the other way around. If Mantle was nothing and there weren't big developers involved I would have detected PR spin a long time ago

Uhhhhh what?

They are using DICE as PR, DICE is on stage at conventions praising an AMD product, that is PR. Your misconception is that PR is innately bad, which is not the case. PR is just things like Press Releases, Demoes, and Keynotes to help market a product. AMD could have just as easily released it, and used their own tech or developed a tech demo to show it off, but they didn't. They worked with DICE, so DICE was the first to implement it and advertise its benefits.

There are some developers that wanted Mantle, and that's their decision. On the flip side, there was a time when developers hated Low Level API's similar to Mantle, and that's where DirectX came from. Yes, it could have been any engine, but it would have required the developer of said engine to work Mantle support into the engine. We're talking about Graphics API's here, you need to have the engine setup to support these things or else nothing will work as intended. There's a reason games are setup to run only in OpenGL or only in DirectX, it's because the Game Engines were designed to use those systems.

basically I have no interest in discussing this, you're wrong, it doesn't work this way, you're mixing things around, Frostbite is not being exploited for PR, it could be any engine that happened to be the first one implemented, you guys are spinning this around Nothing what you have said makes sense considering everything we've heard, I suggest you go back to page 1 and check out all the links.

That's a very pleasent attitude to have, essentially "I disagree with the things you're saying so I don't care or want to hear about it." I never said Frostbite was being exploited, just that it was the first Engine to showcase Mantle, and for the technology to take off other Engines need to implement support for it. Also, everything that we said makes sense, you just seem to be overly defensive and/or misinterpreting it.
 
Back
Top