# Mantle Enables Significant Performance Improvement in Battlefield 4: AMD



## btarunr (Jan 14, 2014)

In what could explain AMD's move to include copies of one of the most GPU-intensive games with its new A-Series APUs, the company revealed that Mantle, its ambitious attempt at a 3D graphics API to rival DirectX and OpenGL, "enables up to 45 percent faster performance" than DirectX in Battlefield 4, the only known game with planned support for Mantle, and one of the most popular PC games of the season. AMD's claims are so tall, that even a 512 stream processor-laden A10-7850K APU could offer acceptable frame-rates at 1080p, while a $299 Radeon R9 280X could run circles around a $650 GeForce GTX 780 Ti at this particular game. If anything, it could help Battlefield 4 become a potent tech-demonstrator for the API, selling it to the various game developers AMD has built strong developer relations with.





*View at TechPowerUp Main Site*


----------



## RCoon (Jan 14, 2014)

The key words here are "up to".
I will reservedly wait for actual factual benchmarks before I believe this.


----------



## Wile E (Jan 14, 2014)

RCoon said:


> The key words here are "up to".
> I will reservedly wait for actual factual benchmarks before I believe this.


Ditto RCoon. Ditto.


----------



## john_ (Jan 14, 2014)

RCoon said:


> The key words here are "up to".
> I will reservedly wait for actual factual benchmarks before I believe this.



The key word here is "45%". It is way too big even for a marketing lie. I prefer an "up to 45%" in a title like BF4, than a guaranteed "15%", anytime.


----------



## Prima.Vera (Jan 14, 2014)

"Up to". Can means that you can have 100fps more than a 780 Ti when looking at the floor only. The real performance will be on average fps, let's see how much those are increased in percentages.


----------



## The Quim Reaper (Jan 14, 2014)

We don't need higher frame rates at the top end, it's minimum frame rates that need to improve...nothings worse than having a smooth running game then entering an area only to see the frame rate plummet down in to the  low teens or 20's.

I'd far rather see a 100% guaranteed minimum 30fps, no matter what they put on the screen, than a meaningless 100fps+


----------



## Sempron Guy (Jan 14, 2014)

45 percent is a huge number. This might be AMD's boldest claim since Bulldozer.


----------



## john_ (Jan 14, 2014)

Prima.Vera said:


> "Up to". Can means that you can have 100fps more than a 780 Ti when looking at the floor only. The real performance will be on average fps, let's see how much those are increased in percentages.


I think looking at the floor will give you the less performance improvement. The more performance improvements will come from the most complex scenes where all the disadvantages of DirectX will be there, affecting performance.


----------



## erocker (Jan 14, 2014)

The 45% boost most likely goes to the APU's.


----------



## Kaynar (Jan 14, 2014)

The Quim Reaper said:


> We don't need higher frame rates at the top end, it's minimum frame rates that need to improve...nothings worse than having a smooth running game then entering an area only to see the frame rate plummet down in to the  low teens or 20's.
> 
> I'd far rather see a 100% guaranteed minimum 30fps, no matter what they put on the screen, than a meaningless 100fps+



Exactly. It's most certainly +45% possible max FPS and +0% increase over minimum FPS (eg during an explosion).


----------



## Relayer (Jan 14, 2014)

btarunr said:


> In what could explain AMD's move to include copies of one of the most GPU-intensive games with its new A-Series APUs, the company revealed that Mantle, its ambitious attempt at a 3D graphics API to rival DirectX and OpenGL, "enables up to 45 percent faster performance" than DirectX in Battlefield 4, *the only known game with planned support for Mantle*, and one of the most popular PC games of the season. AMD's claims are so tall, that even a 512 stream processor-laden A10-7850K APU could offer acceptable frame-rates at 1080p, while a $299 Radeon R9 280X could run circles around a $650 GeForce GTX 780 Ti at this particular game. If anything, it could help Battlefield 4 become a potent tech-demonstrator for the API, selling it to the various game developers AMD has built strong developer relations with.



You really need to get out more.  They are claiming 20 titles planned with support for Mantle. The next title planned is Thief in February.



> “Thief on PC will also be amongst the first batch of games that support AMD’s new Mantle API for high performance graphics,” said Eidos’ Jean-Normand Bucci, who conducted the interview


----------



## the54thvoid (Jan 14, 2014)

When the reviews are released it would be sensible for a comparison of low, mid and high end CPU paired with low, mid and high end GPU. Would let folk see where the bigger gains will be made.


----------



## kn00tcn (Jan 14, 2014)

btarunr said:


> Battlefield 4, the only known game with planned support for Mantle



once again, star citizen...


----------



## bogami (Jan 14, 2014)

BEAUTIFUL slap to NVIDIA .Nvidia could answer, of course, using the drivers for their Quadro series that bypasses diretx but what will then be sold under quadro support progrm !


----------



## frankfrankynumberone (Jan 14, 2014)

kn00tcn said:


> once again, star citizen...


There is even more!

Announced:
Battlefield 4 - DICE/Frostbite Team
Plants vs. Zombies: Garden Warfare - PopCap Games/Frostbite Team
Sniper Elite 3 - Rebellion
Star Citizen - Roberts Space Industries
Thief - Eidos Montreal/Nixxes Software

TBD:
Dragon Age: Inquisition - BioWare/Frostbite Team
Galactic Civilization III - Stardock
Mars (codename) - Mohawk Games/Oxide Games
Mass Effect (untitled) - BioWare/Frostbite Team
Mirror's Edge (untitled) - DICE/Frostbite Team
Need for Speed (untitled) - Ghost Games/Frostbite Team
Star Control - Stardock/Oxide Games
Star Wars: Battlefront - DICE/Frostbite Team
Star Wars (untitled) - Visceral Games/Frostbite Team

Other unannounced titles from EA/SQUARE ENIX/SEGA/(TBD). 20+ titles in development.

That's all for now. But if I know more, than I'll post it.


----------



## Fluffmeister (Jan 14, 2014)

Another year, more Powerpoint slides. 

Release it already.


----------



## darkangel0504 (Jan 14, 2014)

Sad when it doesn't work for HD 6000


----------



## librin.so.1 (Jan 14, 2014)

I, personally, would love it if they put more effort in their drivers and especially on optimizing their OpenGL[1], which at the moment, unlike on the green team, is Mjr. Balle de Sukke on their drivers. ...And would drop CCC, too. Instead of trying to "win" things over with this Mantle of theirs, which, I believe, can just create more of consumer-unfriendly segregation. And we already have too much of that.

*[1] *as mainly a Linux user, I care about that a lot.


----------



## john_ (Jan 14, 2014)

Vinska said:


> I, personally, would love it if they put more effort in their drivers and especially on optimizing their OpenGL[1], which at the moment, unlike on the green team, is Mjr. Balle de Sukke on their drivers. ...And would drop CCC, too. Instead of trying to "win" things over with this Mantle of theirs, which, I believe, can just create more of consumer-unfriendly segregation. And we already have too much of that.
> 
> *[1] *as mainly a Linux user, I care about that a lot.



Mantle is necessary for AMD. Going the root of OpenGL only or/and DirectX it is just butter on Nvidia's bread, because as a stronger brand with more cash in hands Nvidia can manipulate the PC gaming market easily. And they proved, in my opinion, that they enjoy to do so. Only now with all the consoles in it's pocket AMD have a strong advantage over Nvidia. It will be stupid not to use it especially when you have a competitor that, whatever it creates, it is proprietary and locked.


----------



## west7 (Jan 14, 2014)

intersting 45% performence improvment is great but we all have to wait for benchmarks


----------



## mr2009 (Jan 14, 2014)

erocker said:


> The 45% boost most likely goes to the APU's.


if you look at the picture on the bottom, it written there 


> On AMD A10-7850K APU+ AMD RadeonTM R9 290X Discrete Graphics


so i believe that 45% could mean achievable on R9 290X... correct me if im wrong..


----------



## Deadlyraver (Jan 14, 2014)

Either way, the performance is impressive.


----------



## Octavean (Jan 14, 2014)

Impressive indeed or potentially,....

They have my interest anyway,....

I've never been one to be AMD (ATI) only or nVidia only.  I often switch between the two.


----------



## Assimilator (Jan 14, 2014)

I'm interested to know where AMD is getting those numbers from, considering the Mantle version of BF4 isn't even out yet. Maybe they have a beta build of the BF4 Mantle engine that renders to /dev/null.

45% performance improvement*
* under certain conditions including but not limited to: season, time of day, prevailing wind direction, and/or whether the number of atoms in your computer is evenly divisible by 2


----------



## omnimodis78 (Jan 14, 2014)

The question begs to be asked, if Mantle is the second coming of Christ, won't that basically completely and utterly negate the need for high, heck, even mid-range cards?  I don't know, I think Mantle will prove to be more hype than anything else, seeing that shareholders of a company would rather see the cash inflows from selling overpriced expensive cards, than AMD being the white knight in shining armour and actually doing the right thing for gamers (I say that because while I'm an NVIDIA fanboy -it's true- I still think that Mantle would be a step towards the right direction).  Boardroom chatter always wins in the end.


----------



## W1zzard (Jan 14, 2014)

omnimodis78 said:


> The question begs to be asked, if Mantle is the second coming of Christ, won't that basically completely and utterly negate the need for high, heck, even mid-range cards?  I don't know, I think Mantle will prove to be more hype than anything else, seeing that shareholders of a company would rather see the cash inflows from selling overpriced expensive cards, than AMD being the white knight in shining armour and actually doing the right thing for gamers (I say that because while I'm an NVIDIA fanboy -it's true- I still think that Mantle would be a step towards the right direction).  Boardroom chatter always wins in the end.



Look at it like this: Mantle will help to increase your GPU load, if your GPU load is below 100% (limited by CPU/API). It won't do anything if GPU load is already at 100%


----------



## john_ (Jan 14, 2014)

omnimodis78 said:


> The question begs to be asked, if Mantle is the second coming of Christ, won't that basically completely and utterly negate the need for high, heck, even mid-range cards?  I don't know, I think Mantle will prove to be more hype than anything else, seeing that shareholders of a company would rather see the cash inflows from selling overpriced expensive cards, than AMD being the white knight in shining armour and actually doing the right thing for gamers (I say that because while I'm an NVIDIA fanboy -it's true- I still think that Mantle would be a step towards the right direction).  Boardroom chatter always wins in the end.



Think with 4K in mind not 1080p. Nvidia will need faster hardware to fight AMD if Mantle is a success. Also at 4K Intel can't follow.


----------



## Batou1986 (Jan 14, 2014)

just like crossfire = up to a 100% performance increase


----------



## EpicShweetness (Jan 14, 2014)

john_ said:


> Think with 4K in mind not 1080p. Nvidia will need faster hardware to fight AMD if Mantle is a success. Also at 4K Intel can't follow.


 
I was going to say, think of Mantle as a way to drive 4k. Or if anything with 1080p being on anything now a days, a drive to give you content at that resolution with less oomph. In other words less power, more compact hardware, smaller form factor same 1080p wonder show!
I think if you have 1080p and a R9 280X Mantle is nothing new or impressive, because it will give you those "use less 100FPS", but if you have 1080p and a 7770, or you get a brand new Kaveri APU, your going to see the biggest improvement. 21FPS + 45% = playable, awesome!


----------



## GLD (Jan 14, 2014)

I WISH A 280X STILL ONLY COST $299!!! Damn miners have drove the prices up a hundred dollars.


----------



## omnimodis78 (Jan 14, 2014)

It will be interesting to come back to this in a year or so, and see who was right or wrong, and to what degree.  I sincerely hope Mantle won't be a flop, but I also eye any of AMD's claims with suspicious eyes.  Time will tell.


----------



## okidna (Jan 14, 2014)

EpicShweetness said:


> I think if you have 1080p and a R9 280X Mantle is nothing new or impressive, because it will give you those "use less 100FPS", but if you have 1080p and a 7770, or you get a brand new Kaveri APU, your going to see the biggest improvement. 21FPS + 45% = playable, awesome!



I think it's the other way around.

High resolution (say 1080p) with weak GPUs (7700 series or R7 series or APU) = not much performance improvement because weaker GPUs already used close to its full potential (already in high GPU usage state).

Meanwhile with 1080p (or lower) and a powerful or even overkill GPUs (7900 series or R9 280/290 series or CF setup) chances are you'll running in medium or even low GPU usage in most games. This is where Mantle could help, increasing your GPU usage as close as possible to 100% = much more performance improvement.

BTW, I'm using W1zzard's logic here :



> Look at it like this: Mantle will help to increase your GPU load, if your GPU load is below 100% (limited by CPU/API).
> It won't do anything if GPU load is already at 100%



Please correct me if I'm wrong.


----------



## librin.so.1 (Jan 14, 2014)

john_ said:


> Mantle is necessary for AMD. Going the root of OpenGL only or/and DirectX it is just butter on Nvidia's bread, [...]


Still no excuse to leave their other APIs in a bad state (or in case of OpenGL, in a horrible state).



john_ said:


> It will be stupid not to use it especially when you have a competitor that, whatever it creates, it is proprietary and locked.


I don't get it.



okidna said:


> BTW, I'm using W1zzard's logic here :
> 
> 
> 
> ...



That sure is right!
Actually, from a conversation I had with a friend recently:


> no matter what API anyone would think up, the GPU won't magically get more GFLOPS


----------



## Serpent of Darkness (Jan 14, 2014)

btarunr said:


> In what could explain AMD's move to include copies of one of the most GPU-intensive games with its new A-Series APUs, the company revealed that Mantle, its ambitious attempt at a 3D graphics API to rival DirectX and OpenGL, "enables up to 45 percent faster performance" than DirectX in Battlefield 4, the only known game with planned support for Mantle, and one of the most popular PC games of the season.



1.  Where you say "the company revealed that Mantle, it's ambitious attempt at a 3D graphic API to rival DirectX and OpenGL," this statement is not completely accurate.  D3D and OpenGL are High Shader Language APIs.   AMD Mantle is not an HSL API.  It is a CPU-GPU Optimization API with additional perks.  So you can't really say they are the same, and you can't make a claim that it is AMD's rival-API when AMD hasn't released or announced a HSL version.

2.  The 45% isn't 45% to all setups.  It's 45% for an APU setup.  Mainly, this is with the Kaveri APU.  There could be a possibility it will be less than 45% with Richland or below.  Also, there's a possibility that it could be higher with bulldozer, Haswell, Sandy Bridge, Ivy Bridge, Ivy Bride-E, Haswell-E, etc...

 The Starswarm Demo showed the game without AMD Mantle, running at roughly 20 ms per frame, or 49.99 FPS.  With AMD Mantle, the demo was running at 2 to 6 ms, or roughly around 200 FPS.

If you watch the following video, I believe there is some accuracy to this to an extent.  It could be fake.  Who knows for sure at this current time.










I suspect there is a group of people who are testing the AMD Mantle Beta Version with BF4, and this person is one of them.  Take into account, when you watch this, towards the end of the video, this person is using GPU-Z.  He hints two things.  One, I believe he is using a Haswell setup.  GPU-Z shows the integrated Intel Graphics 4000.  Two, he's got a R9 series for his Discrete Graphic Card.  Since Haswell has 4 cores at a higher Core Frequency,  it's possible that FPS performance will go up based on the amount of cores your CPU has, and it's core frequency.

Since AMD Mantle requires a driver on the users end, the person in the video enabled the AMD Mantle in the AMD Catalyst Client.  So, in my opinion, it's looking less fake.

If 45% is what AMD Mantle offers at 162 to 200 FPS, then the Kavari APU alone is pushing somewhere around 113 FPS without AMD Mantle.  From the video, 300 FPS to 400 FPS is more than twice.  Now if you increase the amount of cores on the CPU, this 45% will probably start to show some form of diminishing returns, but the FPS will go up higher.  Why is that.  Well for a start, AMD Mantle, for a lack of a better term, redirects commands through the other cores for the GPU.  Thus, reducing the CPU Bottleneck occurring at the first core.

What's the point in having such a high FPS.  If you look at it in context when benchmarking "Video Graphic Cards,"  high FPS performance versus Game-A, B, C, D, etc, is the x-factor.  If AMD can provide their products with AMD Mantle, they can push higher fps for top-selling, PC Games.  Higher fps equates to higher popularity versus brand B Graphic Card, and revenue returns go up as consumers purchase "higher-performing" products.  Marketing of Graphic Cards are heavily dependent on third party benchers like Techpowerup.com...


----------



## john_ (Jan 14, 2014)

Vinska said:


> Still no excuse to leave their other APIs in a bad state (or in case of OpenGL, in a horrible state).


Is AMD's DirectX support in a bad state? How is that? And how much horrible is OpenGL?



> I don't get it.


 What is that, that you don't understand?


----------



## natr0n (Jan 14, 2014)

Mantle needs less pillow talk and more action.


----------



## arbiter (Jan 14, 2014)

From what i seen of AMD they tend to over state what their products can do. Claim they are faster then they really are. Case in point a recent even AMD claimed their 8350 cpu was comparable to i7 4770k, which if look at benchmarks their 9590 is still slower even though its overclocked ~20%.  Been several cases of that kinda stuff over the years as well, So their claim of 45% i wouldn't believe that for a sec til a 3rd party reviewer comes out with results.


----------



## EpicShweetness (Jan 14, 2014)

okidna said:


> I think it's the other way around.
> 
> Please correct me if I'm wrong.


 
I see what your saying, I was thinking about how Mantle is a different API. Yes you can't magically make a GPU work harder, but what if you change the rules on how it is told to think. See Mantle might increase framerates simply because compared to DirectX it can better utilize the resources given to it. So yes if that 100% load truly means that not much can be had as the API is already utilizing everything it can and Mantle will just do the same thing then in that logic I believe you are correct. However you can see mine right....


----------



## xtremesv (Jan 15, 2014)

Remember 3dfx Glide? Well it's not hard to guess how Mantle could end.


----------



## LAN_deRf_HA (Jan 15, 2014)

I'm still hoping mantle does so well that it forces nvidia to get serious and release GM110 as the lead maxwell chip instead of that midrange masquerading as the highend BS they pulled with kepler. Could be the the biggest generational performance jump in a long time.


----------



## Dent1 (Jan 15, 2014)

arbiter said:


> From what i seen of AMD they tend to over state what their products can do. Claim they are faster then they really are. Case in point a recent even AMD claimed their 8350 cpu was comparable to i7 4770k, which if look at benchmarks their 9590 is still slower even though its overclocked ~20%.  Been several cases of that kinda stuff over the years as well, So their claim of 45% i wouldn't believe that for a sec til a 3rd party reviewer comes out with results.




AMD didn't made that claim because the FX 8350 was released a YEAR BEFORE the  i7 4770k. So what you said can't be true.


----------



## Nihilus (Jan 15, 2014)

Damn this is getting exciting.  Thw haters will hate up until full release.  Mantle could enable a apu system to be more powerful then a ps4/xb1.  Apu systems will benefit the most.


----------



## happita (Jan 15, 2014)

xtremesv said:


> Remember 3dfx Glide? Well it's not hard to guess how Mantle could end.


 
No need to troll there big guy. Your just regurgitating what pretty much every skeptic has already said about Mantle. Try contributing something "new" to this discussion because it just gets boring hearing the same garbage over and over.

Mantle will be out by the end of the month. Yes they have delayed it from last month, but shit happens, YES even in the tech world where we don't get what we are promised. (Mommy, if I'm good, can I go to the arcade? Yes honey, sure.....)
The whole fiasco with BF4 being a technical mess is obvious and it has halted the progress of AMD implementing Mantle in a timely fashion. It looks like EA/DICE has been hard at work fixing up BF4 good and proper and as such I think once they feel it is on par with their and our expectations, Mantle will be ready to go.


----------



## Steevo (Jan 15, 2014)

darkangel0504 said:


> Sad when it doesn't work for HD 6000


 Sad when technology progresses? Or sad that its two generations old and new technology requires....new technology. I guess we should all be angry it idn't going to be supported on Windows 98SE, cause that was awesome, and I had a great time playing games on that OS, and since it doesn't need DX.........


----------



## RejZoR (Jan 15, 2014)

xtremesv said:


> Remember 3dfx Glide? Well it's not hard to guess how Mantle could end.



Glide was gone because 3dfx went bust with bad manufacturing and business decisions and not because Glide was crap by itself. All the bad decisions have taken a toll on development and so hardware started lagging behind competitors and Glide made less and less sense, especially since Direct3D was gaining on popularity back then.

I don't care if gains won't be exactly 45%, i still know they will be huge knowing the performance of other such API's. I still have great memories of S3 Metal that turned my poor S3 Savage3D into something that could compete with RivaTNT2 running D3D back then. It was crazy playing all the Unreal Engine based games with insane framerates just because of that API.

Now that i think, maybe someone might write a Mantle API for older Unreal games  Imagine UT99 with 50000 fps


----------



## Prima.Vera (Jan 15, 2014)

For guys saying that you need Mantle for 1440p resolution or more, since in 1080p you already have more than 60fps with the newer cards, I double-dare them to run any of those games full detail on 8xSSAA with Edge Detect; and post the fps status. I triple dare them.
Btw, anyone remembers Witcher 3 with the Uber Sampling version? Try that with 8xAA on a 1080p 
Cheers.


----------



## Mussels (Jan 15, 2014)

45% on an A10 CPU, with a 290x.


so umm, with all the people here who are acting like experts on mantle, why has no one pointed out the obvious?

we wont get faster FPS oh high end systems, its systems with an 'average' or weak CPU that see the greatest gains.

pair a 290x with a 'slow', multi core CPU and you'll see the greatest gains.


----------



## Prima.Vera (Jan 15, 2014)

^-- you have a point...


----------



## xorbe (Jan 15, 2014)

if fps goes up 45% won't gpu power usage go up 45%


----------



## Nihilus (Jan 15, 2014)

xorbe said:


> if fps goes up 45% won't gpu power usage go up 45%



No, you can't give more than 100% despite what your little league coach told you.  Mantle makes the GPU more efficient.  Think xbox 360 vs ATI 1950.  Similar hardware but you would not be able to run what an xbox 360 can with an ATi 1950 card.


----------



## Frick (Jan 15, 2014)

arbiter said:


> From what i seen of AMD they tend to over state what their products can do. Claim they are faster then they really are. Case in point a recent even AMD claimed their 8350 cpu was comparable to i7 4770k, which if look at benchmarks their 9590 is still slower even though its overclocked ~20%.  Been several cases of that kinda stuff over the years as well, So their claim of 45% i wouldn't believe that for a sec til a 3rd party reviewer comes out with results.



For one thing it's cherry picked. Under specific circumstances a 8350 would be as fast an i7. They never speak in general terms. I'm quite confident that %45 number is true, the question is a) circumstances and b) 45% over what _exactly_?


----------



## Dent1 (Jan 15, 2014)

xorbe said:


> if fps goes up 45% won't gpu power usage go up 45%



No because frame rate and power consumption doesn't increase together in a linear manner  at the same increment.

Generally speaking to increase the FPS the hardware would need to work harder, this could increase power consumption slightly, but its not a 1:1 ratio.




Frick said:


> For one thing it's cherry picked. Under specific circumstances a 8350 would be as fast an i7. They never speak in general terms. I'm quite confident that %45 number is true, the question is a) circumstances and b) 45% over what _exactly_?



What you said is 100% true, but I don't think AMD made such quotes to begin with. The  i7 4770k was released a year after the FX 8350 so it would be impossible for AMD to make that claim. So either arbiter is misinformed or is lying. He seems like a decent gentleman so I'm going to say misinformed.


----------



## TheHunter (Jan 15, 2014)

I wound say no difference in power consumption,.

For example 3dmark2011

Gpu works 100% and 250W TDP, now imagine you remove some API drawback calls that are stalling the driver and make it more efficient,. It spends less time with driver <> API communication/calculations and uses that extra for more rendering power.
It would still run at the same 100% gpu usage and 250w TDP..

Actually I think it should be lower since gpu shader efficiency raises , kinda like PSU efficiency 80+ plus vs 80+ titanium by same wattage.


----------



## xorbe (Jan 16, 2014)

TheHunter said:


> I wound say no difference in power consumption,. For example 3dmark2011 Gpu works 100% and 250W TDP, now imagine you remove some API drawback calls that are stalling the driver and make it more efficient,. It spends less time with driver <> API communication/calculations and uses that extra for more rendering power. It would still run at the same 100% gpu usage and 250w TDP.. Actually I think it should be lower since gpu shader efficiency raises , kinda like PSU efficiency 80+ plus vs 80+ titanium by same wattage.



I am quoting this in case I need to refer to it later.


----------



## Aquinus (Jan 16, 2014)

It seems to me that AMD wants games to run better on machines with more cores. If what @W1zzard said is correct, I'm sure that this is the case. It's more about letting CPUs scale to the GPUs, because GPUs are already pretty good at what they do. I wonder how Mantle scales in CFX as opposed to using just your typical Direct3D libraries.

As many have said before, I would love some real numbers instead of this "up to 45%" garbage.


----------



## Melvis (Jan 16, 2014)

john_ said:


> Think with 4K in mind not 1080p. Nvidia will need faster hardware to fight AMD if Mantle is a success. Also at 4K Intel can't follow.



Mantle is open source, Nvidia can implement it if they like also.


----------



## Mussels (Jan 16, 2014)

Frick said:


> For one thing it's cherry picked. Under specific circumstances a 8350 would be as fast an i7. They never speak in general terms. I'm quite confident that %45 number is true, the question is a) circumstances and b) 45% over what _exactly_?



they said. a 290x with an A10 APU.


all this does is remove the CPU bottleneck, like i said on the last page as well. if you have a high end GPU and a midrange CPU, you'll see massive gains.

if you're in an RTS game where its always CPU limited, you'll see massive gains.

the common denominator here is if your CPU is bottlenecked, you'll see performance gains. if you arent bottlenecked and you use Vsync, you'll just save on wattage and heat.


----------



## phanbuey (Jan 16, 2014)

great news!  I'm hoping it gets wide adoption, and then force nv to respond.  4K GFX for the masses!!!


----------



## leeb2013 (Jan 16, 2014)

Mussels said:


> 45% on an A10 CPU, with a 290x.
> 
> 
> so umm, with all the people here who are acting like experts on mantle, why has no one pointed out the obvious?
> ...



so pair an AMD 290x with an AMD CPU and the AMD Mantle will produce some gains. I see what they're doing there.


----------



## Relayer (Jan 16, 2014)

Vinska said:


> I, personally, would love it if they put more effort in their drivers and especially on optimizing their OpenGL[1], which at the moment, unlike on the green team, is Mjr. Balle de Sukke on their drivers. ...And would drop CCC, too. Instead of trying to "win" things over with this Mantle of theirs, which, I believe, can just create more of consumer-unfriendly segregation. And we already have too much of that.
> 
> *[1] *as mainly a Linux user, I care about that a lot.


Since Mantle isn't tied to DX you will likely see it on Linux as well. Just have to be patient while they take care of the big dog (Windows) first. Also, Mantle will remove the game to game dependency on driver optimizations, as the devs will be able to optimize everything from their end.



omnimodis78 said:


> The question begs to be asked, if Mantle is the second coming of Christ, won't that basically completely and utterly negate the need for high, heck, even mid-range cards?  I don't know, I think Mantle will prove to be more hype than anything else, seeing that shareholders of a company would rather see the cash inflows from selling overpriced expensive cards, than AMD being the white knight in shining armour and actually doing the right thing for gamers (I say that because while I'm an NVIDIA fanboy -it's true- I still think that Mantle would be a step towards the right direction).  Boardroom chatter always wins in the end.


They are looking at the bigger picture, selling APU's, especially in cheap gaming laptops. I'm sure they will have some high end features that will still require mega computing power, but this should dramatically expand their customer base.

There's also hires Eyefinity and (especially) 4K monitors that their solutions should be far more affordable and move what has been reserved for the lunatic fringe closer to the mainstream. A pair of 290's ($800 once the mining craze subsides) should be able to match the gaming experience of TriSLI 780 ti for a fraction of the price. Pair that up with one of the cheaper 4K monitors on the horizon, a "cheap" (by Intel standards) $150 8 core AMD CPU and you'll have gaming performance that last year everyone was assuming would be unaffordable to most.



W1zzard said:


> Look at it like this: Mantle will help to increase your GPU load, if your GPU load is below 100% (limited by CPU/API). It won't do anything if GPU load is already at 100%


I'm not so sure you are correct (if I may be so bold ). If you look at the latest swarm demo they were dramatically changing performance by adding and subtracting IQ settings. They toggled motion blur (multi sampling motion blur I think it was called? A truer motion blur effect that's done by rendering the frame multiple times rather than simply adding a filter effect.) on and off and FPS in DX went from playable to slideshow while Mantle was still playable. I realize that's only a single example and doesn't mean there will be other IQ effects/settings that will have the same effect, but I'm assuming that since it's a tech demo they simply chose something that would be easy to implement and demonstrate. I've also seen reported that AA penalty will be drastically reduced with Mantle. Mainly because DX has huge AA overhead. I know it's early days and none of this proves anything conclusively, but it does look promising.

I really don't think the devs would be as excited as they are (genuinely excited I believe) if it was only going to reduce CPU bottlenecks.



Serpent of Darkness said:


> 1.  Where you say "the company revealed that Mantle, it's ambitious attempt at a 3D graphic API to rival DirectX and OpenGL," this statement is not completely accurate.  D3D and OpenGL are High Shader Language APIs.   AMD Mantle is not an HSL API.  It is a CPU-GPU Optimization API with additional perks.  So you can't really say they are the same, and you can't make a claim that it is AMD's rival-API when AMD hasn't released or announced a HSL version.
> 
> 2.  The 45% isn't 45% to all setups.  It's 45% for an APU setup.  Mainly, this is with the Kaveri APU.  There could be a possibility it will be less than 45% with Richland or below.  Also, there's a possibility that it could be higher with bulldozer, Haswell, Sandy Bridge, Ivy Bridge, Ivy Bride-E, Haswell-E, etc...
> 
> ...


 
I'm pretty sure _Johan_ Andersson said it was fake. I can't find the Tweet ATM.



Steevo said:


> Sad when technology progresses? Or sad that its two generations old and new technology requires....new technology. I guess we should all be angry it idn't going to be supported on Windows 98SE, cause that was awesome, and I had a great time playing games on that OS, and since it doesn't need DX.........


I didn't get the impression he was hating on it because it didn't work on 6000's, just wished it would. Hopefully it means M$ won't be able to hold us hostage to buy their latest OS if we want the latest gaming features.

*repeat after me......  "I will use the edit and quote buttons so that I don't, double, triple, Quadruple and Quintuple posts" *


----------



## john_ (Jan 16, 2014)

Melvis said:


> Mantle is open source, Nvidia can implement it if they like also.



I don't expect them do it. If Maxwell is a much much better chip than Kepler they will just do a little price war, giving much better DirectX performance than AMD at the same price points. So in the end, for example, you will have to choose between being faster by 10-30% in games that support Mantle with an AMD card or 10-20% faster in games that don't support Mantle with an Nvidia card. AMD must be better in hardware also to force Nvidia to support Mantle.


----------



## Relayer (Jan 16, 2014)

The only way nVidia supports Mantle is if it becomes an open standard, which I believe AMD would be willing to do. Too dangerous for them to rely on an API their main competitor controls. It would be like AMD adding PhysX to their feature stack and being at the mercy of nVidia to not make it run like crap on AMD's hardware.


----------



## arbiter (Jan 16, 2014)

Dent1 said:


> AMD didn't made that claim because the FX 8350 was released a YEAR BEFORE the  i7 4770k. So what you said can't be true.









they didn't?




Relayer said:


> The only way nVidia supports Mantle is if it becomes an open standard, which I believe AMD would be willing to do. Too dangerous for them to rely on an API their main competitor controls. It would be like AMD adding PhysX to their feature stack and being at the mercy of nVidia to not make it run like crap on AMD's hardware.



AMD already said it was open but as for nvidia ever using it is doubtful just on principle alone they won't.




john_ said:


> I don't expect them do it. If Maxwell is a much much better chip than Kepler they will just do a little price war, giving much better DirectX performance than AMD at the same price points. So in the end, for example, you will have to choose between being faster by 10-30% in games that support Mantle with an AMD card or 10-20% faster in games that don't support Mantle with an Nvidia card. AMD must be better in hardware also to force Nvidia to support Mantle.



As i said nvidia won't on principle alone, but mantle still has a ton to prove. Is it really as fast as amd claims and one thing i been vocal about is since it does have low level hardware access what kinda stability issues will come in to play with that. Windows back in 90's used to be direct hardware axx to everything and well that wasn't so good.


----------



## W1zzard (Jan 16, 2014)

Relayer said:


> I'm not so sure you are correct (if I may be so bold )


Please be, always 



> . If you look at the latest swarm demo they were dramatically changing performance by adding and subtracting IQ settings. They toggled motion blur (multi sampling motion blur I think it was called? A truer motion blur effect that's done by rendering the frame multiple times rather than simply adding a filter effect.) on and off and FPS in DX went from playable to slideshow while Mantle was still playable. I realize that's only a single example and doesn't mean there will be other IQ effects/settings that will have the same effect, but I'm assuming that since it's a tech demo they simply chose something that would be easy to implement and demonstrate.


I would assume that the way the DX renderer renders the motion blur introduces a bottleneck in either the DirectX API or the CPU, which goes away when running Mantle. So in the non-Mantle example the GPU was most certainly not running at 100%, while with Mantle CPU load is much higher, resulting in higher FPS.

If they naively implemented the DirectX motion blur then this comes at no surprise. If you render, then copy the rendered frame back onto the CPU, it will stall the whole GPU pipeline while the copy is in progress (for more technical info: https://www.google.com/#q=getrendertargetdata+slow)


----------



## Frick (Jan 16, 2014)

Mussels said:


> they said. a 290x with an A10 APU.



Ah ok. But still, what games, what settings, what resolution, what levels did they use etc.



arbiter said:


> they didn't?



That's Mantle PR, which is extremely different from what you were on about in your first post.


----------



## ensabrenoir (Jan 16, 2014)

amd... amd.... doing great things so far.... hope this manifests as you claim.  Don't let yourself get  bulldozier-ed again by all the hype.


----------



## Dent1 (Jan 16, 2014)

arbiter said:


> From what i seen of AMD they tend to over state what their products can do. Claim they are faster then they really are. Case in point a recent even AMD claimed their 8350 cpu was comparable to i7 4770k, which if look at benchmarks their 9590 is still slower even though its overclocked ~20%.  Been several cases of that kinda stuff over the years as well, So their claim of 45% i wouldn't believe that for a sec til a 3rd party reviewer comes out with results.






arbiter said:


> they didn't?
> 
> 
> 
> ...



But that is based on performance linked with Mantle.  In your original post you didn't mention mantle, this gave the impression AMD said that statement a year prior to the 4770ks release or recently but randomly.

There is a Mantle seminar video  presentation on the Mantle API. They managed to make rendering almost solely GPU bound. They said when underclocking the FX 8350 to 2GHz it performs the same as GPU is in control. I can't remember the exact time, but I found the video its worth watching throughout if you haven't already seen it.


----------



## Relayer (Jan 17, 2014)

arbiter said:


> they didn't?


This is while running Mantle which makes use of all 8 AMD cores where DX doesn't and allows the higher IPC of the 4770 to shine.






> AMD already said it was open but as for nvidia ever using it is doubtful just on principle alone they won't.



Having it be open but still controlled by AMD wouldn't work. It would still be AMD/GCN centric. What I'm talking about, and I might not have presented it right, is it being an open standard that has a body of multiple contributers controlling it. Possibly, AMD-nVidia-Intel-M$-etc, so they could all have input to cater to their own needs.






> As i said nvidia won't on principle alone, but mantle still has a ton to prove. Is it really as fast as amd claims and one thing i been vocal about is since it does have low level hardware access what kinda stability issues will come in to play with that. Windows back in 90's used to be direct hardware axx to everything and well that wasn't so good.



If it gets used in enough high profile games and nVidia gets their butts handed to them because of it, they might be forced to, like it or not. Stability is _supposed_ to be improved because the devs can optimize their code so much better. Nothing's proven as of yet, though.


----------



## Relayer (Jan 17, 2014)

W1zzard said:


> Please be, always
> 
> 
> I would assume that the way the DX renderer renders the motion blur introduces a bottleneck in either the DirectX API or the CPU, which goes away when running Mantle. So in the non-Mantle example the GPU was most certainly not running at 100%, while with Mantle CPU load is much higher, resulting in higher FPS.
> ...


Well, technically speaking, I have no clue.  I have no technical knowledge to call upon. I'm just trying to absorb as much info on it as I can and make sense of it. Typically though with mature drivers GPU usage is usually +90%. I don't know why the devs would seem so genuinely excited about it if DX could already be optimized to provide over 90% of Mantle's performance. Hopefully we don't have too much longer to wait.

It seems like BF4 is clogging up everything while they are waiting for Dice to fix it. FWIU Dice was given first release rights to Mantle because of all the work they did developing and promoting it with AMD. It actually looks like Oxide could give us something more right now, but they have to wait for the BF4 patch.


----------



## xtremesv (Jan 17, 2014)

happita said:


> No need to troll there big guy. Your just regurgitating what pretty much every skeptic has already said about Mantle. Try contributing something "new" to this discussion because it just gets boring hearing the same garbage over and over.
> 
> Mantle will be out by the end of the month. Yes they have delayed it from last month, but shit happens, YES even in the tech world where we don't get what we are promised. (Mommy, if I'm good, can I go to the arcade? Yes honey, sure.....)
> The whole fiasco with BF4 being a technical mess is obvious and it has halted the progress of AMD implementing Mantle in a timely fashion. It looks like EA/DICE has been hard at work fixing up BF4 good and proper and as such I think once they feel it is on par with their and our expectations, Mantle will be ready to go.



I don't see how pointing out a fact can be trolling. Mantle has gathered that much attention because it coincided with the release of new gen consoles not due to a revolutionary idea being implemented for the first time. Forgive me if I don't have faith in Mantle the way you do.


----------



## NeoXF (Jan 19, 2014)

Batou1986 said:


> just like crossfire = up to a 100% performance increase


When in reality, it's actually just 80-90% on average... how horribly sick and false are AMD's claims on that, right? /sarcasm

FFS, BF > CoD any day, but that also means (and thank god, is the case as well), that it won't be released in a yearly cycle... but STILL they mess up the timeframes, bugs galore and delays of features that for some, are almost paramount, is just bad news man.

One would expect AMD would be the one dropping the ball in this, but no, it's their partner(s). Anyway, just a few more days... I hope.



arbiter said:


> they didn't?


If anything, "they" admited with that that FX 8350 is slower than i7 4770K, BUT Mantle makes them even in these circumstances. Also, news flash, that's an *Oxide slide*, not a AMD one (tho probably approved by them first). The sloppy way they typed in the model names kinda make my eye twitch.

You kinda failed big time here. Seems you are filled with a lot of disdain towards AMD, did they happen to run over your childhood pet or something? Chill yo...


----------



## Nihilus (Feb 3, 2014)

RCoon said:


> The key words here are "up to".
> I will reservedly wait for actual factual benchmarks before I believe this.


  Seen a review on guru3d.  It gets at least 10% boost in every situation, so not bad.



erocker said:


> The 45% boost most likely goes to the APU's.


  APUs seemed to get the smallest gains, but 10% is still a nice boost.  The real focus was eliminating CPU overhead.  The greatest gains were soon with a good graphics card and a slower cpu.  PCs can get gaming efficiency that is more like that of consoles  
    In a way, this actually hurts the apu.  For the price of a 7850k you can get an R250 and a cheapo intel CPU.  the latter system offers more flexibility for upgrades.


----------



## leeb2013 (Feb 3, 2014)

*Mantle Enables Significant Performance Improvement in Battlefield 4: AMD......*

no it doesn't, it causes BF4 to instantly crash when selecting graphics-options

But I fixed that with help from the online community only to find.....

no it doesn't, performance doesn't improve for anything other than the new R9-xxx with a crappy AMD APU. Forget 7xxx series GPU and Intel CPU......may be optimised sometime.....or may not. The driver will be Beta forever, there will always be issues with DX9, crossfire, multiple displays and it will never improve performance for anyone with an Intel CPU, because they want you to buy cheaper and inferior AMD cpu's for which Mantle will improve AMD sponsored games (ie, just BF4).

On a positive note, the last 1GB update to BF4, which seemed only to check if you were running 13.12 drivers, not 13.11 and complain if you had 13.11 (even if you had 13.12 which AMD forgot and to rename to 13.12, so BF4 still thought you had 13.11) causing you to lose your slot on the server whilst you found the dialogue box to select "yes please run BF4 even though I only have 13.11 (but actually 13.12)"), plus changes to increase the number of crashes between rounds and a menu option for Mantle (which just detects if you have a Intel or AMD CPU and removes the artificial performance restriction if you have an AMD CPU and R9-xxx (really think BF4 needs 80% of an o/c I5 yet runs ok on XB1?, me neither)), at least it thinks I have 14.1 drivers so it complains no more, and neither shall I.


----------



## Nihilus (Feb 3, 2014)

that was epic


----------



## Wile E (Feb 3, 2014)

Queue the mass of anti-Intel programming conspiracies.


----------

