Tuesday, January 14th 2014

Mantle Enables Significant Performance Improvement in Battlefield 4: AMD

In what could explain AMD's move to include copies of one of the most GPU-intensive games with its new A-Series APUs, the company revealed that Mantle, its ambitious attempt at a 3D graphics API to rival DirectX and OpenGL, "enables up to 45 percent faster performance" than DirectX in Battlefield 4, the only known game with planned support for Mantle, and one of the most popular PC games of the season. AMD's claims are so tall, that even a 512 stream processor-laden A10-7850K APU could offer acceptable frame-rates at 1080p, while a $299 Radeon R9 280X could run circles around a $650 GeForce GTX 780 Ti at this particular game. If anything, it could help Battlefield 4 become a potent tech-demonstrator for the API, selling it to the various game developers AMD has built strong developer relations with.
Add your own comment

74 Comments on Mantle Enables Significant Performance Improvement in Battlefield 4: AMD

#51
Dent1
xorbeif fps goes up 45% won't gpu power usage go up 45%
No because frame rate and power consumption doesn't increase together in a linear manner at the same increment.

Generally speaking to increase the FPS the hardware would need to work harder, this could increase power consumption slightly, but its not a 1:1 ratio.
FrickFor one thing it's cherry picked. Under specific circumstances a 8350 would be as fast an i7. They never speak in general terms. I'm quite confident that %45 number is true, the question is a) circumstances and b) 45% over what exactly?
What you said is 100% true, but I don't think AMD made such quotes to begin with. The i7 4770k was released a year after the FX 8350 so it would be impossible for AMD to make that claim. So either arbiter is misinformed or is lying. He seems like a decent gentleman so I'm going to say misinformed.
Posted on Reply
#52
TheHunter
I wound say no difference in power consumption,.

For example 3dmark2011

Gpu works 100% and 250W TDP, now imagine you remove some API drawback calls that are stalling the driver and make it more efficient,. It spends less time with driver <> API communication/calculations and uses that extra for more rendering power.
It would still run at the same 100% gpu usage and 250w TDP..

Actually I think it should be lower since gpu shader efficiency raises , kinda like PSU efficiency 80+ plus vs 80+ titanium by same wattage.
Posted on Reply
#53
xorbe
TheHunterI wound say no difference in power consumption,. For example 3dmark2011 Gpu works 100% and 250W TDP, now imagine you remove some API drawback calls that are stalling the driver and make it more efficient,. It spends less time with driver <> API communication/calculations and uses that extra for more rendering power. It would still run at the same 100% gpu usage and 250w TDP.. Actually I think it should be lower since gpu shader efficiency raises , kinda like PSU efficiency 80+ plus vs 80+ titanium by same wattage.
I am quoting this in case I need to refer to it later.
Posted on Reply
#54
Aquinus
Resident Wat-man
It seems to me that AMD wants games to run better on machines with more cores. If what @W1zzard said is correct, I'm sure that this is the case. It's more about letting CPUs scale to the GPUs, because GPUs are already pretty good at what they do. I wonder how Mantle scales in CFX as opposed to using just your typical Direct3D libraries.

As many have said before, I would love some real numbers instead of this "up to 45%" garbage.
Posted on Reply
#55
Melvis
john_Think with 4K in mind not 1080p. Nvidia will need faster hardware to fight AMD if Mantle is a success. Also at 4K Intel can't follow.
Mantle is open source, Nvidia can implement it if they like also.
Posted on Reply
#56
Mussels
Freshwater Moderator
FrickFor one thing it's cherry picked. Under specific circumstances a 8350 would be as fast an i7. They never speak in general terms. I'm quite confident that %45 number is true, the question is a) circumstances and b) 45% over what exactly?
they said. a 290x with an A10 APU.


all this does is remove the CPU bottleneck, like i said on the last page as well. if you have a high end GPU and a midrange CPU, you'll see massive gains.

if you're in an RTS game where its always CPU limited, you'll see massive gains.

the common denominator here is if your CPU is bottlenecked, you'll see performance gains. if you arent bottlenecked and you use Vsync, you'll just save on wattage and heat.
Posted on Reply
#57
phanbuey
great news! I'm hoping it gets wide adoption, and then force nv to respond. 4K GFX for the masses!!!
Posted on Reply
#58
leeb2013
Mussels45% on an A10 CPU, with a 290x.


so umm, with all the people here who are acting like experts on mantle, why has no one pointed out the obvious?

we wont get faster FPS oh high end systems, its systems with an 'average' or weak CPU that see the greatest gains.

pair a 290x with a 'slow', multi core CPU and you'll see the greatest gains.
so pair an AMD 290x with an AMD CPU and the AMD Mantle will produce some gains. I see what they're doing there.
Posted on Reply
#59
Relayer
VinskaI, personally, would love it if they put more effort in their drivers and especially on optimizing their OpenGL[1], which at the moment, unlike on the green team, is Mjr. Balle de Sukke on their drivers. ...And would drop CCC, too. Instead of trying to "win" things over with this Mantle of theirs, which, I believe, can just create more of consumer-unfriendly segregation. And we already have too much of that.

[1] as mainly a Linux user, I care about that a lot.
Since Mantle isn't tied to DX you will likely see it on Linux as well. Just have to be patient while they take care of the big dog (Windows) first. Also, Mantle will remove the game to game dependency on driver optimizations, as the devs will be able to optimize everything from their end.
omnimodis78The question begs to be asked, if Mantle is the second coming of Christ, won't that basically completely and utterly negate the need for high, heck, even mid-range cards? I don't know, I think Mantle will prove to be more hype than anything else, seeing that shareholders of a company would rather see the cash inflows from selling overpriced expensive cards, than AMD being the white knight in shining armour and actually doing the right thing for gamers (I say that because while I'm an NVIDIA fanboy -it's true- I still think that Mantle would be a step towards the right direction). Boardroom chatter always wins in the end.
They are looking at the bigger picture, selling APU's, especially in cheap gaming laptops. I'm sure they will have some high end features that will still require mega computing power, but this should dramatically expand their customer base.

There's also hires Eyefinity and (especially) 4K monitors that their solutions should be far more affordable and move what has been reserved for the lunatic fringe closer to the mainstream. A pair of 290's ($800 once the mining craze subsides) should be able to match the gaming experience of TriSLI 780 ti for a fraction of the price. Pair that up with one of the cheaper 4K monitors on the horizon, a "cheap" (by Intel standards) $150 8 core AMD CPU and you'll have gaming performance that last year everyone was assuming would be unaffordable to most.
W1zzardLook at it like this: Mantle will help to increase your GPU load, if your GPU load is below 100% (limited by CPU/API). It won't do anything if GPU load is already at 100%
I'm not so sure you are correct (if I may be so bold :)). If you look at the latest swarm demo they were dramatically changing performance by adding and subtracting IQ settings. They toggled motion blur (multi sampling motion blur I think it was called? A truer motion blur effect that's done by rendering the frame multiple times rather than simply adding a filter effect.) on and off and FPS in DX went from playable to slideshow while Mantle was still playable. I realize that's only a single example and doesn't mean there will be other IQ effects/settings that will have the same effect, but I'm assuming that since it's a tech demo they simply chose something that would be easy to implement and demonstrate. I've also seen reported that AA penalty will be drastically reduced with Mantle. Mainly because DX has huge AA overhead. I know it's early days and none of this proves anything conclusively, but it does look promising.

I really don't think the devs would be as excited as they are (genuinely excited I believe) if it was only going to reduce CPU bottlenecks.
Serpent of Darkness1. Where you say "the company revealed that Mantle, it's ambitious attempt at a 3D graphic API to rival DirectX and OpenGL," this statement is not completely accurate. D3D and OpenGL are High Shader Language APIs. AMD Mantle is not an HSL API. It is a CPU-GPU Optimization API with additional perks. So you can't really say they are the same, and you can't make a claim that it is AMD's rival-API when AMD hasn't released or announced a HSL version.

2. The 45% isn't 45% to all setups. It's 45% for an APU setup. Mainly, this is with the Kaveri APU. There could be a possibility it will be less than 45% with Richland or below. Also, there's a possibility that it could be higher with bulldozer, Haswell, Sandy Bridge, Ivy Bridge, Ivy Bride-E, Haswell-E, etc...

The Starswarm Demo showed the game without AMD Mantle, running at roughly 20 ms per frame, or 49.99 FPS. With AMD Mantle, the demo was running at 2 to 6 ms, or roughly around 200 FPS.

If you watch the following video, I believe there is some accuracy to this to an extent. It could be fake. Who knows for sure at this current time.


I suspect there is a group of people who are testing the AMD Mantle Beta Version with BF4, and this person is one of them. Take into account, when you watch this, towards the end of the video, this person is using GPU-Z. He hints two things. One, I believe he is using a Haswell setup. GPU-Z shows the integrated Intel Graphics 4000. Two, he's got a R9 series for his Discrete Graphic Card. Since Haswell has 4 cores at a higher Core Frequency, it's possible that FPS performance will go up based on the amount of cores your CPU has, and it's core frequency.

Since AMD Mantle requires a driver on the users end, the person in the video enabled the AMD Mantle in the AMD Catalyst Client. So, in my opinion, it's looking less fake.

If 45% is what AMD Mantle offers at 162 to 200 FPS, then the Kavari APU alone is pushing somewhere around 113 FPS without AMD Mantle. From the video, 300 FPS to 400 FPS is more than twice. Now if you increase the amount of cores on the CPU, this 45% will probably start to show some form of diminishing returns, but the FPS will go up higher. Why is that. Well for a start, AMD Mantle, for a lack of a better term, redirects commands through the other cores for the GPU. Thus, reducing the CPU Bottleneck occurring at the first core.

What's the point in having such a high FPS. If you look at it in context when benchmarking "Video Graphic Cards," high FPS performance versus Game-A, B, C, D, etc, is the x-factor. If AMD can provide their products with AMD Mantle, they can push higher fps for top-selling, PC Games. Higher fps equates to higher popularity versus brand B Graphic Card, and revenue returns go up as consumers purchase "higher-performing" products. Marketing of Graphic Cards are heavily dependent on third party benchers like Techpowerup.com...
I'm pretty sure Johan Andersson said it was fake. I can't find the Tweet ATM.
SteevoSad when technology progresses? Or sad that its two generations old and new technology requires....new technology. I guess we should all be angry it idn't going to be supported on Windows 98SE, cause that was awesome, and I had a great time playing games on that OS, and since it doesn't need DX.........
I didn't get the impression he was hating on it because it didn't work on 6000's, just wished it would. Hopefully it means M$ won't be able to hold us hostage to buy their latest OS if we want the latest gaming features.

repeat after me...... "I will use the edit and quote buttons so that I don't, double, triple, Quadruple and Quintuple posts" :)
Posted on Reply
#60
john_
MelvisMantle is open source, Nvidia can implement it if they like also.
I don't expect them do it. If Maxwell is a much much better chip than Kepler they will just do a little price war, giving much better DirectX performance than AMD at the same price points. So in the end, for example, you will have to choose between being faster by 10-30% in games that support Mantle with an AMD card or 10-20% faster in games that don't support Mantle with an Nvidia card. AMD must be better in hardware also to force Nvidia to support Mantle.
Posted on Reply
#61
Relayer
The only way nVidia supports Mantle is if it becomes an open standard, which I believe AMD would be willing to do. Too dangerous for them to rely on an API their main competitor controls. It would be like AMD adding PhysX to their feature stack and being at the mercy of nVidia to not make it run like crap on AMD's hardware.
Posted on Reply
#62
arbiter
Dent1AMD didn't made that claim because the FX 8350 was released a YEAR BEFORE the i7 4770k. So what you said can't be true.


they didn't?
RelayerThe only way nVidia supports Mantle is if it becomes an open standard, which I believe AMD would be willing to do. Too dangerous for them to rely on an API their main competitor controls. It would be like AMD adding PhysX to their feature stack and being at the mercy of nVidia to not make it run like crap on AMD's hardware.
AMD already said it was open but as for nvidia ever using it is doubtful just on principle alone they won't.
john_I don't expect them do it. If Maxwell is a much much better chip than Kepler they will just do a little price war, giving much better DirectX performance than AMD at the same price points. So in the end, for example, you will have to choose between being faster by 10-30% in games that support Mantle with an AMD card or 10-20% faster in games that don't support Mantle with an Nvidia card. AMD must be better in hardware also to force Nvidia to support Mantle.
As i said nvidia won't on principle alone, but mantle still has a ton to prove. Is it really as fast as amd claims and one thing i been vocal about is since it does have low level hardware access what kinda stability issues will come in to play with that. Windows back in 90's used to be direct hardware axx to everything and well that wasn't so good.
Posted on Reply
#63
W1zzard
RelayerI'm not so sure you are correct (if I may be so bold :))
Please be, always :)
. If you look at the latest swarm demo they were dramatically changing performance by adding and subtracting IQ settings. They toggled motion blur (multi sampling motion blur I think it was called? A truer motion blur effect that's done by rendering the frame multiple times rather than simply adding a filter effect.) on and off and FPS in DX went from playable to slideshow while Mantle was still playable. I realize that's only a single example and doesn't mean there will be other IQ effects/settings that will have the same effect, but I'm assuming that since it's a tech demo they simply chose something that would be easy to implement and demonstrate.
I would assume that the way the DX renderer renders the motion blur introduces a bottleneck in either the DirectX API or the CPU, which goes away when running Mantle. So in the non-Mantle example the GPU was most certainly not running at 100%, while with Mantle CPU load is much higher, resulting in higher FPS.

If they naively implemented the DirectX motion blur then this comes at no surprise. If you render, then copy the rendered frame back onto the CPU, it will stall the whole GPU pipeline while the copy is in progress (for more technical info: www.google.com/#q=getrendertargetdata+slow)
Posted on Reply
#64
Frick
Fishfaced Nincompoop
Musselsthey said. a 290x with an A10 APU.
Ah ok. But still, what games, what settings, what resolution, what levels did they use etc.
arbiter

they didn't?
That's Mantle PR, which is extremely different from what you were on about in your first post.
Posted on Reply
#65
ensabrenoir
amd... amd.... doing great things so far.... hope this manifests as you claim. Don't let yourself get bulldozier-ed again by all the hype.
Posted on Reply
#66
Dent1
arbiterFrom what i seen of AMD they tend to over state what their products can do. Claim they are faster then they really are. Case in point a recent even AMD claimed their 8350 cpu was comparable to i7 4770k, which if look at benchmarks their 9590 is still slower even though its overclocked ~20%. Been several cases of that kinda stuff over the years as well, So their claim of 45% i wouldn't believe that for a sec til a 3rd party reviewer comes out with results.
arbiter

they didn't?




AMD already said it was open but as for nvidia ever using it is doubtful just on principle alone they won't.


As i said nvidia won't on principle alone, but mantle still has a ton to prove. Is it really as fast as amd claims and one thing i been vocal about is since it does have low level hardware access what kinda stability issues will come in to play with that. Windows back in 90's used to be direct hardware axx to everything and well that wasn't so good.
But that is based on performance linked with Mantle. In your original post you didn't mention mantle, this gave the impression AMD said that statement a year prior to the 4770ks release or recently but randomly.

There is a Mantle seminar video presentation on the Mantle API. They managed to make rendering almost solely GPU bound. They said when underclocking the FX 8350 to 2GHz it performs the same as GPU is in control. I can't remember the exact time, but I found the video its worth watching throughout if you haven't already seen it.

Posted on Reply
#67
Relayer
arbiter

they didn't?
This is while running Mantle which makes use of all 8 AMD cores where DX doesn't and allows the higher IPC of the 4770 to shine.
AMD already said it was open but as for nvidia ever using it is doubtful just on principle alone they won't.
Having it be open but still controlled by AMD wouldn't work. It would still be AMD/GCN centric. What I'm talking about, and I might not have presented it right, is it being an open standard that has a body of multiple contributers controlling it. Possibly, AMD-nVidia-Intel-M$-etc, so they could all have input to cater to their own needs.
As i said nvidia won't on principle alone, but mantle still has a ton to prove. Is it really as fast as amd claims and one thing i been vocal about is since it does have low level hardware access what kinda stability issues will come in to play with that. Windows back in 90's used to be direct hardware axx to everything and well that wasn't so good.
If it gets used in enough high profile games and nVidia gets their butts handed to them because of it, they might be forced to, like it or not. Stability is supposed to be improved because the devs can optimize their code so much better. Nothing's proven as of yet, though.
Posted on Reply
#68
Relayer
W1zzardPlease be, always :)


I would assume that the way the DX renderer renders the motion blur introduces a bottleneck in either the DirectX API or the CPU, which goes away when running Mantle. So in the non-Mantle example the GPU was most certainly not running at 100%, while with Mantle CPU load is much higher, resulting in higher FPS.

If they naively implemented the DirectX motion blur then this comes at no surprise. If you render, then copy the rendered frame back onto the CPU, it will stall the whole GPU pipeline while the copy is in progress (for more technical info: www.google.com/#q=getrendertargetdata slow)
Well, technically speaking, I have no clue. ;) I have no technical knowledge to call upon. I'm just trying to absorb as much info on it as I can and make sense of it. Typically though with mature drivers GPU usage is usually +90%. I don't know why the devs would seem so genuinely excited about it if DX could already be optimized to provide over 90% of Mantle's performance. Hopefully we don't have too much longer to wait.

It seems like BF4 is clogging up everything while they are waiting for Dice to fix it. FWIU Dice was given first release rights to Mantle because of all the work they did developing and promoting it with AMD. It actually looks like Oxide could give us something more right now, but they have to wait for the BF4 patch.
Posted on Reply
#69
xtremesv
happitaNo need to troll there big guy. Your just regurgitating what pretty much every skeptic has already said about Mantle. Try contributing something "new" to this discussion because it just gets boring hearing the same garbage over and over.

Mantle will be out by the end of the month. Yes they have delayed it from last month, but shit happens, YES even in the tech world where we don't get what we are promised. (Mommy, if I'm good, can I go to the arcade? Yes honey, sure.....)
The whole fiasco with BF4 being a technical mess is obvious and it has halted the progress of AMD implementing Mantle in a timely fashion. It looks like EA/DICE has been hard at work fixing up BF4 good and proper and as such I think once they feel it is on par with their and our expectations, Mantle will be ready to go.
I don't see how pointing out a fact can be trolling. Mantle has gathered that much attention because it coincided with the release of new gen consoles not due to a revolutionary idea being implemented for the first time. Forgive me if I don't have faith in Mantle the way you do.
Posted on Reply
#70
NeoXF
Batou1986just like crossfire = up to a 100% performance increase o_O
When in reality, it's actually just 80-90% on average... how horribly sick and false are AMD's claims on that, right? /sarcasm

FFS, BF > CoD any day, but that also means (and thank god, is the case as well), that it won't be released in a yearly cycle... but STILL they mess up the timeframes, bugs galore and delays of features that for some, are almost paramount, is just bad news man.

One would expect AMD would be the one dropping the ball in this, but no, it's their partner(s). Anyway, just a few more days... I hope.
arbiter

they didn't?
If anything, "they" admited with that that FX 8350 is slower than i7 4770K, BUT Mantle makes them even in these circumstances. Also, news flash, that's an Oxide slide, not a AMD one (tho probably approved by them first). The sloppy way they typed in the model names kinda make my eye twitch.

You kinda failed big time here. Seems you are filled with a lot of disdain towards AMD, did they happen to run over your childhood pet or something? Chill yo...
Posted on Reply
#71
Nihilus
RCoonThe key words here are "up to".
I will reservedly wait for actual factual benchmarks before I believe this.
Seen a review on guru3d. It gets at least 10% boost in every situation, so not bad.
erockerThe 45% boost most likely goes to the APU's.
APUs seemed to get the smallest gains, but 10% is still a nice boost. The real focus was eliminating CPU overhead. The greatest gains were soon with a good graphics card and a slower cpu. PCs can get gaming efficiency that is more like that of consoles
In a way, this actually hurts the apu. For the price of a 7850k you can get an R250 and a cheapo intel CPU. the latter system offers more flexibility for upgrades.
Posted on Reply
#72
leeb2013
Mantle Enables Significant Performance Improvement in Battlefield 4: AMD......

no it doesn't, it causes BF4 to instantly crash when selecting graphics-options

But I fixed that with help from the online community only to find.....

no it doesn't, performance doesn't improve for anything other than the new R9-xxx with a crappy AMD APU. Forget 7xxx series GPU and Intel CPU......may be optimised sometime.....or may not. The driver will be Beta forever, there will always be issues with DX9, crossfire, multiple displays and it will never improve performance for anyone with an Intel CPU, because they want you to buy cheaper and inferior AMD cpu's for which Mantle will improve AMD sponsored games (ie, just BF4).

On a positive note, the last 1GB update to BF4, which seemed only to check if you were running 13.12 drivers, not 13.11 and complain if you had 13.11 (even if you had 13.12 which AMD forgot and to rename to 13.12, so BF4 still thought you had 13.11) causing you to lose your slot on the server whilst you found the dialogue box to select "yes please run BF4 even though I only have 13.11 (but actually 13.12)"), plus changes to increase the number of crashes between rounds and a menu option for Mantle (which just detects if you have a Intel or AMD CPU and removes the artificial performance restriction if you have an AMD CPU and R9-xxx (really think BF4 needs 80% of an o/c I5 yet runs ok on XB1?, me neither)), at least it thinks I have 14.1 drivers so it complains no more, and neither shall I.
Posted on Reply
#74
Wile E
Power User
Queue the mass of anti-Intel programming conspiracies. :rolleyes:
Posted on Reply
Add your own comment
Nov 22nd, 2024 16:16 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts