Tuesday, January 14th 2014
Mantle Enables Significant Performance Improvement in Battlefield 4: AMD
In what could explain AMD's move to include copies of one of the most GPU-intensive games with its new A-Series APUs, the company revealed that Mantle, its ambitious attempt at a 3D graphics API to rival DirectX and OpenGL, "enables up to 45 percent faster performance" than DirectX in Battlefield 4, the only known game with planned support for Mantle, and one of the most popular PC games of the season. AMD's claims are so tall, that even a 512 stream processor-laden A10-7850K APU could offer acceptable frame-rates at 1080p, while a $299 Radeon R9 280X could run circles around a $650 GeForce GTX 780 Ti at this particular game. If anything, it could help Battlefield 4 become a potent tech-demonstrator for the API, selling it to the various game developers AMD has built strong developer relations with.
74 Comments on Mantle Enables Significant Performance Improvement in Battlefield 4: AMD
Generally speaking to increase the FPS the hardware would need to work harder, this could increase power consumption slightly, but its not a 1:1 ratio. What you said is 100% true, but I don't think AMD made such quotes to begin with. The i7 4770k was released a year after the FX 8350 so it would be impossible for AMD to make that claim. So either arbiter is misinformed or is lying. He seems like a decent gentleman so I'm going to say misinformed.
For example 3dmark2011
Gpu works 100% and 250W TDP, now imagine you remove some API drawback calls that are stalling the driver and make it more efficient,. It spends less time with driver <> API communication/calculations and uses that extra for more rendering power.
It would still run at the same 100% gpu usage and 250w TDP..
Actually I think it should be lower since gpu shader efficiency raises , kinda like PSU efficiency 80+ plus vs 80+ titanium by same wattage.
As many have said before, I would love some real numbers instead of this "up to 45%" garbage.
all this does is remove the CPU bottleneck, like i said on the last page as well. if you have a high end GPU and a midrange CPU, you'll see massive gains.
if you're in an RTS game where its always CPU limited, you'll see massive gains.
the common denominator here is if your CPU is bottlenecked, you'll see performance gains. if you arent bottlenecked and you use Vsync, you'll just save on wattage and heat.
There's also hires Eyefinity and (especially) 4K monitors that their solutions should be far more affordable and move what has been reserved for the lunatic fringe closer to the mainstream. A pair of 290's ($800 once the mining craze subsides) should be able to match the gaming experience of TriSLI 780 ti for a fraction of the price. Pair that up with one of the cheaper 4K monitors on the horizon, a "cheap" (by Intel standards) $150 8 core AMD CPU and you'll have gaming performance that last year everyone was assuming would be unaffordable to most. I'm not so sure you are correct (if I may be so bold :)). If you look at the latest swarm demo they were dramatically changing performance by adding and subtracting IQ settings. They toggled motion blur (multi sampling motion blur I think it was called? A truer motion blur effect that's done by rendering the frame multiple times rather than simply adding a filter effect.) on and off and FPS in DX went from playable to slideshow while Mantle was still playable. I realize that's only a single example and doesn't mean there will be other IQ effects/settings that will have the same effect, but I'm assuming that since it's a tech demo they simply chose something that would be easy to implement and demonstrate. I've also seen reported that AA penalty will be drastically reduced with Mantle. Mainly because DX has huge AA overhead. I know it's early days and none of this proves anything conclusively, but it does look promising.
I really don't think the devs would be as excited as they are (genuinely excited I believe) if it was only going to reduce CPU bottlenecks. I'm pretty sure Johan Andersson said it was fake. I can't find the Tweet ATM. I didn't get the impression he was hating on it because it didn't work on 6000's, just wished it would. Hopefully it means M$ won't be able to hold us hostage to buy their latest OS if we want the latest gaming features.
repeat after me...... "I will use the edit and quote buttons so that I don't, double, triple, Quadruple and Quintuple posts" :)
they didn't? AMD already said it was open but as for nvidia ever using it is doubtful just on principle alone they won't. As i said nvidia won't on principle alone, but mantle still has a ton to prove. Is it really as fast as amd claims and one thing i been vocal about is since it does have low level hardware access what kinda stability issues will come in to play with that. Windows back in 90's used to be direct hardware axx to everything and well that wasn't so good.
If they naively implemented the DirectX motion blur then this comes at no surprise. If you render, then copy the rendered frame back onto the CPU, it will stall the whole GPU pipeline while the copy is in progress (for more technical info: www.google.com/#q=getrendertargetdata+slow)
There is a Mantle seminar video presentation on the Mantle API. They managed to make rendering almost solely GPU bound. They said when underclocking the FX 8350 to 2GHz it performs the same as GPU is in control. I can't remember the exact time, but I found the video its worth watching throughout if you haven't already seen it.
It seems like BF4 is clogging up everything while they are waiting for Dice to fix it. FWIU Dice was given first release rights to Mantle because of all the work they did developing and promoting it with AMD. It actually looks like Oxide could give us something more right now, but they have to wait for the BF4 patch.
FFS, BF > CoD any day, but that also means (and thank god, is the case as well), that it won't be released in a yearly cycle... but STILL they mess up the timeframes, bugs galore and delays of features that for some, are almost paramount, is just bad news man.
One would expect AMD would be the one dropping the ball in this, but no, it's their partner(s). Anyway, just a few more days... I hope. If anything, "they" admited with that that FX 8350 is slower than i7 4770K, BUT Mantle makes them even in these circumstances. Also, news flash, that's an Oxide slide, not a AMD one (tho probably approved by them first). The sloppy way they typed in the model names kinda make my eye twitch.
You kinda failed big time here. Seems you are filled with a lot of disdain towards AMD, did they happen to run over your childhood pet or something? Chill yo...
In a way, this actually hurts the apu. For the price of a 7850k you can get an R250 and a cheapo intel CPU. the latter system offers more flexibility for upgrades.
no it doesn't, it causes BF4 to instantly crash when selecting graphics-options
But I fixed that with help from the online community only to find.....
no it doesn't, performance doesn't improve for anything other than the new R9-xxx with a crappy AMD APU. Forget 7xxx series GPU and Intel CPU......may be optimised sometime.....or may not. The driver will be Beta forever, there will always be issues with DX9, crossfire, multiple displays and it will never improve performance for anyone with an Intel CPU, because they want you to buy cheaper and inferior AMD cpu's for which Mantle will improve AMD sponsored games (ie, just BF4).
On a positive note, the last 1GB update to BF4, which seemed only to check if you were running 13.12 drivers, not 13.11 and complain if you had 13.11 (even if you had 13.12 which AMD forgot and to rename to 13.12, so BF4 still thought you had 13.11) causing you to lose your slot on the server whilst you found the dialogue box to select "yes please run BF4 even though I only have 13.11 (but actually 13.12)"), plus changes to increase the number of crashes between rounds and a menu option for Mantle (which just detects if you have a Intel or AMD CPU and removes the artificial performance restriction if you have an AMD CPU and R9-xxx (really think BF4 needs 80% of an o/c I5 yet runs ok on XB1?, me neither)), at least it thinks I have 14.1 drivers so it complains no more, and neither shall I.