• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Mantle Enables Significant Performance Improvement in Battlefield 4: AMD

Frick

Fishfaced Nincompoop
Joined
Feb 27, 2006
Messages
19,578 (2.86/day)
Location
Piteå
System Name White DJ in Detroit
Processor Ryzen 5 5600
Motherboard Asrock B450M-HDV
Cooling Be Quiet! Pure Rock 2
Memory 2 x 16GB Kingston Fury 3400mhz
Video Card(s) XFX 6950XT Speedster MERC 319
Storage Kingston A400 240GB | WD Black SN750 2TB |WD Blue 1TB x 2 | Toshiba P300 2TB | Seagate Expansion 8TB
Display(s) Samsung U32J590U 4K + BenQ GL2450HT 1080p
Case Fractal Design Define R4
Audio Device(s) Plantronics 5220, Nektar SE61 keyboard
Power Supply Corsair RM850x v3
Mouse Logitech G602
Keyboard Cherry MX Board 1.0 TKL Brown
Software Windows 10 Pro
Benchmark Scores Rimworld 4K ready!
From what i seen of AMD they tend to over state what their products can do. Claim they are faster then they really are. Case in point a recent even AMD claimed their 8350 cpu was comparable to i7 4770k, which if look at benchmarks their 9590 is still slower even though its overclocked ~20%. Been several cases of that kinda stuff over the years as well, So their claim of 45% i wouldn't believe that for a sec til a 3rd party reviewer comes out with results.

For one thing it's cherry picked. Under specific circumstances a 8350 would be as fast an i7. They never speak in general terms. I'm quite confident that %45 number is true, the question is a) circumstances and b) 45% over what exactly?
 
Joined
May 18, 2010
Messages
3,427 (0.65/day)
System Name My baby
Processor Athlon II X4 620 @ 3.5GHz, 1.45v, NB @ 2700Mhz, HT @ 2700Mhz - 24hr prime95 stable
Motherboard Asus M4A785TD-V EVO
Cooling Sonic Tower Rev 2 with 120mm Akasa attached, Akasa @ Front, Xilence Red Wing 120mm @ Rear
Memory 8 GB G.Skills 1600Mhz
Video Card(s) ATI ASUS Crossfire 5850
Storage Crucial MX100 SATA 2.5 SSD
Display(s) Lenovo ThinkVision 27" (LEN P27h-10)
Case Antec VSK 2000 Black Tower Case
Audio Device(s) Onkyo TX-SR309 Receiver, 2x Kef Cresta 1, 1x Kef Center 20c
Power Supply OCZ StealthXstream II 600w, 4x12v/18A, 80% efficiency.
Software Windows 10 Professional 64-bit
if fps goes up 45% won't gpu power usage go up 45%

No because frame rate and power consumption doesn't increase together in a linear manner at the same increment.

Generally speaking to increase the FPS the hardware would need to work harder, this could increase power consumption slightly, but its not a 1:1 ratio.


For one thing it's cherry picked. Under specific circumstances a 8350 would be as fast an i7. They never speak in general terms. I'm quite confident that %45 number is true, the question is a) circumstances and b) 45% over what exactly?

What you said is 100% true, but I don't think AMD made such quotes to begin with. The i7 4770k was released a year after the FX 8350 so it would be impossible for AMD to make that claim. So either arbiter is misinformed or is lying. He seems like a decent gentleman so I'm going to say misinformed.
 
Last edited:
Joined
Apr 10, 2012
Messages
1,400 (0.30/day)
Location
78°55' N, 11°56' E
System Name -aLiEn beaTs-
Processor Intel i7 11700kf @ 5.055Ghz
Motherboard MSI Z490 Unify
Cooling Corsair H115i Pro RGB
Memory G.skill Royal Silver 4400 cl17 @ 4403mhz
Video Card(s) Zotac GTX 980TI AMP!Omega Factory OC 1418MHz
Storage Intel SSD 330, Crucial SSD MX300 & MX500
Display(s) Samsung C24FG73 144HZ
Case CoolerMaster HAF 932 USB3.0
Audio Device(s) X-Fi Titanium HD @ 2.1 Bose acoustimass 5
Power Supply CoolerMaster 850W v2 gold atx 2.52
Mouse Razer viper 8k
Keyboard Logitech G19s
Software Windows 11 Pro 21h2 64Bit
Benchmark Scores ► ♪♫♪♩♬♫♪♭
I wound say no difference in power consumption,.

For example 3dmark2011

Gpu works 100% and 250W TDP, now imagine you remove some API drawback calls that are stalling the driver and make it more efficient,. It spends less time with driver <> API communication/calculations and uses that extra for more rendering power.
It would still run at the same 100% gpu usage and 250w TDP..

Actually I think it should be lower since gpu shader efficiency raises , kinda like PSU efficiency 80+ plus vs 80+ titanium by same wattage.
 
Joined
Feb 14, 2012
Messages
2,355 (0.50/day)
System Name msdos
Processor 8086
Motherboard mainboard
Cooling passive
Memory 640KB + 384KB extended
Video Card(s) EGA
Storage 5.25"
Display(s) 80x25
Case plastic
Audio Device(s) modchip
Power Supply 45 watts
Mouse serial
Keyboard yes
Software disk commander
Benchmark Scores still running
I wound say no difference in power consumption,. For example 3dmark2011 Gpu works 100% and 250W TDP, now imagine you remove some API drawback calls that are stalling the driver and make it more efficient,. It spends less time with driver <> API communication/calculations and uses that extra for more rendering power. It would still run at the same 100% gpu usage and 250w TDP.. Actually I think it should be lower since gpu shader efficiency raises , kinda like PSU efficiency 80+ plus vs 80+ titanium by same wattage.

I am quoting this in case I need to refer to it later.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,171 (2.81/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
It seems to me that AMD wants games to run better on machines with more cores. If what @W1zzard said is correct, I'm sure that this is the case. It's more about letting CPUs scale to the GPUs, because GPUs are already pretty good at what they do. I wonder how Mantle scales in CFX as opposed to using just your typical Direct3D libraries.

As many have said before, I would love some real numbers instead of this "up to 45%" garbage.
 
Joined
Mar 18, 2008
Messages
5,441 (0.89/day)
Location
Australia
System Name Night Rider | Mini LAN PC | Workhorse
Processor AMD R7 5800X3D | Ryzen 1600X | i7 970
Motherboard MSi AM4 Pro Carbon | GA- | Gigabyte EX58-UD5
Cooling Noctua U9S Twin Fan| Stock Cooler, Copper Core)| Big shairkan B
Memory 2x8GB DDR4 G.Skill Ripjaws 3600MHz| 2x8GB Corsair 3000 | 6x2GB DDR3 1300 Corsair
Video Card(s) MSI AMD 6750XT | 6500XT | MSI RX 580 8GB
Storage 1TB WD Black NVME / 250GB SSD /2TB WD Black | 500GB SSD WD, 2x1TB, 1x750 | WD 500 SSD/Seagate 320
Display(s) LG 27" 1440P| Samsung 20" S20C300L/DELL 15" | 22" DELL/19"DELL
Case LIAN LI PC-18 | Mini ATX Case (custom) | Atrix C4 9001
Audio Device(s) Onboard | Onbaord | Onboard
Power Supply Silverstone 850 | Silverstone Mini 450W | Corsair CX-750
Mouse Coolermaster Pro | Rapoo V900 | Gigabyte 6850X
Keyboard MAX Keyboard Nighthawk X8 | Creative Fatal1ty eluminx | Some POS Logitech
Software Windows 10 Pro 64 | Windows 10 Pro 64 | Windows 7 Pro 64/Windows 10 Home
Think with 4K in mind not 1080p. Nvidia will need faster hardware to fight AMD if Mantle is a success. Also at 4K Intel can't follow.

Mantle is open source, Nvidia can implement it if they like also.
 

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.94/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
For one thing it's cherry picked. Under specific circumstances a 8350 would be as fast an i7. They never speak in general terms. I'm quite confident that %45 number is true, the question is a) circumstances and b) 45% over what exactly?

they said. a 290x with an A10 APU.


all this does is remove the CPU bottleneck, like i said on the last page as well. if you have a high end GPU and a midrange CPU, you'll see massive gains.

if you're in an RTS game where its always CPU limited, you'll see massive gains.

the common denominator here is if your CPU is bottlenecked, you'll see performance gains. if you arent bottlenecked and you use Vsync, you'll just save on wattage and heat.
 
Joined
Nov 13, 2007
Messages
10,765 (1.73/day)
Location
Austin Texas
System Name stress-less
Processor 9800X3D @ 5.42GHZ
Motherboard MSI PRO B650M-A Wifi
Cooling Thermalright Phantom Spirit EVO
Memory 64GB DDR5 6000 CL30-36-36-76
Video Card(s) RTX 4090 FE
Storage 2TB WD SN850, 4TB WD SN850X
Display(s) Alienware 32" 4k 240hz OLED
Case Jonsbo Z20
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse DeathadderV2 X Hyperspeed
Keyboard 65% HE Keyboard
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
great news! I'm hoping it gets wide adoption, and then force nv to respond. 4K GFX for the masses!!!
 
Joined
Sep 23, 2013
Messages
154 (0.04/day)
System Name Trixeon
Processor Xeon E3-1230-V2
Motherboard Asrock Z77 Extreme 3
Cooling Coolmaster EVO
Memory Gskill 2133 8GB
Video Card(s) Sapphire R9-290 Tri-x o/c bios 1000/6000MHz
Storage 128GB SSD, 256GB SSD, 3TB HDD, 1TB HDD
Display(s) 3x 22" 1050 monitors 16:10
Case Xigmatec
Audio Device(s) onboard
Power Supply Xigmatec 800W centurio
Software Win7 64bit
45% on an A10 CPU, with a 290x.


so umm, with all the people here who are acting like experts on mantle, why has no one pointed out the obvious?

we wont get faster FPS oh high end systems, its systems with an 'average' or weak CPU that see the greatest gains.

pair a 290x with a 'slow', multi core CPU and you'll see the greatest gains.

so pair an AMD 290x with an AMD CPU and the AMD Mantle will produce some gains. I see what they're doing there.
 
Joined
Jan 13, 2009
Messages
424 (0.07/day)
I, personally, would love it if they put more effort in their drivers and especially on optimizing their OpenGL[1], which at the moment, unlike on the green team, is Mjr. Balle de Sukke on their drivers. ...And would drop CCC, too. Instead of trying to "win" things over with this Mantle of theirs, which, I believe, can just create more of consumer-unfriendly segregation. And we already have too much of that.

[1] as mainly a Linux user, I care about that a lot.
Since Mantle isn't tied to DX you will likely see it on Linux as well. Just have to be patient while they take care of the big dog (Windows) first. Also, Mantle will remove the game to game dependency on driver optimizations, as the devs will be able to optimize everything from their end.

The question begs to be asked, if Mantle is the second coming of Christ, won't that basically completely and utterly negate the need for high, heck, even mid-range cards? I don't know, I think Mantle will prove to be more hype than anything else, seeing that shareholders of a company would rather see the cash inflows from selling overpriced expensive cards, than AMD being the white knight in shining armour and actually doing the right thing for gamers (I say that because while I'm an NVIDIA fanboy -it's true- I still think that Mantle would be a step towards the right direction). Boardroom chatter always wins in the end.
They are looking at the bigger picture, selling APU's, especially in cheap gaming laptops. I'm sure they will have some high end features that will still require mega computing power, but this should dramatically expand their customer base.

There's also hires Eyefinity and (especially) 4K monitors that their solutions should be far more affordable and move what has been reserved for the lunatic fringe closer to the mainstream. A pair of 290's ($800 once the mining craze subsides) should be able to match the gaming experience of TriSLI 780 ti for a fraction of the price. Pair that up with one of the cheaper 4K monitors on the horizon, a "cheap" (by Intel standards) $150 8 core AMD CPU and you'll have gaming performance that last year everyone was assuming would be unaffordable to most.

Look at it like this: Mantle will help to increase your GPU load, if your GPU load is below 100% (limited by CPU/API). It won't do anything if GPU load is already at 100%
I'm not so sure you are correct (if I may be so bold :)). If you look at the latest swarm demo they were dramatically changing performance by adding and subtracting IQ settings. They toggled motion blur (multi sampling motion blur I think it was called? A truer motion blur effect that's done by rendering the frame multiple times rather than simply adding a filter effect.) on and off and FPS in DX went from playable to slideshow while Mantle was still playable. I realize that's only a single example and doesn't mean there will be other IQ effects/settings that will have the same effect, but I'm assuming that since it's a tech demo they simply chose something that would be easy to implement and demonstrate. I've also seen reported that AA penalty will be drastically reduced with Mantle. Mainly because DX has huge AA overhead. I know it's early days and none of this proves anything conclusively, but it does look promising.

I really don't think the devs would be as excited as they are (genuinely excited I believe) if it was only going to reduce CPU bottlenecks.

1. Where you say "the company revealed that Mantle, it's ambitious attempt at a 3D graphic API to rival DirectX and OpenGL," this statement is not completely accurate. D3D and OpenGL are High Shader Language APIs. AMD Mantle is not an HSL API. It is a CPU-GPU Optimization API with additional perks. So you can't really say they are the same, and you can't make a claim that it is AMD's rival-API when AMD hasn't released or announced a HSL version.

2. The 45% isn't 45% to all setups. It's 45% for an APU setup. Mainly, this is with the Kaveri APU. There could be a possibility it will be less than 45% with Richland or below. Also, there's a possibility that it could be higher with bulldozer, Haswell, Sandy Bridge, Ivy Bridge, Ivy Bride-E, Haswell-E, etc...

The Starswarm Demo showed the game without AMD Mantle, running at roughly 20 ms per frame, or 49.99 FPS. With AMD Mantle, the demo was running at 2 to 6 ms, or roughly around 200 FPS.

If you watch the following video, I believe there is some accuracy to this to an extent. It could be fake. Who knows for sure at this current time.


I suspect there is a group of people who are testing the AMD Mantle Beta Version with BF4, and this person is one of them. Take into account, when you watch this, towards the end of the video, this person is using GPU-Z. He hints two things. One, I believe he is using a Haswell setup. GPU-Z shows the integrated Intel Graphics 4000. Two, he's got a R9 series for his Discrete Graphic Card. Since Haswell has 4 cores at a higher Core Frequency, it's possible that FPS performance will go up based on the amount of cores your CPU has, and it's core frequency.

Since AMD Mantle requires a driver on the users end, the person in the video enabled the AMD Mantle in the AMD Catalyst Client. So, in my opinion, it's looking less fake.

If 45% is what AMD Mantle offers at 162 to 200 FPS, then the Kavari APU alone is pushing somewhere around 113 FPS without AMD Mantle. From the video, 300 FPS to 400 FPS is more than twice. Now if you increase the amount of cores on the CPU, this 45% will probably start to show some form of diminishing returns, but the FPS will go up higher. Why is that. Well for a start, AMD Mantle, for a lack of a better term, redirects commands through the other cores for the GPU. Thus, reducing the CPU Bottleneck occurring at the first core.

What's the point in having such a high FPS. If you look at it in context when benchmarking "Video Graphic Cards," high FPS performance versus Game-A, B, C, D, etc, is the x-factor. If AMD can provide their products with AMD Mantle, they can push higher fps for top-selling, PC Games. Higher fps equates to higher popularity versus brand B Graphic Card, and revenue returns go up as consumers purchase "higher-performing" products. Marketing of Graphic Cards are heavily dependent on third party benchers like Techpowerup.com...

I'm pretty sure Johan Andersson said it was fake. I can't find the Tweet ATM.

Sad when technology progresses? Or sad that its two generations old and new technology requires....new technology. I guess we should all be angry it idn't going to be supported on Windows 98SE, cause that was awesome, and I had a great time playing games on that OS, and since it doesn't need DX.........
I didn't get the impression he was hating on it because it didn't work on 6000's, just wished it would. Hopefully it means M$ won't be able to hold us hostage to buy their latest OS if we want the latest gaming features.

repeat after me...... "I will use the edit and quote buttons so that I don't, double, triple, Quadruple and Quintuple posts" :)
 
Last edited by a moderator:
Joined
Sep 6, 2013
Messages
3,333 (0.81/day)
Location
Athens, Greece
System Name 3 desktop systems: Gaming / Internet / HTPC
Processor Ryzen 5 5500 / Ryzen 5 4600G / FX 6300 (12 years latter got to see how bad Bulldozer is)
Motherboard MSI X470 Gaming Plus Max (1) / MSI X470 Gaming Plus Max (2) / Gigabyte GA-990XA-UD3
Cooling Νoctua U12S / Segotep T4 / Snowman M-T6
Memory 32GB - 16GB G.Skill RIPJAWS 3600+16GB G.Skill Aegis 3200 / 16GB JUHOR / 16GB Kingston 2400MHz (DDR3)
Video Card(s) ASRock RX 6600 + GT 710 (PhysX)/ Vega 7 integrated / Radeon RX 580
Storage NVMes, ONLY NVMes/ NVMes, SATA Storage / NVMe boot(Clover), SATA storage
Display(s) Philips 43PUS8857/12 UHD TV (120Hz, HDR, FreeSync Premium) ---- 19'' HP monitor + BlitzWolf BW-V5
Case Sharkoon Rebel 12 / CoolerMaster Elite 361 / Xigmatek Midguard
Audio Device(s) onboard
Power Supply Chieftec 850W / Silver Power 400W / Sharkoon 650W
Mouse CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Keyboard CoolerMaster Devastator III Plus / CoolerMaster Devastator / Logitech
Software Windows 10 / Windows 10&Windows 11 / Windows 10
Mantle is open source, Nvidia can implement it if they like also.

I don't expect them do it. If Maxwell is a much much better chip than Kepler they will just do a little price war, giving much better DirectX performance than AMD at the same price points. So in the end, for example, you will have to choose between being faster by 10-30% in games that support Mantle with an AMD card or 10-20% faster in games that don't support Mantle with an Nvidia card. AMD must be better in hardware also to force Nvidia to support Mantle.
 
Joined
Jan 13, 2009
Messages
424 (0.07/day)
The only way nVidia supports Mantle is if it becomes an open standard, which I believe AMD would be willing to do. Too dangerous for them to rely on an API their main competitor controls. It would be like AMD adding PhysX to their feature stack and being at the mercy of nVidia to not make it run like crap on AMD's hardware.
 
Joined
Jun 13, 2012
Messages
1,389 (0.31/day)
Processor i7-13700k
Motherboard Asus Tuf Gaming z790-plus
Cooling Coolermaster Hyper 212 RGB
Memory Corsair Vengeance RGB 32GB DDR5 7000mhz
Video Card(s) Asus Dual Geforce RTX 4070 Super ( 2800mhz @ 1.0volt, ~60mhz overlock -.1volts)
Storage 1x Samsung 980 Pro PCIe4 NVme, 2x Samsung 1tb 850evo SSD, 3x WD drives, 2 seagate
Display(s) Acer Predator XB273u 27inch IPS G-Sync 165hz
Audio Device(s) Logitech Z906 5.1
Power Supply Corsair RMx Series RM850x (OCZ Z series PSU retired after 13 years of service)
Mouse Logitech G502 hero
Keyboard Logitech G710+
AMD didn't made that claim because the FX 8350 was released a YEAR BEFORE the i7 4770k. So what you said can't be true.



they didn't?


The only way nVidia supports Mantle is if it becomes an open standard, which I believe AMD would be willing to do. Too dangerous for them to rely on an API their main competitor controls. It would be like AMD adding PhysX to their feature stack and being at the mercy of nVidia to not make it run like crap on AMD's hardware.

AMD already said it was open but as for nvidia ever using it is doubtful just on principle alone they won't.


I don't expect them do it. If Maxwell is a much much better chip than Kepler they will just do a little price war, giving much better DirectX performance than AMD at the same price points. So in the end, for example, you will have to choose between being faster by 10-30% in games that support Mantle with an AMD card or 10-20% faster in games that don't support Mantle with an Nvidia card. AMD must be better in hardware also to force Nvidia to support Mantle.

As i said nvidia won't on principle alone, but mantle still has a ton to prove. Is it really as fast as amd claims and one thing i been vocal about is since it does have low level hardware access what kinda stability issues will come in to play with that. Windows back in 90's used to be direct hardware axx to everything and well that wasn't so good.
 
Last edited:

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,842 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
I'm not so sure you are correct (if I may be so bold :))
Please be, always :)

. If you look at the latest swarm demo they were dramatically changing performance by adding and subtracting IQ settings. They toggled motion blur (multi sampling motion blur I think it was called? A truer motion blur effect that's done by rendering the frame multiple times rather than simply adding a filter effect.) on and off and FPS in DX went from playable to slideshow while Mantle was still playable. I realize that's only a single example and doesn't mean there will be other IQ effects/settings that will have the same effect, but I'm assuming that since it's a tech demo they simply chose something that would be easy to implement and demonstrate.
I would assume that the way the DX renderer renders the motion blur introduces a bottleneck in either the DirectX API or the CPU, which goes away when running Mantle. So in the non-Mantle example the GPU was most certainly not running at 100%, while with Mantle CPU load is much higher, resulting in higher FPS.

If they naively implemented the DirectX motion blur then this comes at no surprise. If you render, then copy the rendered frame back onto the CPU, it will stall the whole GPU pipeline while the copy is in progress (for more technical info: https://www.google.com/#q=getrendertargetdata+slow)
 

Frick

Fishfaced Nincompoop
Joined
Feb 27, 2006
Messages
19,578 (2.86/day)
Location
Piteå
System Name White DJ in Detroit
Processor Ryzen 5 5600
Motherboard Asrock B450M-HDV
Cooling Be Quiet! Pure Rock 2
Memory 2 x 16GB Kingston Fury 3400mhz
Video Card(s) XFX 6950XT Speedster MERC 319
Storage Kingston A400 240GB | WD Black SN750 2TB |WD Blue 1TB x 2 | Toshiba P300 2TB | Seagate Expansion 8TB
Display(s) Samsung U32J590U 4K + BenQ GL2450HT 1080p
Case Fractal Design Define R4
Audio Device(s) Plantronics 5220, Nektar SE61 keyboard
Power Supply Corsair RM850x v3
Mouse Logitech G602
Keyboard Cherry MX Board 1.0 TKL Brown
Software Windows 10 Pro
Benchmark Scores Rimworld 4K ready!
Joined
Apr 16, 2010
Messages
2,070 (0.39/day)
System Name iJayo
Processor i7 14700k
Motherboard Asus ROG STRIX z790-E wifi
Cooling Pearless Assasi
Memory 32 gigs Corsair Vengence
Video Card(s) Nvidia RTX 2070 Super
Storage 1tb 840 evo, Itb samsung M.2 ssd 1 & 3 tb seagate hdd, 120 gig Hyper X ssd
Display(s) 42" Nec retail display monitor/ 34" Dell curved 165hz monitor
Case O11 mini
Audio Device(s) M-Audio monitors
Power Supply LIan li 750 mini
Mouse corsair Dark Saber
Keyboard Roccat Vulcan 121
Software Window 11 pro
Benchmark Scores meh... feel me on the battle field!
amd... amd.... doing great things so far.... hope this manifests as you claim. Don't let yourself get bulldozier-ed again by all the hype.
 
Joined
May 18, 2010
Messages
3,427 (0.65/day)
System Name My baby
Processor Athlon II X4 620 @ 3.5GHz, 1.45v, NB @ 2700Mhz, HT @ 2700Mhz - 24hr prime95 stable
Motherboard Asus M4A785TD-V EVO
Cooling Sonic Tower Rev 2 with 120mm Akasa attached, Akasa @ Front, Xilence Red Wing 120mm @ Rear
Memory 8 GB G.Skills 1600Mhz
Video Card(s) ATI ASUS Crossfire 5850
Storage Crucial MX100 SATA 2.5 SSD
Display(s) Lenovo ThinkVision 27" (LEN P27h-10)
Case Antec VSK 2000 Black Tower Case
Audio Device(s) Onkyo TX-SR309 Receiver, 2x Kef Cresta 1, 1x Kef Center 20c
Power Supply OCZ StealthXstream II 600w, 4x12v/18A, 80% efficiency.
Software Windows 10 Professional 64-bit
From what i seen of AMD they tend to over state what their products can do. Claim they are faster then they really are. Case in point a recent even AMD claimed their 8350 cpu was comparable to i7 4770k, which if look at benchmarks their 9590 is still slower even though its overclocked ~20%. Been several cases of that kinda stuff over the years as well, So their claim of 45% i wouldn't believe that for a sec til a 3rd party reviewer comes out with results.




they didn't?




AMD already said it was open but as for nvidia ever using it is doubtful just on principle alone they won't.


As i said nvidia won't on principle alone, but mantle still has a ton to prove. Is it really as fast as amd claims and one thing i been vocal about is since it does have low level hardware access what kinda stability issues will come in to play with that. Windows back in 90's used to be direct hardware axx to everything and well that wasn't so good.

But that is based on performance linked with Mantle. In your original post you didn't mention mantle, this gave the impression AMD said that statement a year prior to the 4770ks release or recently but randomly.

There is a Mantle seminar video presentation on the Mantle API. They managed to make rendering almost solely GPU bound. They said when underclocking the FX 8350 to 2GHz it performs the same as GPU is in control. I can't remember the exact time, but I found the video its worth watching throughout if you haven't already seen it.

 
Last edited:
Joined
Jan 13, 2009
Messages
424 (0.07/day)


they didn't?
This is while running Mantle which makes use of all 8 AMD cores where DX doesn't and allows the higher IPC of the 4770 to shine.




AMD already said it was open but as for nvidia ever using it is doubtful just on principle alone they won't.

Having it be open but still controlled by AMD wouldn't work. It would still be AMD/GCN centric. What I'm talking about, and I might not have presented it right, is it being an open standard that has a body of multiple contributers controlling it. Possibly, AMD-nVidia-Intel-M$-etc, so they could all have input to cater to their own needs.




As i said nvidia won't on principle alone, but mantle still has a ton to prove. Is it really as fast as amd claims and one thing i been vocal about is since it does have low level hardware access what kinda stability issues will come in to play with that. Windows back in 90's used to be direct hardware axx to everything and well that wasn't so good.

If it gets used in enough high profile games and nVidia gets their butts handed to them because of it, they might be forced to, like it or not. Stability is supposed to be improved because the devs can optimize their code so much better. Nothing's proven as of yet, though.
 
Joined
Jan 13, 2009
Messages
424 (0.07/day)
Please be, always :)


I would assume that the way the DX renderer renders the motion blur introduces a bottleneck in either the DirectX API or the CPU, which goes away when running Mantle. So in the non-Mantle example the GPU was most certainly not running at 100%, while with Mantle CPU load is much higher, resulting in higher FPS.

If they naively implemented the DirectX motion blur then this comes at no surprise. If you render, then copy the rendered frame back onto the CPU, it will stall the whole GPU pipeline while the copy is in progress (for more technical info: https://www.google.com/#q=getrendertargetdata slow)
Well, technically speaking, I have no clue. ;) I have no technical knowledge to call upon. I'm just trying to absorb as much info on it as I can and make sense of it. Typically though with mature drivers GPU usage is usually +90%. I don't know why the devs would seem so genuinely excited about it if DX could already be optimized to provide over 90% of Mantle's performance. Hopefully we don't have too much longer to wait.

It seems like BF4 is clogging up everything while they are waiting for Dice to fix it. FWIU Dice was given first release rights to Mantle because of all the work they did developing and promoting it with AMD. It actually looks like Oxide could give us something more right now, but they have to wait for the BF4 patch.
 
Joined
Mar 11, 2010
Messages
120 (0.02/day)
Location
El Salvador
System Name Jaguar X
Processor AMD Ryzen 7 7700X
Motherboard ASUS ROG Strix X670E-E Gaming WiFi
Cooling Corsair H150 RGB
Memory 2x 16GB Corsair Vengeance DDR5-6000
Video Card(s) Gigabyte RTX 4080 Gaming OC
Storage 1TB Kingston KC3000 + 1TB Samsung 970 EVO Plus
Display(s) LG C1
Case Cougar Panzer EVO RGB
Power Supply XPG Core Reactor 850W
Mouse Cougar Minos XT
Keyboard Cougar Ultimus RGB
Software Windows 11 Pro
No need to troll there big guy. Your just regurgitating what pretty much every skeptic has already said about Mantle. Try contributing something "new" to this discussion because it just gets boring hearing the same garbage over and over.

Mantle will be out by the end of the month. Yes they have delayed it from last month, but shit happens, YES even in the tech world where we don't get what we are promised. (Mommy, if I'm good, can I go to the arcade? Yes honey, sure.....)
The whole fiasco with BF4 being a technical mess is obvious and it has halted the progress of AMD implementing Mantle in a timely fashion. It looks like EA/DICE has been hard at work fixing up BF4 good and proper and as such I think once they feel it is on par with their and our expectations, Mantle will be ready to go.

I don't see how pointing out a fact can be trolling. Mantle has gathered that much attention because it coincided with the release of new gen consoles not due to a revolutionary idea being implemented for the first time. Forgive me if I don't have faith in Mantle the way you do.
 
Joined
Sep 19, 2012
Messages
615 (0.14/day)
System Name [WIP]
Processor Intel Pentium G3420 [i7-4790K SOON(tm)]
Motherboard MSI Z87-GD65 Gaming
Cooling [Corsair H100i]
Memory G.Skill TridentX 2x8GB-2400-CL10 DDR3
Video Card(s) [MSI AMD Radeon R9-290 Gaming]
Storage Seagate 2TB Desktop SSHD / [Samsung 256GB 840 PRO]
Display(s) [BenQ XL2420Z]
Case [Corsair Obsidian 750D]
Power Supply Corsair RM750
Software Windows 8.1 x64 Pro / Linux Mint 15 / SteamOS
just like crossfire = up to a 100% performance increase o_O
When in reality, it's actually just 80-90% on average... how horribly sick and false are AMD's claims on that, right? /sarcasm

FFS, BF > CoD any day, but that also means (and thank god, is the case as well), that it won't be released in a yearly cycle... but STILL they mess up the timeframes, bugs galore and delays of features that for some, are almost paramount, is just bad news man.

One would expect AMD would be the one dropping the ball in this, but no, it's their partner(s). Anyway, just a few more days... I hope.



they didn't?
If anything, "they" admited with that that FX 8350 is slower than i7 4770K, BUT Mantle makes them even in these circumstances. Also, news flash, that's an Oxide slide, not a AMD one (tho probably approved by them first). The sloppy way they typed in the model names kinda make my eye twitch.

You kinda failed big time here. Seems you are filled with a lot of disdain towards AMD, did they happen to run over your childhood pet or something? Chill yo...
 
Last edited:
Joined
Jul 19, 2011
Messages
540 (0.11/day)
The key words here are "up to".
I will reservedly wait for actual factual benchmarks before I believe this.
Seen a review on guru3d. It gets at least 10% boost in every situation, so not bad.

The 45% boost most likely goes to the APU's.
APUs seemed to get the smallest gains, but 10% is still a nice boost. The real focus was eliminating CPU overhead. The greatest gains were soon with a good graphics card and a slower cpu. PCs can get gaming efficiency that is more like that of consoles
In a way, this actually hurts the apu. For the price of a 7850k you can get an R250 and a cheapo intel CPU. the latter system offers more flexibility for upgrades.
 
Joined
Sep 23, 2013
Messages
154 (0.04/day)
System Name Trixeon
Processor Xeon E3-1230-V2
Motherboard Asrock Z77 Extreme 3
Cooling Coolmaster EVO
Memory Gskill 2133 8GB
Video Card(s) Sapphire R9-290 Tri-x o/c bios 1000/6000MHz
Storage 128GB SSD, 256GB SSD, 3TB HDD, 1TB HDD
Display(s) 3x 22" 1050 monitors 16:10
Case Xigmatec
Audio Device(s) onboard
Power Supply Xigmatec 800W centurio
Software Win7 64bit
Mantle Enables Significant Performance Improvement in Battlefield 4: AMD......

no it doesn't, it causes BF4 to instantly crash when selecting graphics-options

But I fixed that with help from the online community only to find.....

no it doesn't, performance doesn't improve for anything other than the new R9-xxx with a crappy AMD APU. Forget 7xxx series GPU and Intel CPU......may be optimised sometime.....or may not. The driver will be Beta forever, there will always be issues with DX9, crossfire, multiple displays and it will never improve performance for anyone with an Intel CPU, because they want you to buy cheaper and inferior AMD cpu's for which Mantle will improve AMD sponsored games (ie, just BF4).

On a positive note, the last 1GB update to BF4, which seemed only to check if you were running 13.12 drivers, not 13.11 and complain if you had 13.11 (even if you had 13.12 which AMD forgot and to rename to 13.12, so BF4 still thought you had 13.11) causing you to lose your slot on the server whilst you found the dialogue box to select "yes please run BF4 even though I only have 13.11 (but actually 13.12)"), plus changes to increase the number of crashes between rounds and a menu option for Mantle (which just detects if you have a Intel or AMD CPU and removes the artificial performance restriction if you have an AMD CPU and R9-xxx (really think BF4 needs 80% of an o/c I5 yet runs ok on XB1?, me neither)), at least it thinks I have 14.1 drivers so it complains no more, and neither shall I.
 
Last edited:

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (3.67/day)
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
Queue the mass of anti-Intel programming conspiracies. :rolleyes:
 
Top