• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Capcom to Gamers: Monster Hunter World PC Port is CPU Heavy

crazyeyesreaper

Not a Moderator
Staff member
Joined
Mar 25, 2009
Messages
9,847 (1.66/day)
Location
04578
System Name Old reliable
Processor Intel 8700K @ 4.8 GHz
Motherboard MSI Z370 Gaming Pro Carbon AC
Cooling Custom Water
Memory 32 GB Crucial Ballistix 3666 MHz
Video Card(s) MSI RTX 3080 10GB Suprim X
Storage 3x SSDs 2x HDDs
Display(s) ASUS VG27AQL1A x2 2560x1440 8bit IPS
Case Thermaltake Core P3 TG
Audio Device(s) Samson Meteor Mic / Generic 2.1 / KRK KNS 6400 headset
Power Supply Zalman EBT-1000
Mouse Mionix NAOS 7000
Keyboard Mionix
Capcom's, Monster Hunter World, which has already proven to be a rousing success on console, will release on PC as of August 9th. With the release date fast approaching, the performance of the games PC Port has begun to surface online. The game is regarded as being rather demanding on hardware considering what is on offer visually. However, one of the major reasons for that is how Capcom's MT Framework engine, which powers the game, works.

As explained by William Yagi-Bacon, Capcom USA's vice president of digital platforms and marketing, on the Resetera forums, Monster Hunter World and by extension the MT Framework engine, uses the CPU quite heavily. The game loads the entire level into memory while maintaining a wide range of interactions and calculations between the player and environment. As a result, it can effectively load all eight threads of an Intel i7 7700K at 4.4 GHz. Meaning older CPUs will likely struggle, being the limiting factor when paired with newer higher end graphics cards.



"To eliminate interstitial loading during active gameplay, MHW loads the entire level into memory. In addition to managing assets loaded into memory, it keeps track of monster interactions, health status, environment/object changes, manages LOD & object culling, calculates collision detection and physics simulation, and tons of other background telemetry stuff that you don't see yet requires CPU cycle. This is in addition to supporting any GPU rendering tasks.

"While the MT Framework engine has been around for ages, it does a good job in distributing CPU cycles and load-balancing tasks across all available cores and threads. The engine itself is optimized for x86 CPU instruction set, is highly scalable, and loosely speaking, is platform agnostic regardless of PC or console platform so as long as it conforms to the x86 instruction set."

As pointed out by PC Gamer, the game will still be quite demanding on the graphics card. Performance at the highest settings with the resolution set to 3840 x 2160 averaged somewhere around 20 FPS on an i7 7700k + GTX 980 Ti system. To get the game playable at those settings will likely require a GTX 1080 Ti. Even then, it seems likely that there will be the odd frame drop or hiccup depending on the hardware of the system.

It was also noted in their opinion that they expected better framerates, as the visual quality compared to the PS4 Pro was fairly similar. In regards to the various graphics presets, the difference between them was minor and generally wouldn't be as noticeable unless pushing higher resolutions. When testing the game with the i5 2500k and a GTX 970 PC Gamer, noticed severe stuttering at medium settings. It seems the venerable processor may need to be retired if your looking to spend a great deal of time with Monster Hunter World.

View at TechPowerUp Main Site
 
And people claimed more cores isn’t better :kookoo:
 
The 4k requirements isn't anything unusual or so bad.
 
Horrid port to be fair. It's a game for people with deep pockets to enjoy, since (and this is the first time I've said this) it'll be better to get a PS4 for this.
 
Horrid port to be fair. It's a game for people with deep pockets to enjoy, since (and this is the first time I've said this) it'll be better to get a PS4 for this.

Its not a bad port. The problem don't lie on graphics or optimization. It lies on interactions and calculations needed for the game to work properly. Unfortunately old CPUs are not good in neither of the two.
 
As being discovered already by most people that got the preview and in other stream. Volumetric Renderin Fog is quite heavy and make the game washed out .
Pc Gamer is only by the Name because they don't even care about messing in the setting and use stupid presets and say the game is heavy at 4 k .
They didn't even bother to mess around in the settings and see how the game perform without Volumetric Rendering Fog and if Z psoition is set to 64 bit .
This is on the same lvl of bullshit about that other dude on reddit saying the game was heavy with presets at 1440p. When he didn't even bother to mess around in the settings that is a normal thing to do ON PC.
They should stop putting PC on the same level of consoles and use stupid presets without trying out options in the settings.
Pc Gamer by the name only.
 
Last edited:
Its not a bad port. The problem don't lie on graphics or optimization. It lies on interactions and calculations needed for the game to work properly. Unfortunately old CPUs are not good in neither of the two.

Whats counted as old?
 
yet in a recent interview

Ryozo Tsujimoto said:
"When the game launches the visuals will have parity with the console versions," he says, "but we’re considering releasing a free update after launch."
 
I find it hard to believe chips like the 7700k would struggle when it evidently runs fine on PS4, with its 1.6GHz (albeit 8 core) processor. Let's double the clockspeed to make up for the 7700k only having 4 cores... so let's pretend the PS4 has a 3.2GHz quad core chip instead. Yeah, no... a stock 7700k is still >1GHz faster.

wat
 
yet in a recent interview
They probably meant a 4K resolution textures pack, like the one they released for FFXV.
 
I find it hard to believe chips like the 7700k would struggle when it evidently runs fine on PS4, with its 1.6GHz (albeit 8 core) processor. Let's double the clockspeed to make up for the 7700k only having 4 cores... so let's pretend the PS4 has a 3.2GHz quad core chip instead. Yeah, no... a stock 7700k is still >1GHz faster.
7700K is a lot faster than the poor thing in PS4 or XBox1. There is a lot more than clock speed to CPUs. That thing in both consoles is comparable to Atom, basically.

Worth noting that this is a console game.
1080p on PS4, 864p on Xbox1, struggling to keep at 30 FPS. XBox1X and PS4 Pro are both struggling around 40 FPS on 1080p.
 
Honestly, maxing 7700K seems a bit... unbelievable. Or, basically, lack of optimization.
 
I find it hard to believe chips like the 7700k would struggle when it evidently runs fine on PS4, with its 1.6GHz (albeit 8 core) processor. Let's double the clockspeed to make up for the 7700k only having 4 cores... so let's pretend the PS4 has a 3.2GHz quad core chip instead. Yeah, no... a stock 7700k is still >1GHz faster.

wat
plus the ps4 has potato IPC compared to SkyCofKab Lake
 
It's lack of optimization. Games doesn't look that pretty. As well, when adjusting various settings, obviously the performance changes and from what I have read from the user who tested it at resetera: \

FluffyQuack said:
Here's a comparison of the "Volume Rendering Quality" setting at highest and off.

Highest (54 fps): https://cdn.discordapp.com/attachme...71556035044179981/582010_20180725075413_1.png
Off (78 fps): https://cdn.discordapp.com/attachme...71556022440558592/582010_20180725075426_1.png

I imagine it'll vary on the person if they like the effect or not, but I recommend this is the first thing people tinker with, especially since it has such an enormous impact on performance. I think I'll play with the setting turned to off.

A 4790 running this is no problem, especially after turning off that ugly effect.
 
like any mmo ... a typical case of "no [insert random swearing] Sherlock!"

even if if is a bit more hungry than your typical mmo
 
It's not lack of optimization per se. Optimization points are simply different. Primarily due to hardware constraints and drive for prettier graphics, the compromize for most console games has been 30fps. That is what they are optimized for. Graphics scales a lot better to higher FPS, everything else relying on CPU often does not.
 
It's not lack of optimization per se. Optimization points are simply different. Primarily due to hardware constraints and drive for prettier graphics, the compromize for most console games has been 30fps. That is what they are optimized for. Graphics scales a lot better to higher FPS, everything else relying on CPU often does not.
Okay, so if they can't optimize the game, they can compromise the better graphics for something that doesn't take a dump on high end hardware, limiting who can play the game. The game should be able to run on an i5/1060 or something of the sort.
 
Just buy PS4 and play it, they are not crazy to port ps4 cpu optimisations to GPU-s for PC too much work for little money so..
 
Just buy PS4 and play it, they are not crazy to port ps4 cpu optimisations to GPU-s for PC too much work for little money so..
For 1080p@30fps and for what currently seems medium-ish details, fairly modest PC will do.
 
Its not a bad port. The problem don't lie on graphics or optimization. It lies on interactions and calculations needed for the game to work properly. Unfortunately old CPUs are not good in neither of the two.
It runs on a Jaguar CPU on the PS4, how can that be heavy for a desktop CPU?
 
So, 8 Jaguar cores on PS4 Pro are as good as 4 Kaby Lake cores + Hyperthreading at almost 3 times higher frequency? Nice. Fortran?
 
Its not a bad port. The problem don't lie on graphics or optimization. It lies on interactions and calculations needed for the game to work properly. Unfortunately old CPUs are not good in neither of the two.

Explain to me how those measly Jaguar cores at low clocks are faster than an Intel quad with HT. Until then, I call BS and its just a bad excuse for a bad port, that all the gullible idiots will fall for..

They aren't lying, the port is CPU heavy - but that's just because its a bad port. This message only confirms it.

"MHW loads the entire level into memory "
They bring that as if its something new... this is hilarious
 
As being discovered already by most people that got the preview and in other stream. Volumetric Renderin Fog is quite heavy and make the game washed out .
Pc Gamer is only by the game because they don't even care about messing in the setting and use stupid presets and say the game is heavy at 4 k .
They didn't even bother to mess around in the settings and see how the game perform without Volumetric Rendering Fog and if Z psoition is set to 64 bit .
This is on the same lvl of bullshit about that other dude on reddit saying the game was heavy with presets at 1440p. When he didn't even bother to mess around in the settings that is a normal thing to do ON PC.
They should stop putting PC on the same level of consoles and use stupid presets without trying out options in the settings.
Pc Gamer by the name only.

I have a Vega64, by most reviewers it's not capable of 4K gaming.
Granted, competetive games at 4K is no go, however singleplayer games at 60 fps is easily done at 4K with some OC to the vega and setting reasonable settings that far exceed any quality settings you get at 1440P.
Can't max it, no it's not 4K capable which is somewhat of an odd conclusion but when benchmarking 2400G, GT1030, GT1050ti they drop settings and say they are 1080P capable and what not.
 
Back
Top