• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Post your Monster Hunter Wilds benchmark scores

7800X3D -20 CO
7900GRE mem@2380 990mv UC
Expo 32GB ram

1440p RT/Frame Gen OFF
Screenshot 2025-02-08 113630.png
 
Big fan of Monster Hunter here...and IM PISSED AT THIS CLUSTERF... trainwreck
why dlss forced to achieve a decent framerate.
this is bad Capcom like really bad.

After the first run i changed Dlss Balanced to Quality and
score when down to 72 fps,
then i raised the anisotropic to 16x
shadows distance: medium to high
ambient occlusion: medium to high

frames stayed at 72.

Ultra native 53fps
High native 56fps

Ultra Dlss on Quality 66fps

Nvidia needs to work on their drivers too, but Capcom really???
not even considered for a moment steam most common systems list.
a 5600/x - 12400-12600
2060/ 3060
dont even mention the gtx 1660 from their own syst req list.

they need to optimize this game asap

fdfdfd.jpg
ffg.jpg
ggfgd.jpg
fdfdf.jpg
 
Preset High DLLS quality No Bloom & Motion blur
Capture d'écran 2025-02-07 071454.png


Preset Ultra FSR performance + FG enable
Capture d'écran 2025-02-09 062641.png
 
Ultra-FXAA+TAA, FSR and RT disabled..
 

Attachments

  • 20250206_224226.jpg
    20250206_224226.jpg
    2 MB · Views: 92
Had to go down to High preset do to Vram limitations on RTX 4060, else it was a stutter fest. High preset with DLSS on quality. No FG or RT enable

View attachment 383757

1440P Ultra preseet so DLSS quality and no FG or RT on. I´m deffently CPU limited here. So with a better cpu i bet i could get a higher score. Unfortunaly my monitor is native 1440P and even with nvidia DSR enable i can´t choose a higher resolution. I would have like to see my 4K score, but not posible it seems.

View attachment 383758
I only have a 1080P monitor. Want to share results, but I don't have a screenshot at this time. Edit. Same gpu RTX 4060

Using a 14700K with different CPU settings does not have an effect on the outcome of this benchmark. Like at all.

Ran this with 4c at 6100mhz all the way to stock. Average scores 1080P with your exact settings produced almost exactly the same result, about 73 FPS.

This benchmark seems to not care about any GPU clock speed, minor differences in scores as well.

It's almost as if the shader count matters the most and that's it.
 
I only have a 1080P monitor. Want to share results, but I don't have a screenshot at this time. Edit. Same gpu RTX 4060

Using a 14700K with different CPU settings does not have an effect on the outcome of this benchmark. Like at all.

Ran this with 4c at 6100mhz all the way to stock. Average scores 1080P with your exact settings produced almost exactly the same result, about 73 FPS.

This benchmark seems to not care about any GPU clock speed, minor differences in scores as well.

It's almost as if the shader count matters the most and that's it.

That differs from what I see on my 3070 at 1440p.

Stock 1905 MHz cores, 1750 MHz VRAM: 48.28 FPS (31 FPS observed low)
OC 2100 MHz cores, 2100 MHz VRAM: 54.06 FPS (35 FPS observed low)

I'll take a free 12% every time.
 
That differs from what I see on my 3070 at 1440p.

Stock 1905 MHz cores, 1750 MHz VRAM: 48.28 FPS (31 FPS observed low)
OC 2100 MHz cores, 2100 MHz VRAM: 54.06 FPS (35 FPS observed low)

I'll take a free 12% every time.
That's pretty good. I haven't tried this with 1440p or the 4070 Super quite yet.

I will say I was able to crash the benchmark a few times..... haha.
 
That differs from what I see on my 3070 at 1440p.

Stock 1905 MHz cores, 1750 MHz VRAM: 48.28 FPS (31 FPS observed low)
OC 2100 MHz cores, 2100 MHz VRAM: 54.06 FPS (35 FPS observed low)

I'll take a free 12% every time.
Whoops! I realized you posted at 1080p, let's see what changes:

Same settings of High but no upscaling, just TAA (forgot to mention above)

Stock 1920 MHz cores, 1750 MHz VRAM: 63.08 FPS (39 FPS observed low)
OC 2115 MHz cores, 2100 MHz VRAM: 69.03 FPS (43 FPS observed low)

Only a 9.4% improvement at 1080p, but I'll still take it.
 
7800X3D -20 CO
7900 XT mem@2630 1075mv
Expo 32GB RAM 6000 CL30

1440p RT/Frame Gen OFF

1739130811938.png
 
Ran it all at default, closed it, ran it all at default again. It picked any and all settings.
RTX A2000 is a 6GB at default clocks.

16995.png
 
I was away for the weekend, just came back and gave it another go:

without FG, same settings
Monster Hunt no FG.jpg


without FG & FSR, same settings:
Monster Hunt no FG & FSR.jpg
 
[PIcture of a Steve]
"8Gb of VRAM."
Games still have to run on current consoles (which is why 8GB is usable), but what everyone is talking about is 1440 and greater as base console resolution shrinks to 1080p and lower; <60fps.
Along with (currently) mostly PC-exclusive features. This means 1440p and higher, along with those higher-end settings, require more compute power and buffer in-step accordingly.
FWIW, I haven't watched any of their vids in a while (because I've been worn out from trying to decipher all the 50-series bs, not bc of them), and last I recall he/they suggested 16GB; that is a fair opinion.
That has always been the argument. Wizard is/was on the '8GB is enough' forever (perhaps bc of the former rational), which is the joke. People realize increasingly lower console (base) settings at diff times.
I say it truly all depends, but when you factor in things like high-rez textures, upscaling, RT, FG, GI, PT, RR, etc, 12GB becomes a limitation with the ideal closer to 18GB. That's all I'm trying to get across.
SWO is a perfect example of where we're going. That game runs at 720p (8GB is in-fact enough...for 720p) on current consoles and will use 18GB with all the options turned on at 1440p.
I like using SWO, some like Hogwarts (which is >12GB at at any rez w/ RT). It's pretty clear next-gen is aimed at 1440p->4k upscaled (non-rt) and 1080p->4k RT w/ 16GB or more, worse-case.
Really probably ~14.5-17.5GB. Point is, >12-18GB. Conceivably 16GB and then 18GB needed for gpus to keep up, (but probably not 18gb at first bc that would ruin player-base). >12GB is NEEDED.
It's totally fine if you don't play demanding games or use those settings, but if you want to be prepared for the reality of base 1440p or even 1080pRT upscaled to 1440p/4k, you want at least 16GB.
For the one millionth and 42nd time, this is why >$300 12GB GPUs are a sham. I don't want people believing it will be a good option for 1440p (or 1080pRT upscaled) going forward (in many instances).
It's the same way you shouldn't expect 16GB to be enough for native 4k or 1440p->4k (esp RT). This is why 5080 is a sham. I know some will argue the GDDR7 bw, so let's argue bw vs buffer and stutters.
The reason why you need larger buffer allocation is so you don't need to refill the texture cache constantly which can cause stutter.
RAM can be fast to swap (like on a 5080 that attempts to use bw to replace the more ram it really needs), but if you need a texture rn and not allocated in buffer, a swap can (and often does) cause a stutter.
This is why in MH (et al) in-between biomes there are hallways or w/e; to switch the buffer so GPUs don't stutter. Not all games are designed that way and/or have that luxery, esp if large open-world.
I know, it's a struggle. I've been familiar with this since the 970 back in the day (running it in a bunch of games/scenarios) and trying to figure out wtf was happening. It's made sense ever since.
I figure this is probably why Radeons are pretty-much designed to never run out of buffer (for their absolute compute capability in gaming). I am thoroughly convinced ATi/AMD does it to avoid that.
4k ultra DLSS no frame interpolation.
View attachment 383418

Curious if it can keep 4k60 DLSS MAX RT mins w/ other AA. Haven't looked, are you saying it forces TAA?
Wonder if it's one of those "no hotspot monitoring" situations. Forking PresentMon so you can't see native FR during FG (which I don't know if bc bad perf or bc showing it could be done on older cards).
If it makes them looks bad, they hide it by making it not possible to know. I figured DLSS4 perf hit would impact generally-accepted settings on some cards...wonder if it's the case here? Perhaps with FG?
That said, if 4k60 mins DLSS MAX RT (non-ghosting AA or native DLAA) isn't possible on a 4090, I would argue Capcom needs to optimize the game to that standard. That's on them imho, not the 4090.
Edit: IG what I'm wondering is when using other (more-demanding) anti-aliasing methods, and that's why they're not there. Editted for clarification.

Apologies if this is later in the thread, I'm just taking it in and registering thoughts as I go.

I asked bc I ordered it on PS given my whole friends list is there and it's just easier. I don't need a benchmark to tell you the beta runs very 'not well' (bc the PS5 is obviously getting pretty old).
Hopefully they can polish it up decently for release. I wouldn't even mind if it dropped to 48fps mins on the regular PS5, as that's the lower VRR window, granted PRO users prolly deserve locked 60, or close.
I honestly haven't studied all their settings, but saw there's a 120hz mode on console. Is that locked 40fps in fidelity? It should be if it isn't. I always use performance (and it was still rough).
This is one of those games they should've optimized for whatever they can do @ 40 locked on a base PS5 imho, 48 lowest on pro. 60 asking a lot, esp on the base.
I guess that would potentially screw over 12GB GPU owners @ 1440p native though, and I get for this game they want the user base as large as humanly possible...so it makes sense to not do that.
That's my whole point though: It would be better if it were targeted that way, but then require 16GB GPUs for really nice 1440p. More games probably'll be that way soon regardless, I would imagine.
The reason why is to think of the PS5 CPU. If the PS5 cpu is 8-core @ 3.5ghz, and the PS5 pro 8 @ 3.85ghz, and the PS6 12-core at around the level of PS5pro (or more), what's the common-sense base?
It's around 1080p40 (.5x scale), or 720p60 (.33x scale), right? Scale that to 1440p60 native. That would (perhaps) require > than 12GB. That's not what they did (so 12GB okay) but that's where we're going.
This is shown by use of the high-res texture pack...
Yep, uses 18+ GB even in 1080p with the highest textures. 16 GB is actually required for this setting o_O
Is it strange, or rather a prime example of the whole point of what I've been trying to explain?

You be the judge.

I don't want to hear about 12GB struggle running high-rez (1440p?) textures when it will have the low-rez console (1080p? Less?) texture option available. The '8/12GB is enough' option. :p

Ig it's more like "monsters-are-nightmare-inducing-rudimentary-polygons" struggle, in this case, rather than stutter (likely to avoid said stutter until there can be a possible swap, if able). Same point applies.

Please don't get too hyped up about what I'm saying; it's a conversation. If you think I'm wrong, prove it (which helps everyone).
I'm pushing for better understanding/preperation for everyone; improvements made by the developers and/or hardware manufacturers for standards...that's all. It's not an attack on anyone. :love:
 
Last edited:
Games still have to run on current consoles (which is why 8GB is usable), but what everyone is talking about is 1440 and greater as base console resolution shrinks to 1080p and lower; <60fps.
Along with (currently) mostly PC-exclusive features. This means 1440p and higher, along with those higher-end settings, require more compute power and buffer in-step accordingly.
FWIW, I haven't watched any of their vids in a while (because I've been worn out from trying to decipher all the 50-series bs, not bc of them), and last I recall suggested 16GB; that is a fair opinion.
That has always been the argument. Wizard is/was on the '8GB is enough' forever (perhaps bc of the former rational), which is the joke. People realize increasingly lower console (base) settings at diff times.
I say it truly all depends, but when you factor in things like upscaling, RT, FG, GI, PT, RR, etc, 12GB becomes a limitation with the ideal closer to 18GB. That's all I'm trying to get across.
SWO is a perfect example of where we're going. That game runs at 720p (8GB is in-fact enough...for 720p) on current consoles and will use 18GB with all the options turned on at 1440p.
I like using SWO, some like Hogwarts (which is >12GB at at any rez w/ RT) at 4k native. It's pretty clear next-gen is aimed at 1440p->4k upscaled (non-rt) and 1080p->4k RT w/ 16GB or more, worse-case.
Really ~14.5-17.5GB, but who's that precise? :p Point is, >12-18GB. Conceivably >16GB and then 18GB for gpu, but probably not at first (bc that would ruin playerbase). 16GB > 12GB is my point.
It's totally fine if you don't play demanding games or use those settings, but if you want to be prepared for the reality of base 1440p or even 1080pRT upscaled to 1440p/4k, you want at least 16GB.
For the one millionth and 42nd time, this is why 12GB GPUs are a sham. I don't want people believing it will be a good option for 1440p (or 1080pRT upscaled) going forward (in some/many instances).
It's the same way you shouldn't expect 16GB to be enough for native 4k or 1440p->4k (esp RT). This is why 5080 is a sham. I know some will argue the GDDR7 bw, so let's argue bw vs buffer and stutters.
It gives me another reason to bring up Destiny. The reason why you need larger buffer allocation is so you don't need to refill the texture cache constantly which can cause stutter. 970 says hi.
While that RAM can be fast (like on a 5080 that uses BW to replace the extra ram it actually needs), if you need that texture rn and not allocated in buffer, a swap can (and often does) cause a stutter.
This is why in Destiny in-between biomes there are hallways/streets/whatever; to switch the buffer so it doesn't stutter. Not all games are designed that way and/or have luxery, esp if large open-world.
I know, it's a struggle. Believe me, I am very familiar with this ever since thoroughly testing the 970 back in the day and trying to figure out wtf was happening. It's made sense ever since.
I figure this is probably why Radeons are pretty-much designed to never run out of buffer (for their comparable compute capability). I am thoroughly convinced ATi/AMD does it to avoid that.


Curious if it can keep 4k60 DLSS MAX RT mins. Haven't looked, are you saying it forces TLAA?
I wonder if it's another one of those "no hotspot monitoring" situations. If it looks bad, hide it by making it not possible. I told ya'll DLSS4 performance hit would impact generally-accepted settings.
That said, if 4k60 mins DLSS MAX RT isn't possible on a 4090, I would argue Capcom needs to optimize the game to that standard. That's on them imho, not the 4090.

Apologies if this is later in the thread, I'm just taking it in and registering thoughts as I go.

I asked bc I ordered it on PS given my whole friends list is there and it's just easier. I don't need a benchmark to tell you the beta runs very 'not well' (bc the PS5 is obviously getting pretty outdated).
Hopefully they can polish it up decently for release. It would be nice if it were 1080p ~55fps minimum (limited by CPU clockspeed on base PS5 versus Pro [and maybe PS6?]) as it'd be okay with VRR.
That should be their goal if it's possible.
I honestly haven't studied all their settings, but see there's a 120hz mode on console. Is that 40fps? It should be if it isn't.
This is one of those games they should've optimized for whatever they can do @ 1080p/40 locked on a base PS5 imho.
I guess that would potentially screw over 12GB GPU owners @ 1440p though, and I get for this game they want the user base as large as humanly possible...so it makes sense to not do that.
That's my whole point though...It would be better if it were, but then people would need 16GB GPUs for a good base 1440p experience. Some games are and many more WILL be that way.
The reason why is to think of the PS5/XSX CPU. If the PS5 gpu is 8-core @ 3.5ghz, and the PS5 pro 8 @ 3.85ghz, and the PS6 12-core at around the level of PS5pro, what's the common-sense base setting?
It's around 1080p30-40fps, right (or 1536*864p60)? Now scale that to 1440p/60fps. That would require more than 12GB. That's not what this is (so 12GB isn't discouraged) but that's where we're going.
This is shown by use of the high-res texture pack...

Is it strange, or rather a prime example of the whole point of what I've been trying to explain?

You be the judge.

But I don't want to hear about 12GB stutter struggle because you will have the non high-rez option available.

I guess in this game it's more like 12GB "monsters-are-fucked-up-nightmare-inducing-rudimentary-polygons" struggle, in this case, bc they don't want it to stutter. Same point applies.

Please don't get too hyped up about what I'm saying; it's a conversation. If you think I'm wrong, prove it (which helps everyone). I'm pushing for better understanding/preperation for everyone; that's all.

I've deleted it but RT is very light it's at least 60fps at 4k with TAA it doesn't force TAA but looks really bad with no AA.

I don't remember any portion under 60 at 4k
 
2560x1080 Medium DLSS Performance.

Screenshot (13).png
 
I've deleted it but RT is very light it's at least 60fps at 4k with TAA it doesn't force TAA but looks really bad with no AA.

I don't remember any portion under 60 at 4k

Thanks. I kinda think it looks pretty rough generally regardless (imho), but these games have always been a 'little' janky. I don't remember them looking THIS soft, though. At least not World.
Rise, IG, but that was a Switch port so w/e.

Hopefully they add DLAA/other options. TAA does in-fact kind of suck (in most cases). One of those things I certainly wouldn't recommend unless you need it to make the game playable w/o looking horrible.
 
Shader Comp = ~7mins35secs
Win11Pro 23H2 Build 22631.4751

20250209204058_1.jpg

20250209204738_1.jpg

20250209205754_1.jpg

Do all Monster Hunter games look this ass?

The actual visuals popping in and lack of detail at medium range make me want to look up DF videos about past games in the franchise...
 
Thanks. I kinda think it looks pretty rough generally regardless (imho), but these games have always been a 'little' janky. I don't remember them looking THIS soft, though. At least not World.
Rise, IG, but that was a Switch port so w/e.

Hopefully they add DLAA/other options. TAA does in-fact kind of suck (in most cases). One of those things I certainly wouldn't recommend unless you need it to make the game playable w/o looking horrible.

It's a very meh looking game. Nothing but maybe mods will save it.
 
Do all Monster Hunter games look this ass?

The actual visuals popping in and lack of detail at medium range make me want to look up DF videos about past games in the franchise...

Go back and look at World in general. While there is purposely lower attempted draw distance (smaller arenas/vistas etc), it really doesn't imho.
I mean, it kinda does, but MH is just 'like that'; also it was PS4/xb1, and the roots go back to mobile....so you just have to kind of roll w/ it.

I truly feel tho, and perhaps you and ox agree, this one is very much a case of the their reach exceeding their grasp. The current consoles simply hinder it too much for what they're trying to accomplish, I think.
That, and perhaps culling to make memory requirements more manageable w/o performance issues.

Like I said, maybe if they were shooting for 1080p40 or 720p60 it could fix a lot of things, but with shooting for 1080p60 or higher scaled to 4k on the base PS5 it just really cuts down what they can do imho.

Maybe a revamp will happen for the expansion around the time of the PS6? One can hope? I dunno.

Will defer final judgement until final release update 1 update 2 later.
 
4K, DLSS Quality + RT + FG

1739202061607.png


And here's without FG:

1739202832802.png


I have it running at +400 on core and +2000 on memory.
 
Last edited:
High preset using native res with US, FG and RT disabled, on an oc'd system:

hunter.jpg


No CPU bottleneck to speak of. GPU utilization was at 98-99%, briefly dropping on camera change :rockout:
 
I know y'all been waiting for this, I got you covered:

Ryzen 7840HS
Radeon 780m (8GB dedicated to GPU)
32GB 5600 DDR5 (24GB for CPU)

720p TAA, Low with required settings improvements to turn top tier dogsh*t into merely your typical everyday dogsh*t:
All Shadows to Low (some are off or Lowest)
AO on Med
Models on Med
Textures on High (woo 8GB VRAM!)

33.35 FPS, 20 FPS minimum observed in grassy area

MHW 780m 720p TAA Low setting, Low Shadows + AO, Med models, Hi Textures.jpg


I could probably maybe play this if I didn't find Monster Hunter games inscrutable. I'mma try to see if 1024x576 is available via .ini file.
 
Back
Top