• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Senua's Saga: Hellblade II Performance Benchmark

Crazy demanding when a 4090 gets sub 60 fps at 4k so up scaling is required for this game.

In my opinion, Unreal Engine is still very responsive even if you get around 25 fps on average.

I remember playing the first Hellblade on weak hardware at very low fps, but the game was strangely (very) responsive.

Even while recording this game, the game was responsive which you can see here.


 
wow 2560x1440 with a 3090 gets 56fps..
i'd imagine 3440x1440 would hover around 40-45ish
 
When I played the game with a 4070ti 12gb I had several FPS drops from totally playable to unplayable 5FPS. Dropping GFX settings helped to restore the game to playable. But to run the whole game uninterrupted from drops my guess it that I´d had to play it on mid 1440p or high 1080p. Lets wait for some updates to the game and drivers so I can try it again.
 
I would like to make a request that TPU stops using Max settings for 4K. High is plenty enough for most players at that resolution and any of the 16+ GB GPUs from Nvidia or AMD should deliver much higher frames. Like maybe what you are getting at 1440p max. I got the first Game from Epic but because of that I totally forgot about it. Another reason Epic is no where as good as it should be in my opinion.
This is ridiculous and on the level of 'tweak your review for my preference'. If you want the performance on a specific setting or setup other than maxed out, there's Youtube.
 
In my opinion, Unreal Engine is still very responsive even if you get around 25 fps on average.

I remember playing the first Hellblade on weak hardware at very low fps, but the game was strangely (very) responsive.

Even while recording this game, the game was responsive which you can see here.


based on the reviews from others I believe it most people are saying this game is still playable at around 40 fps.
 
They did a pretty good job last gen in rasterization and an ok one this generation, but next generation ain't looking so good for them likely matching like a 5070 at best lol.
I think they have it in their minds to 'sell us' FSR3 in every game by then. As if that's going to work, its almost as bad as 'you can crossfire two RX480's and reach perf parity'.
 
You forgot to mention that this game is the only one after 20+ years using a software 3D Sound, which sounds too good for this game, especially if you are wearing good quality headsets. On my 4.1 speakers it reminded me playing F.E.A.R with EAX 5.0 all those years ago.
Pleasantly surprised about this.

 
This is the first game I've played that truly looks "next gen" since the launch of PS5/XSX. The graphics are eerily realistic at times. It's really a work of art. It's ok as a game, it will keep you engaged for the most part.
 
FSR suffers from shimmering while moving and its not good, even at 4k... While in latest Horizon and in GoT FSR is good in this its not. I played few hours on 1440p high with 7800xt and its not 60 fps but it runs ok with 40,50,60... Even round 30 at native 4k its still playable..
 
Last edited:
In my opinion, Unreal Engine is still very responsive even if you get around 25 fps on average.

I remember playing the first Hellblade on weak hardware at very low fps, but the game was strangely (very) responsive.

Even while recording this game, the game was responsive which you can see here.



In my case use linux too (xubuntu) and runs ok around 60fps without recording and custom options like this on engine.ini

[SystemSettings]
r.AmbientOcclusionLevels=0
r.BloomQuality=0
r.DefaultFeature.AntiAliasing=0
r.DepthOfFieldQuality=0
r.fog=1
r.MaxAnisotropy=4
r.MotionBlurQuality=0
r.PostProcessAAQuality=0
r.SceneColorFringeQuality=0
r.ShadowQuality=0

on Ryzen 5 4600G oc 4300mhz for all cores - Vega 7 igpu oc 2300mhz with scythe mugen 5 black edition, i use a low resolution too in virtual desktop

and game actually run stable in my case


:)
 
Last edited:
This is the first game I've played that truly looks "next gen" since the launch of PS5/XSX. The graphics are eerily realistic at times. It's really a work of art. It's ok as a game, it will keep you engaged for the most part.

I agree, at a min it's the first UE5 game that get's me a little excited about the engine.
 
Well, it should have been CLEAR from the git go that Native 1440p not 4K is the "optimal" Display to pair with an RTX 4090.
Because most did not grasp this they will be left blaming game devs for "unoptimized ports" and or forced to use gimmicks such as "frame gen" and upscalers for decent performance.
With a Native 1440p Display none of this is required, just a pristine Native image w/ no visual artifacts or negative impacts to input lag just as the PC God's intended.

If you have an RTX 4090, the optimal display is something like 1440p / 240hz - This way you can play the latest AAA RPG's from UE5, etc at 90 FPS+ and more demanding first person shooters (not talking esports) but games such as Halo Infinite, Destiny 2, etc at up to 240 FPS. It's funny how many seem to think 1440p is suddeningly beneath them when in actuality it's the ideal resolution.
 
Well, it should have been CLEAR from the git go that Native 1440p not 4K is the "optimal" Display to pair with an RTX 4090.
Because most did not grasp this they will be left blaming game devs for "unoptimized ports" and or forced to use gimmicks such as "frame gen" and upscalers for decent performance.
With a Native 1440p Display none of this is required, just a pristine Native image w/ no visual artifacts or negative impacts to input lag just as the PC God's intended.

If you have an RTX 4090, the optimal display is something like 1440p / 240hz - This way you can play the latest AAA RPG's from UE5, etc at 90 FPS+ and more demanding first person shooters (not talking esports) but games such as Halo Infinite, Destiny 2, etc at up to 240 FPS. It's funny how many seem to think 1440p is suddeningly beneath them when in actuality it's the ideal resolution.
Nah its simply that games are always a moving target, and graphics cards are always looking for ways to upsell or sell you a new gen.

The 4090 isn't slow, but you can make it slow. NP. That's what RT is for. Now with Nanite and UE5 we get to see what's really happening in graphics advances. Not some overmarketed POS that hardly anyone can use proper and needs dev TLC, no, just in-engine improvements that utilize new tech. The 4090 is still fast... but its just about as obsolete as the handful of cards below it when you consider the actual performance maxed out in FPS. All current cards are fighting for playable frames when maxed.

We've finally arrived at a new graphics paradigm, one hinted at with DX12, then Crytek's Neon Noir and after a rocky start in UE5 we're starting to see it bear fruit. This is the move forward that the industry will carry, not this stupid RTX push that still gets scaffolded by bags of Nvidia money and marketing only.

And guess what, you don't need proprietary hardware for it. You don't need a 4090. Nobody ever did, and any recommendations regarding said card or fitting resolutions are just nonsense for people who don't understand gaming or its history. The 1440p '1080' was relegated to a 1080p card within 3 years. The examples of this are endless. Resolution's of very little relevance to anything. Even when 4K became more feasible, the early 4K games didn't even look better, because the assets in them didn't progress with the resolution bump. So now you have large empty spaces on models. Yay. On the flip side, when 4K became more feasible, developers saw the advantage of higher fidelity assets, and you can now enjoy them even on 1080p :)

Resolution is not, will not, and cannot be a factor to measure things by when it comes to gaming. It fails. Hard. All you need is time. That's also why its very unwise to move to 4K and consider it a norm, though there is the fortunate side effect you can use a near-native 1080p representation.
 
Last edited:
Nah its simply that games are always a moving target, and graphics cards are always looking for ways to upsell or sell you a new gen.

The 4090 isn't slow, but you can make it slow. NP. That's what RT is for. Now with Nanite and UE5 we get to see what's really happening in graphics advances. Not some overmarketed POS that hardly anyone can use proper and needs dev TLC, no, just in-engine improvements that utilize new tech.

We've finally arrived at a new graphics paradigm, one hinted at with Crytek's Neon Noir and after a rocky start in UE5 we're starting to see it bear fruit.
And guess what, you don't need proprietary hardware for it. You don't need a 4090.

Although a 4090 is the only card that performs ok in it so although it's not needed it's definitely giving a much better experience I would say the 4080/7900XTX are ok at 1440p though.... if the RDNA4 gpu does end up in the ballpark of a 7900XT/XTX it's gonna look pretty sad with any UF5 game pushing the engine harder than this.


I am excited about what The Coalition can do with this engine they were probably the only UE4 developer I liked with how they managed that engine.
 
Although a 4090 is the only card that performs ok in it so although it's not needed it's definitely giving a much better experience I would say the 4080/7900XTX are ok at 1440p though.... if the RDNA4 gpu does end up in the ballpark of a 7900XT/XTX it's gonna look pretty sad with any UF5 game pushing the engine harder than this.


I am excited about what The Coalition can do with this engine they were probably the only UE4 developer I liked with how they managed that engine.
Did you look at the comparison shots between low and high though? There's barely a difference but a huge framerate gain.

You don't need a top end card, you never did, its a massive waste of time, and they drop in value much harder than anything else. I get the motivation to buy them, but not ever in history of gaming has a top end card defined or made anything happen. Its the midrange and some part of high end that determines where the market's at, and therefore where the games are at.
 
Did you look at the comparison shots between low and high though? There's barely a difference but a huge framerate gain.

You don't need a top end card, you never did, its a massive waste of time, and they drop in value much harder than anything else.

In person the difference is pretty obvious to me even medium to high.... And only DLAA looks good to me AA wise.

I agree it doesn't look bad on low settings though.
 
In person the difference is pretty obvious to me even median to high.... And only DLAA looks good to me AA wise.
Sure, but that's the crucial difference between 'need this to play games' and 'want because I have enough cash to blow'
 
Sure, but that's the crucial difference between 'need this to play games' and 'want because I have enough cash to blow'

I've only played a couple hours but there are sections much heavier than whatever was benchmarked the beginning section is around 20 fps lower
 
The game was made for the Xbox console. This is why the High settings look practically the same.
 
When I played the game with a 4070ti 12gb I had several FPS drops from totally playable to unplayable 5FPS. Dropping GFX settings helped to restore the game to playable. But to run the whole game uninterrupted from drops my guess it that I´d had to play it on mid 1440p or high 1080p. Lets wait for some updates to the game and drivers so I can try it again.

1716414347984.png
 
FSR suffers from shimmering while moving and its not good, even at 4k... While in latest Horizon and in GoT FSR is good in this its not. I played few hours on 1440p high with 7800xt and its not 60 fps but it runs ok with 40,50,60... Even round 30 at native 4k its still playable..
The shimmering is gone and overall upscaling quality is much better with lukefz fsr mod, so the devs simply did a poor implementation with FSR in this case. They also didn't bother to add FSR3 fg.
 
I've only played a couple hours but there are sections much heavier than whatever was benchmarked the beginning section is around 20 fps lower

I played the first Hellblade with low FPS. In the most extreme cases the FPS went to 12 FPS. Despite this low FPS, I still found the game to be a very fun experience and I was able to easily complete the battles on the highest difficulty level. I feel like Unreal Engine 4 was designed in such a way that you can still have an acceptable experience with low FPS. On Arch Linux it was like that at the time.
 
I played the first Hellblade with low FPS. In the most extreme cases the FPS went to 12 FPS. Despite this low FPS, I still found the game to be a very fun experience and I was able to easily complete the battles on the highest difficulty level. I feel like Unreal Engine 4 was designed in such a way that you can still have an acceptable experience with low FPS. On Arch Linux it was like that at the time.

Might just be a personal thing of being use to playing games at 90-120fps or higher but If I'm not getting 60 or above I ain't playing it and if it's a game a really want to play I'm buying new hardware.....
 
Might just be a personal thing of being use to playing games at 90-120fps or higher but If I'm not getting 60 or above I ain't playing it and if it's a game a really want to play I'm buying new hardware.....

I'm not saying that I wouldn't have preferred to play the game at a higher framerate. I did notice the dips to 12 FPS and that was the point where it was bordering on completely unplayable.

But I really mean it, despite that low frame rate, I still think it's one of the best games I've ever played. In the end, it seems to me to be a game that doesn't rely heavily on high fps for a good experience.

For the cinematic parts, you can say that in a cinema films have 23.9 fps. I found the combat fairly easy, even on the highest difficulty. For example, I found NieR: Automata on medium difficulty more challenging. Even in the fights, I never found that the low fps greatly lowered my experience.

To solve the puzzles, I don't think high or low fps has a very big impact on your gaming experience either. Overall, it doesn't seem like the type of game where very high fps adds much to the experience.

My graphics settings were high/very high and the game was responsive despite the lower fps.
It seems I found it more important in this game to have nice graphics than high fps.

For example, playing Hellblade 2 on low settings on an RX 580 would not be a problem for me in terms of fps:
Graphically it wouldn't be a problem for me because the 'low setting for 1080p (no FSR)' looks better to me than Hellblade 1 at high settings..
 
Back
Top