Friday, October 27th 2023

PSA: Alan Wake II Runs on Older GPUs, Mesh Shaders not Required

"Alan Wake II," released earlier this week, is the latest third person action adventure loaded with psychological thriller elements that call back to some of the best works of Remedy Entertainment, including "Control," "Max Payne 2," and "Alan Wake." It's also a visual feast as our performance review of the game should show you, leveraging the full spectrum of the DirectX 12 Ultimate feature-set. In the run up to the release, when Remedy put out the system requirements lists for "Alan Wake II" with clear segregation for experiences with ray tracing and without; what wasn't clear was just how much the game depended on hardware support for mesh shaders, which is why its bare minimum list called for at least an NVIDIA RTX 2060 "Turing," or at least an AMD RX 6600 XT RDNA2, both of which are DirectX 12 Ultimate GPUs with hardware mesh shaders support.

There was some confusion among gaming online forums over the requirement for hardware mesh shaders. Many people assumed that the game will not work on GPUs without mesh shader support, locking out lots of gamers. Through the course of our testing for our performance review, we learned that while it is true that "Alan Wake II" relies on hardware support for mesh shaders, the lack of this does not break gameplay. You will, however, pay a heavy performance penalty on GPUs that lack hardware mesh shader support. On such GPUs, the game is designed to show users a warning dialog box that their GPU lacks mesh shader support (screenshot below), but you can choose to ignore this warning, and go ahead to play the game. The game considers mesh shaders a "recommended GPU feature," and not a requirement. Without mesh shaders, you can expect a severe performance loss that is best illustrated with the AMD Radeon RX 5700 XT based on the RDNA architecture, which lacks hardware mesh shaders.
In our testing, at 1080p, without upscaling, the RX 5700 XT performs worse than the GeForce GTX 1660 Ti. In most other raster-only titles, the RX 5700 XT with the latest AMD drivers, is known to perform about as fast as an RTX 2080. Here it's seen lagging behind the GTX 1660 Ti. It's important to note here, that the GTX 16-series "Turing," while lacking in RT cores and tensor cores from its RTX 20-series cousin, does feature hardware support for mesh shaders, and is hence able to perform along expected lines. We have included a projection for how the RX 5700 XT fares typically in our testing—it ends up roughly around the performance region of the RTX 3060 and RX 6600 XT. AMD's Radeon RX 6000 series RDNA2 and current RX 7000 series RDNA3 fully support hardware mesh shaders across all GPU models.

That doesn't mean that RX 5700 XT delivers unplayable results. 1080p at 60 FPS is in reach with lowest settings, or at close to maximum settings with FSR Quality, which is not such a terrible tradeoff, just you still need to make compromises. We didn't spot any rendering errors or crashes.

Once we knew that RX 5700 XT works, we also wanted to test the NVIDIA side of things. Using the GeForce GTX 1080 Ti "Pascal" , the flagship GPU from that generation, we were greeted with the same warning dialog as the RX 5700 XT—that the GPU is missing support for mesh shaders. Not only does the GTX 1080 Ti vastly underperform, but it yields far worse performance than the RX 5700 XT, nearly 2-3rds. At launch, the RX 5700 XT was a little bit slower than the GTX 1080 Ti in our reviews of the time, but has climbed since, and is now a tiny bit faster. Since the card lacks DLSS support, using FSR is the only option, but even that can't save the card. Running at 1080p lowest with FSR 2 Ultra Performance yielded only 27 FPS.
Add your own comment

100 Comments on PSA: Alan Wake II Runs on Older GPUs, Mesh Shaders not Required

#51
ToTTenTranz
I think people are giving way too much importance on a game whose PC version is pretty much dead on arrival because it's an Epic Games Store exclusive.
Posted on Reply
#52
W1zzard
ToTTenTranzI think people are giving way too much importance on a game whose PC version is pretty much dead on arrival because it's an Epic Games Store exclusive.
Check numbers at your favorite Torrent site
Posted on Reply
#53
trparky
ToTTenTranzI think people are giving way too much importance on a game whose PC version is pretty much dead on arrival because it's an Epic Games Store exclusive.
Is the Epic Games Store like some black sheep of the gaming world?
Posted on Reply
#54
AusWolf
trparkyIs the Epic Games Store like some black sheep of the gaming world?
Many people (myself included) don't like it because of the exclusive deals and no other service provided. With that said, I just bought and installed the game on EGS (this will be my first and only ever purchase there), and noticed that the Epic app prevents my monitor from entering sleep mode. What an absolute turd that app is anyway. Zero customizability, and now this.
Posted on Reply
#55
theouto
AusWolfMany people (myself included) don't like it because of the exclusive deals and no other service provided. With that said, I just bought and installed the game on EGS (this will be my first and only ever purchase there), and noticed that the Epic app prevents my monitor from entering sleep mode. What an absolute turd that app is anyway. Zero customizability, and now this.
Oh my, I knew it was trash, even used it, but that was long ago, I did not realize it was still trash, welp, thanks for letting me know that I should avoid it still! Bummer.
Posted on Reply
#56
Tomorrow
PapaTaipei16 fps on a 1080ti at 1080p with no upscaling or rt. This is hilarious.
Even more hilarious is the fact that a 7900XTX that costs over $1000 can only do around 40fps in 4K with no upscaling and RT. And that's with 7800X3D. As per Daniel Owen's last video.
Imagine paying four figures for a GPU (and another four figures for the platform) only to have to use upscaling to even get to 60fps and the game is not even running at max settings (by that i mean RT included).

What a sad state of affairs.
Posted on Reply
#57
tpa-pr
Trying to chip in with a more measured response here: I think the primary issue people have with such steep system requirements (that they aren't articulating properly) is that a lot of these modern games don't look "good" enough to justify such monstrous requirements. Fidelity improvements have plateued from what people were used to back in the day so it's hard to understand WHY we suddenly need to rely on tech such as upscaling, especially in non-RT modes, just to get the game to run decently. Remnant II for example really doesn't look all that different to a game from 2 or even 5 years ago, yet we have top end cards not able to maintain a consistent framerate.

It's not necessarily a matter of people being afraid of progressing technology, it's that the progress isn't seen as impressive enough to justify the cost.
Posted on Reply
#58
mama
AusWolfMany people (myself included) don't like it because of the exclusive deals and no other service provided. With that said, I just bought and installed the game on EGS (this will be my first and only ever purchase there), and noticed that the Epic app prevents my monitor from entering sleep mode. What an absolute turd that app is anyway. Zero customizability, and now this.
It's a game delivery device. It works to that extent so I don't see the need for partisan comparisons about which game delivery device works best. Just play the game you want whether on Steam or otherwise.
tpa-prTrying to chip in with a more measured response here: I think the primary issue people have with such steep system requirements (that they aren't articulating properly) is that a lot of these modern games don't look "good" enough to justify such monstrous requirements. Fidelity improvements have plateued from what people were used to back in the day so it's hard to understand WHY we suddenly need to rely on tech such as upscaling, especially in non-RT modes, just to get the game to run decently. Remnant II for example really doesn't look all that different to a game from 2 or even 5 years ago, yet we have top end cards not able to maintain a consistent framerate.

It's not necessarily a matter of people being afraid of progressing technology, it's that the progress isn't seen as impressive enough to justify the cost.
True but I expect performance costs will decrease as the new standard sets in.
TomorrowEven more hilarious is the fact that a 7900XTX that costs over $1000 can only do around 40fps in 4K with no upscaling and RT. And that's with 7800X3D. As per Daniel Owen's last video.
Imagine paying four figures for a GPU (and another four figures for the platform) only to have to use upscaling to even get to 60fps and the game is not even running at max settings (by that i mean RT included).

What a sad state of affairs.
I don't have to imagine. And yes, it is a sad state of affairs. Tell me who to blame other than myself.
Posted on Reply
#59
trparky
tpa-prIt's not necessarily a matter of people being afraid of progressing technology, it's that the progress isn't seen as impressive enough to justify the cost.
Couldn’t have said it better myself.

Lifelike is lifelike. What more do we need?
Posted on Reply
#60
Dr. Dro
AusWolfHow could I forget about Oblivion? You basically had to mod it to make it run properly on anything other than a high-end GeForce 7800 or such.

Or even then, you turned on HDR and the grass density/distance, and your PC died. :laugh:
It really shows how much G80 was revolutionary IMO. You can play complex games as far as Borderlands PreSequel on a GeForce 8800 GTX or Ultra without a problem (and if you happen to own a Quadro FX 5600 like me, the 1.5 GB helps immensely handling 1080p smoothly as well), and even some minor recent releases like the Final Fantasy pixel remasters run flawless on these cards - in DirectX 11, too! Sure it's feature level 10_0 but still, quite well supported for what it is.

That means that despite being a GPU from 2006, it's still capable of tackling basic games released well into the 2020s. Unified shaders with full programmability really were the bedrock of modern graphics.
Posted on Reply
#61
Belfaborac
tpa-prTrying to chip in with a more measured response here:
I'll add another, less measured one: I think a number of people are offended that games they want to play are released with sysreqs which would require upgrades they can't afford, acting as though developers are deliberately thumbing their collective noses at their lack of funds. As if playing a particular game at max settings has somehow become a universal human right.

With no particular reference to anyone in this thread; just a general thought.

In other news i would quite like a winter home in the Bahamas or maybe Antigua and Barbuda, but my bank statements keep telling me it ain't happening anytime soon.

We all have our crosses to bear.
Posted on Reply
#62
tpa-pr
BelfaboracI'll add another, less measured one: I think a number of people are offended that games they want to play are released with sysreqs which would require upgrades they can't afford, acting as though developers are deliberately thumbing their collective noses at their lack of funds. As if playing a particular game at max settings has somehow become a universal human right.

With no particular reference to anyone in this thread; just a general thought.

In other news i would quite like a winter home in the Bahamas or maybe Antigua and Barbuda, but my bank statements keep telling me it ain't happening anytime soon.

We all have our crosses to bear.
There might be some of that, sure. But at the same time you have, for example, the director of a major game studio telling people with the best hardware available to "just upgrade your PC" when the studio's game just doesn't look good enough to be problematic on current generation hardware. Yet it is problematic and somehow considered the fault of the consumer for not having better-than-the-best available hardware?

You can't help but feel slighted considering the cost of PC parts these days.

Older gamers (including myself) were expecting basically photo-realistic graphics by this point with the hardware to match. The graphics just aren't there and yet the mythical hardware is still required.
Posted on Reply
#63
Belfaborac
I agree with all of that, and more besides. There are valid annoyances and concerns on all sides.
Posted on Reply
#64
solarmystic
Dr. DroIt really shows how much G80 was revolutionary IMO. You can play complex games as far as Borderlands PreSequel on a GeForce 8800 GTX or Ultra without a problem (and if you happen to own a Quadro FX 5600 like me, the 1.5 GB helps immensely handling 1080p smoothly as well), and even some minor recent releases like the Final Fantasy pixel remasters run flawless on these cards - in DirectX 11, too! Sure it's feature level 10_0 but still, quite well supported for what it is.

That means that despite being a GPU from 2006, it's still capable of tackling basic games released well into the 2020s. Unified shaders with full programmability really were the bedrock of modern graphics.
I'd agree if not for the fact that D3D11 feature 11_0 soon become mandatory for games to even boot in short order after DX11 came out in 2009(!) on Windows 7 (14 years old already, how time flies!), which made DX10 only cards like the G80 and G92 obsolete in many titles a lot quicker than they should've been given the raster power they had.

It's still a sore point for me since in 2009 i bought a laptop which had a DX10 only Mobilty HD Radeon 4670 1 GB which, while low-mid range at the time, i could tweak and tune settings in many contemporary games to run them fluidly, until DX11 feature 11_0 became ingrained in more and more titles, which it couldn't boot at all.

DX10 in many ways, was a short lived, transitional period that lasted only 3 years, had token games made using it's feature set (devs mostly made games for 9.0c or 11 and just skipped 10 entirely, some like CAPCOM making token efforts on PC like DMC4 and RE5 for example).

DX11 turned out to be the longest ever DX era, still relevant to this very day.
Posted on Reply
#65
Dr. Dro
solarmysticI'd agree if not for the fact that D3D11 feature 11_0 soon become mandatory for games to even boot in short order after DX11 came out in 2009(!) on Windows 7 (14 years old already, how time flies!), which made DX10 only cards like the G80 and G92 obsolete in many titles a lot quicker than they should've been given the raster power they had.

It's still a sore point for me since in 2009 i bought a laptop which had a DX10 only Mobilty HD Radeon 4670 1 GB which, while low-mid range at the time, i could tweak and tune settings in many contemporary games to run them fluidly, until DX11 feature 11_0 became ingrained in more and more titles, which it couldn't boot at all.

DX10 in many ways, was a short lived, transitional period that lasted only 3 years, had token games made using it's feature set (devs just made games for 9.0c or 11 and just skipped 10 entirely, some like CAPCOM making token efforts on PC like DMC4 and RE5 for example).

DX11 turned out to be the longest ever DX era, still relevant to this very day.
Agreed, although we likely have the overwhelmingly negative reception of Vista amongst gamers to thank for the 9.0c/11 split :/
Posted on Reply
#66
AusWolf
theoutoOh my, I knew it was trash, even used it, but that was long ago, I did not realize it was still trash, welp, thanks for letting me know that I should avoid it still! Bummer.
There is ZERO innovation on the Epic store, as far as I see it. The only things keeping it alive are the exclusives and free games. I wholeheartedly wish Epic to go bankrupt so I can have their games on Steam.
mamaIt's a game delivery device. It works to that extent so I don't see the need for partisan comparisons about which game delivery device works best. Just play the game you want whether on Steam or otherwise.
If the game delivery device prevents my monitor from entering sleep mode while installing, then it's not doing its job properly, is it? When I googled, I saw posts on the EGS support site all the way back from 2018 with the same issue, and it's still not being dealt with. Besides, why can't I set it up so it only does that: deliver my games? Every other platform has a default screen which I can set to being my game library. Even Ubisoft's app! Why do I have to look at the EGS storefront with a big "Alan Wake 2 out now, buy now" banner every time I open the app when I already have the game? Sorry, this level of turdiness doesn't fly with me.
BelfaboracI'll add another, less measured one: I think a number of people are offended that games they want to play are released with sysreqs which would require upgrades they can't afford, acting as though developers are deliberately thumbing their collective noses at their lack of funds. As if playing a particular game at max settings has somehow become a universal human right.
I would generally agree, although, back when PC parts were cheap, we had to upgrade every year. Now, it's enough to upgrade every 5 or so years. Whether I spend 200 quid every year, or 800 every 5 years makes no difference in the long run (unless I spend all my money that I should save for my next PC upgrade in the meantime, which I assume, the biggest complainers do).
Posted on Reply
#67
W1zzard
AusWolfThere is ZERO innovation on the Epic store, as far as I see it
100% agree. A small dev team could add so many improvements within weeks and months, which leads me to believe that Epic simply has no interest in improving it.
Posted on Reply
#68
AusWolf
W1zzard100% agree. A small dev team could add so many improvements within weeks and months, which leads me to believe that Epic simply has no interest in improving it.
They don't have to if you're forced to buy their games on EGS by having no other choice. This is why I find their practice utterly disgusting. AW2 will be the first and last game I ever bought on EGS.

Not to mention, they can't even be bothered to fix age-old bugs by the looks of it, which is even sadder.
Posted on Reply
#69
Assimilator
trparkyLifelike is lifelike. What more do we need?
The person who invented the wagon said the same thing.
Posted on Reply
#70
Vayra86
AusWolfDoes anybody remember the late '90s - early 2000s when you had to buy a new graphics card every year because of new feature sets that made new games not even start on the old card? Anyone?

Sure, they were cheaper, but instead of paying $200 every year, now we pay $5-800 every 5 years. Pascal was awesome, and the 1080 (Ti) had a long and prosperous life, but sometimes, we have to move on.
The writing was on the wall for a long time already.

People are just in denial... I jumped on the 7900XT because I already saw what was gonna happen and it would affect my gaming too - it started doing so with TW Warhammer already and that's not even a state of the art engine. For much the same reasons, I strongly recommend people to get 16GB+ VRAM and solid hardware rather than buying/paying for better software, for anything midrange or up. And here we are...
Posted on Reply
#71
Tomorrow
Vayra86The writing was on the wall for a long time already.

People are just in denial... I jumped on the 7900XT because I already saw what was gonna happen and it would affect my gaming too - it started doing so with TW Warhammer already and that's not even a state of the art engine. For much the same reasons, I strongly recommend people to get 16GB+ VRAM and solid hardware rather than buying/paying for better software, for anything midrange or up. And here we are...
And Nvidia sells a 12GB card for 800+ as if it's normal. The only 16GB card they have is horribly priced at 1200+ and is the worst selling 80 class in the history. The next jump is 24GB for 1700+. Nothing in between. If they had a 16GB option around 800 that would be at least reasonable and would be competition for 7900XT.
Posted on Reply
#72
Vayra86
TomorrowAnd Nvidia sells a 12GB card for 800+ as if it's normal. The only 16GB card they have is horribly priced at 1200+ and is the worst selling 80 class in the history. The next jump is 24GB for 1700+. Nothing in between. If they had a 16GB option around 800 that would be at least reasonable and would be competition for 7900XT.
I don't get for the life of me why people bought Ada below the 4080 to begin with. Its a complete waste of time. At least you're leading in the amount of abbreviations you need to switch on to game proper, well yay.
AssimilatorThe person who invented the wagon said the same thing.
If anything is built on stagnation, its the automotive business itself though lol. Heck, it thrives on stagnation.
W1zzard100% agree. A small dev team could add so many improvements within weeks and months, which leads me to believe that Epic simply has no interest in improving it.
Yep and that's quite a worrying conclusion, because it means Epic's business doesn't float on recurring customer base that just 'wants to be on EGS'. Its puzzling isn't it... they throw millions in game funding, and nothing in customer experience/journey.

I used to give EGS benefit of the doubt, but I just can't anymore. Their launcher is horrible to use, it literally detracts from playing games through EGS. I just don't trust my game library being there either because of it.
Posted on Reply
#73
Assimilator
Vayra86Yep and that's quite a worrying conclusion, because it means Epic's business doesn't float on recurring customer base that just 'wants to be on EGS'. Its puzzling isn't it... they throw millions in game funding, and nothing in customer experience/journey.

I used to give EGS benefit of the doubt, but I just can't anymore. Their launcher is horrible to use, it literally detracts from playing games through EGS. I just don't trust my game library being there either because of it.
I honestly don't understand what Epic's business model is at this point. I think the problem is that they don't, either.
Posted on Reply
#74
AusWolf
AssimilatorI honestly don't understand what Epic's business model is at this point. I think the problem is that they don't, either.
I think the closest guess is "exclusivity contracts -> gamers don't have a choice -> money comes without spending on development and innovation".
Posted on Reply
#75
Prima.Vera
Those are good points.
However, can somebody explain how this game engine is good when on an GPU RTX 3080 with a 13700K CPU, you get ~30fps on 1440p res (natively, no DLSS garbage), and WITHOUT enabling RTS ???
Posted on Reply
Add your own comment
Dec 18th, 2024 06:01 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts