Friday, October 27th 2023

PSA: Alan Wake II Runs on Older GPUs, Mesh Shaders not Required

"Alan Wake II," released earlier this week, is the latest third person action adventure loaded with psychological thriller elements that call back to some of the best works of Remedy Entertainment, including "Control," "Max Payne 2," and "Alan Wake." It's also a visual feast as our performance review of the game should show you, leveraging the full spectrum of the DirectX 12 Ultimate feature-set. In the run up to the release, when Remedy put out the system requirements lists for "Alan Wake II" with clear segregation for experiences with ray tracing and without; what wasn't clear was just how much the game depended on hardware support for mesh shaders, which is why its bare minimum list called for at least an NVIDIA RTX 2060 "Turing," or at least an AMD RX 6600 XT RDNA2, both of which are DirectX 12 Ultimate GPUs with hardware mesh shaders support.

There was some confusion among gaming online forums over the requirement for hardware mesh shaders. Many people assumed that the game will not work on GPUs without mesh shader support, locking out lots of gamers. Through the course of our testing for our performance review, we learned that while it is true that "Alan Wake II" relies on hardware support for mesh shaders, the lack of this does not break gameplay. You will, however, pay a heavy performance penalty on GPUs that lack hardware mesh shader support. On such GPUs, the game is designed to show users a warning dialog box that their GPU lacks mesh shader support (screenshot below), but you can choose to ignore this warning, and go ahead to play the game. The game considers mesh shaders a "recommended GPU feature," and not a requirement. Without mesh shaders, you can expect a severe performance loss that is best illustrated with the AMD Radeon RX 5700 XT based on the RDNA architecture, which lacks hardware mesh shaders.
In our testing, at 1080p, without upscaling, the RX 5700 XT performs worse than the GeForce GTX 1660 Ti. In most other raster-only titles, the RX 5700 XT with the latest AMD drivers, is known to perform about as fast as an RTX 2080. Here it's seen lagging behind the GTX 1660 Ti. It's important to note here, that the GTX 16-series "Turing," while lacking in RT cores and tensor cores from its RTX 20-series cousin, does feature hardware support for mesh shaders, and is hence able to perform along expected lines. We have included a projection for how the RX 5700 XT fares typically in our testing—it ends up roughly around the performance region of the RTX 3060 and RX 6600 XT. AMD's Radeon RX 6000 series RDNA2 and current RX 7000 series RDNA3 fully support hardware mesh shaders across all GPU models.

That doesn't mean that RX 5700 XT delivers unplayable results. 1080p at 60 FPS is in reach with lowest settings, or at close to maximum settings with FSR Quality, which is not such a terrible tradeoff, just you still need to make compromises. We didn't spot any rendering errors or crashes.

Once we knew that RX 5700 XT works, we also wanted to test the NVIDIA side of things. Using the GeForce GTX 1080 Ti "Pascal" , the flagship GPU from that generation, we were greeted with the same warning dialog as the RX 5700 XT—that the GPU is missing support for mesh shaders. Not only does the GTX 1080 Ti vastly underperform, but it yields far worse performance than the RX 5700 XT, nearly 2-3rds. At launch, the RX 5700 XT was a little bit slower than the GTX 1080 Ti in our reviews of the time, but has climbed since, and is now a tiny bit faster. Since the card lacks DLSS support, using FSR is the only option, but even that can't save the card. Running at 1080p lowest with FSR 2 Ultra Performance yielded only 27 FPS.
Add your own comment

100 Comments on PSA: Alan Wake II Runs on Older GPUs, Mesh Shaders not Required

#26
theouto
Honestly, after a while, we have to let progress be progress and move on with new tech, it's not like most UE5 titles where it looks alright while having no technology that actually warrants that level of meh performance, this game has actual hardware reasons to cast certain gpus into the flames, while also looking insanely good (geometric complexity is absurd if you stop to look at it, taking great advantage of mesh shaders)

If we don't start embracing things like mesh shaders now, we could be sacrificing performance that doesn't need sacrificing, as things like, again, mesh shaders, can enable games to either look better geometrically, or run better, as it's an insanely good optimization tool (if anyone compares this to DLSS or FSR, please don't, it's silly to do so)
Posted on Reply
#27
remekra
There is always a way to optimize. Same goes for Ray tracing, there are many techniques where RT can be optimized to use as low possible rays while still keeping good image quality.
AMD has on their dev website for example a demo for hybrid reflections, where it uses both SSR and RT to help where SSR cannot work properly. It looks great and performs well. They also have a whitepaper on global illumination system that uses RT, but rather than brute forcing it with PT it uses Radiance Caching to improve performance.
Even more recent example of games - Metro Exodus EE and Witcher 3 Remastered, both used RTGI, where it's a mix of probes and RT, both especially ME had amazing lighting and while it was costly to run those, it was not as costly as PT.

For Alan Wake 2 best example is reflections. In order to get RT reflections in this game you need to turn on PT, which is stupid. They could have added RT reflections option pretty easily, in Control it was not a problem as well.
But they chose to brute force it with PT, probably because of sponsorship with nvidia.

Nvidia has a habit of presenting technologies and then using them to the fullest even beyond the diminishing returns point and even when it tanks their own GPUs (as long as it tanks competiton more its good). Happened before and happens now. It will not become mainstream anyway because of consoles and same as gameworks, physx etc didn't see mainstream adoption other than nvidia sponsored titles same will happen here. Did that smoke simulations and hair simulations looked really nice? They did, was it viable to use in every game? Nope.

It is not to say that RT or PT will dissapear, it's just that acceleration for those calculations will be used differently, to assist raster in places where it cannot improve visuals anymore.
Maybe for the next gen consoles we will see fully PT games become mainstream.
Posted on Reply
#28
AusWolf
FrickMaybe in the mid 90's that was a thing, but definitely not in the 2000's, iirc.
Yes it was. Look at DirectX's version history. Basically every single major version, and sometimes even minor versions needed a new graphics card to run. That is: DirectX 7: 1999, DirectX 8: 2000, DirectX 8.1: 2001, DirectX 9: 2002, DirectX 9.0c: 2004. That's 5 graphics card upgrades within 5 years. Only if you wanted to run the latest games, of course - because they didn't even start without compatible hardware. And here we are, crying that you only get X FPS on a 7 year-old graphics card in the newest AAA game. :rolleyes:

20 years ago, we paid 200 bucks every year. Now, we pay 5-800 every 5 years. That's it.

Edit: Half-Life 2 was regarded as a bloody miracle for supporting DirectX 7 as well as 9, and ran on a GeForce 2 at ultra low graphics, which was a 4 year-old graphics card at that time.
Posted on Reply
#29
Fluffmeister
It's interesting people mentioning older games like Crysis and comparing how cards of the day performed, looking back W1zz's review of the GTX 680 using 1200P as he did at the time there wasn't a card that managed 60FPS:


Metro 2033 battered the cards even harder:


Just interesting to compare...
Posted on Reply
#30
Hyderz
I now understand what leather jacket meant when he said it just works, meaning it will run but he doesn’t promise it would run well
Posted on Reply
#31
Nostras
FluffmeisterIt's interesting people mentioning older games like Crysis and comparing how cards of the day performed, looking back W1zz's review of the GTX 680 using 1200P as he did at the time there wasn't a card that managed 60FPS:


Metro 2033 battered the cards even harder:


Just interesting to compare...
I do believe that back then there just wasn't that much room to lower visual settings, it would look like horse shit.
Nowadays in modern games that are supposedly "setting the bar" have a lot of room to reduce visual quality but the performance you gain is so minimal it's not worth it.
Obviously it's because the developer does not optimize for this experience, but it does sting a bit.
I'd bet most would much prefer a nice 60 fps over a "next-gen experience".
Posted on Reply
#32
wolf
Better Than Native
FluffmeisterJust interesting to compare...
Not to mention, for Crysis specifically there, the fastest card at the time of launch, if memory serves, was an 8800GTX, if we round node shrinks and arch changes into 'major' generational leaps, the GTX680 is a full 3 generations beyond an 8800GTX (8/9 series, 200 series, 400/500 series, then 600 series) - and still not cracking 60fps @ 1200p with 4xMSAA - which admittedly was a tall bar for Crysis. I wonder if a full 3 generations from now, AW2 will be playable at native 4k60+...

I never played the first Alan Wake, but am very much enjoying all the content and discussion on 2, it seems like a masterpiece for it's time. Might be time to pick up the 1st game remastered while it's cheap and wait for a special on 2.
Posted on Reply
#33
Dr. Dro
Prima.VeraPi$$ off game engine. 16fps @1080p on a 1080 Ti ??? Are you kidding me?
Seriously, the video card companies are hand-in-hand with those callous game devs, so they can sucker in the plebs on buying always the latest and most powerful GPU.
Just like on mobile phones...
Disgusting.
I feel like this is an emotional overreaction to be honest. The 1080 Ti is based on a 7 year old GP102 processor in its worst shipping configuration. It had more than its valiant run, this was inevitable, just let it go. :)
Posted on Reply
#34
Frick
Fishfaced Nincompoop
AssimilatorNo, it absolutely was. Back then it was easy to get massive hardware performance gain via a simple and cheap node shrink, then anything on top of that in terms of actual design improvements was just gravy (and there was plenty of low-hanging fruit there too).

Nowadays node shrinks are barely an improvement and hideously expensive to boot, rasterisation has been optimised to the Nth degree so there's almost no room for improvement via that avenue, and we've only barely started down the far more complex ray- and path-tracing road, where optimisations are hindered by the slow down in node shrinking.
I meant the part about having to buy a new GPJ each year to play new games because the games wouldn't even launch on last years GPU.
AusWolfYes it was. Look at DirectX's version history. Basically every single major version, and sometimes even minor versions needed a new graphics card to run. That is: DirectX 7: 1999, DirectX 8: 2000, DirectX 8.1: 2001, DirectX 9: 2002, DirectX 9.0c: 2004. That's 5 graphics card upgrades within 5 years. Only if you wanted to run the latest games, of course - because they didn't even start without compatible hardware. And here we are, crying that you only get X FPS on a 7 year-old graphics card in the newest AAA game. :rolleyes:

20 years ago, we paid 200 bucks every year. Now, we pay 5-800 every 5 years. That's it.

Edit: Half-Life 2 was regarded as a bloody miracle for supporting DirectX 7 as well as 9, and ran on a GeForce 2 at ultra low graphics, which was a 4 year-old graphics card at that time.
Do you have examples? I was pretty deep into games at that point (on a nothing budget!) and have absolutely no memory of this being a thing at all. Not every year anyway.
Posted on Reply
#35
swirl09
Me in twenty-****ing-thirty: Why does my 4090 not run this new graphically intense game well?!?!



*(Assuming Im still alive)
Posted on Reply
#36
Dr. Dro
FrickI meant the part about having to buy a new GPJ each year to play new games because the games wouldn't even launch on last years GPU.


Do you have examples? I was pretty deep into games at that point and have absolutely no memory of this being a thing at all.
Well, try running Oblivion on a Fury Maxx (which would be 7 years old by then). Right, that one needs Windows 98...

Oblivion even booted on a GeForce FX (3 years old by its release) if you disabled HDR and fell back to bloom lighting but... Can't speak for fps...

Yep, until unified shaders and I'd argue DirectX 11 cards, upgrades were essentially mandatory. Things are cozier now, support for obsolete/downlevel hardware is quite good
Posted on Reply
#37
AusWolf
wolfI never played the first Alan Wake, but am very much enjoying all the content and discussion on 2, it seems like a masterpiece for it's time. Might be time to pick up the 1st game remastered while it's cheap and wait for a special on 2.
Do that! It's one of my favourite games, ever.
Posted on Reply
#38
Assimilator
AusWolfDo that! It's one of my favourite games, ever.
Agreed! The primary mechanic is fantastic.
Posted on Reply
#39
AusWolf
FrickDo you have examples? I was pretty deep into games at that point (on a nothing budget!) and have absolutely no memory of this being a thing at all. Not every year anyway.
I just gave you examples. :confused:

Every new DirectX version needed a new graphics card to run. You can look up any game using the DirectX versions above which came out every year. OpenGL games weren't any better, like Quake 3 and other id Tech 3 engine based ones. Or there's id Tech 4 which ran under Doom 3, which (quoting Wikipedia) "would not even run on high end graphics cards in 2004 as the engine required at least 512 MB of video memory to display properly and at playable speeds." That is, at Ultra graphics, it required hardware that wasn't even available until about 2 years later!

Like I said above, Half-Life 2 was revolutionary in a sense that it supported 3 major DirectX versions, making it run on a 4 year-old GeForce 2, which was unheard of.

I got my first PC in 1998 which had partial support for DirectX 7 (more like 6.1). Two years later, no new game would run on it. At all. So I basically missed the 2000-2004 era entirely.

Edit: Or what about the proprietary APIs that ran only on specific hardware, like Glide on 3DFX?
Dr. DroWell, try running Oblivion on a Fury Maxx (which would be 7 years old by then). Right, that one needs Windows 98...

Oblivion even booted on a GeForce FX (3 years old by its release) if you disabled HDR and fell back to bloom lighting but... Can't speak for fps...

Yep, until unified shaders and I'd argue DirectX 11 cards, upgrades were essentially mandatory. Things are cozier now, support for obsolete/downlevel hardware is quite good
How could I forget about Oblivion? You basically had to mod it to make it run properly on anything other than a high-end GeForce 7800 or such.

Or even then, you turned on HDR and the grass density/distance, and your PC died. :laugh:
Posted on Reply
#40
MrDweezil
AssimilatorTrue gamers are gluttons for progress in evolving the state of the art. This game does that.
I'm not going to touch what "true gamers" think, but if AAA games from one year are performing the same as AAA games from the year before, graphics are stagnating and game devs should be pushing harder. As long as companies are releasing faster GPUs, devs should be releasing more demanding games.
Posted on Reply
#41
theelviscerator
I cannot play 3rd person. Too many hours playing FPS since Doom came out. Non starter.
Posted on Reply
#42
trparky
MrDweezilI'm not going to touch what "true gamers" think, but if AAA games from one year are performing the same as AAA games from the year before, graphics are stagnating and game devs should be pushing harder. As long as companies are releasing faster GPUs, devs should be releasing more demanding games.
Do you own stock in nVidia?
Posted on Reply
#43
TheoneandonlyMrK
MrDweezilI'm not going to touch what "true gamers" think, but if AAA games from one year are performing the same as AAA games from the year before, graphics are stagnating and game devs should be pushing harder. As long as companies are releasing faster GPUs, devs should be releasing more demanding games.
I actually agree, but to push beyond what your baseline is using,

At the minimum spec required.


That's the , ,, ,A issue.
Posted on Reply
#44
theouto
Many times devs make ultra settings not as settings for the present, but settings for the future, a good example of this is kingdom come deliverance, and metro exodus enhanced to an extent (the maximum preset is ridiculous to run and doesn't offer that much of a visual improvement)

I suspect this is one such case, as the low-mid settings look absolutely stunning, and because this is remedy entertainment, and nothing is ever normal with these guys, they pushed, and they pushed big time.

Minimum spec is minimum spec for a reason, it's minimum, not recommended
Posted on Reply
#45
MrDweezil
trparkyDo you own stock in nVidia?
Don't even have an nvidia gpu. And I'm not a frequent upgrader either. But if the state of the art isn't always moving forward then why are we even gaming on PCs? We could sit around with our PS2s and agree games could never use more capabilities than it had.
Posted on Reply
#46
trparky
MrDweezilDon't even have an nvidia gpu. And I'm not a frequent upgrader either. But if the state of the art isn't always moving forward then why are we even gaming on PCs? We could sit around with our PS2s and agree games could never use more capabilities than it had.
To an extent, you're right. However, not all of us are made of money and can afford the latest and greatest GPUs. Or are willing to put it on a credit card to buy it. Talk about alienating a large portion of the market.
Posted on Reply
#47
Dragokar
Epic, this is really the content that you guys stand out for. Thank you for the update.
Posted on Reply
#48
PapaTaipei
In the end if the game doesn't sell well on the PC market, other devs will take note and have better optimisation. I still remember Remedy for their masterpiece Max Payne 1.
Posted on Reply
#49
tfdsaf
AssimilatorDid you make the same complaints when Crysis was released? Remedy, like Crytek, are to be praised for pushing the boundaries of what can be achieved with game engines, not condemned. When developers push these boundaries, hardware is pushed to keep up, and consumers benefit as a result.
Crysis had REVOLUTIONARY graphics, even today many games don't have the same level of graphical fidelity and technology.

Alan Wake looks great, but nothing special, nothing we haven't seen before, nothing that would warrant this level of poor performance!

Plus Crysis sold like 10 copies due to its absurd requirements, which I think will be the case for Alan Wake 2 on PC as well.

And the thing is I think Hogwarts looks just as good if not better in certain areas, yet the game can be easily run by 1000 series cards with ease at the lowest settings.
Posted on Reply
#50
Easo
swirl09Me in twenty-****ing-thirty: Why does my 4090 not run this new graphically intense game well?!?!



*(Assuming Im still alive)
The problem is that is is perfectly possible for 4090 to not do that right now.
Let's not even mention other cards.

Sorry, I really do not understand you all.
Posted on Reply
Add your own comment
Dec 18th, 2024 05:50 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts