Friday, October 27th 2023
PSA: Alan Wake II Runs on Older GPUs, Mesh Shaders not Required
"Alan Wake II," released earlier this week, is the latest third person action adventure loaded with psychological thriller elements that call back to some of the best works of Remedy Entertainment, including "Control," "Max Payne 2," and "Alan Wake." It's also a visual feast as our performance review of the game should show you, leveraging the full spectrum of the DirectX 12 Ultimate feature-set. In the run up to the release, when Remedy put out the system requirements lists for "Alan Wake II" with clear segregation for experiences with ray tracing and without; what wasn't clear was just how much the game depended on hardware support for mesh shaders, which is why its bare minimum list called for at least an NVIDIA RTX 2060 "Turing," or at least an AMD RX 6600 XT RDNA2, both of which are DirectX 12 Ultimate GPUs with hardware mesh shaders support.
There was some confusion among gaming online forums over the requirement for hardware mesh shaders. Many people assumed that the game will not work on GPUs without mesh shader support, locking out lots of gamers. Through the course of our testing for our performance review, we learned that while it is true that "Alan Wake II" relies on hardware support for mesh shaders, the lack of this does not break gameplay. You will, however, pay a heavy performance penalty on GPUs that lack hardware mesh shader support. On such GPUs, the game is designed to show users a warning dialog box that their GPU lacks mesh shader support (screenshot below), but you can choose to ignore this warning, and go ahead to play the game. The game considers mesh shaders a "recommended GPU feature," and not a requirement. Without mesh shaders, you can expect a severe performance loss that is best illustrated with the AMD Radeon RX 5700 XT based on the RDNA architecture, which lacks hardware mesh shaders.In our testing, at 1080p, without upscaling, the RX 5700 XT performs worse than the GeForce GTX 1660 Ti. In most other raster-only titles, the RX 5700 XT with the latest AMD drivers, is known to perform about as fast as an RTX 2080. Here it's seen lagging behind the GTX 1660 Ti. It's important to note here, that the GTX 16-series "Turing," while lacking in RT cores and tensor cores from its RTX 20-series cousin, does feature hardware support for mesh shaders, and is hence able to perform along expected lines. We have included a projection for how the RX 5700 XT fares typically in our testing—it ends up roughly around the performance region of the RTX 3060 and RX 6600 XT. AMD's Radeon RX 6000 series RDNA2 and current RX 7000 series RDNA3 fully support hardware mesh shaders across all GPU models.
That doesn't mean that RX 5700 XT delivers unplayable results. 1080p at 60 FPS is in reach with lowest settings, or at close to maximum settings with FSR Quality, which is not such a terrible tradeoff, just you still need to make compromises. We didn't spot any rendering errors or crashes.
Once we knew that RX 5700 XT works, we also wanted to test the NVIDIA side of things. Using the GeForce GTX 1080 Ti "Pascal" , the flagship GPU from that generation, we were greeted with the same warning dialog as the RX 5700 XT—that the GPU is missing support for mesh shaders. Not only does the GTX 1080 Ti vastly underperform, but it yields far worse performance than the RX 5700 XT, nearly 2-3rds. At launch, the RX 5700 XT was a little bit slower than the GTX 1080 Ti in our reviews of the time, but has climbed since, and is now a tiny bit faster. Since the card lacks DLSS support, using FSR is the only option, but even that can't save the card. Running at 1080p lowest with FSR 2 Ultra Performance yielded only 27 FPS.
There was some confusion among gaming online forums over the requirement for hardware mesh shaders. Many people assumed that the game will not work on GPUs without mesh shader support, locking out lots of gamers. Through the course of our testing for our performance review, we learned that while it is true that "Alan Wake II" relies on hardware support for mesh shaders, the lack of this does not break gameplay. You will, however, pay a heavy performance penalty on GPUs that lack hardware mesh shader support. On such GPUs, the game is designed to show users a warning dialog box that their GPU lacks mesh shader support (screenshot below), but you can choose to ignore this warning, and go ahead to play the game. The game considers mesh shaders a "recommended GPU feature," and not a requirement. Without mesh shaders, you can expect a severe performance loss that is best illustrated with the AMD Radeon RX 5700 XT based on the RDNA architecture, which lacks hardware mesh shaders.In our testing, at 1080p, without upscaling, the RX 5700 XT performs worse than the GeForce GTX 1660 Ti. In most other raster-only titles, the RX 5700 XT with the latest AMD drivers, is known to perform about as fast as an RTX 2080. Here it's seen lagging behind the GTX 1660 Ti. It's important to note here, that the GTX 16-series "Turing," while lacking in RT cores and tensor cores from its RTX 20-series cousin, does feature hardware support for mesh shaders, and is hence able to perform along expected lines. We have included a projection for how the RX 5700 XT fares typically in our testing—it ends up roughly around the performance region of the RTX 3060 and RX 6600 XT. AMD's Radeon RX 6000 series RDNA2 and current RX 7000 series RDNA3 fully support hardware mesh shaders across all GPU models.
That doesn't mean that RX 5700 XT delivers unplayable results. 1080p at 60 FPS is in reach with lowest settings, or at close to maximum settings with FSR Quality, which is not such a terrible tradeoff, just you still need to make compromises. We didn't spot any rendering errors or crashes.
Once we knew that RX 5700 XT works, we also wanted to test the NVIDIA side of things. Using the GeForce GTX 1080 Ti "Pascal" , the flagship GPU from that generation, we were greeted with the same warning dialog as the RX 5700 XT—that the GPU is missing support for mesh shaders. Not only does the GTX 1080 Ti vastly underperform, but it yields far worse performance than the RX 5700 XT, nearly 2-3rds. At launch, the RX 5700 XT was a little bit slower than the GTX 1080 Ti in our reviews of the time, but has climbed since, and is now a tiny bit faster. Since the card lacks DLSS support, using FSR is the only option, but even that can't save the card. Running at 1080p lowest with FSR 2 Ultra Performance yielded only 27 FPS.
100 Comments on PSA: Alan Wake II Runs on Older GPUs, Mesh Shaders not Required
Well, gaming-wise that is.
Which brings us to the initial point: Alan Wake 2 isn't going to be a tangentially relevant game for the people who spend more than $300 on a graphics card, or more than $900 on a gaming laptop. It's not going to sell even a tenth of the Cyberpunk's DLC Phantom Liberty.
Until it comes out on Steam, it's a mostly irrelevant game for the people who spend big money on graphics cards, yet it's probably going to be ridiculously over-represented in benchmark suites because it'll feature in Nvidia's
mandatory gamesreview guidelines.Just like Control before it.
Let's assume I would have spent 900 EUR on 4070 Ti and now game director and "some dudes" are saying my theoretical almost 1k GPU is not enough? Wut?
It is quite a bit about software, you know that. If patches actually do improve the performance of all those recently released games with issues then yes, it is software + greed, not the diminishing returns on the photorealistic graphics (debatable in the first place considering the sometimes dated look. Where did that photorealism go there, exactly??).
I know voting with wallet is close to useless, but it seems most have given up and just accepted it that bruteforcing is the only way.
Why does everybody want to play everything on Ultra graphics with all kinds of graphics cards?
Game is looking great, not denying that. It's hella of a lot inconsistent. I can turn on Path tracing in some parts and get 50-55 fps, while in others I get 20.
I do still think that they could optimize it more, not to run on 10 series cards, but for current gen GPU's, that are not 4090.
I really await the Avatar game that uses new ver of Snowdrop. RTGI on consoles and additional RT on PC, from trailers it's looking great, but I wonder what will be the performance of it.
On the other side, you have a game like City Skyline 2 that can barely run at 50 FPS on an empty map on a 4080 with a 7950X3D at 4K at low while not looking really better than the first game.
I have worked with Nvidia for almost 20 years and I’ve never seen any such guidelines. they never said i must test a certain game in my reviews, only once they asked „why are you still testing control in 2022?“ which is a very reasonable question
Alan Wake 2 is still interesting to include in my tests because it‘s not an unreal engine game. The fact that it’s on egs makes my life much more difficult though. It’s the reason i never looked again at godfall after my initial review. Otoh the way things are going all games will be unreal in a few years anyway
Give me a break ffs.
In general, though, I think we've crossed the line where there's no more wow factor in game graphics. They look real enough so that there's no more Half-Life to Crysis level of improvement in the next 10 years.
Where i see the biggest potential improving games in the future are actually not better graphics but better physics and better AI.
Make NPC's smarter and make environments more destructible. What good are path traced graphics if it's immediately ruined by dump NPC's with bad lip-syncing in an environments where everything is static and cannot be destroyed?
I would argue that Crysis's appeal was not just graphics. It was also physics. Being able to cut down trees, take those chucks and use them as projectiles is something that most games these days are still not doing. Some of the greatest and most revered games have used advanced physics like Half Life 2, Crysis and even Dead Space (using enemy limbs as projectiles).
But I can see why nvidia introduced Ray Reconstruction. Without it PT has this so to say temporal or smudgy look. Like you can see that denoising filter its trying its best to smooth the image out when you move the camera or just move in the env. That happens regardless of FSR Quality, native or performance.
But I guess this is a lost battle...