Friday, October 27th 2023
PSA: Alan Wake II Runs on Older GPUs, Mesh Shaders not Required
"Alan Wake II," released earlier this week, is the latest third person action adventure loaded with psychological thriller elements that call back to some of the best works of Remedy Entertainment, including "Control," "Max Payne 2," and "Alan Wake." It's also a visual feast as our performance review of the game should show you, leveraging the full spectrum of the DirectX 12 Ultimate feature-set. In the run up to the release, when Remedy put out the system requirements lists for "Alan Wake II" with clear segregation for experiences with ray tracing and without; what wasn't clear was just how much the game depended on hardware support for mesh shaders, which is why its bare minimum list called for at least an NVIDIA RTX 2060 "Turing," or at least an AMD RX 6600 XT RDNA2, both of which are DirectX 12 Ultimate GPUs with hardware mesh shaders support.
There was some confusion among gaming online forums over the requirement for hardware mesh shaders. Many people assumed that the game will not work on GPUs without mesh shader support, locking out lots of gamers. Through the course of our testing for our performance review, we learned that while it is true that "Alan Wake II" relies on hardware support for mesh shaders, the lack of this does not break gameplay. You will, however, pay a heavy performance penalty on GPUs that lack hardware mesh shader support. On such GPUs, the game is designed to show users a warning dialog box that their GPU lacks mesh shader support (screenshot below), but you can choose to ignore this warning, and go ahead to play the game. The game considers mesh shaders a "recommended GPU feature," and not a requirement. Without mesh shaders, you can expect a severe performance loss that is best illustrated with the AMD Radeon RX 5700 XT based on the RDNA architecture, which lacks hardware mesh shaders.In our testing, at 1080p, without upscaling, the RX 5700 XT performs worse than the GeForce GTX 1660 Ti. In most other raster-only titles, the RX 5700 XT with the latest AMD drivers, is known to perform about as fast as an RTX 2080. Here it's seen lagging behind the GTX 1660 Ti. It's important to note here, that the GTX 16-series "Turing," while lacking in RT cores and tensor cores from its RTX 20-series cousin, does feature hardware support for mesh shaders, and is hence able to perform along expected lines. We have included a projection for how the RX 5700 XT fares typically in our testing—it ends up roughly around the performance region of the RTX 3060 and RX 6600 XT. AMD's Radeon RX 6000 series RDNA2 and current RX 7000 series RDNA3 fully support hardware mesh shaders across all GPU models.
That doesn't mean that RX 5700 XT delivers unplayable results. 1080p at 60 FPS is in reach with lowest settings, or at close to maximum settings with FSR Quality, which is not such a terrible tradeoff, just you still need to make compromises. We didn't spot any rendering errors or crashes.
Once we knew that RX 5700 XT works, we also wanted to test the NVIDIA side of things. Using the GeForce GTX 1080 Ti "Pascal" , the flagship GPU from that generation, we were greeted with the same warning dialog as the RX 5700 XT—that the GPU is missing support for mesh shaders. Not only does the GTX 1080 Ti vastly underperform, but it yields far worse performance than the RX 5700 XT, nearly 2-3rds. At launch, the RX 5700 XT was a little bit slower than the GTX 1080 Ti in our reviews of the time, but has climbed since, and is now a tiny bit faster. Since the card lacks DLSS support, using FSR is the only option, but even that can't save the card. Running at 1080p lowest with FSR 2 Ultra Performance yielded only 27 FPS.
There was some confusion among gaming online forums over the requirement for hardware mesh shaders. Many people assumed that the game will not work on GPUs without mesh shader support, locking out lots of gamers. Through the course of our testing for our performance review, we learned that while it is true that "Alan Wake II" relies on hardware support for mesh shaders, the lack of this does not break gameplay. You will, however, pay a heavy performance penalty on GPUs that lack hardware mesh shader support. On such GPUs, the game is designed to show users a warning dialog box that their GPU lacks mesh shader support (screenshot below), but you can choose to ignore this warning, and go ahead to play the game. The game considers mesh shaders a "recommended GPU feature," and not a requirement. Without mesh shaders, you can expect a severe performance loss that is best illustrated with the AMD Radeon RX 5700 XT based on the RDNA architecture, which lacks hardware mesh shaders.In our testing, at 1080p, without upscaling, the RX 5700 XT performs worse than the GeForce GTX 1660 Ti. In most other raster-only titles, the RX 5700 XT with the latest AMD drivers, is known to perform about as fast as an RTX 2080. Here it's seen lagging behind the GTX 1660 Ti. It's important to note here, that the GTX 16-series "Turing," while lacking in RT cores and tensor cores from its RTX 20-series cousin, does feature hardware support for mesh shaders, and is hence able to perform along expected lines. We have included a projection for how the RX 5700 XT fares typically in our testing—it ends up roughly around the performance region of the RTX 3060 and RX 6600 XT. AMD's Radeon RX 6000 series RDNA2 and current RX 7000 series RDNA3 fully support hardware mesh shaders across all GPU models.
That doesn't mean that RX 5700 XT delivers unplayable results. 1080p at 60 FPS is in reach with lowest settings, or at close to maximum settings with FSR Quality, which is not such a terrible tradeoff, just you still need to make compromises. We didn't spot any rendering errors or crashes.
Once we knew that RX 5700 XT works, we also wanted to test the NVIDIA side of things. Using the GeForce GTX 1080 Ti "Pascal" , the flagship GPU from that generation, we were greeted with the same warning dialog as the RX 5700 XT—that the GPU is missing support for mesh shaders. Not only does the GTX 1080 Ti vastly underperform, but it yields far worse performance than the RX 5700 XT, nearly 2-3rds. At launch, the RX 5700 XT was a little bit slower than the GTX 1080 Ti in our reviews of the time, but has climbed since, and is now a tiny bit faster. Since the card lacks DLSS support, using FSR is the only option, but even that can't save the card. Running at 1080p lowest with FSR 2 Ultra Performance yielded only 27 FPS.
100 Comments on PSA: Alan Wake II Runs on Older GPUs, Mesh Shaders not Required
Imagine paying four figures for a GPU (and another four figures for the platform) only to have to use upscaling to even get to 60fps and the game is not even running at max settings (by that i mean RT included).
What a sad state of affairs.
It's not necessarily a matter of people being afraid of progressing technology, it's that the progress isn't seen as impressive enough to justify the cost.
Lifelike is lifelike. What more do we need?
That means that despite being a GPU from 2006, it's still capable of tackling basic games released well into the 2020s. Unified shaders with full programmability really were the bedrock of modern graphics.
With no particular reference to anyone in this thread; just a general thought.
In other news i would quite like a winter home in the Bahamas or maybe Antigua and Barbuda, but my bank statements keep telling me it ain't happening anytime soon.
We all have our crosses to bear.
You can't help but feel slighted considering the cost of PC parts these days.
Older gamers (including myself) were expecting basically photo-realistic graphics by this point with the hardware to match. The graphics just aren't there and yet the mythical hardware is still required.
It's still a sore point for me since in 2009 i bought a laptop which had a DX10 only Mobilty HD Radeon 4670 1 GB which, while low-mid range at the time, i could tweak and tune settings in many contemporary games to run them fluidly, until DX11 feature 11_0 became ingrained in more and more titles, which it couldn't boot at all.
DX10 in many ways, was a short lived, transitional period that lasted only 3 years, had token games made using it's feature set (devs mostly made games for 9.0c or 11 and just skipped 10 entirely, some like CAPCOM making token efforts on PC like DMC4 and RE5 for example).
DX11 turned out to be the longest ever DX era, still relevant to this very day.
Not to mention, they can't even be bothered to fix age-old bugs by the looks of it, which is even sadder.
People are just in denial... I jumped on the 7900XT because I already saw what was gonna happen and it would affect my gaming too - it started doing so with TW Warhammer already and that's not even a state of the art engine. For much the same reasons, I strongly recommend people to get 16GB+ VRAM and solid hardware rather than buying/paying for better software, for anything midrange or up. And here we are...
I used to give EGS benefit of the doubt, but I just can't anymore. Their launcher is horrible to use, it literally detracts from playing games through EGS. I just don't trust my game library being there either because of it.
However, can somebody explain how this game engine is good when on an GPU RTX 3080 with a 13700K CPU, you get ~30fps on 1440p res (natively, no DLSS garbage), and WITHOUT enabling RTS ???