Friday, October 27th 2023
PSA: Alan Wake II Runs on Older GPUs, Mesh Shaders not Required
"Alan Wake II," released earlier this week, is the latest third person action adventure loaded with psychological thriller elements that call back to some of the best works of Remedy Entertainment, including "Control," "Max Payne 2," and "Alan Wake." It's also a visual feast as our performance review of the game should show you, leveraging the full spectrum of the DirectX 12 Ultimate feature-set. In the run up to the release, when Remedy put out the system requirements lists for "Alan Wake II" with clear segregation for experiences with ray tracing and without; what wasn't clear was just how much the game depended on hardware support for mesh shaders, which is why its bare minimum list called for at least an NVIDIA RTX 2060 "Turing," or at least an AMD RX 6600 XT RDNA2, both of which are DirectX 12 Ultimate GPUs with hardware mesh shaders support.
There was some confusion among gaming online forums over the requirement for hardware mesh shaders. Many people assumed that the game will not work on GPUs without mesh shader support, locking out lots of gamers. Through the course of our testing for our performance review, we learned that while it is true that "Alan Wake II" relies on hardware support for mesh shaders, the lack of this does not break gameplay. You will, however, pay a heavy performance penalty on GPUs that lack hardware mesh shader support. On such GPUs, the game is designed to show users a warning dialog box that their GPU lacks mesh shader support (screenshot below), but you can choose to ignore this warning, and go ahead to play the game. The game considers mesh shaders a "recommended GPU feature," and not a requirement. Without mesh shaders, you can expect a severe performance loss that is best illustrated with the AMD Radeon RX 5700 XT based on the RDNA architecture, which lacks hardware mesh shaders.In our testing, at 1080p, without upscaling, the RX 5700 XT performs worse than the GeForce GTX 1660 Ti. In most other raster-only titles, the RX 5700 XT with the latest AMD drivers, is known to perform about as fast as an RTX 2080. Here it's seen lagging behind the GTX 1660 Ti. It's important to note here, that the GTX 16-series "Turing," while lacking in RT cores and tensor cores from its RTX 20-series cousin, does feature hardware support for mesh shaders, and is hence able to perform along expected lines. We have included a projection for how the RX 5700 XT fares typically in our testing—it ends up roughly around the performance region of the RTX 3060 and RX 6600 XT. AMD's Radeon RX 6000 series RDNA2 and current RX 7000 series RDNA3 fully support hardware mesh shaders across all GPU models.
That doesn't mean that RX 5700 XT delivers unplayable results. 1080p at 60 FPS is in reach with lowest settings, or at close to maximum settings with FSR Quality, which is not such a terrible tradeoff, just you still need to make compromises. We didn't spot any rendering errors or crashes.
Once we knew that RX 5700 XT works, we also wanted to test the NVIDIA side of things. Using the GeForce GTX 1080 Ti "Pascal" , the flagship GPU from that generation, we were greeted with the same warning dialog as the RX 5700 XT—that the GPU is missing support for mesh shaders. Not only does the GTX 1080 Ti vastly underperform, but it yields far worse performance than the RX 5700 XT, nearly 2-3rds. At launch, the RX 5700 XT was a little bit slower than the GTX 1080 Ti in our reviews of the time, but has climbed since, and is now a tiny bit faster. Since the card lacks DLSS support, using FSR is the only option, but even that can't save the card. Running at 1080p lowest with FSR 2 Ultra Performance yielded only 27 FPS.
There was some confusion among gaming online forums over the requirement for hardware mesh shaders. Many people assumed that the game will not work on GPUs without mesh shader support, locking out lots of gamers. Through the course of our testing for our performance review, we learned that while it is true that "Alan Wake II" relies on hardware support for mesh shaders, the lack of this does not break gameplay. You will, however, pay a heavy performance penalty on GPUs that lack hardware mesh shader support. On such GPUs, the game is designed to show users a warning dialog box that their GPU lacks mesh shader support (screenshot below), but you can choose to ignore this warning, and go ahead to play the game. The game considers mesh shaders a "recommended GPU feature," and not a requirement. Without mesh shaders, you can expect a severe performance loss that is best illustrated with the AMD Radeon RX 5700 XT based on the RDNA architecture, which lacks hardware mesh shaders.In our testing, at 1080p, without upscaling, the RX 5700 XT performs worse than the GeForce GTX 1660 Ti. In most other raster-only titles, the RX 5700 XT with the latest AMD drivers, is known to perform about as fast as an RTX 2080. Here it's seen lagging behind the GTX 1660 Ti. It's important to note here, that the GTX 16-series "Turing," while lacking in RT cores and tensor cores from its RTX 20-series cousin, does feature hardware support for mesh shaders, and is hence able to perform along expected lines. We have included a projection for how the RX 5700 XT fares typically in our testing—it ends up roughly around the performance region of the RTX 3060 and RX 6600 XT. AMD's Radeon RX 6000 series RDNA2 and current RX 7000 series RDNA3 fully support hardware mesh shaders across all GPU models.
That doesn't mean that RX 5700 XT delivers unplayable results. 1080p at 60 FPS is in reach with lowest settings, or at close to maximum settings with FSR Quality, which is not such a terrible tradeoff, just you still need to make compromises. We didn't spot any rendering errors or crashes.
Once we knew that RX 5700 XT works, we also wanted to test the NVIDIA side of things. Using the GeForce GTX 1080 Ti "Pascal" , the flagship GPU from that generation, we were greeted with the same warning dialog as the RX 5700 XT—that the GPU is missing support for mesh shaders. Not only does the GTX 1080 Ti vastly underperform, but it yields far worse performance than the RX 5700 XT, nearly 2-3rds. At launch, the RX 5700 XT was a little bit slower than the GTX 1080 Ti in our reviews of the time, but has climbed since, and is now a tiny bit faster. Since the card lacks DLSS support, using FSR is the only option, but even that can't save the card. Running at 1080p lowest with FSR 2 Ultra Performance yielded only 27 FPS.
100 Comments on PSA: Alan Wake II Runs on Older GPUs, Mesh Shaders not Required
If we don't start embracing things like mesh shaders now, we could be sacrificing performance that doesn't need sacrificing, as things like, again, mesh shaders, can enable games to either look better geometrically, or run better, as it's an insanely good optimization tool (if anyone compares this to DLSS or FSR, please don't, it's silly to do so)
AMD has on their dev website for example a demo for hybrid reflections, where it uses both SSR and RT to help where SSR cannot work properly. It looks great and performs well. They also have a whitepaper on global illumination system that uses RT, but rather than brute forcing it with PT it uses Radiance Caching to improve performance.
Even more recent example of games - Metro Exodus EE and Witcher 3 Remastered, both used RTGI, where it's a mix of probes and RT, both especially ME had amazing lighting and while it was costly to run those, it was not as costly as PT.
For Alan Wake 2 best example is reflections. In order to get RT reflections in this game you need to turn on PT, which is stupid. They could have added RT reflections option pretty easily, in Control it was not a problem as well.
But they chose to brute force it with PT, probably because of sponsorship with nvidia.
Nvidia has a habit of presenting technologies and then using them to the fullest even beyond the diminishing returns point and even when it tanks their own GPUs (as long as it tanks competiton more its good). Happened before and happens now. It will not become mainstream anyway because of consoles and same as gameworks, physx etc didn't see mainstream adoption other than nvidia sponsored titles same will happen here. Did that smoke simulations and hair simulations looked really nice? They did, was it viable to use in every game? Nope.
It is not to say that RT or PT will dissapear, it's just that acceleration for those calculations will be used differently, to assist raster in places where it cannot improve visuals anymore.
Maybe for the next gen consoles we will see fully PT games become mainstream.
20 years ago, we paid 200 bucks every year. Now, we pay 5-800 every 5 years. That's it.
Edit: Half-Life 2 was regarded as a bloody miracle for supporting DirectX 7 as well as 9, and ran on a GeForce 2 at ultra low graphics, which was a 4 year-old graphics card at that time.
Metro 2033 battered the cards even harder:
Just interesting to compare...
Nowadays in modern games that are supposedly "setting the bar" have a lot of room to reduce visual quality but the performance you gain is so minimal it's not worth it.
Obviously it's because the developer does not optimize for this experience, but it does sting a bit.
I'd bet most would much prefer a nice 60 fps over a "next-gen experience".
I never played the first Alan Wake, but am very much enjoying all the content and discussion on 2, it seems like a masterpiece for it's time. Might be time to pick up the 1st game remastered while it's cheap and wait for a special on 2.
*(Assuming Im still alive)
Oblivion even booted on a GeForce FX (3 years old by its release) if you disabled HDR and fell back to bloom lighting but... Can't speak for fps...
Yep, until unified shaders and I'd argue DirectX 11 cards, upgrades were essentially mandatory. Things are cozier now, support for obsolete/downlevel hardware is quite good
Every new DirectX version needed a new graphics card to run. You can look up any game using the DirectX versions above which came out every year. OpenGL games weren't any better, like Quake 3 and other id Tech 3 engine based ones. Or there's id Tech 4 which ran under Doom 3, which (quoting Wikipedia) "would not even run on high end graphics cards in 2004 as the engine required at least 512 MB of video memory to display properly and at playable speeds." That is, at Ultra graphics, it required hardware that wasn't even available until about 2 years later!
Like I said above, Half-Life 2 was revolutionary in a sense that it supported 3 major DirectX versions, making it run on a 4 year-old GeForce 2, which was unheard of.
I got my first PC in 1998 which had partial support for DirectX 7 (more like 6.1). Two years later, no new game would run on it. At all. So I basically missed the 2000-2004 era entirely.
Edit: Or what about the proprietary APIs that ran only on specific hardware, like Glide on 3DFX? How could I forget about Oblivion? You basically had to mod it to make it run properly on anything other than a high-end GeForce 7800 or such.
Or even then, you turned on HDR and the grass density/distance, and your PC died. :laugh:
At the minimum spec required.
That's the , ,, ,A issue.
I suspect this is one such case, as the low-mid settings look absolutely stunning, and because this is remedy entertainment, and nothing is ever normal with these guys, they pushed, and they pushed big time.
Minimum spec is minimum spec for a reason, it's minimum, not recommended
Alan Wake looks great, but nothing special, nothing we haven't seen before, nothing that would warrant this level of poor performance!
Plus Crysis sold like 10 copies due to its absurd requirements, which I think will be the case for Alan Wake 2 on PC as well.
And the thing is I think Hogwarts looks just as good if not better in certain areas, yet the game can be easily run by 1000 series cards with ease at the lowest settings.
Let's not even mention other cards.
Sorry, I really do not understand you all.