Friday, January 27th 2023
Forspoken Simply Doesn't Work with AMD Radeon RX 400 and RX 500 "Polaris" GPUs
AMD Radeon RX 400 series and RX 500 series graphics cards based on the "Polaris" graphics architecture are simply unable to run "Forspoken," as users on Reddit report. The game has certain DirectX 12 feature-level 12_1 API requirements that the architecture does not meet. Interestingly, NVIDIA's "Maxwell" graphics architecture, which predates AMD "Polaris" by almost a year, supports FL 12_1, and is able to play the game. Popular GPUs from the "Maxwell" generation include the GeForce GTX 970 and GTX 960. Making matters much worse, AMD is yet to release an update to its Adrenalin graphics drivers for the RX Vega, RX 5000, and RX 6000 series that come with "Forspoken" optimization. Its latest 23.1.2 beta drivers that come with these optimizations only support the RX 7000 series RDNA3 graphics cards. It's now been over 50 days since the vast majority of AMD discrete GPUs have received a driver update.
Source:
xCuri0 (Reddit)
86 Comments on Forspoken Simply Doesn't Work with AMD Radeon RX 400 and RX 500 "Polaris" GPUs
You can run it with a vulkan wrapper or on Linux with a polaris card but I'm not sure why you'd want to. In either case it's a bad game and the performance is abhorrent regardless of 580, 480, 1060, 970.
It is too early for Polaris cards to run into these kind of hiccups but this game is an exception given it's extremely poor optimization. It's not an indicator of the broader market.
Anyway....
Plays on i5-4690 16GB DDR3 1600 4GB GTX1050 Ti
528.24 Game Ready driver
Uses 3.6GB VRAM, 11.5GB System RAM total
Just past tutorial area swinging view slowly across map, Low settings preset
1080p: 12-18fps
1080p, FSR2Quality: 17-20fps
720p: 18-23fps
720p, FSRQ: 21-27fps
720p looks effing horrible in the game, there's something wrong with the in-game switching, I'm familiar with dumbing down to 720p and I've never seen it take the hit I see in this game. I'll try starting the game with Windows at 720p to see if there's a difference. FSR looks pretty good at 1080p in this game but I'm not too picky about these things.
I'll pop the 745 in there next. I have an RX6400 but it's elsewhere right now, will test that tonite in a similar i7-4790 PC. All these are Dell Optiplexes with Dell memory so modest systems. Will be interesting to see if the 6400's x4 bus (PCIe 3.0 in the Optiplex) is an issue here or not.
Weirdly the game is pegged at 1860MHz on my ASUS Phoenix 1050Ti (155 MHz OC on Cores, 150MHz OC on VRAM) and it's not using the 68-70W that it usually does in games, instead using around 58-64W and the GPU load is rummaging in the 80s%. Usually that core clock will dynamically reduce to keep the power under 70W. But the CPU is frequently at 40% usage with peaks into the 70s, I will need to track this more.
On a RX 580 8GB. Mesa 22.3.3-3
Demo version
Proton Experimental
1080p
High Preset
VRS Off
Dynamic Resolution Off
Model Memory High
Texture Memory High
The game looks awful, though.
This wouldn't rule out the use of Polaris cards yet but does sound the warning siren's of doom.
I was and I think every pc gamer besides a niche is ok with having to compromise on in game settings for adequate FPS, I only just upgraded a Vega 64.
But there's plenty of options, this wouldn't be the thread to debate that though and Polaris isn't dead yet either, a 6600XT or 3060Ti would make Polaris look poor at this point though. And might be affordable to some.
In any case, Polaris is an aged card. So I really don’t see the reason why people are getting upset with the lack of features to run a game. It’s not great, but even if you have a 10 year old card, it’s going to perform really badly in this game. And if you have a GTX Nvidia card, you should be thanking AMD and Intel for enabling decent upscaling technology so that you can still play some of these new games. Nvidia has shut GTX users out in the cold.
I'm using a Radeon Pro W6600 , using 2021 drivers and the game runs just fine. I get a "driver warning", but i ignore it and it plays perfectly fine. Same with Fortnite, it says my driver has issues with DX12. Lol no it doesn't, the game runs at 90fps at almost max settings 1440p perfectly fine. Don't believe the hype!
Wow :D Yeah? How did the past launch work out for them? It looked good in the presentation. The promised performance however was not unlocked on reference cards while an OC is highly profitable; there is a vapor chamber issue, and several other pretty serious issues, plus its priced to performance akin to Nvidia's new stack, plus there are some pretty nasty driver bugs going on and once again we're seeing a highly spotty driver regime. Cards are missing support right now for some games whereas others got it, for example.
Support periods on hardware are shorter than its competitor in general; feature sets are less expansive or less future proof - we have an example here right now. The only real pro AMD has going for its newest range is that there's a better I/O. Well yay. You can play on their cards at a res less than 0,5% of the audience will get to use anytime soon, never mind its FPS.
Honestly man, I was about to jump on 7900 series, and then AMD happened. Again. For the umpteenth time. They apparently can't keep a GPU product smooth sailing for any longer than a single gen - right now RDNA2 is the unicorn. The rest? Barely interesting. Its frustrating as hell, IMHO. They really need to do better. People who keep wondering why oh why is Nvidia at 80+% share... here it is.
GCN 1 was 11_1, Kepler and Maxwell 1 are 11_0
GCN 2 through 4 are 12_0, Maxwell 2 and Pascal are 12_1
GCN 5 and RDNA 1 are 12_1, Turing* fully supported DirectX 12 Ultimate (12_2) whereas its contemporaries did not
AMD caught up again with RDNA 2
*Small Turing (GTX 16 series) are 12_1 due to the removal of the raytracing cores but otherwise supports the same shader instructions as the big Turing cards
Also, not a butthurt AMD fanboy(and have been around the block since the Nvidia GeForce 6 series) and I buy what's suitable for my needs. I currently have a 6900 XT and it has been absolutely incredible. A minimum amount of bugs and incredible performance. Though that 4090 is looking very enticing...
Enjoy your day, between all the butthurt and fanboy rhetoric.
I remember being one of few people who kept saying these claims were nonsense. True. Drivers aren't really optimized for specific games, not at least the way people think.
Workarounds in drivers may happen, but are very costly and sometimes even risky. But in 99.99% of cases, the drivers operate agnostic of what is running. Games don't need to be "supported" by the drivers. Drivers implement graphics APIs, and any software utilizing these APIs according to spec should just work. In reality though, the implementations aren't 100% correct, and new software tends to reveal bugs that have been there all along.
But as our friend brutlern said, GPU makers will spin this as a good thing, like claiming to "optimize" the driver for a game, when in realty they should say; we would like to thank the makers of <insert game here> for exposing an embarrassing flaw in our messy driver codebase, and we chose to resolve it by a workaround / rewrite the affected part of the code base.
In rare cases they may override shaders etc., but this usually creates more problems than it solves. It's not hard to figure out that AMD sells very few GPUs, at least in retail. Just look at the difference in total market share vs. Steam market share over the years. Unfortunately this means they ship relatively few retail cards, which is why they aren't able to truly compete with Nvidia, even when they have a decent product. They need millions of cards to make a dent in Nvidia's sales, and at least hundreds of thousands around launch, not like 10.000(?) or so which they did last launch.
This obviously is another greenscam
Although from what I can see- Vega64 supports "DX12_1" feature level.
But in the past we do have cases where Nvidia is somehow involved. The DX 10.1 removal from the first Assassin's Creed game, the tessellated ocean under the city in Crysis 2, all those games with PhysX support that where running like slideshows in systems not using an Nvidia as a primary card, or where offering crap even non existing physics without PhysX, all those games using GameWorks libraries that are close code, or features like HairWorks, again close code. The fact that Nvidia was even punishing (in a way) it's own customers when using another brand's hardware as primary video adapter, by dissabling CUDA and PhysX support, is also an indication of a company that could easily move behind close doors to make it's hardware artificially look better than the competition. So, suspecting Nvidia for having to do something that makes AMD look worst, does have a base. AMD sometimes manages to do damage to itself without any help from others, but some times it's not their fault.
In this case the choice of not implementing 12_1 support is theirs alone. Until today it didn't cost them. Today the get more negative headlines.