Wednesday, November 4th 2020
Godfall System Requirements List 12 GB VRAM for 4K and Ultra HD Textures
Godfall, the RPG looter-slasher that's being developed by Counterplay Games in close collaboration with AMD, will require 12 GB VRAM for maxed-out settings at 4K resolution. As part of AMD's partner videos the company announced when it revealed the RX 6000 series of graphics cards, Godfall is being built with DirectX 12 Ultimate and DXR in mind, and takes advantage of a number of rendering technologies that are part of the DXR 1.1 feature-set, alongside AMD's Fidelity FX technologies. Counterplay Games will make a 4X x 4X Ultra HD texture pack available for maxed-out settings - well within the 16 GB of VRAM AMD has settled on for its RX 6900 XT, RX 6800 XT, and RX 6800 graphics cards.
Godfall features Variable Rate Shading (VRS) for increased performance with no discernible loss of visual quality, as well as raytraced shadows (platform agnostic) and makes use of AMD's Fidelity FX Contrast Adaptive Sharpening. This technology has shown great results in improving both performance (it has been benchmarked as offering performance levels similar to that of DLSS 2.0 in Death Stranding, for instance, compared to a full 4K render) and image quality.
Source:
Videocardz
Godfall features Variable Rate Shading (VRS) for increased performance with no discernible loss of visual quality, as well as raytraced shadows (platform agnostic) and makes use of AMD's Fidelity FX Contrast Adaptive Sharpening. This technology has shown great results in improving both performance (it has been benchmarked as offering performance levels similar to that of DLSS 2.0 in Death Stranding, for instance, compared to a full 4K render) and image quality.
87 Comments on Godfall System Requirements List 12 GB VRAM for 4K and Ultra HD Textures
RIP me, press F.
Gotta watch out people just get bent out of pretty much nothing these days, like offend WTF, any chance i missed some thing and pissed this guy of else were.
Really shit happens, got bored of the video tbh not my kind of game by the looks of it.
Btw, the GPUs releases nowadays start to look more and more like what 3DFXwas doing in the day with their VooDoo 4 cards.... Just saying...
If I'd say nobody buys 3090 you would say there are people buying it. So as there are people playing Strange Brigade and I'm sure there's more of these people than people willing to buy 3090.
At least, You can play Brigade cause getting 3090 is more of a fantasy.
The game is a benchmark because reviewers use it often to benchmark vulkan scaling performance. But as the chartsshow, very few actually play it. Like, the peak of all time is under 2000 people, and that was at launch, when the peak would be at an all time high. (even Halo Wars, which came out a decade ago on the Xbox 360, for a series who's primary audience is Xbox, not PC, has had more peak players!) The average has never been above 300 people... The game isn't popular, end of story. You're one of the few actually playing it. There's just no arguing facts. I'm not saying the game is bad, I'm saying less than 500 people actually play it on a daily basis, making the game played by "nobody".
I haven't said anything about the 3090 so your "gotcha" falls a bit flat. I don't see how it's relevant.
Anyway, this discussion is off topic. My point overall is this'll end up being another one of those benchmarking titles that nobody actually plays, like Strange Brigade and Ashes.
The 3090 was my example not yours and you still wouldn't say nobody has 3090 while in fact there are people who have the card even though the number is short. That's my point.
I'm serious. Try playing 3Dmark :) Showcasing capabilities of new tech in a game and benchmarks is totally different thing. Also what about the real world performance? Isn't that better showing it in a real game you can play instead of a benchmark?
Back on topic, The GodFall adds a lot of new tech that can be showcased, I think it should be added to the testing suite if not as a general RT performance measure but at least a good comparsion. It is a new engine and it tests all the new tech we may have in the future titles.
Now that AMD is back to playing the high end game you can rest assured they will be pushing for higher VRAM usage and accompanying extra detail. On PC, there is now a performance/quality potential gap between AMD and Nvidia cards. If you objectively look back in time, Nvidia has made use of those gaps every single time, and not to the benefit of AMD performance either.
Remember, this was about 'the best' GPU choice this gen. I think its clear by now a 10GB 3080 isn't it.
As for the game itself, that is really beside the point I reckon its a pretty weak title. But it is one of the console launch titles too, so a writing on the wall.
About the performance penalty... frame time variance does go up when you're fully allocated. It can kill smoothness.
Nvidia only has itself to blame for not only failing to increase VRAM on it's cards but actually decreasing it over a 3 generation period. My 1080 ti is 2 generations old and has 11GB of VRAM, the 3080 has 10GB, that's pathetic. Corners were cut with Ampere.
www.ebuyer.com/911027-xfx-radeon-rx-570-xxx-8gb-graphics-card-rx-570p8dfd6
£120 excluding tax, dual-fan, 8GB, and a metal backplate.
about $205 Canuck dollars. Here they are listed for about $265 w/o tax
I dont recommend AMD GPUs often, but really save your money for the RX 6000 series card, atleast those options starts with 16Gb VRAM, something that will actually maxout detail in true next gen games, not cross gen bullcrap
- Consoles do not have more memory: they're equipped with 16GB RAM, about 3GB of that is reserved for the OS and the rest is to be divided between code and VRAM. Unless you think games will be coded to run within 1GB or smth.
- Godfall is so "true next gen" it's running on Unreal Engine (see www.techpowerup.com/forums/threads/godfall-system-requirements-list-12-gb-vram-for-4k-and-ultra-hd-textures.274218/post-4385342 )
As for RAM usage, current PC games typically use 4-7GB - I've just come here from GN with a video from Steve investigating exactly that as part of their 2 vs 4 sticks findings - and a lot of that is used as swap space for GPU VRAM. DirectStorage means that the consoles will probably permit developers to allocate up to 12GB RAM to the GPU, and if not immediately, that will eventually become the baseline minimum for developers to target when the inevitable mid-cycle console refreshes come out with increased specs and performance.
You don't feel comfortable with 10GB, don't get a 3080. I see no reason to parrot betrayal and whatnot about it.