Wednesday, November 4th 2020

Godfall System Requirements List 12 GB VRAM for 4K and Ultra HD Textures

Godfall, the RPG looter-slasher that's being developed by Counterplay Games in close collaboration with AMD, will require 12 GB VRAM for maxed-out settings at 4K resolution. As part of AMD's partner videos the company announced when it revealed the RX 6000 series of graphics cards, Godfall is being built with DirectX 12 Ultimate and DXR in mind, and takes advantage of a number of rendering technologies that are part of the DXR 1.1 feature-set, alongside AMD's Fidelity FX technologies. Counterplay Games will make a 4X x 4X Ultra HD texture pack available for maxed-out settings - well within the 16 GB of VRAM AMD has settled on for its RX 6900 XT, RX 6800 XT, and RX 6800 graphics cards.

Godfall features Variable Rate Shading (VRS) for increased performance with no discernible loss of visual quality, as well as raytraced shadows (platform agnostic) and makes use of AMD's Fidelity FX Contrast Adaptive Sharpening. This technology has shown great results in improving both performance (it has been benchmarked as offering performance levels similar to that of DLSS 2.0 in Death Stranding, for instance, compared to a full 4K render) and image quality.
Source: Videocardz
Add your own comment

87 Comments on Godfall System Requirements List 12 GB VRAM for 4K and Ultra HD Textures

#51
lexluthermiester
mechtechNow if people are worried about 1 game and have a 4k screen, then wait for a review before making a purchase.
That's a fair point. Still, it's easy to understand the concern.
mechtechebay/scalpers

Also for retailers/etailers I typically think in CND$, so my bad there for not differentiating.
Ah, gotcha.
bugAnybody else remembers:
forum.beyond3d.com/threads/doom-3s-ultra-quality-mode-and-512mb-graphics-boards.18114/ DOOM 3: High vs. Ultra quality - TR Forums
I do remember that. But then again 1GB cards were just around the corner with a new gen of GPUs from both NVidia and ATI(IIRC). Id software knew this and had planed to take advantage of the extra VRAM while still leaving options for older cards.
Posted on Reply
#52
bug
lexluthermiesterI do remember that. But then again 1GB cards were just around the corner with a new gen of GPUs from both NVidia and ATI(IIRC). Id software knew this and had planed to take advantage of the extra VRAM while still leaving options for older cards.
And what happened when people got a hold of card capable of running that? They found out it looks exactly like on "very high" settings.
Posted on Reply
#53
wolf
Better Than Native
KeemzayRip 10gb 3080 owners
Oh no! a game I was never going to play, at a res I was never going to play at.

RIP me, press F.
Posted on Reply
#54
AsRock
TPU addict
nguyenoh wow somehow talk about a game is offending to you, sorry about that.

Would be pretty funny when an indie developer get paid by AMD to tune an UE4 game to the point of trashing Nvidia GPU, Hairwork reverso ?
To a point i agree with you but there are those who enjoy the stuff and to say no one is a very bold statement.

Gotta watch out people just get bent out of pretty much nothing these days, like offend WTF, any chance i missed some thing and pissed this guy of else were.

Really shit happens, got bored of the video tbh not my kind of game by the looks of it.
Posted on Reply
#55
Prima.Vera
I'm more interested in the gameplay and story then graphicalls to be honest...

Btw, the GPUs releases nowadays start to look more and more like what 3DFXwas doing in the day with their VooDoo 4 cards.... Just saying...
Posted on Reply
#56
ratirt
kayjay010101You need to calm down, dude. They never called you out for anything and only stated facts. No need to get offended. Ashes and Strange Brigade are games that literally nobody plays.
Literally nobody? I play it so you are wrong. How is a game only as a benchmark only? Your "nobody" is taken out of tin air.
If I'd say nobody buys 3090 you would say there are people buying it. So as there are people playing Strange Brigade and I'm sure there's more of these people than people willing to buy 3090.
At least, You can play Brigade cause getting 3090 is more of a fantasy.
Posted on Reply
#57
W1zzard
bugwill take a look at it when he has some time.
Not sure. It's yet another Unreal Engine title, EGS exclusive and not sure if a very good game.
Posted on Reply
#58
kayjay010101
ratirtLiterally nobody? I play it so you are wrong. How is a game only as a benchmark only? Your "nobody" is taken out of tin air.
If I'd say nobody buys 3090 you would say there are people buying it. So as there are people playing Strange Brigade and I'm sure there's more of these people than people willing to buy 3090.
At least, You can play Brigade cause getting 3090 is more of a fantasy.
Disregarding the free giveaway that happened in October, the weekly average players of SB in the last year was (drumroll please) 350 players. I'm sorry but that's pathetic lol.
The game is a benchmark because reviewers use it often to benchmark vulkan scaling performance. But as the chartsshow, very few actually play it. Like, the peak of all time is under 2000 people, and that was at launch, when the peak would be at an all time high. (even Halo Wars, which came out a decade ago on the Xbox 360, for a series who's primary audience is Xbox, not PC, has had more peak players!) The average has never been above 300 people... The game isn't popular, end of story. You're one of the few actually playing it. There's just no arguing facts. I'm not saying the game is bad, I'm saying less than 500 people actually play it on a daily basis, making the game played by "nobody".
I haven't said anything about the 3090 so your "gotcha" falls a bit flat. I don't see how it's relevant.

Anyway, this discussion is off topic. My point overall is this'll end up being another one of those benchmarking titles that nobody actually plays, like Strange Brigade and Ashes.
Posted on Reply
#59
ratirt
kayjay010101Disregarding the free giveaway that happened in October, the weekly average players of SB in the last year was (drumroll please) 350 players. I'm sorry but that's pathetic lol.
The game is a benchmark because reviewers use it often to benchmark vulkan scaling performance. But as the chartsshow, very few actually play it. Like, the peak of all time is under 2000 people, and that was at launch, when the peak would be at an all time high. (even Halo Wars, which came out a decade ago on the Xbox 360, for a series who's primary audience is Xbox, not PC, has had more peak players!) The average has never been above 300 people... The game isn't popular, end of story. You're one of the few actually playing it. There's just no arguing facts. I'm not saying the game is bad, I'm saying less than 500 people actually play it on a daily basis, making the game played by "nobody".
I haven't said anything about the 3090 so your "gotcha" falls a bit flat. I don't see how it's relevant.

Anyway, this discussion is off topic. My point overall is this'll end up being another one of those benchmarking titles that nobody actually plays, like Strange Brigade and Ashes.
So nobody makes no sense if there are people playing it.
The 3090 was my example not yours and you still wouldn't say nobody has 3090 while in fact there are people who have the card even though the number is short. That's my point.
I'm serious. Try playing 3Dmark :) Showcasing capabilities of new tech in a game and benchmarks is totally different thing. Also what about the real world performance? Isn't that better showing it in a real game you can play instead of a benchmark?

Back on topic, The GodFall adds a lot of new tech that can be showcased, I think it should be added to the testing suite if not as a general RT performance measure but at least a good comparsion. It is a new engine and it tests all the new tech we may have in the future titles.
Posted on Reply
#60
Chomiq
W1zzardNot sure. It's yet another Unreal Engine title, EGS exclusive and not sure if a very good game.
Plus it's from Randy "Pendrive" Pitchford, that guy will say anything if the end result is a big fat check.
Posted on Reply
#61
Vayra86
bugGames can ask for whatever they want, what is unfathomable is games becoming unplayable on a 3080 in the next 3-4 years.
PC games are meant to be playable on mid-range cards and since current midrange costs ~$350, games won't even target that.

I mean, anecdotal evidence aside, we have a lot of game reviews here on TPU that show 10GB is plenty. And that many games allocate VRAM, but will run with no performance penalty on cards that less than that.
That's looking back. Not a good idea when you're a few weeks short of a new console gen. Nobody says 'unplayable' here, what I'm saying is 'having to consider cutting back on detail' which is a strange idea with a 700 dollar card. Yes, in the first 3-4 years, maybe even five. Remember... the new midrange carries and has carried 6-8 GB for a while now. Since 2016. Moving to 10 is a pretty weak step forward for a high end product.

Now that AMD is back to playing the high end game you can rest assured they will be pushing for higher VRAM usage and accompanying extra detail. On PC, there is now a performance/quality potential gap between AMD and Nvidia cards. If you objectively look back in time, Nvidia has made use of those gaps every single time, and not to the benefit of AMD performance either.

Remember, this was about 'the best' GPU choice this gen. I think its clear by now a 10GB 3080 isn't it.

As for the game itself, that is really beside the point I reckon its a pretty weak title. But it is one of the console launch titles too, so a writing on the wall.

About the performance penalty... frame time variance does go up when you're fully allocated. It can kill smoothness.
Posted on Reply
#62
R-T-B
Vayra86The first proof its not, is now in. And the consoles haven't even released proper.
It's not proof until the game is in hand.
Posted on Reply
#63
Athlonite
Now wouldn't it be just the shit if the game wouldn't launch if it detected less than the required 12GB of Vram
Posted on Reply
#64
evernessince
bugI'm pretty sure 3080 will run this just fine. AMD probably mandated the dev throws in some humongous textures because... well, we all know why.
Given that Hardware Unboxed is also reporting over 8GB of VRAM in Watch Dog Legions, I think it's less to do with your assumption of AMD sabotage and more with the fact that it was only a matter of time before game VRAM requirements increase.

Nvidia only has itself to blame for not only failing to increase VRAM on it's cards but actually decreasing it over a 3 generation period. My 1080 ti is 2 generations old and has 11GB of VRAM, the 3080 has 10GB, that's pathetic. Corners were cut with Ampere.
Posted on Reply
#65
BoboOOZ
AthloniteNow wouldn't it be just the shit if the game wouldn't launch if it detected less than the required 12GB of Vram
Probably not, it's just that some cached data will not fit in the VRAM, so they will have to be fetched over and over again from the RAM, which will cause the performance to decrease.
Posted on Reply
#66
Vayra86
R-T-BIt's not proof until the game is in hand.
Agreed. Time will tell
Posted on Reply
#67
r9
My dad can beat your dad ...
Posted on Reply
#68
Chrispy_
evernessinceGiven that Hardware Unboxed is also reporting over 8GB of VRAM in Watch Dog Legions, I think it's less to do with your assumption of AMD sabotage and more with the fact that it was only a matter of time before game VRAM requirements increase.

Nvidia only has itself to blame for not only failing to increase VRAM on it's cards but actually decreasing it over a 3 generation period. My 1080 ti is 2 generations old and has 11GB of VRAM, the 3080 has 10GB, that's pathetic. Corners were cut with Ampere.
I've been buying a crapload of RX570's to refurb old machines at work. 8GB cards are €125 ultra-budget solutions right now. The prospect of a 'flagship' not having significantly more RAM than that is hillarious.
Posted on Reply
#69
mechtech
Chrispy_I've been buying a crapload of RX570's to refurb old machines at work. 8GB cards are €125 ultra-budget solutions right now. The prospect of a 'flagship' not having significantly more RAM than that is hillarious.
Where are you buying these RX570s? Almost vapourware here in Canada............but I have not looked to 2nd hand market.
Posted on Reply
#72
Solid State Soul ( SSS )
Lamo i thought as much, when a 500$ console ( 400$ digital edition ) has more VRAM than a 700$ graphics card you know something is up, this is yet another case of games being developed for consoles first, then optimized for PC later, Godfall is the first true nexr gen game to be on PC and it already demands more Vram than an RTX 3080 lol, i wounder what the elitist legions on the "is 10Gb VRAM enough for next gen" thread have to say cause they tried their damn to debunking that saying even 8gb is enough for net gen lol.

I dont recommend AMD GPUs often, but really save your money for the RX 6000 series card, atleast those options starts with 16Gb VRAM, something that will actually maxout detail in true next gen games, not cross gen bullcrap
Posted on Reply
#73
bug
Solid State Soul ( SSS )Lamo i thought as much, when a 500$ console ( 400$ digital edition ) has more VRAM than a 700$ graphics card you know something is up, this is yet another case of games being developed for consoles first, then optimized for PC later, Godfall is the first true nexr gen game to be on PC and it already demands more Vram than an RTX 3080 lol, i wounder what the elitist legions on the "is 10Gb VRAM enough for next gen" thread have to say cause they tried their damn to debunking that saying even 8gb is enough for net gen lol.

I dont recommend AMD GPUs often, but really save your money for the RX 6000 series card, atleast those options starts with 16Gb VRAM, something that will actually maxout detail in true next gen games, not cross gen bullcrap
Yeah, about that:
- Consoles do not have more memory: they're equipped with 16GB RAM, about 3GB of that is reserved for the OS and the rest is to be divided between code and VRAM. Unless you think games will be coded to run within 1GB or smth.
- Godfall is so "true next gen" it's running on Unreal Engine (see www.techpowerup.com/forums/threads/godfall-system-requirements-list-12-gb-vram-for-4k-and-ultra-hd-textures.274218/post-4385342 )
Posted on Reply
#74
Chrispy_
bugYeah, about that:
- Consoles do not have more memory: they're equipped with 16GB RAM, about 3GB of that is reserved for the OS and the rest is to be divided between code and VRAM. Unless you think games will be coded to run within 1GB or smth.
- Godfall is so "true next gen" it's running on Unreal Engine (see www.techpowerup.com/forums/threads/godfall-system-requirements-list-12-gb-vram-for-4k-and-ultra-hd-textures.274218/post-4385342 )
I think his point wasn't that consoles are going to use all 16GB RAM for games, but that they can get you 16GB RAM for $400 whilst Nvidia are only giving you 10GB RAM for $700. The GPU in the consoles may not be as good as the GPU in a 3080 but it's still a large, expensive piece of silicon since it also includes an 8C/16T CPU and all IO stuff. Yields and manufacturing cost are going to be roughly comparable to a GA102 die.

As for RAM usage, current PC games typically use 4-7GB - I've just come here from GN with a video from Steve investigating exactly that as part of their 2 vs 4 sticks findings - and a lot of that is used as swap space for GPU VRAM. DirectStorage means that the consoles will probably permit developers to allocate up to 12GB RAM to the GPU, and if not immediately, that will eventually become the baseline minimum for developers to target when the inevitable mid-cycle console refreshes come out with increased specs and performance.
Posted on Reply
#75
bug
Chrispy_I think his point wasn't that consoles are going to use all 16GB RAM for games, but that they can get you 16GB RAM for $400 whilst Nvidia are only giving you 10GB RAM for $700. The GPU in the consoles may not be as good as the GPU in a 3080 but it's still a large, expensive piece of silicon since it also includes an 8C/16T CPU and all IO stuff. Yields and manufacturing cost are going to be roughly comparable to a GA102 die.
That's apples to oranges and you know it. Consoles are sold at a loss, the money's in the services.
Chrispy_As for RAM usage, current PC games typically use 4-7GB - I've just come here from GN with a video from Steve investigating exactly that as part of their 2 vs 4 sticks findings - and a lot of that is used as swap space for GPU VRAM. DirectStorage means that the consoles will probably permit developers to allocate up to 12GB RAM to the GPU, and if not immediately, that will eventually become the baseline minimum for developers to target when the inevitable mid-cycle console refreshes come out with increased specs and performance.
So sure of the future are you...
You don't feel comfortable with 10GB, don't get a 3080. I see no reason to parrot betrayal and whatnot about it.
Posted on Reply
Add your own comment
Feb 22nd, 2025 23:35 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts