• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Are game requirements and VRAM usage a joke today?

Regarding VRAM usage in Avatar, both RTX 4060 Ti 8GB and RTX 4060 Ti 16 GB performed almost identically, even the 8GB version was slightly faster, probably due to memory chip configuration. It's a matter of allocation rather than the actual usage.
 
Anywhere to read more about this? Can't find anything on Google nor the patch notes.



Perhaps because no hardware can render scenes with 24GB of assets in real time?

Poor implementations exist (looking at you, Cities Skyline 2 -_- ) and will do so. But I'd be cautious comparing two different games if the used metric is (subjective) aesthetic quality. Same for concluding that memory utilisation is a bad thing. Optimisation prioritizes performance, not memory consumption. Uselessness of unused memory 'n all...

Its possible, me and a friend did it in FF7 remake by changing the engine settings. Not full 24 gig as Os itself uses VRAM but circa 20 gig.

Regarding VRAM usage in Avatar, both RTX 4060 Ti 8GB and RTX 4060 Ti 16 GB performed almost identically, even the 8GB version was slightly faster, probably due to memory chip configuration. It's a matter of allocation rather than the actual usage.

Were they the same quality LOD though? this needs examining manually, automated tools wont pick this sort of stuff up.
 
Were they the same quality LOD though? this needs examining manually, automated tools wont pick this sort of stuff up.
performance-2560-1440.png


By memory chip configuration, I'm refering to a 32-bit bus being shared by 2 memory chips, 32-bit bus for each memory chip is the usual these days.
 
Last edited:
Were they the same quality LOD though? this needs examining manually, automated tools wont pick this sort of stuff up.
I ran a benchmark on my 7800 XT at 3440x1440 with various FSR options, and on my 2070 at 4K with DLSS ultra performance. The LOD looked exactly the same to my eyes.
 
performance-2560-1440.png


By memory chip configuration, I'm refering to a 32-bit bus being shared by 2 memory chips, 32-bit bus for each memory chip is the usual these days.
The graph indicates a game heavily optimized for nVidia. The amount of vRAM does not matter if the 4070 Ti 12GB exceeds the 7900 XT 20GB. Moreover, with the exception of the 3090/Ti, no other video card from the previous generation can provide a decent framerate at these settings and you have two options:
1. Reduce details -> less vRAM used but higher framerate.
2. Play in HU mode with 30-50 average and 10-20 minimum.

Option two obliges you to several sessions with a psychologist in order to avoid the psychiatrist in the future.
Look at the 6700XT. How does 12GB help?
 
The graph indicates a game heavily optimized for nVidia. The amount of vRAM does not matter if the 4070 Ti 12GB exceeds the 7900 XT 20GB. Moreover, with the exception of the 3090/Ti, no other video card from the previous generation can provide a decent framerate at these settings and you have two options:
1. Reduce details -> less vRAM used but higher framerate.
2. Play in HU mode with 30-50 average and 10-20 minimum.

Option two obliges you to several sessions with a psychologist in order to avoid the psychiatrist in the future.
Look at the 6700XT. How does 12GB help?
I remember someone commenting in another thread that AMD sponsored titles use excessive amounts of VRAM for bad-looking textures just to make Nvidia look bad. Ironic, isn't it? ;)
 
I remember someone commenting in another thread that AMD sponsored titles use excessive amounts of VRAM for bad-looking textures just to make Nvidia look bad. Ironic, isn't it? ;)
You remember something, but not correctly. Probably because of age.
Games that use excessive vRAM aren't any better in quality than others. If I buy 50,000 tons of ink, I'm no better than Leonardo. I'm really sure of that.
 
The graph indicates a game heavily optimized for nVidia. The amount of vRAM does not matter if the 4070 Ti 12GB exceeds the 7900 XT 20GB. Moreover, with the exception of the 3090/Ti, no other video card from the previous generation can provide a decent framerate at these settings and you have two options:
1. Reduce details -> less vRAM used but higher framerate.
2. Play in HU mode with 30-50 average and 10-20 minimum.

Option two obliges you to several sessions with a psychologist in order to avoid the psychiatrist in the future.
Look at the 6700XT. How does 12GB help?
It's apparently an AMD sponsored game, however that doesn't reflect on the graphs. I would bet it was sponsored for the use of AMD's FSR 3.
The point I was making, is that people focus way too much on memory size, and should focus more on raster performance.. 4060 Ti 8/16 GB performance comparison proved that, it also proves that game doesn't really use 10+ GB of allocated VRAM.
 
You remember something, but not correctly. Probably because of age.
Games that use excessive vRAM aren't any better in quality than others. If I buy 50,000 tons of ink, I'm no better than Leonardo. I'm really sure of that.
I think you misunderstood me. What I meant is, here's an AMD-sponsored game that doesn't need a ton of VRAM to look good. In fact, I tested it with my HTPC that has 16 GB RAM in it and a 2070 (which is probably dying, but that's besides the point), and I couldn't see any stuttering or LOD quality changes compared to my main PC (in my profile).
 
Let's understand each other once. Tons of vRAM used by some titles (no visible visual impact, they are not beasts compared to others) are used by AMD and its satellites as marketing. AMD advertises "as much vRAM as possible - future proof", HU fails miserably with their demonstrations, and the real world looks completely different.
Why? Because they have nothing else to offer. nVidia dominates them in all chapters, from content creation support to energy efficiency. All the "demonstrations" to highlight this vRAM surplus produced an embarrassing framerate, acceptable for igpu and lowend video cards. I can't believe that someone is willing to pay hundreds of dollars (over 400 these days) and play below 50 fps. If there is, it has a big problem.
I can write novels endlessly with the materials collected from the reviews in which it is unequivocally demonstrated that the GPU dies before vRAM becomes a problem. The most eloquent demonstration still remains the comparisons between 4060 Ti 8 and 16GB.

I am putting the 6700XT 12GB and 3070 8GB data once again, collected from two reviews, one from 2021 and other from 2024. After 2.5 years, the 3070 is even more powerful than the 6700XT (not by much, but it is), but both video cards have lost ~35% of performance because new games demand more from the graphics processor. If it wasn't like that, the 3070 should now be surpassed by the 6700XT.
comp.jpg


Future proof? In this domain?!
In June 2019, the Ryzen 7 3700X was launched, a powerful processor (at that time) with an attractive price. It is now surpassed by the E cores of the 14700K.
The biggest stupidity I read on the forums was with AMD doing miracles from drivers, but in time. Give the money now, I'll bring you the goods... sometime. Now I have evolved to vRAM, waiting for miracles as long as every game release hits the graphics processor in the plex.

ecores.jpg
 
I am impressed, my 3070Ti is hangin out with the big boi 7800XT.. noice :)
 
The gaming industry is probably less kind to incompetent devs than gaming GPU with 8GB VRAM LOL


Making crappy looking games with bloated hardware requirements? yeah...
 
Let's understand each other once. Tons of vRAM used by some titles (no visible visual impact, they are not beasts compared to others) are used by AMD and its satellites as marketing. AMD advertises "as much vRAM as possible - future proof", HU fails miserably with their demonstrations, and the real world looks completely different.
Why? Because they have nothing else to offer. nVidia dominates them in all chapters, from content creation support to energy efficiency. All the "demonstrations" to highlight this vRAM surplus produced an embarrassing framerate, acceptable for igpu and lowend video cards. I can't believe that someone is willing to pay hundreds of dollars (over 400 these days) and play below 50 fps. If there is, it has a big problem.
I can write novels endlessly with the materials collected from the reviews in which it is unequivocally demonstrated that the GPU dies before vRAM becomes a problem. The most eloquent demonstration still remains the comparisons between 4060 Ti 8 and 16GB.

I am putting the 6700XT 12GB and 3070 8GB data once again, collected from two reviews, one from 2021 and other from 2024. After 2.5 years, the 3070 is even more powerful than the 6700XT (not by much, but it is), but both video cards have lost ~35% of performance because new games demand more from the graphics processor. If it wasn't like that, the 3070 should now be surpassed by the 6700XT.
View attachment 330381

Future proof? In this domain?!
In June 2019, the Ryzen 7 3700X was launched, a powerful processor (at that time) with an attractive price. It is now surpassed by the E cores of the 14700K.
The biggest stupidity I read on the forums was with AMD doing miracles from drivers, but in time. Give the money now, I'll bring you the goods... sometime. Now I have evolved to vRAM, waiting for miracles as long as every game release hits the graphics processor in the plex.
You're not entirely wrong, but
  1. Just because AMD uses badly optimised games in their marketing doesn't mean you have to buy into it. You don't have to run a game at max settings to be able to enjoy it anyway.
  2. AMD does have something else to offer: price. Every tier of AMD GPU costs about 50-100 quid less than the competing Nvidia card.
  3. Most people don't care about content creation or energy efficiency.
  4. Show me any single sub-$400 GPU that delivers above 50 FPS in the latest AAA games at max settings, regardless of how much VRAM it has.
 
Let's understand each other once. Tons of vRAM used by some titles (no visible visual impact, they are not beasts compared to others) are used by AMD and its satellites as marketing. AMD advertises "as much vRAM as possible - future proof", HU fails miserably with their demonstrations, and the real world looks completely different.
Why? Because they have nothing else to offer. nVidia dominates them in all chapters, from content creation support to energy efficiency. All the "demonstrations" to highlight this vRAM surplus produced an embarrassing framerate, acceptable for igpu and lowend video cards. I can't believe that someone is willing to pay hundreds of dollars (over 400 these days) and play below 50 fps. If there is, it has a big problem.
I can write novels endlessly with the materials collected from the reviews in which it is unequivocally demonstrated that the GPU dies before vRAM becomes a problem. The most eloquent demonstration still remains the comparisons between 4060 Ti 8 and 16GB.

I am putting the 6700XT 12GB and 3070 8GB data once again, collected from two reviews, one from 2021 and other from 2024. After 2.5 years, the 3070 is even more powerful than the 6700XT (not by much, but it is), but both video cards have lost ~35% of performance because new games demand more from the graphics processor. If it wasn't like that, the 3070 should now be surpassed by the 6700XT.
View attachment 330381

Future proof? In this domain?!
In June 2019, the Ryzen 7 3700X was launched, a powerful processor (at that time) with an attractive price. It is now surpassed by the E cores of the 14700K.
The biggest stupidity I read on the forums was with AMD doing miracles from drivers, but in time. Give the money now, I'll bring you the goods... sometime. Now I have evolved to vRAM, waiting for miracles as long as every game release hits the graphics processor in the plex.

View attachment 330380

Absolutely true, everything you say. This is what people that ate AMDs VRAM marketing don't know. Up to this marketing came out, a few AMD sponsored games was rushed out, that ate tons of VRAM, more than they should. I bet they did this to support their claim. Funny enough, those games were fixed over time and VRAM usage dropped (image quality even went up slightly - and the massive FSR shimmering and artifact issues got less severe too)

AMD did this several times before. Look up Shadow of Mordor texture pack. AMD sponsored game. Texture pack needed 6GB VRAM but textures looked identical, yet forced many Nvidia GPUs (but also many of AMDs own) to run out of VRAM, with no actual improvement in textures. -> https://www.pcgamer.com/spot-the-di...rdor-ultra-hd-textures-barely-change-a-thing/

Allocation does not mean required amount. It's that simple. Depends on game engine how allocation work.

I can only laugh when I see people cherrypicking numbers to force cards like 3070 into having VRAM issues, by running the most demanding games today in 4K/UHD, sometimes with RT on top, just to force the VRAM dry. Not a real world scenario at all. The GPU itself would not even be able to run these settings, even if it had 16GB VRAM. GPU power is the problem, not VRAM.

In most cases, the GPU is the limiting factor. Not the VRAM at all. Besides, 95% of PC gamers are using 1440p or less (Steam HW Survey). Looking at 4K/UHD native fps numbers means little for the majority of PC gamers and 96% of people on Steam HW Survey have 12GB VRAM or less. Developers make games to earn money, and you don't sell any games if you make games for the 4-5% marketshare.

Beside, most games today look almost identical on high and ultra preset, while high uses alot less vram. Often motion blur and dof is pushed to ultra presets and uses more memory while looking worse to the end-user. Even medium preset looks great in most new games, and sometimes even more "clean" than high and especially ultra, which is filled with blur, dof and all kinds of features that tank performance but necessarily don't make the visuals "better"

Next gen games look great regardless of preset. Take Alan Wake 2 for example. Look good on low as well.

Avatar is one of the best looking games today and it don't need alot of VRAM ->


"Our VRAM testing would suggest that Avatar is a VRAM hog, but that's not exactly true. While we measured over 15 GB at 4K "Ultra" and even 1080p "Low" is a hard hitter with 11 GB, you have to consider that these numbers are allocations, not "usage in each frame." The Snowdrop engine is optimized to use as much VRAM as possible and only evict assets from VRAM once that is getting full. That's why we're seeing these numbers during testing with the 24 GB RTX 4090. It makes a lot of sense, because unused VRAM doesn't do anything for you, so it's better to keep stuff on the GPU, once it's loaded. Our performance results show that there is no significant performance difference between RTX 4060 Ti 8 GB and 16 GB, which means that 8 GB of VRAM is perfectly fine, even at 4K. I've tested several cards with 8 GB and there is no stuttering or similar, just some objects coming in from a distance will have a little bit more texture pop-in, which is an acceptable compromise in my opinion."


People ramble about VRAM for longevity all the time, partly because AMD wants people to believe this is true. In reality tho, upscaling like DLSS/FSR will matter more for longevity and DLSS clearly beats FSR.

AMD owners said 6700XT would age much better than 3070 because of the 12GB VRAM but even today 3070 is faster in pretty much all games including 4K/UHD. Watch the Avatar link again -> https://www.techpowerup.com/review/avatar-fop-performance-benchmark/5.html

And DLSS beats FSR as well in this game. TPU tested that too -> https://www.techpowerup.com/review/avatar-frontiers-of-pandora-dlss-2-vs-fsr-3-comparison/

So yeah. Longevity don't come down to VRAM alone. In most cases, the GPU itself will be the limiting factor and you will not be able to max out games anyway on a dated and slow GPU, forcing you to lower settings.
 
Last edited:
So yeah. Longevity don't come down to VRAM alone. In most cases, the GPU itself will be the limiting factor and you will not be able to max out games anyway on a dated and slow GPU, forcing you to lower settings.
Also noticed when people do nvidia vs amd side by side, nvidia memory compression is noticeable in lower vram usage most the time. So even 12gb on a 4070 is probably more around 14gb due to the algorithm in the card that optimizes what is stored. Given the current lvl of details being what they are it will be hard to make games look better without massive increase in gpu/cpu power and memory IMO. As that game uses 11gb on lowest settings at 1080, and only 15gb when max details at 4k. A lot of people will only see the usage and without little thought in to it draw the wrong conclusion that they need min 16gb and anything less is unacceptable. Using all ram is not bad thing but depending on who is looking at it can make it a bad thing as can see how some people are thinking what they need when its 1 outlayer game that's engine uses everything it can. Windows does this as well, fills up extra ram with cache so you are using it all most the time.
 
Absolutely true, everything you say. This is what people that ate AMDs VRAM marketing don't know. Up to this marketing came out, a few AMD sponsored games was rushed out, that ate tons of VRAM, more than they should. I bet they did this to support their claim. Funny enough, those games were fixed over time and VRAM usage dropped (image quality even went up slightly - and the massive FSR shimmering and artifact issues got less severe too)

AMD did this several times before. Look up Shadow of Mordor texture pack. AMD sponsored game. Texture pack needed 6GB VRAM but textures looked identical, yet forced many Nvidia GPUs (but also many of AMDs own) to run out of VRAM, with no actual improvement in textures. -> https://www.pcgamer.com/spot-the-di...rdor-ultra-hd-textures-barely-change-a-thing/

Allocation does not mean required amount. It's that simple. Depends on game engine how allocation work.

I can only laugh when I see people cherrypicking numbers to force cards like 3070 into having VRAM issues, by running the most demanding games today in 4K/UHD, sometimes with RT on top, just to force the VRAM dry. Not a real world scenario at all. The GPU itself would not even be able to run these settings, even if it had 16GB VRAM. GPU power is the problem, not VRAM.

In most cases, the GPU is the limiting factor. Not the VRAM at all. Besides, 95% of PC gamers are using 1440p or less (Steam HW Survey). Looking at 4K/UHD native fps numbers means little for the majority of PC gamers and 96% of people on Steam HW Survey have 12GB VRAM or less. Developers make games to earn money, and you don't sell any games if you make games for the 4-5% marketshare.

Beside, most games today look almost identical on high and ultra preset, while high uses alot less vram. Often motion blur and dof is pushed to ultra presets and uses more memory while looking worse to the end-user. Even medium preset looks great in most new games, and sometimes even more "clean" than high and especially ultra, which is filled with blur, dof and all kinds of features that tank performance but necessarily don't make the visuals "better"

Next gen games look great regardless of preset. Take Alan Wake 2 for example. Look good on low as well.

Avatar is one of the best looking games today and it don't need alot of VRAM ->


"Our VRAM testing would suggest that Avatar is a VRAM hog, but that's not exactly true. While we measured over 15 GB at 4K "Ultra" and even 1080p "Low" is a hard hitter with 11 GB, you have to consider that these numbers are allocations, not "usage in each frame." The Snowdrop engine is optimized to use as much VRAM as possible and only evict assets from VRAM once that is getting full. That's why we're seeing these numbers during testing with the 24 GB RTX 4090. It makes a lot of sense, because unused VRAM doesn't do anything for you, so it's better to keep stuff on the GPU, once it's loaded. Our performance results show that there is no significant performance difference between RTX 4060 Ti 8 GB and 16 GB, which means that 8 GB of VRAM is perfectly fine, even at 4K. I've tested several cards with 8 GB and there is no stuttering or similar, just some objects coming in from a distance will have a little bit more texture pop-in, which is an acceptable compromise in my opinion."


People ramble about VRAM for longevity all the time, partly because AMD wants people to believe this is true. In reality tho, upscaling like DLSS/FSR will matter more for longevity and DLSS clearly beats FSR.

AMD owners said 6700XT would age much better than 3070 because of the 12GB VRAM but even today 3070 is faster in pretty much all games including 4K/UHD. Watch the Avatar link again -> https://www.techpowerup.com/review/avatar-fop-performance-benchmark/5.html

And DLSS beats FSR as well in this game. TPU tested that too -> https://www.techpowerup.com/review/avatar-frontiers-of-pandora-dlss-2-vs-fsr-3-comparison/

So yeah. Longevity don't come down to VRAM alone. In most cases, the GPU itself will be the limiting factor and you will not be able to max out games anyway on a dated and slow GPU, forcing you to lower settings.
Whether you ramble on about "VRAM = longevity", or "DLSS = the saving grace of modern gaming", you're equally wrong, in my opinion. People believing marketing crap isn't new in any way. More VRAM is good to have, but won't make a weaker card magically faster, and DLSS is just a compromise between quality and performance, a means to an end, not the end itself. Even RT is just a preview feature at this point.
 
Also noticed when people do nvidia vs amd side by side, nvidia memory compression is noticeable in lower vram usage most the time. So even 12gb on a 4070 is probably more around 14gb due to the algorithm in the card that optimizes what is stored. Given the current lvl of details being what they are it will be hard to make games look better without massive increase in gpu/cpu power and memory IMO. As that game uses 11gb on lowest settings at 1080, and only 15gb when max details at 4k. A lot of people will only see the usage and without little thought in to it draw the wrong conclusion that they need min 16gb and anything less is unacceptable. Using all ram is not bad thing but depending on who is looking at it can make it a bad thing as can see how some people are thinking what they need when its 1 outlayer game that's engine uses everything it can. Windows does this as well, fills up extra ram with cache so you are using it all most the time.
Yeah, Nvidia have better memory compression for sure.

Also Nvidia actually invented Neutral Texture Compression which will lower VRAM requirement while improving texture quality. Lets see if it comes to games. After all Nvidia is great at pushing new features to AAA games.


Stuff like this is worth way more than just adding more VRAM.
Besides, AMD uses GDDR6 which is cheaper and worse than GDDR6X.
 
Yeah, Nvidia have better memory compression for sure.

Also Nvidia actually invented Neutral Texture Compression which will lower VRAM requirement while improving texture quality. Lets see if it comes to games. After all Nvidia is great at pushing new features to AAA games.
Let's see if it comes to games without the need to have an Nvidia card in your system. After all Nvidia is great at pushing their own propaganda to lock users of other brands out.

Besides, AMD uses GDDR6 which is cheaper and worse than GDDR6X.
And that's a problem because...?
 
AMD plays with "more VRAM" today.
Nvidia played with "tessellation" before and RT today.

In the end, we need the right card to play the game that we like, yes we can complain as much as we want but this will not change their "dirty" game.

If you have a card with more VRAM you have something that you can use always.
If you have a low-end card with week RT, you can just turn it off today and in the future, or maybe use it in combination with DLSS.

But yeah, we know it, AMD is crap, and just buy Nvidia...
 
AMD plays with "more VRAM" today.
Nvidia played with "tessellation" before and RT today.

In the end, we need the right card to play the game that we like, yes we can complain as much as we want but this will not change their "dirty" game.

If you have a card with more VRAM you have something that you can use always.
If you have a low-end card with week RT, you can just turn it off today and in the future, or maybe use it in combination with DLSS.

But yeah, we know it, AMD is crap, and just buy Nvidia...

AMD charge 60usd more for the extra 8GB on the 7600XT, a card that is too weak for current 1440p gaming let alone sometime in the future
 
AMD charge 60usd more for the extra 8GB on the 7600XT, a card that is too weak for current 1440p gaming let alone sometime in the future
Of course, there are always stupid cards, isn't Nvidia making similar cards too?
 
Of course, there are always stupid cards, isn't Nvidia making similar cards too?

Nah it just meant VRAM is not free and having lots of VRAM on weak GPU is completely useless for games (have better use case on workstation).
Imagine spending 60usd extra on more powerful GPU, that would be more useful in games
 
Nah it just meant VRAM is not free and having lots of VRAM on weak GPU is completely useless for games (have better use case on workstation).
Imagine spending 60usd extra on more powerful GPU, that would be more useful in games
Nobody wants more VRAM on weak cards. People want more VRAM on powerful cards, like the 4070 (Ti). Why AMD is releasing a 16 GB version of the 7600 is beyond me.
 
I ran a benchmark on my 7800 XT at 3440x1440 with various FSR options, and on my 2070 at 4K with DLSS ultra performance. The LOD looked exactly the same to my eyes.

If the game ever goes on sale I will take a peek myself.

Nah it just meant VRAM is not free and having lots of VRAM on weak GPU is completely useless for games (have better use case on workstation).
Imagine spending 60usd extra on more powerful GPU, that would be more useful in games

From AMDs perspective its probably why not given how cheap VRAM is right now wholesale. Given how hungry modern games are in VRAM I would be doing the same. The nice thing about high texture resolution is its practically free on performance, it needs VRAM but doesnt really need rendering performance.
 
Back
Top