• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Are game requirements and VRAM usage a joke today?

So does setting the textures to low to avoid a 4GB GPU running like garbage prove anything? With textures at maximum quality it falls below an iGPU. lol
It proves that the graphics processor is at its limit. It can't do more even with 4TB vRAM.
Loaded textures are just a barrel of ink if the "brushes" can't be used. They cannot be used effectively because the GPU does not have enough computing power. We look at the TPU 2023 reviews and the 6700XT offers a decent framerate for 1440p (>60fps) in only two titles. In another it is at the limit (>50fps) and in the rest ... a disastrous average of 20-40 fps.
The idea is that, until vRAM, the graphics processor kills you.

What are you talking about? Ive never even owned an amd card. All I'm saying is, the 8gb became a limitation on my 3070 well before the GPU limitation did when playing tlou at launch. And that's when I realized vram is important, but that doesn't mean it is the one and only spec you look at when picking a card, just an important one. No one spec should be taken in a vacuum.

And no I don't use ray tracing, but only because I find it a waste of electricity. And frankly I don't see how that's relevant here.
You are probably talking about The Last of Us. This is another masterpiece of optimization disaster, a game where even the RTX 3080 offers an excretable experience in 1440p with maximum settings.
I played it on High (nowadays, the difference between High and Ultra is just a fad), I appreciate a healthy average of 80 FPS. For such poorly optimized games, you have to dig deep into your wallet for a very powerful video card. The 6700XT has enough vRAM for Ultra ... if you consider an average of 48 fps to be acceptable to you. For me, it is not.

I didn't say you're an AMD supporter, I just asked what you pay attention to in games: completing tasks or birds in the sky?


.................................
3060 Ti 8GB vs 3060 Ti 16GB.
Today, new games, identical chips, differences only in the amount of vRAM.

What do we learn from here, if we haven't learned anything from recent history (580 4/8GB and others)?

Clipboard01.jpg
 
Last edited:
You are probably talking about The Last of Us. This is another masterpiece of optimization disaster, a game where even the RTX 3080 offers an excretable experience in 1440p with maximum settings.
I played it on High (nowadays, the difference between High and Ultra is just a fad), I appreciate a healthy average of 80 FPS. For such poorly optimized games, you have to dig deep into your wallet for a very powerful video card. The 6700XT has enough vRAM for Ultra ... if you consider an average of 48 fps to be acceptable to you. For me, it is not.

I didn't say you're an AMD supporter, I just asked what you pay attention to in games: completing tasks or birds in the sky?

Oh my would you stop bringing up AMD gpus they have nothing to do with this. Its about the 3070 and its 8gb vram limitation. You said GPU limitation comes before vram limitations in 8gb cards, I provided an example where that wasn't the case. And its far from the only title. And no your 'charts' do not prove otherwise for reasons I have already explained, crashing and missing textures.

Do I watch birds in the sky in my games? Not generally if they are crashing. If they don't crash I often find myself playing the game.
 
You are probably talking about The Last of Us. This is another masterpiece of optimization disaster, a game where even the RTX 3080 offers an excretable experience in 1440p with maximum settings.
I played it on High (nowadays, the difference between High and Ultra is just a fad), I appreciate a healthy average of 80 FPS. For such poorly optimized games, you have to dig deep into your wallet for a very powerful video card. The 6700XT has enough vRAM for Ultra ... if you consider an average of 48 fps to be acceptable to you. For me, it is not.

I didn't say you're an AMD supporter, I just asked what you pay attention to in games: completing tasks or birds in the sky?
Optimisation disaster or not, if you want to play THAT game, you do, if you don't, you don't. In the first case, how well it runs matters, in the second case it doesn't.

That's why I always suggest looking at specific games in GPU reviews instead of averages. You don't play an average game. You play the one you play. It's always specific.

Whether playing at Ultra is reasonable or not is a different matter altogether (in a lot of cases, it's not, unless your hardware can run it fluently).
 
It proves that the graphics processor is at its limit. It can't do more even with 4TB vRAM.
Loaded textures are just a barrel of ink if the "brushes" can't be used. They cannot be used effectively because the GPU does not have enough computing power. We look at the TPU 2023 reviews and the 6700XT offers a decent framerate for 1440p (>60fps) in only two titles. In another it is at the limit (>50fps) and in the rest ... a disastrous average of 20-40 fps.
The idea is that, until vRAM, the graphics processor kills you.


You are probably talking about The Last of Us. This is another masterpiece of optimization disaster, a game where even the RTX 3080 offers an excretable experience in 1440p with maximum settings.
I played it on High (nowadays, the difference between High and Ultra is just a fad), I appreciate a healthy average of 80 FPS. For such poorly optimized games, you have to dig deep into your wallet for a very powerful video card. The 6700XT has enough vRAM for Ultra ... if you consider an average of 48 fps to be acceptable to you. For me, it is not.

I didn't say you're an AMD supporter, I just asked what you pay attention to in games: completing tasks or birds in the sky?

Having more vram allows you to use better textures, regardless of the rest of the settings, and saturating vram not only drops performance but also causes all kinds of bugs and stutters. it's clear that putting rivers of vram on a weak GPU doesn't help, the limit will be the GPU itself.

However, it is also true that there are many examples on the market of powerful GPUs with questionable amounts of Vram.

3070 8gb
3070ti 8gb
4070ti 12gb

Don't you agree that such expensive GPUs should have larger amounts of Vram?
 

I see you added another chart to your post. I think there may be something you don't understand about vram, but please correct me if I'm wrong. More vram ( as in quantity, not mhz) does not make a game faster. Vram (quantity) only starts to matter when the buffer is no longer sufficient to hold the necessary assets. So no a 16gb 4060 ti is not faster than a 8gb 4060 ti. However, you do not want to be stuck with 8gb 4060 ti when you are playing a game that requires more than 8gb of vram. In that case, you are likely to experience either stuttering, crashing, missing textures, or a combination of all 3. Those things do not show up in averages, at least not in a significant way, however, they significantly degrade the experience.

Like I already said. Vram is important, but its not a stat you look at to determine a cards relative speed.
 
Don't forget classic example GTX 970 3.5GB. Problem is not only RAM capacity but also throughput. GPU may were owns enough performance (in it's time) for 4+GB fast VRAM but Nvidia choose only 3.5GB fast +0.5 GB slow.
 

Having more vram allows you to use better textures, regardless of the rest of the settings, and saturating vram not only drops performance but also causes all kinds of bugs and stutters. it's clear that putting rivers of vram on a weak GPU doesn't help, the limit will be the GPU itself.
You didn't choose well. There is a physical difference in GPU between 1060 3 and 6GB. Early reviews concluded that this difference makes a difference in games.
The ok examples are the ones presented by me (4060Ti and 580) because the graphics chips are identical.
I think that the 4060Ti 8-16GB reviews leave no room for discussion on this topic.

Optimisation disaster or not, if you want to play THAT game, you do, if you don't, you don't. In the first case, how well it runs matters, in the second case it doesn't.

That's why I always suggest looking at specific games in GPU reviews instead of averages. You don't play an average game. You play the one you play. It's always specific.

Whether playing at Ultra is reasonable or not is a different matter altogether (in a lot of cases, it's not, unless your hardware can run it fluently).
As I said, I played this title on High. As the differences between High and Ultra are tiny, almost undetectable with the eye, I had other complaints related to the game. I don't know, maybe many want to force the video card to maximum details at the price of a pathetic framerate. Maybe it's their pleasure. I prefer to reduce the details and obtain a fluent framerate in GPU-intensive games, an operation that relieves vRAM.
If someone wants maximum settings for The Last of Us, keep in mind that even a 3080 has big problems in 1440p.
You have three options:
1. You buy the more expensive, top-of-the-line video card.
2. Reduce the details if point 1 does not interest you.
3. Play at maximum details and excretable framerate if point 1 and 2 does not interest you.

I play Cyberpunk with the RayTracing Ultra preset because it provides ~80 fps average and a minimum of 56 fps. Decent. If it wasn't decent, I had no qualms about going down to Ultra without raytracing or with raytracing and reducing other settings.

Oh my would you stop bringing up AMD gpus they have nothing to do with this. Its about the 3070 and its 8gb vram limitation. You said GPU limitation comes before vram limitations in 8gb cards, I provided an example where that wasn't the case. And its far from the only title. And no your 'charts' do not prove otherwise for reasons I have already explained, crashing and missing textures.

Do I watch birds in the sky in my games? Not generally if they are crashing. If they don't crash I often find myself playing the game.
In the absence of other arguments, the surplus of vRAM is almost the only target of AMD's marketing. The other is the prices.
The limitation of 3070 (I use 3070Ti) does not exist in this game. The main reason would be the shortage of ammunition. You cannot afford to waste it and you are forced to play with a decent frame rate. I hope you have heard about Counter Strike and the appetite of top players for low settings. According to the game review, only the top video cards get a decent framerate if you insist on the maximum details. In this game, just going from Ultra to High is enough to double the fps for 3070. The 6700XT doesn't do wonders with the surplus of vRAM either. Although it's about the only game in which it exceeds the 3070, with 60fps in 1080p and 48fps in 1440, well, you're doing yourself a disservice with the maximum details.
 
Last edited:
Games have become obssessed with looking as real as possible, not everyone want's that, even though game makers, and video card manufacturers think 100% of us do. Look at WoW for example, it is seemingly still doing pretty well looking like it does.

This makes zero sense. A few games aiming at ultra-realism is not representative of "games" and the second sentence very handily disproves the first all by itself.
 
This makes zero sense. A few games aiming at ultra-realism is not representative of "games" and the second sentence very handily disproves the first all by itself.

How does it. The number of players on wow prooves the players are not bothered if it looks dated. It is an old game, most newer games have an obsession with looking as real as possible, where as WoW does not, which is an exception.
 
You didn't choose well. There is a physical difference in GPU between 1060 3 and 6GB. Early reviews concluded that this difference makes a difference in games.
The ok examples are the ones presented by me (4060Ti and 580) because the graphics chips are identical.
I think that the 4060Ti 8-16GB reviews leave no room for discussion on this topic.
Yeah, 11% more shaders are giving almost double the performance, it makes sense. Certainly the vram didn't have any impact. :p

RTX 4060 Ti 8GB vs 4060 Ti 16GB vs 4070: The Ultimate Comparison in the Latest Games!!! - YouTube

There is a video full of examples showing the vram limitation, and even better on the GPU that you mentioned. Looking forward to seeing what excuses you'll come up with now.
 
The ability to reflect light and being shiny aren't the same thing. You see in the image we're talking about that the road surface has turned into a giant mirror. That's not how reflection works in the real world at all.
The world reacts on light.. but its not a mirror.
Water does have specular (mirror) reflections. A waterlogged road would turn into "a giant mirror."

The principle that because its a new effect that is costly on perf, means it defeats an artistic purpose or just logical and immersive game worlds is completely ridiculous. It wanting to stand out 'too much' and that somehow being positive is a world upside down much the same. Its the same thing as poor art or a B movie with the excuse 'yeah, the painter only had highlighter markers for this work'... 'Thanks for the effort, it looks like shit' is all I can say about it. Might as well not have bothered. Just copy the old cliche instead and make it work, thank you.
That's unfortunately how progress happens.
Even simple actions as rotating an image were expensive once, and hardware implementation to work around that were made, which were also very limited in capability. But gamedevs still used them, improved on them, and with time cost became insignificant and now I don't even think you can find a game without some sort of affine transformation. You'll find the same story applying to many of even the most basic brushes, to use AusWolf's analogy, a modern gamedev has.

As for the highlighter painter, I'm pretty sure everyone but hipsters agree that postmodern art is shit...
 
Water does have specular (mirror) reflections. A waterlogged road would turn into "a giant mirror."
Except that the street in that image isn't waterlogged. Or if it is, then how? It's on a slope, so water should flow straight down. Also, why is everything else (including the sidewalk) bone dry?
 
Except that the street in that image isn't waterlogged. Or if it is, then how? It's on a slope, so water should flow straight down. Also, why is everything else (including the sidewalk) bone dry?
Dude! RT is complicated as is without adding a hydraulic/drainage simulator to the feature list. Although I wouldn't mind a new industry to spam with resumes.

There are views that show the sidewalks, roofs, and other paved areas wet, plus some (pathetic?) rain effects.
 
There is definitely a need for real-world observations of pavements after rain. The manner in which, and for what time, water drains, soaks, or dries on different parts of the roadway.
 
I think that the 4060Ti 8-16GB reviews leave no room for discussion on this topic.

The answer is hardly that simple. This line was designed to bind itself up with more than VRAM. You are looking past the whole being worth less than the sum of its parts.

Any secondary examination beyond software controls should be focused on power + efficiency being likely one of many determining factors in only installing 8GB of RAM.

These are budget cards that should be creating a tier below the main RTX 4xxx line instead of basing Super/Ti/etc variations off them.
 
Dude! RT is complicated as is without adding a hydraulic/drainage simulator to the feature list. Although I wouldn't mind a new industry to spam with resumes.
I don't accept that as a reason why a few drops of rain should make roads look like they're caked in ice.

There are views that show the sidewalks, roofs, and other paved areas wet, plus some (pathetic?) rain effects.
I think the whole "water = mirror" concept looks pathetic, especially when there's no splashing, drainage, or any other effect that would show that water is actually a transparent liquid.

Like I said, I'm not against RT at all, but this kind of implementation looks like crap. Sorry, it's just my opinion.
 
In the absence of other arguments, the surplus of vRAM is almost the only target of AMD's marketing. The other is the prices.
The limitation of 3070 (I use 3070Ti) does not exist in this game. The main reason would be the shortage of ammunition. You cannot afford to waste it and you are forced to play with a decent frame rate. I hope you have heard about Counter Strike and the appetite of top players for low settings. According to the game review, only the top video cards get a decent framerate if you insist on the maximum details. In this game, just going from Ultra to High is enough to double the fps for 3070. The 6700XT doesn't do wonders with the surplus of vRAM either. Although it's about the only game in which it exceeds the 3070, with 60fps in 1080p and 48fps in 1440, well, you're doing yourself a disservice with the maximum details.

What? Have you forgotten already, that I said this was at launch when the game was very different and I wasn't playing at ultra? I was just trying to find a good balance between visual quality and NOT CRASHING. At 1440p native medium detail the game would still crash, though less frequently, but back then medium looked like ass. Only dlss saved me from the vram crashes. 60 fps with it, 60 fps without it. Not a gpu limitation, a vram limitation. Still nothing to do with amd.
 
What if we have it all wrong? What if Nvidia packs less VRAM into their GPUs because they know they don't have the horsepower to use it all. Just because AMD packs a bunch of VRAM on their low to midrange GPUs doesn't mean you still get good performance from it.. am I wrong?
 
What if we have it all wrong? What if Nvidia packs less VRAM into their GPUs because they know they don't have the horsepower to use it all. Just because AMD packs a bunch of VRAM on their low to midrange GPUs doesn't mean you still get good performance from it.. am I wrong?

I mean amd certainly do it for marketing, no doubt about that. But the 3070 still had more life in her. 6800xt didn't have so much trouble playing tlou and didn't have missing textures in hogwarts or crash or in resident evil. So yeah with a wide angle lens on I'd say you're wrong. Though at time of release nvidia seems to give enough to play games of the day. But the clock starts ticking real fast.
 
What if we have it all wrong? What if Nvidia packs less VRAM into their GPUs because they know they don't have the horsepower to use it all. Just because AMD packs a bunch of VRAM on their low to midrange GPUs doesn't mean you still get good performance from it.. am I wrong?
I mean amd certainly do it for marketing, no doubt about that. But the 3070 still had more life in her. 6800xt didn't have so much trouble playing tlou and didn't have missing textures in hogwarts or crash or in resident evil. So yeah with a wide angle lens on I'd say you're wrong. Though at time of release nvidia seems to give enough to play games of the day. But the clock starts ticking real fast.
You both have a point, imo. You don't need 12-16 GB on an entry/mid-range card (so that's definitely marketing), but you do need more than 8 on a mid/high-tier one.

Edit: Having more than you need is never a bad thing, either. :)
 
Last edited:
You both have a point, imo. You don't need 12-16 GB on an entry/mid-range card (so that's definitely marketing), but you do need more than 8 on a mid/high-tier one.

Edit: Having more than you need is never a bad thing, either. :)
Have you ever stopped to think that the fact that Nvidia and AMD have continually limited entry-level and mid-range GPUs (the most popular) since the RX480 to just 8GB also limits the quality of the textures that devs will consider using in games? Of course, Nvidia's decisions, having a dominant market share in the dGPU market, are the ones that have the greatest influence on developers' perceptions. Even so, there are still cases where 8GB is pushed to the limit.

If in the next generation both AMD and Nvidia increase this minimum "VRAM bar" on entry-level GPUs, there is no doubt that the previous generation will start to suffer.
 
Have you ever stopped to think that the fact that Nvidia and AMD have continually limited entry-level and mid-range GPUs (the most popular) since the RX480 to just 8GB also limits the quality of the textures that devs will consider using in games? Of course, Nvidia's decisions, having a dominant market share in the dGPU market, are the ones that have the greatest influence on developers' perceptions. Even so, there are still cases where 8GB is pushed to the limit.

If in the next generation both AMD and Nvidia increase this minimum "VRAM bar" on entry-level GPUs, there is no doubt that the previous generation will start to suffer.
I know. That's partly why I opted for a 16 GB 7800 XT even though I don't need one at the present moment.
 
Edit: Having more than you need is never a bad thing, either. :)
I only mentioned it because I noticed in Forza Motorsport I have enough VRAM, but horsepower could be better at 3840x2160. At 2160x1440 its plenty..
 
Yeah, 11% more shaders are giving almost double the performance, it makes sense. Certainly the vram didn't have any impact. :p

RTX 4060 Ti 8GB vs 4060 Ti 16GB vs 4070: The Ultimate Comparison in the Latest Games!!! - YouTube

There is a video full of examples showing the vram limitation, and even better on the GPU that you mentioned. Looking forward to seeing what excuses you'll come up with now.
For the first part of the comment, this was the opinion of the reviewers when testing the two video cards. Double is the amount of vRAM, not the performance. The difference between their performance was then ~11%. It is surprising that the difference in cores is also 11%. Could it be a coincidence? If you insist, we look into the reviewers' conclusions.
NVIDIA GeForce GTX 1060 3 GB.jpg


For the second part (the one with the video) did you see how many frames are rendered per second? Seriously! Spending $400-$500 on a video card to play at 40, 30 or even 20 fps??? Priceless!!!!
I'm serious, but only if you have stains on your brain and are desperately trying to prove a ... stupidity.
If you want a personal opinion, whoever spent $100 extra for the 16GB, stays with the money spent.
 
For the first part of the comment, this was the opinion of the reviewers when testing the two video cards. Double is the amount of vRAM, not the performance. The difference between their performance was then ~11%. It is surprising that the difference in cores is also 11%. Could it be a coincidence? If you insist, we look into the reviewers' conclusions.
An excess amount of VRAM has never been about present games. It's about future games. Just look at the 2 GB 960 or the 3 GB 1060 as examples. They both ran contemporary games just fine (that's why you see only an 11% difference in the database), but only after a year or two, they started to awfully struggle against their counterparts with double the amount of VRAM.

On the other hand, loading a 6500 XT with 16 GB of VRAM would be pointless, but that's not the performance class in question here.

I'm also not saying that the 8 GB 4060 Ti will turn out to be equally crap as the 2 GB 960 soon, but you never know. If you have more than enough VRAM, you know what you get performance-wise. Having just enough, or not enough is a gamble.
 
Last edited:
Back
Top