• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

for those who think 12gb vram can max out everything

The game using that much of the framebuffer, does not meant the game needs that much framebuffer.
That's not true. All programs will use what it needs to do the task at hand as required by the user. If the program in question is asked to perform at a certain level and need a specific memory pool to do so it will use whatever it has or stop running. With games it will use VRAM first, then system RAM and then virtual memory(swap file). So if the game has to shunt data into system RAM because VRAM is not enough, it will do so, but performance can(and often does) suffer.

So as described just a few posts above, the choices for the user are simple.
 
it can't

2077 motocycle drive in dogtown around obelisk/heavy hearts club
4060ti 16gb vram, 5700x3d, resolution 2k
capturing software: capframex+rtss
I just can't use 12gb vram cards anymore. time to get over it.
2K, as in 1920x1080 right?

Because otherwise you should have writen 2.5K or 1440p, in which case you should have bought a higher end GPU, because the 4060 Ti is aimed at 1080p120.
 
Last edited:
it can't

2077 motocycle drive in dogtown around obelisk/heavy hearts club
4060ti 16gb vram, 5700x3d, resolution 2k
capturing software: capframex+rtss
I just can't use 12gb vram cards anymore. time to get over it.
Heres perspective.

 
That's not true. All programs will use what it needs to do the task at hand as required by the user. If the program in question is asked to perform at a certain level and need a specific memory pool to do so it will use whatever it has or stop running. With games it will use VRAM first, then system RAM and then virtual memory(swap file). So if the game has to shunt data into system RAM because VRAM is not enough, it will do so, but performance can(and often does) suffer.

So as described just a few posts above, the choices for the user are simple.


daft answer, What he said is absolutely right, just because it says so in the framebuffer doesn't mean that's what the game is using. That's the definition of buffer.
It may use more or less at any point, that was not what was in question here.
 
For me, 8gb is the minimum. It is enough though. 4gb/6gb is generally too small for anything but a mobile dgpu
 
Nvidia won't provide you more VRAM because then you might get too cocky with your AI stuff, and that's the solely reason for it. Whether 8GB, 12GB or whatever is enough doesn't matter
Pay more for a higher tier GPU and then you get more VRAM. Simple as that.
 
Just go with a 16GB+ card whether it be Array Technologies inc, intel or nvidia.
 
Last edited:
Just go with a 16GB+ card whether it be Array Technologies inc, intel or nvidia.
I had a 16gb 6800, what a beastly card!!!

My 4070 Super spank the shit out if it with 4gb less vram.

So, no. Don't just buy whatever card with 16gb of Vram. May not be doing yourself any favors.
 
Last edited:
I like her hair whoever that is.

My GPU has 12GB, but I can bog it down before I run into VRAM problems..

I bet that happens more than people think, and they maybe just blame the ram.
 
Just go with a 16GB+ card whether it be Array Technologies inc, intel or nvidia.
the problem is, that's exactly how nvidia wants you to think. 4070TiS has 4GB more vram than 4070S, and costs +35% more for +15% performance. The main selling point is that extra vram, but having owned both 10gb and 12gb cards, I can tell you that it's really hard to find scenarios where 12gb, or even 10gb for that matter, is not enough.
 
I like her hair whoever that is.

My GPU has 12GB, but I can bog it down before I run into VRAM problems..

I bet that happens more than people think, and they maybe just blame the ram.


i think it's easy to see if it really is vram to blame, just compare ram usage before and after when the frame spikes happen

the problem is, that's exactly how nvidia wants you to think. 4070TiS has 4GB more vram than 4070S, and costs +35% more for +15% performance. The main selling point is that extra vram, but having owned both 10gb and 12gb cards, I can tell you that it's really hard to find scenarios where 12gb, or even 10gb for that matter, is not enough.

it would be best for all to have a little extra vram, so cards can live a bit longer, but my old rx 580 never used the 8gb so it was kind of a waste for me, but maybe now it's in good use for someone else, if it had came with 4gb it would be a lot less useful and would probably be e-waste by now. Trade offs.
We should have more vram on the nvidia cards, but some complains and clickbait videos are making things worst and lots of weird arguments being made: i want to play 4k all maxed on my mid range card and can't because of the vram
 
Last edited:
I would like to inform people thant rtx 4070 and rtx 4070 super aren't great cards because of 12gb vram. they're terrible
Fact. /thread
 
We need fact checkers here
And misinformation chess pretty please.

4070 and 4070 Super are greatly balanced GPUs with lots of performance and capable VRAM buffers. What makes them awful is how expensive they are. Imagine the same SKUs for 380 and 450 USD respectively. It would've left zero reason to whine about VRAM because, seriously, this much performance is awesome at this pricing and some scenarios where it's bad to "only" have 12 GB would've been left as "whatevers" more than anything else.

4070 Ti, however, despite being just a tad faster than 4070 Super, is a little bit of a waste and not because it "only" has 12 GB but because it only has 500ish GBps bandwidth. Should've had faster VRAM chips, it's really bandwidth starved, especially at 4K. And yeah, too expensive. Even if we make it a 16 GB GPU, it's still too expensive.

The market is going full monopoly mode so what can we do other than thank AMD and Intel for constant underdelivery coupled with out of mind management and ignoring the dGPU area for too long, respectively.
 
smarter people already pointed out that it depends on the context. We need fact checkers here
According to my records.
Unigene Superposition 8K Optimized.
RX 6800 = 5436 pts.
4070 S = 6513 pts.

Would you like the screen shots also??
 
According to my records.
Unigene Superposition 8K Optimized.
RX 6800 = 5436 pts.
4070 S = 6513 pts.

Would you like the screen shots also??

that's a great game, do you pay it often? how much was your 8K monitor?
 
that's a great game, do you pay it often? how much was your 8K monitor?
None of that matters. It's a benchmark. It runs the resolution even if you had a crt.

And because it's not a game, there are not a bunch of variables.

Unfortunately it is not CyberPunk 2077 benchmark no, but I believe the 4070S has a submission in that section of the forum if you'd like to see it. I did start the thread out with a 16gb card, the 6800 mentioned earlier.


Why?
 
None of that matters. It's a benchmark. It runs the resolution even if you had a crt.

And because it's not a game, there are not a bunch of variables.

Unfortunately it is not CyberPunk 2077 benchmark no, but I believe the 4070S has a submission in that section of the forum if you'd like to see it. I did start the thread out with a 16gb card, the 6800 mentioned earlier.


Why?

context = what you play and how you play it

benchmarking = get the biggest number possible

i'm sure you understand that a faster card will always be a faster card, but there are scenarios like the vram where the slower card will outperform, but not everyone will have that problem, depends on what they do.
 
Last edited:
And misinformation chess pretty please.

4070 and 4070 Super are greatly balanced GPUs with lots of performance and capable VRAM buffers. What makes them awful is how expensive they are. Imagine the same SKUs for 380 and 450 USD respectively. It would've left zero reason to whine about VRAM because, seriously, this much performance is awesome at this pricing and some scenarios where it's bad to "only" have 12 GB would've been left as "whatevers" more than anything else.

4070 Ti, however, despite being just a tad faster than 4070 Super, is a little bit of a waste and not because it "only" has 12 GB but because it only has 500ish GBps bandwidth. Should've had faster VRAM chips, it's really bandwidth starved, especially at 4K. And yeah, too expensive. Even if we make it a 16 GB GPU, it's still too expensive.

The market is going full monopoly mode so what can we do other than thank AMD and Intel for constant underdelivery coupled with out of mind management and ignoring the dGPU area for too long, respectively.
We agree for a large part. It is indeed mostly the price that makes the 4070 (S) bad. The 4070ti non S though yeah that one is truly badly balanced, much like the 4060ti is.

Still, buying ANY 12GB GPU with that performance is not optimal. Core is still going to outlast the VRAM. It still won't have good resale value. This is the class of GPUs I stay away from entirely, because its bad value for money in a world where the perf/$ metric of GPUs in the midrange versus the top end has crawled to similar levels. In that world, what you want is something that holds value. 12GB GPUs do not. They're obsolete by the time you want to sell them. If you replace your GPUs, this is the segment you don't wanna be in. Just get something better, resell it when it still holds value, and upgrade for less than the price of an x60, but still keep playing in high end.

You can stretch that principle to larger budgets too, I just don't want to buy an x90. But realistically, look at it now. You can still sell a 4090 for 1499,-. Its the street price. In a market that's going full monopoly, I say, make the most of your $$$. Irrespective of brand.

In the end what resolution you play on and what FPS you want are a completely abstract thing these days, especially with upscale and FG. Its a non argument. There are no 1080p, 1440p 'cards', there are just games with varying loads that you either do or don't want to play. Even at 1080p a new one can bring your super well picked midranger to its knees. And then what? You're gonna spend money again even if you didn't plan to? The reality often is that you will, at least sooner than you anticipated, and then you could've done better right away.
 
Last edited:
context = what you play and how you play it

benchmarking = get the biggest number possible

i'm sure you understand that a faster card will always be a faster card, but there are scenarios like the vram where the slower card will outperform, but not everyone will have that problem, depends on what they do.
Raw performance is Big gpu core first. There is no place that I saw in ANY benchmark a 16gb RX6800 beat a 4070 Super. None. Not one. Zero. Zip. Ziltch.

And all of a sudden it's going to do better in some games cause 16gb? Nope. Not unless someone altered settings and was undisclosed.

Want to look at a stupid card, point fingers at 4060 ti. For the money, THAT card would be the worst buy right now. Definitely not a 4070....
 
when 4070 super go out of vram will go on swap file and slow down thus rx 6800 beat 4070 super
in theory, just saying :p
By that point both cards will be in unplayable territory and I am not sure about you, but I personally have no interest in discussing the merits of two different piles of shite. Oh look, this one is slightly less lumpy than that one, very cool.
 
when 4070 super go out of vram will go on swap file and slow down thus rx 6800 beat 4070 super
in theory, just saying :p
It goes to system ram, not swap(pagefile). Go practice your theories because at this point you're trying to prove a null point like a flat earth stooge.
 
Back
Top