• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

4K Gamers, How much VRAM do you have?

4K Gamers, How much VRAM do you have?

  • 6 GB or less

    Votes: 1,423 4.4%
  • 8 GB

    Votes: 3,266 10.1%
  • 12 GB

    Votes: 3,541 11.0%
  • 16 GB

    Votes: 5,105 15.8%
  • More than 16 GB

    Votes: 6,227 19.3%
  • I'm not gaming at 4K

    Votes: 12,734 39.4%

  • Total voters
    32,296
  • Poll closed .
No Option for 10GB, been using the 3080 for 2 years at 4k (LG OLED 4k120) now and it's still a respectable beast at this resolution, and DLSS shines at 4k too, I'll 100% use it in any game that supports it. VRAM yet to cause me issues.

EDIT: come to think of it my 6GB RTX A2000 is also connected to a 4k display only (Sony 85" X85K 129hz), and though upscaling, in fact even a combination sometimes of NIS and DLSS, this GPU can comfortably deliver on this TV, especially at couch viewing distances. My wife and son are a lot less picky about IQ than I am :pimp:

If you use FSR\DLSS, it isn't 4k at all.
Just because the game says 3840 x 2160 doesn't mean it is.
Disagree, the output resolution is 4k, and it's been demonstrated over and over that (partly at least) due to forced TAA, very often Upscaling looks as good or better, on a balance of IQ facets.
i have a VERY difficult time believing that 61.4 % of users on TPU are using 4k
These TPU polls are pretty much meaningless (apart from generating forum discussion)
I've hammered this point in the past, I find the numbers close to meaningless and nobody should be hanging their hat on the results of a poll that anyone can vote in with no burden of proof whatsoever (like anyone and everyone voting what their preferred DLSS config is, while some are not actually using RTX cards, ergo not using DLSS at all). The discussions can be interesting however as you say.
 
Last edited:
I've hammered this point in the past, I find the numbers close to meaningless and nobody should be hanging their hat on the results of a poll that anyone can vote in with no burden of proof whatsoever (like anyone and everyone voting what their preferred DLSS config is, while some are not actually using RTX cards, ergo not using DLSS at all). The discussions can be interesting however as you say.

Indeed - at least steam surveys are pretty reliable, but i haven't seen them doing a graph that shows the amount of vram people have according to the resolution they use - would be pretty interesting to see actually.

I reckon if such a graph was made, it would show that the vast majority of 4k users are using 24gb vram (they are also in the majority in this poll, but there are obviously alot of trash votes here, so yeah, not very reliable at all).
 
I personally use a 6600 XT in my gaming PC, so I alternate between the three main reses (1080, 1440 and 4K), depending on the game. However, I invested in a 4K144Hz Monitor same time I got the GPU (November 2022 - was £500 for the Monitor, £235 second hand for the GPU), just so I know it will last 10-15 years, and when I do upgrade, it will be a large upgrade (Hopefully 8K144Hz MicroLED for around £750-ish)
 
RTX 3070 here, so 8GB. I often play in 4K, not every single game of course but yeah. Some need DLSS and FSR 3 mod. All in all, almost every game that i play is in 4K. Even if the game resolution gets lower, the UI, text and other stuff get much sharper. I cannot go under 4K anymore, cus of that reason alone. World of Warcraft is a good example of this. Change resolution from 1080p to 4k. Watch the UI get a million times better looking. Icons and all. I used to have a GTX 1070, and 4K was also an option for the much older games. Which i still play to this day.
 
I voted for not gaming at 4k...because frankly the expense is silly right now. That said, I drive 4k worth of pixels on two monitors using a 3080 and 3070 respectively... No issues with games that are optimized well running well, no desire to take the RT plunge on some of the games that have it, and still happy with moderate settings despite being able to crank them to the moon. That said...explicit 4k isn't something I seek right now. Two 2k monitors at 144 Hz is plenty of screen size, a smooth experience, and more than enough to satisfy everything that I want without being insanely expensive.

If the 4070 super or the 4070 ti were in the $500 range this would be a different discussion...but that's upper bounds of "affordable" 4k. At the current $600 MSRP your "entry level" 4k offerings are just not viable. Throw in the monitor cost...and I can game for years without breaking the bank...or play at 4k today.
 
I assume people who down scale from 4k to 1440p still count? as the rendering is still at 4k. I wonder how many people dont know rendering resolution is independent of texture resolution.
 
I assume people who down scale from 4k to 1440p still count? as the rendering is still at 4k. I wonder how many people dont know rendering resolution is independent of texture resolution.

Do you mean downsample with dsr ? If so i would say it counts. The load is the same whether it is done via dsr or natively after all.
 
Do you mean downsample with dsr ? If so i would say it counts. The load is the same whether it is done via dsr or natively after all.

Native 4k rendering down scaled, e.g. playing FF15 on 1440p display but with rendering resolution set to 150%. So it renders at 2160p but outputs at 1440p if that makes sense.
 
Native 4k rendering down scaled, e.g. playing FF15 on 1440p display but with rendering resolution set to 150%. So it renders at 2160p but outputs at 1440p if that makes sense.

That's called downsampling mate. But yes.
 
Native 4k rendering down scaled, e.g. playing FF15 on 1440p display but with rendering resolution set to 150%. So it renders at 2160p but outputs at 1440p if that makes sense.
stupidiest thing, unless you are Apple fanboy with their "retina"-fetish. That's it, you buy Macbook Pro at 1,5-2x cost of plain MB Air, just to realize your "retina" display is scaled with default zoom more than 100% and you are not gaining screen estate but "quality", which could be seen with a magnifier not with naked eye... lol. tried also this "higher than 1x rendering" in GTA V, worthless GPU pushing thing...

I'll play 4k some day, still haven't found a 4k monitor that has a reasonable price performance ratio.
60 hz doesn't cost you are a Boeing wing.
 
stupidiest thing, unless you are Apple fanboy with their "retina"-fetish. That's it, you buy Macbook Pro at 1,5-2x cost of plain MB Air, just to realize your "retina" display is scaled with default zoom more than 100% and you are not gaining screen estate but "quality", which could be seen with a magnifier not with naked eye... lol. tried also this "higher than 1x rendering" in GTA V, worthless GPU pushing thing...


60 hz doesn't cost you are a Boeing wing.

No offense, but this is simply incorrect.

If you set it up correctly with 4x dsr and zero blur, it greatly enhances the image quality. Combine it with dlss performance, and it hardly costs more performance than native res, but looks substantially better.
 
Last edited:
stupidiest thing, unless you are Apple fanboy with their "retina"-fetish. That's it, you buy Macbook Pro at 1,5-2x cost of plain MB Air, just to realize your "retina" display is scaled with default zoom more than 100% and you are not gaining screen estate but "quality", which could be seen with a magnifier not with naked eye... lol. tried also this "higher than 1x rendering" in GTA V, worthless GPU pushing thing...


60 hz doesn't cost you are a Boeing wing.

Gedosato which was created for this purpose, and then Nvidia latched on to it with DSR, no its not pointless.

Even more so if TAA is used.

I could even show you some recordings of FF7 remake, my early recordings were with the ps4 pro outputting at 1080p and capturing at 1080p, then later I discovered I could configure the console to output at 4k to the elgato device which I did, but was still capturing at 1080p, but the quality of the image in the recordings was quite noticeable as well as on my screen playing the game. This is just one example, on dozens of games on my PC I will render at 4k and output at 1440p, its better quality vs a flat 1440p. In FF13 e.g. its not that noticeable when looking at something big close by, but then at distant monsters moving, they blurry without the 4k rendering even on a 1440p monitor the difference is quite visible.

On FF15 which uses TAA it removed a lot of the downsides of TAA and the hair looks so much nicer, also makes the hair look much nicer in star ocean 4 as well.
 
Gedosato which was created for this purpose, and then Nvidia latched on to it with DSR, no its not pointless.

Even more so if TAA is used.

I could even show you some recordings of FF7 remake, my early recordings were with the ps4 pro outputting at 1080p and capturing at 1080p, then later I discovered I could configure the console to output at 4k to the elgato device which I did, but was still capturing at 1080p, but the quality of the image in the recordings was quite noticeable as well as on my screen playing the game. This is just one example, on dozens of games on my PC I will render at 4k and output at 1440p, its better quality vs a flat 1440p. In FF13 e.g. its not that noticeable when looking at something big close by, but then at distant monsters moving, they blurry without the 4k rendering even on a 1440p monitor the difference is quite visible.

On FF15 which uses TAA it removed a lot of the downsides of TAA and the hair looks so much nicer, also makes the hair look much nicer in star ocean 4 as well.

Downsampling has always greatly improved image quality, also before TAA - but yes, it nearly completely negates the downsides of taa.
 
6700 10GB here... No option for me :(. But 6800 XT ordered, might as well click on 16 GB :).
 
6700 10GB here... No option for me :(. But 6800 XT ordered, might as well click on 16 GB :).

And are you using a 4k monitor ?
 
Fair enough, haven't noted the 4K at the beginning of the question.
 
Fair enough, haven't noted the 4K at the beginning of the question.

As a large portion of the responders no doubt haven't.
 
12GB (4070) and im playing 1080p coz i like high fps in games and high details, for my eyes 25" 1080p 70cm distance is ideal, no headshake etc..
 
Late to the poll, but I've got 8GB on the now rarely used RTX 2080, and 24GB on the 7900XTX.
 
Exactly. Everyone just doesn't play the latest AAA titles with all the eyecandy on (new games suck anyway. Maybe 1-3 good releases per year).
Well, I'm glad to see that not everyone is forced to upgrade. I agree that new games have kinda sucked lately because even Elden Ring, the so-called "Game of the Year" was kinda meh. I miss games that are actually amazing like Far Cry 3 & 5, Assassin's Creed: Odyssey, Witcher III and Skyrim. I also miss silly, self-aware games that are just mindless fun like Deadpool and Saint's Row IV.

The devs have become so focused on graphics that the more important aspects of games are suffering (story, actual gameplay, stability, etc.). I think that the devs going all-in on graphical frills like ray-tracing has hampered games in general because there are only so many hours in a day to dedicate to different aspects of games. Thus, everything has become a trade-off in some way or another.

Ubisoft dedicated a large amount of time to graphics improvements in AC: Valhalla compared to AC: Odyssey and while the graphics are (marginally) better, the effort required to do that took away from the rest of the game which is why AC: Odyssey is a better game overall. Graphics look nice but I'd rather have a fantastic story, great gameplay and decent graphics over a game with a decent story, great gameplay and fantastic graphics.
 
4K users:

Steam 3.54%
TPU 61.7%

o_O :wtf: :laugh:

I game across the whole spectrum of resolutions, from 320x200 for classic DOS games, to 3840x2160 for most 3D titles.
Still, I find it difficult to believe that the majority of voters play in 4K.
 
1440p "gamer", but still 24GB (4090). So 2 answers apply for me.
 
stupidiest thing
No. Some games are designed for 4K or even greater resolution monitors with no regard to low resolution ones (1440p and below). That's why rendering them at 4K on a 1080p display is vastly superior to playing the same game on the same monitor but at the native resolution (given the framerate is awesome in either case). Lots of fine details you just can't see if the game is running at 1080p. Want better immersion, increase the resolution. Of course immersion doesn't only consist of graphics but still.
I voted for not gaming at 4k...because frankly the expense is silly right now.
It's not silly, it's fine. 300 USD video cards handle this resolution just fine if you don't go full Ultra + RTX approach. Many games are beyond playable on Medium or even High presets at this resolution. Talking from my experience with 6700 XT. With RTX 5000 series coming soon, $ per FPS will go even more down, 4K gaming included. But what makes 4K and greater resolution displays so nice is the aforementioned UI. Eye-candy of UI elements and text, the one you read on the TPU included, is worth every dime. It's no going back to 1080p or even 1440p for me. I'd rather not use the PC at all than torture my eyes like that.
 
4K users:

Steam 3.54%
TPU 61.7%

o_O :wtf: :laugh:

I game across the whole spectrum of resolutions, from 320x200 for classic DOS games, to 3840x2160 for most 3D titles.
Still, I find it difficult to believe that the majority of voters play in 4K.

At least 50% obviously didn't read the poll before voting, as is also evident by the comments made by people...
 
Back
Top