• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

4K Gamers, How much VRAM do you have?

4K Gamers, How much VRAM do you have?

  • 6 GB or less

    Votes: 1,423 4.4%
  • 8 GB

    Votes: 3,266 10.1%
  • 12 GB

    Votes: 3,541 11.0%
  • 16 GB

    Votes: 5,105 15.8%
  • More than 16 GB

    Votes: 6,227 19.3%
  • I'm not gaming at 4K

    Votes: 12,734 39.4%

  • Total voters
    32,296
  • Poll closed .
Why the hell I got 4K 27", but Windows "suggest" me the scaling factor larger than 100%, cropping res?
As the same diagonal and resolution user, I don't see 100% any usable (I use 200% and sit almost 5 feet far from the monitor). Unless being 10 inch close to a display is by any mean acceptable. Yes, you're losing the estate due to Windows shifting the scale but it at least becomes readable from afar.
3200x1800 should be a thing. Why was it never a thing?
Because manufacturers and their marketologists never care what's right for the customer. 3200x1800 wouldn't make 4090 as appealing as 3840x2160 did.

However, 3200x1800 is still quite powerful. I'd prefer 4200x1800, however, as UW is way better for first person perspective games and for work.
when your opponent
See, I'm not an opponent. I'm a guy who's got accused of things he's never said or done.
I explicitly said 4K is doable on $300ish GPUs when you do not use ultra settings. Then I showed how the games fare on the worst 300 dollar GPU on the ultra settings. Then I showed how changing settings from ultra to semi-high doubles the framerate from 30 to 60. This is also possible in all other games, yet proving it doesn't belong to my duties. The burden of proof does lie on the one who accuses.
Then you call everything I've done names, called me a bar moving deluso and decided I'm calling 4K30 the ideal scenario. You are your own opponent.
 
As the same diagonal and resolution user, I don't see 100% any usable (I use 200% and sit almost 5 feet far from the monitor). Unless being 10 inch close to a display is by any mean acceptable. Yes, you're losing the estate due to Windows shifting the scale but it at least becomes readable from afar.

Because manufacturers and their marketologists never care what's right for the customer. 3200x1800 wouldn't make 4090 as appealing as 3840x2160 did.

However, 3200x1800 is still quite powerful. I'd prefer 4200x1800, however, as UW is way better for first person perspective games and for work.

See, I'm not an opponent. I'm a guy who's got accused of things he's never said or done.
I explicitly said 4K is doable on $300ish GPUs when you do not use ultra settings. Then I showed how the games fare on the worst 300 dollar GPU on the ultra settings. Then I showed how changing settings from ultra to semi-high doubles the framerate from 30 to 60. This is also possible in all other games, yet proving it doesn't belong to my duties. The burden of proof does lie on the one who accuses.
Then you call everything I've done names, called me a bar moving deluso and decided I'm calling 4K30 the ideal scenario. You are your own opponent.
5 feet - very personal.
100% not usable - very personal.
at 1 feet, 100% is pretty usable. 200% suits more for big TV and you on the sofa far away than 5 feet.

Because manufacturers and their marketologists never care what's right for the customer. 3200x1800 wouldn't make 4090 as appealing as 3840x2160 did.
vice versa. 2160p makes 4090 perfect for marketing, but 1800p makes it even better for consumers, as in this res it can push even more fps lol. or I didn't get your point and you shared the same opinion? :)
 
I didn't get your point and you shared the same opinion?
Yes. I don't get it why some people refuse to understand what I wrote. I thought I'm using English simple enough for it to be idiot-proof.
5 feet - very personal.
100% not usable - very personal.
Can't say so. Medical researches recommend looking into your screen from at least 1.5x the diagonal. 40.5 inches isn't quite 5 feet but even from this distance, 100% scaled 4K at 27" requires massive effort to read. At 150 to 200 percent, though, letters are big enough.
at 1 feet
1 foot; it's also ridiculously close. You can't even fit the whole real estate watching from this close.
 
As the same diagonal and resolution user, I don't see 100% any usable (I use 200% and sit almost 5 feet far from the monitor). Unless being 10 inch close to a display is by any mean acceptable. Yes, you're losing the estate due to Windows shifting the scale but it at least becomes readable from afar.

Because manufacturers and their marketologists never care what's right for the customer. 3200x1800 wouldn't make 4090 as appealing as 3840x2160 did.

However, 3200x1800 is still quite powerful. I'd prefer 4200x1800, however, as UW is way better for first person perspective games and for work.

See, I'm not an opponent. I'm a guy who's got accused of things he's never said or done.
I explicitly said 4K is doable on $300ish GPUs when you do not use ultra settings. Then I showed how the games fare on the worst 300 dollar GPU on the ultra settings. Then I showed how changing settings from ultra to semi-high doubles the framerate from 30 to 60. This is also possible in all other games, yet proving it doesn't belong to my duties. The burden of proof does lie on the one who accuses.
Then you call everything I've done names, called me a bar moving deluso and decided I'm calling 4K30 the ideal scenario. You are your own opponent.

Note that you fail your own basic statements.

"Then I showed how changing settings from ultra to semi-high doubles the framerate from 30 to 60."
Where? You BS a discussion that "I can get playable frame rates with one setting..." but never provided data.

Likewise, you are stating something inherently stupid....that "4k Ultra HD" is Ultra settings. Did you by chance not read? I'm asking because nowhere in the review do they day that the presets are ultra...only that they've averaged the results between lows and highs...I'm not sure if this is abundantly clear....but when you pull data maybe instead of assuming you do the reading....because the rest of the article that you didn't link to paints a much different picture. Also note in that article that the average FPS is 29.0....which is entirely in-line with unplayable stuttering mess at 4k....



So...I say that 4k is too expensive. You say any $300 card can do it without issues. You spend multiple pages misunderstanding the data, not reading that "ultra HD" is the resolution and not the bells-and-whistles settings....and after all of it you accuse me of not reading despite your own pictures literally curb stomping the stupid argument of 4k being fine with a $300 card. Your only argument is FSR...made on page two of the review....that claims you can go from 36 to 122 FPS....in one game, with undefined settings.



Please note that this isn't an Nvidia vs AMD flame war. The 7600 and 4060 both aren't designed to do 4k...and the compromises required to make them do it result in a crappier experience. If you want to die with the argument that "just one setting fixes everything," that you can "double frame rates magically," and that going from all the extras at 144 FPS on a 2k monitor to nothing on a 4k monitor at less than half the FPS (or for your 7600 example, 63.7 down to 34.7 on average scores....ignoring 1% lows that definitely make the games unplayable messes) then you do you. Nobody I care about is getting the recommendation to spend $500+ on a 4k screen and then to cheap out on a $300 GPU....and at that price I can buy a console ($600), a good 2k monitor (about $150) and a game....without even having to buy the rest of a PC....
Gaming is about gaming. Gaming at 4k is currently about displaying wealth. Consider me entirely happy to not throw money at getting 4k...because right now it's in about the same place as 1920x1080 was when the initial Crisis came out. The bizarro statement that you can run 4k on any $300 card is just....well, it's stupid. It's stupid as buying a house and having a mortgage that sucks 70% of your paycheck (being "house poor"). It's stupid as buying a Civic, and spending thousands of dollars on the exhaust to make it sound like it's got a bigger engine. It's as stupid as buying a white elephant....a gift meant to punish the receiver.

You do you is my way of saying that instead of thinking you are free to "feel good" with your decision. I...understand that because of stupid decisions and people willing to pretend that 30 FPS is playable I'll see 4k come down in price faster because somebody else is paying the new technology premium. I'm glad I'm not paying the tax....
 
All right back on topic , less of this please!
Take your "chat" to pm
 
16 GB with my Arc A770 (my second one, the first one has 8 GB)

12 GB with my RX 6750 XT

Want RT, buy a 4090. Want perfect eye-candy, get cryolated for a century and a half since we're not nearly there yet.
Good one, LOL.
 
1080p 144hz FreeSync
16GB 7900 GRE Nitro+ (Upgraded from 16GB Vega 10 MI25/WX9100)

All but CP'77 runs MTFO >42-144+ FPS; no FSR/XeSS needed.
 
Voted: "Im not gaming at 4K"

but if i did all 3 of my 4K capable cards are not supported in the poll:

1080 TI - 11GB
2080 TI - 11GB
3080 - 10GB

Maybe change the 12GB poll selection to "10-12GB"
 
Beam NG .drive is performing very well now with 1440p! You don't need more than an RX 6700 family card at 1440p.

Here's the trick: With Beam NG .drive 0.32, let it build the cache, then you must close and relaunch Beam NG .drive, otherwise, the FPS will be low and it won't fully utilize the GPU. (bug)

After I relaunch Beam NG .drive, I easily get 130+ FPS on my RX 6750 XT. When it's running correctly, my OC'ed RX 6750 XT @ 2.9 GHz, will easily pull 240W.

I have a 4K monitor, but it's only 60 Hz.
 
Last edited:
1080p 144hz FreeSync
16GB 7900 GRE Nitro+ (Upgraded from 16GB Vega 10 MI25/WX9100)

All but CP'77 runs MTFO >42-144+ FPS; no FSR/XeSS needed.

And what did you vote ?
 
@W1zzard, is there a dedicated page where we can view all previous, current and future Poll results?
 
Had a downgrade of VRAM, switched from RX 6700 XT 12GB to a RTX 3080 10GB.
 
I just need to be conservative with the texture settings and I guess I'm fine.

That's the one setting one should never lower though, especially at 4k...
 
Back
Top