Friday, November 9th 2018

TechPowerUp Survey: Over 25% Readers Game or Plan to Game at 4K Resolution

More than a quarter of TechPowerUp readers either already game at 4K Ultra HD resolution, or plan to do so by next year, according to our front-page survey poll run over the past 50 days. We asked our readers if they are gaming at 4K. Among the 17,175 respondents at the time of this writing, 14.5 percent said that they are already gaming at 4K UHD (3840 x 2160 pixels), which includes not just a 4K display, but also having their games render at that resolution. 3 percent say that while they have a 4K display, they game at lower resolutions or with reduced level of detail, probably indicating that their PC hardware isn't yet capable of handling 4K.

Almost a tenth of the respondents (9.5 percent to be precise), say that while they don't game at 4K, they plan to do so in the near future. 1.6 percent responded that they expect to go 4K within 2018, and 7.9 percent in 2019. The majority 73 percent of our readers neither game at 4K nor plan to any time soon. These results are particularly encouraging as a reasonably big slice of our readership is drawn to 4K, the high-end gaming resolution of this generation, which can provide four times the detail as Full HD (1080p). Of the 9.5 percent lining up to upgrade, a near proportionate amount could upgrade not just their display, but also other hardware such as graphics cards, and perhaps even the rest of their platforms, to cope with 4K.
Add your own comment

99 Comments on TechPowerUp Survey: Over 25% Readers Game or Plan to Game at 4K Resolution

#76
John Naylor
Until we can game at a minimum 80 fps (Using ULMB) in every game on a 144 Hz IPS monitor with acceptable lag and response times @ 4k for less than 1/3 of the current cost of $4,500 in GFX cards and monitor, not going to go there.
HTCThe current problem with 4K gaming is that, to be able to do it, it requires a lot of money:

- the monitor's price itself: prices have come down, but they are still expensive
- the extra cost of sync technologies: freesync is much cheaper VS G-sync, but it still adds to the cost of the monitor
- the only manufacturer (nVidia) currently able to provide good FPS @ 4K has the card's that do it @ very high prices (i'm referring to 1080 and above performance cards), thus unavailable to most people
- mining crazy did NOT help one bit, driving away further potential 4K gaming adopters
- the only alternative to nVidia's solutions to 4K gaming (@ lesser quality settings), though cheaper, still costs too much thus unavailable to most people
- requires a good CPU so games aren't bottlenecked by it: for gaming, Intel is still the best choice in terms of FPS, but also the most expensive, thus increasing the total costs further
- RAM is still very pricey: doesn't help one bit

No wonder the 4K gaming adoption is slow!
All true ... almost... Freesync isn't going anything for ya with a 144 Hz 4k monitor

From this perspective... the two technologies are very similar.
G-Sync provides adaptive sync starting at about 30 fps and its impact trails off greatly after 60 fps.
Free-Sync provides adaptive sync starting at about 40 fps and its impact trails off greatly after 60 fps

From this perspective... the two technologies are very dissimilar.
G-Sync monitors include a Hardware Module which is the reason for the difference in cost between the two technologies. When the 1st G-Sync ready monitors came out, you could buy a module as an add on for $200. Adaptove sync is intended to eliminate display issues that occur from when fps is below 60Hz. The hardware module provides Motion Blur Reduction which is quite useful above 60 Hz when the problems that adaptive sync solves are no longer a significant issue. Of course average fps means min fps is lower so some cushion is needed; so when fps averages 70 fps or so, the MBR technology provides a superior visual experience. The typical 1440p 144 / 165 Hz owner, assuming they have the GFX horsepower, is playing with G-Sync OFF and ULMB ON until that rare game that won't let them maintain fps above 60.
Freesync monitors are not equipped with this hardware module. Yes, adaptive sync continues to work after 60 fps but the effects are greatly diminished as the fps increases. But there is no high fps alternative to switch to because Freesync includes fo MBR hardware module.

So, when ya are ready to go 4k ....

With AMD, grab a decent 60 / 75 Hz panel and wait for a next gen AMD card that can handle the games you want to play play @ 40 - 75 fps
With nVidia, wait till ya can afford to grab what is now a $2500 144Hz panel as well as reasonably affordable nVidia card(s) that can handle the games you want to play play at 75 - 144 Hz

Personally, I won't give up 144Hz ULMB to go to 4k and do it at reasonable expense .... I'm thinking 2020
Posted on Reply
#77
lexluthermiester
stimpy8875% of gamers don't game at 4K, and don't even plan on upgrading!
HTCAlmost: it should be 73% of gamers don't game at 4K, and don't even plan on upgrading!
Yeah, but people change their minds. And when a compelling reason comes up they likely will. The reality is all of us will sooner or later because UHD will become the defacto standard. So Maybe it might be better to say; " 73% of gamers don't game at 4K, and don't even plan on upgrading until they have a good reason! ".
Posted on Reply
#78
Mescalamba
Thats interesting way to present reality.

I know that optimism is often described as seeing glass half full, while pessimists say that its half empty.

What is presenting that 25% readers of TPU "almost play at 4K"? Lying or just bending truth a bit?

Your goal is become fake news server or what?
lexluthermiesterYeah, but people change their minds. And when a compelling reason comes up they likely will. The reality is all of us will sooner or later because UHD will become the defacto standard. So Maybe it might be better to say; " 73% of gamers don't game at 4K, and don't even plan on upgrading until they have a good reason! ".
It will become standard when GPUs are cheap enough to feed 4K without effort and same goes for LCDs.

Which means 2-5 years or more. Depending on situation on the market.
Posted on Reply
#79
Prima.Vera
For gaming, once you go 21:9 or 3440x1440@100Hz you cannot go back. Even if I would have an 8K monitor and a GPU capable of handling 8K @100Hz, I would still not go back to 16:9 format ever.
Posted on Reply
#80
ZeDestructor
Prima.VeraFor gaming, once you go 21:9 or 3440x1440@100Hz you cannot go back. Even if I would have an 8K monitor and a GPU capable of handling 8K @100Hz, I would still not go back to 16:9 format ever.
Someone will make you an 8K or 10K 21:9 when the time comes. Don't worry about it.
Posted on Reply
#81
Prima.Vera
ZeDestructorSomeone will make you an 8K or 10K 21:9 when the time comes. Don't worry about it.
A 10K 21:9 (10080x4320) monitor will be the ultimate thingy, however what kind of GPU is required to push at least 100fps on the latest games on that res??
Posted on Reply
#82
ZeDestructor
Prima.VeraA 10K 21:9 (10080x4320) monitor will be the ultimate thingy, however what kind of GPU is required to push at least 100fps on the latest games on that res??
Don't need to render games at 10320x4320 (that's the actual res they'll use, based on the common 3440x1440 we use right now... or 10240x4320, if they scale up 5120x2160). 5160x2160, 3440x1440 or 2580x1080 (well, more likely 2560x1080 with 10 black columns on each side) would work just fine as far as rendering games go (based on my experience with 3K and 4K displays).

Personally, I want a 30" 16:10 8K or more (7680x4800) screen. more useful for highly multimonitor work.. unless they go massively all out and build a 48:10 monster (23040x4800) :D
Posted on Reply
#83
bug
Prima.VeraFor gaming, once you go 21:9 or 3440x1440@100Hz you cannot go back. Even if I would have an 8K monitor and a GPU capable of handling 8K @100Hz, I would still not go back to 16:9 format ever.
You do ealize you're talking about personal preference, don't you? Thus no need to take a hard stance, you know other will prefer it otherwise.
Posted on Reply
#84
stimpy88
lexluthermiesterYeah, but people change their minds. And when a compelling reason comes up they likely will. The reality is all of us will sooner or later because UHD will become the defacto standard. So Maybe it might be better to say; " 73% of gamers don't game at 4K, and don't even plan on upgrading until they have a good reason! ".
Very true. But we need MS to sort the mess that is scaling in Windows at present (I actually don't think they ever will!). I also hear that HDR support is still buggy in Windows and with some games still. Then we have the crazy monitor prices (if you want quality 4K HDR), then the crazy cost of building a PC capable of true 4K 60FPS minimum. After these barriers are overcome, plus the new barrier of the RTX cards being unable to play an RTX title at true 4K 60Hz, then I think we will be ready for the next big hype to push us to 4K. But I predict it will be a good few years until this happens for most people.

I remember when I went from 1080p gaming to 1440p gaming, the only barriers for me were about $650 for a nice wide colour gamut, high refresh rate Korean monitor, and a GeForce 970. I had no other issues to deal with, as the rest of the system was good enough, so the cost was manageable. If I wanted to go after the 4K experience now, I would need a totally new PC and about $2000 worth of monitor first. I just cannot justify that, plus the experience is simply not good enough, or will last long enough, to match the cost.

But yes, I have no doubt that in 10 years, 4K HDR will be the new 1080p.
Posted on Reply
#85
gamerman
well, we all must remembe that quaes it always quest. you can answer it like it.

i think 2560x1440 is my max resolution and now i play FHD and its ok when you have good monitor.

palying 4K needso much cash more,bcoz with 4K monitor what is very expensive you need then high performance gpu, i think 2080 ti.

so i say,very few ppl go 4K for now and 85% ppl sill stay FHD and WHQL resolution.


4K gaming dont give you anything more feels than FHD,its fact...afer few week playing....
Posted on Reply
#86
lexluthermiester
stimpy88But yes, I have no doubt that in 10 years, 4K HDR will be the new 1080p.
I don't think it'll take that long. 5 years at the longest..
Posted on Reply
#87
skates
I've been gaming on 4K since the 780Ti came out and it's all been good. I've had various 1440P and 4K monitors and while on some games I can't go ultra settings, I could always get high settings with 60FPS with the 780Ti.

Frankly, I really don't understand all the talk about how 4K isn't ready. I've been hearing it for years now and have been doing it for several years with no issues.

I currently game with a 1080Ti on a 43" LG panel 4K monitor. One of the Korean ones with low input lag and built as a PC monitor, no TV Tuner. I paid $500 for it 2 years ago and can game with 444, low input lag. Can't tell the difference in frame rates since its 60Hz, but Battlefield is a consistent 100+ FPS on the 1080Ti with all settings on Ultra.

4K is glorious on a big screen and I would never go back to 1440p even with a higher Hz.
gamermanwell, we all must remembe that quaes it always quest. you can answer it like it.

i think 2560x1440 is my max resolution and now i play FHD and its ok when you have good monitor.

palying 4K needso much cash more,bcoz with 4K monitor what is very expensive you need then high performance gpu, i think 2080 ti.

so i say,very few ppl go 4K for now and 85% ppl sill stay FHD and WHQL resolution.


4K gaming dont give you anything more feels than FHD,its fact...afer few week playing....
You really don't need a lot of cash. A large screen 4K monitor off EBay is not that expensive and you can get a used 1080Ti and good to go. The talk that it's expensive or not ready is just not true.
Posted on Reply
#88
lexluthermiester
skates4K is glorious on a big screen and I would never go back to 1440p even with a higher Hz.
You're not the first person who's said this kind of thing. Maybe it's time to give UHD a go.. LOL! Here I am changing my mind.
Posted on Reply
#89
ZeDestructor
stimpy88Very true. But we need MS to sort the mess that is scaling in Windows at present (I actually don't think they ever will!).
That particular issue is no longer in MS' hands. MS have built the APIs, the tooling, the support and the scaling into Windows 8 and newer, and it works extremely well if your apps make use of it (source: my high-DPI, scaled screen laptop since 2015). Sadly, you have laggards like Adobe and AutoCAD who still have shitshows when it comes to running on high DPI scaled screens. You'd expect content creation companies to be the first on it, but no, it's small time random devs that caught up first (like the devs of my IRC client, or terminal emulator, for example).

Hell, my dad only has 1 app he uses often that doesn't support scaling properly, but given he uses 250% scaling on his 13" 4K laptop, there's more than enough pixels to keep things only slightly blurry. Meanwhile, everything else he uses scales perfectly.
Posted on Reply
#90
toyo
I have no idea how 25% of people would choose 4K. I do GFX for a living. I have a 4K Iiyama monitor. I don't play anything on it. I prefer my 144Hz 1080p monitor by far.
Posted on Reply
#91
bug
toyoI have no idea how 25% of people would choose 4K. I do GFX for a living. I have a 4K Iiyama monitor. I don't play anything on it. I prefer my 144Hz 1080p monitor by far.
Less than 20% game at 4k. The rest are thinking about going 4k. And even among those 20% I'm pretty sure there are those that have shelled out the dough and are either not seeing much of a difference or are downright disappointed. But going back would be costlier, so they just stick with it.
And even then, we're not talking 20% of all people. Just 20% of the tech enthusiasts that frequent TPU.

I'm sure we'll all game at 4k eventually. But today, it's a bridge too far for me.
Posted on Reply
#92
Kamgusta
lexluthermiesterI don't think it'll take that long. 5 years at the longest..
STEAM hardware survey says 92% of all the gamers in the World (quite a lot of people, I would say) PLAY at resolutions NOT GREATER than 1080p.
You really think next year, or the year after that, those numbers will magically change?
1080p took 15 years to became "standard" and we STILL have 1/3 of ALL the gamers in the World playing at LOWER resolutions!!!
Posted on Reply
#93
lexluthermiester
Kamgusta1080p took 15 years to became "standard"
Oh it did not. Took less than 5 years from the time the very first 1080P(not 1080i) LCD screen for PC's hit til gaming at that res was widely accepted and preferred.
KamgustaYou really think next year, or the year after that, those numbers will magically change?
Not magically, logically. As GPU tech continues to advance and progress, as always does, 2160p will become easier to do and as it become easier people will see it as a favorable option. Same thing happened with 1080p. At first GPU's struggled to do 1080p gaming but as the hardware advanced it became much more viable. 2160p is viable now if you're willing to buy premium parts and turn down or off a few settings. It's just not cost effective for the masses. However, that time is upon us and the whole industry is transitioning to 2160p as a standard. It will be mainstream in 5 years or less.
Posted on Reply
#94
Kamgusta
lexluthermiesterOh it did not. Took less than 5 years from the time the very first 1080P(not 1080i) LCD screen for PC's hit til gaming at that res was widely accepted and preferred.

Not magically, logically. As GPU tech continues to advance and progress, as always does, 2160p will become easier to do and as it become easier people will see it as a favorable option. Same thing happened with 1080p. At first GPU's struggled to do 1080p gaming but as the hardware advanced it became much more viable. 2160p is viable now if you're willing to buy premium parts and turn down or off a few settings. It's just not cost effective for the masses. However, that time is upon us and the whole industry is transitioning to 2160p as a standard. It will be mainstream in 5 years or less.
First 1080p TV came out in 2005. PC monitors already had that resolution. First 1080p (CRT) monitor came out in 1995.
2018-1995=23 years have passed. And we still have people playing at 1280x720.
Man, we all have dreams. But money is the biggest issue.
As long our equipment works (and 1080p is easy dealt with 200$ GTX1060s), we won't jump in the 4K bandwagon.
No need to. Simple as that.
Posted on Reply
#95
lexluthermiester
KamgustaFirst 1080p (CRT) monitor came out in 1995.
Actually, 1994. But those were $12000 monitors and not availible to the common consumer market. 1080p as a common consumer item was 2003 with a plasma TV(can't remember the brand). It had a VGA input that could take 1080P signals. The first consumer 1080p LCD was January 2004. By 2007 1080p gaming was common(I personally was gaming on an ASUS 1920x1200 LCD without issue in 2007). Thus less than 5 years. I do not include professional, high priced displays because they were out of reach to most consumers and thus out of context to this discussion.
Kamgustawe won't jump in the 4K bandwagon.
Then get left behind. The rest of us will enjoy UHD gaming in the near future.
Posted on Reply
#96
kanecvr
Octavean~28” 4K capable monitors start at about ~$200 USD and some name brand 32” 4K monitors with FreeSync start at about ~$300 USD. This is a significant reduction from ~$1,500 to ~$3,000 4K Monitor prices which were typical a few years ago. There are more expensive 4K monitors to be sure but generally speaking the monitor prices are not the issue. The issue is the price necessary for 4K capable GPUs and the reality is that the upper end of the GPU market has always been expensive.
Not where I'm from mate. Here, 200$ barely gets you a budget 1080p 27" monitor. Cheapest 4k 27" display I could find right now goes for ~360$, and they scale up to (and over) 1000$ depending on diagonal, response time, aspect ratio and freesync / g-sync. I did the math - 420$ for a 4K monitor I'd buy, + 1499$ for a GTX 2080ti - for that money I can get a nice looking 2001 BMW 320i with 200k kilometers on it (leather interior, heated seats, cruise control, 18" wheels) - or a 2000 Honda CB600 Hornet. The price on the 2080ti alone is MADNESS: (1$ = 4.17 lei)



(oldness incoming) I remember a time when 150-200$ got you a good mid-end video card witch let you play ANYTHING fluently and at a good resolution, and a high end card witch had spectacular performance was 400$. That 400$ is still my maximum budget for a video card. I got my GTX 1080 about this time last year for 360$, and I'll stick with it until something 40-50% faster for under 400$ comes out. Right now I'm gaming @ 2k (Dell U2713h), and I'm in no hurry to spend lots of money for 4k gear.
Posted on Reply
#97
Tomorrow

In blind test people noticed 240Hz 1080p being lower res in images anf games yet difference between 1440p 144Hz vs 4K 60Hz was much harder to spot on still images. In games 144Hz obviously felt smoother.
Posted on Reply
#98
InVasMani
Relative to 1080p 60Hz higher refresh rate or higher resolution displays or a cross between the two in the case of 1440p w/higher refresh rate requires more CPU, GPU, or CPU + GPU power none of that is free and clear.
stimpy88Shouldn't the headline have actually read...

75% of gamers don't game at 4K, and don't even plan on upgrading!

Yeah, that does read quite differently to me...
That's a lie they'll upgrade sooner or later it's simply not a priority, but most gamer's aren't die hard nerds debating this stuff on tech forums in the first place.
Posted on Reply
Add your own comment
Nov 9th, 2024 08:00 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts