• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

FHD(1080p) vs UHD(4k) for gaming: Any downside to UHD besides price?

Mworenstein

New Member
Joined
Dec 11, 2015
Messages
2 (0.00/day)
Hey all,

I recently purchased a Lenovo Y700 (15.6 inch laptop) with UHD for $100 over the model with FHD. I know this is a gaming laptop, and I assume the graphics card will not be able to run new games at UHD resolution. The graphics card is GTX 960M 4GB.

So here is my question: If I put the display or games at 1080p resolution, will it look and perform the same way as the FHD screen? Basically, I am just wondereing if I lose anything by upgrading to the UHD screen. I figure it could be a win/win. 4K resolution when I want it when not gaming, and perhaps 1080p mostly when gaming. I saw a video where someone said 1080p does not look great on a 4k monitor, even though it makes sense that it would, since 4 pixels on a 4k screen equal 1 pixel on a 1080p screen.


Thanks!

Matt
 
Theoretically, 1920x1080 on a 15.6inch 1080p screen will look identical to 1920x1080 on a 15.6inch 4K screen. They're sharing the same pixel density ratio technically, so yeah, you could run Windows at 4K and set your games to 1080p in the graphics settings and they'd look about the same as they would on the other screen.

You are correct in assuming your GTX960M will not play most titles at 4K. You might get away with smaller indie titles, like MOBAs or whatever.
 
Theoretically, 1920x1080 on a 15.6inch 1080p screen will look identical to 1920x1080 on a 15.6inch 4K screen. They're sharing the same pixel density ratio technically, so yeah, you could run Windows at 4K and set your games to 1080p in the graphics settings and they'd look about the same as they would on the other screen.

You are correct in assuming your GTX960M will not play most titles at 4K. You might get away with smaller indie titles, like MOBAs or whatever.

Thanks for this. I'm just wondering how this theoretical scenario plays out in the real world...
 
You'd only have problems if for whatever reason the monitor/TV happens to have a poor scaler for outside of it's native resolution. I'm not aware of if that is still a big issue now a days or not. Rcoon is basically right on spot elsewise.
 
is there much point in 4K on a 15 inch laptop screen.. ??

trog
 
Thanks for this. I'm just wondering how this theoretical scenario plays out in the real world...

Below is basically the only potential issue. Provided the panel is a decent one (it's Lenovo, it should be), the scaler shouldn't have a problem.

if for whatever reason the monitor/TV happens to have a poor scaler for outside of it's native resolution

is there much point in 4K on a 15 inch laptop screen.. ??

trog
Windows 8/10 scaling is really awesome, even for multi-monitor on different resolutions. The only issue I forsee virtually, is that game UI's are going to be horrendously tiny unless the game features a UI scaling option (I've only played a few games which feature these, most of them being Blizzard titles or MOBAs).
 
Windows 8/10 scaling is really awesome

If by "really awesome" you mean basically the same as Win7 and still breaks programs, then sure...

Until Windows completely re-does their scaling, high pixel density screens just won't work on it.
 
If by "really awesome" you mean basically the same as Win7 and still breaks programs, then sure...

Until Windows completely re-does their scaling, high pixel density screens just won't work on it.

It's way better than Windows 7's scaling (Windows 10 is anyway), especially for multi-monitor setups. You can have individual scaling profiles per monitor. That said, it's a discussion for another topic/thread.
 
I saw a video where someone said 1080p does not look great on a 4k monitor, even though it makes sense that it would, since 4 pixels on a 4k screen equal 1 pixel on a 1080p screen.
I'm not surprised that they said that. As such it should look like a normal 2K picture, but the 4K monitor is likely to apply antialiasing to the 2K signal which is what makes it look blurred and crappy looking. It's pretty stupid really. If the driver supports it then you'll be able to get perfect 2K on 4K. It essentially would make the graphics card do the scaling without AA and then transmit a 4K signal with 2K content in it.
 
Biggest downside for me, good luck finding a 4K 144Hz screen :P
 
Biggest downside for me, good luck finding a 4K 144Hz screen :p
They will be epic when they eventually hit the market.
 
And then you need hardware that can run 4K at least 144 fps :P Good luck with that as well :P
 
And then you need hardware that can run 4K at least 144 fps :p Good luck with that as well :p
Cya in 10 years when 4k is passé! :laugh:
 
And then you need hardware that can run 4K at least 144 fps :p Good luck with that as well :p
I can't wait for the day that this becomes standard and I'd give it a year or two now. This will give me the equivalent framerate performance at 4K as I enjoy now on 2K.

For a top end graphics card, it only needs to be roughly 3-4 times more powerful than it is now, because the resolution is 4 times bigger and that's not an insurmountable mountain given the new technology going into forthcoming graphics cards.
 
4K Res is bullshit at game.... it kills the performance for nothing more than lil expand, 1080p the best ideal
 
I can't wait for the day that this becomes standard and I'd give it a year or two now. This will give me the equivalent framerate performance at 4K as I enjoy now on 2K.
"2k" as in 1440p? because "1k" is 720p so 1080p is "1.5k" so "4k" 2160p is not "4k" as 2880p would be, it's "3k".... wait .... they are lying to us

sorry ... morning here and didn't had my coffee ... correcting that right now
 
GreiverBlade said:
"2k" as in 1440p? because "1k" is 720p so 1080p is "1.5k" so "4k" 2160p is not "4k" as 2880p would be, it's "3k".... wait .... they are lying to us

sorry ... morning here and didn't had my coffee ... correcting that right now

I think 2K refers to 1080p in this case, going by horizontal resolution alone. 1920 is not far from 2000 (2K) as 3840 is double that and not far off from 4000 (and TV 4K standard surpasses 4000 in horizontal pixels). Though in the usual sense 4K should be 4 times 1080p in pixels (twice horizontal and twice vertical, 2x2=4).

720p is to 1440p as 1080p is to 4K. They should really be thought of as two separate camps.

1280 x 720
times 2 either way is
2560 x 1440
A total increase of 4x pixels (which is why 1440 is called QHD, Quad HD, 4 x HD)

1920 x 1080
Times 2 each way is
3840 x 2160
A total increase of 4x pixels


HD is 66.67% of FHD. QHD is 66.67% of UHD.

Anyways I've come to the conclusion that there are only 3 applications of UHD in a monitor. First is for photographers who get to edit their photos in closer to native resolution. Second is the natural progression of resolution in 27-28" displays, where the trend has been 1080p to 1440p. Lastly is for maintaining pixel density in 30"+ monitors where 1440p doesn't cut it. For gaming, 4K is an unnecessary cut to performance without any benefit. Multi monitors arguably give the player a slight boost to situational awareness in select games; 4K does not.
 
Last edited:
I think 2K refers to 1080p in this case, going by horizontal resolution alone. 1920 is not far from 2000 (2K) as 3840 is double that and not far off from 4000 (and TV 4K standard surpasses 4000 in horizontal pixels). Though in the usual sense 4K should be 4 times 1080p in pixels (twice horizontal and twice vertical, 2x2=4).

720p is to 1440p as 1080p is to 4K. They should really be thought of as two separate camps.

1280 x 720
times 2 either way is
2560 x 1440
A total increase of 4x pixels (which is why 1440 is called QHD, Quad HD, 4 x HD)

1920 x 1080
Times 2 each way is
3840 x 2160
A total increase of 4x pixels


HD is 66.67% of FHD. QHD is 66.67% of UHD.

Anyways I've come to the conclusion that there are only 3 applications of UHD in a monitor. First is for photographers who get to edit their photos in closer to native resolution. Second is the natural progression of resolution in 27-28" displays, where the trend has been 1080p to 1440p. Lastly is for maintaining pixel density in 30"+ monitors where 1440p doesn't cut it. For gaming, 4K is an unnecessary cut to performance without any benefit. Multi monitors arguably give the player a slight boost to situational awareness in select games; 4K does not.
i always forgot the horizontal ... :oops:
math fails me .... i need a 2nd coffee :laugh:
 
"2k" as in 1440p? because "1k" is 720p so 1080p is "1.5k" so "4k" 2160p is not "4k" as 2880p would be, it's "3k".... wait .... they are lying to us

sorry ... morning here and didn't had my coffee ... correcting that right now
i always forgot the horizontal ... :oops:
math fails me .... i need a 2nd coffee :laugh:
lol you definitely needed that second coffee. :)

tabasco is right, I was thinking of 2K as in 1080p and 4K as in 2160p. A 980 Ti can already drive a 4K display at good framerates on its own if the game isn't too demanding (older title helps) and the details turned down a bit, hence I figured a 3-4x boost in performance compared to this would be sufficient to give a solid 144Hz at 4K with a suitable future monitor. And these are definitely coming, as I've seen the latest 4K standard that includes 4k @ 120Hz on TPU or Wikipedia.
 
Last edited:
2K = 2048x1080
1080p = 1920x1080
The two are NOT interchangeable. However...
720p = 1280x720... but with TV's it is(was) typically 1366x768. There isn't another term so the two ARE interchangeable in this case.


1440p = 2560x1440 or WQHD
4K = 4096x2160
UHD-1 = 3840x2160 (TV's '4K' - notice how they all say UHD-1 for that res for clarity and to prevent false advertising?)

https://upload.wikimedia.org/wikipe...s8.svg/1280px-Vector_Video_Standards8.svg.png

What it is called in a TV is typically different from the PAL/NTSC. Some are interchangeable, others are not.

... sorry, pet peeve to get these right as most seem to brick on it (and I am probably being too literal... but hey, a spade is not a hoe, but they are both garden tools that move dirt! :)
 
Last edited:
2K = 2048x1080
1080p = 1920x1080
The two are NOT interchangeable. However...
720p = 1280x720... but with TV's it is(was) typically 1366x768. There isn't another term so the two ARE interchangeable in this case.


1440p = 2560x1440 or WQHD
4K = 4096x2160
UHD-1 = 3840x2160 (TV's '4K' - notice how they all say UHD-1 for that res for clarity and to prevent false advertising?)

https://upload.wikimedia.org/wikipe...s8.svg/1280px-Vector_Video_Standards8.svg.png

What it is called in a TV is typically different from the PAL/NTSC. Some are interchangeable, others are not.

... sorry, pet peeve to get these right as most seem to brick on it (and I am probably being too literal... but hey, a spade is not a hoe, but they are both garden tools that move dirt! :)

Yes, those are indeed the official standards. Thing is, everybody gets it "wrong" for all these resolutions and most people likely won't even be aware of the correct names for them, so you'd have to correct every instance of "4K" (3840x2160) not being called the rather awkward UHD-1 instead, for example. We therefore shouldn't technically be using 4K for 3840x2160 either, but we do and there's no issue or confusion with it.

Same way it can be extended to 1920x1080 for 2K. One can just put it down to a colloquialism* so in the context of our PCs there's no ambiguity with these informal references, especially as I've never seen an actual "4K" 2048x1080 PC monitor and if they exist they must be very rare and expensive. I hope that scratches that pet peeve itch a little bit? :)
lol, I love your last line! :toast:

Finally, do you have the link to the article that diagram came from please, because there's no way to get back to it from there?

*I had to actually google it for the correct spelling. :p
 
Yeah, I guess I prefer to get it right is all. And, there IS ambiguity with these informal references. People are just lazy and drop the "UHD" off of 4K, but if you look at all the specifications and labeling on the box, its all 4K UHD.

2K for 1920x1080 is a recent forum thing and is wrong, period. Its not a colloquialism... its just wrong but people use still use it. So there is built in ambiguity unless you say things properly as they are defined. Again a spade is not a hoe! :p

Link: https://en.wikipedia.org/wiki/Display_resolution
 
Well, it's the context of PCs when discussed in tech forums like this which really removes the ambiguity as I've said above, even though these definitions aren't strictly correct and you'd have to start picking up on all of them to be consistent. There's not really any more I can add to that. I can see why it might bug you though, lol. I generally like correctness too and try to define things properly.

One of my pet peeves are users running their monitors at a non-native resolution, leading to a truly awful picture that's got so much less desktop real estate than native. Of course, these users are invariably the clueless ones who will always absolutely insist on running their monitors that way. I tried to get the PCs at work locked down to native resolution only by corporate policy, but was thwarted due to "health and safety" reasons as the users "couldn't see the screen properly" otherwise. Fuckin' hell. :shadedshu:

Thanks for the link. Bookmarked. :)
 
2560x1440 for 144/165 Hz. 4K == slide show.
 
Well, it's the context of PCs when discussed in tech forums like this which really removes the ambiguity as I've said above, even though these definitions aren't strictly correct and you'd have to start picking up on all of them to be consistent. There's not really any more I can add to that. I can see why it might bug you though, lol. I generally like correctness too and try to define things properly.
Im doing my part... LOL!
 
Back
Top