# FHD(1080p) vs UHD(4k) for gaming: Any downside to UHD besides price?



## Mworenstein (Dec 11, 2015)

Hey all,

I recently purchased a Lenovo Y700 (15.6 inch laptop) with UHD for $100 over the model with FHD. I know this is a gaming laptop, and I assume the graphics card will not be able to run new games at UHD resolution. The graphics card is GTX 960M 4GB.

So here is my question: If I put the display or games at 1080p resolution, will it look and perform the same way as the FHD screen? Basically, I am just wondereing if I lose anything by upgrading to the UHD screen. I figure it could be a win/win. 4K resolution when I want it when not gaming, and perhaps 1080p mostly when gaming. I saw a video where someone said 1080p does not look great on a 4k monitor, even though it makes sense that it would, since 4 pixels on a 4k screen equal 1 pixel on a 1080p screen.


Thanks!

Matt


----------



## RCoon (Dec 11, 2015)

Theoretically, 1920x1080 on a 15.6inch 1080p screen will look identical to 1920x1080 on a 15.6inch 4K screen. They're sharing the same pixel density ratio technically, so yeah, you could run Windows at 4K and set your games to 1080p in the graphics settings and they'd look about the same as they would on the other screen.

You are correct in assuming your GTX960M will not play most titles at 4K. You might get away with smaller indie titles, like MOBAs or whatever.


----------



## Mworenstein (Dec 11, 2015)

RCoon said:


> Theoretically, 1920x1080 on a 15.6inch 1080p screen will look identical to 1920x1080 on a 15.6inch 4K screen. They're sharing the same pixel density ratio technically, so yeah, you could run Windows at 4K and set your games to 1080p in the graphics settings and they'd look about the same as they would on the other screen.
> 
> You are correct in assuming your GTX960M will not play most titles at 4K. You might get away with smaller indie titles, like MOBAs or whatever.



Thanks for this.  I'm just wondering how this theoretical scenario plays out in the real world...


----------



## Zakin (Dec 11, 2015)

You'd only have problems if for whatever reason the monitor/TV happens to have a poor scaler for outside of it's native resolution. I'm not aware of if that is still a big issue now a days or not. Rcoon is basically right on spot elsewise.


----------



## trog100 (Dec 11, 2015)

is there much point in 4K on a 15 inch laptop screen.. ??

trog


----------



## RCoon (Dec 11, 2015)

Mworenstein said:


> Thanks for this.  I'm just wondering how this theoretical scenario plays out in the real world...



Below is basically the only _potential_ issue. Provided the panel is a decent one (it's Lenovo, it should be), the scaler shouldn't have a problem.



Zakin said:


> if for whatever reason the monitor/TV happens to have a poor scaler for outside of it's native resolution





trog100 said:


> is there much point in 4K on a 15 inch laptop screen.. ??
> 
> trog


Windows 8/10 scaling is really awesome, even for multi-monitor on different resolutions. The only issue I forsee virtually, is that game UI's are going to be horrendously tiny unless the game features a UI scaling option (I've only played a few games which feature these, most of them being Blizzard titles or MOBAs).


----------



## newtekie1 (Dec 11, 2015)

RCoon said:


> Windows 8/10 scaling is really awesome



If by "really awesome" you mean basically the same as Win7 and still breaks programs, then sure...

Until Windows completely re-does their scaling, high pixel density screens just won't work on it.


----------



## RCoon (Dec 11, 2015)

newtekie1 said:


> If by "really awesome" you mean basically the same as Win7 and still breaks programs, then sure...
> 
> Until Windows completely re-does their scaling, high pixel density screens just won't work on it.



It's way better than Windows 7's scaling (Windows 10 is anyway), especially for multi-monitor setups. You can have individual scaling profiles per monitor. That said, it's a discussion for another topic/thread.


----------



## qubit (Dec 11, 2015)

Mworenstein said:


> I saw a video where someone said 1080p does not look great on a 4k monitor, even though it makes sense that it would, since 4 pixels on a 4k screen equal 1 pixel on a 1080p screen.


I'm not surprised that they said that. As such it should look like a normal 2K picture, but the 4K monitor is likely to apply antialiasing to the 2K signal which is what makes it look blurred and crappy looking. It's pretty stupid really. If the driver supports it then you'll be able to get perfect 2K on 4K. It essentially would make the graphics card do the scaling without AA and then transmit a 4K signal with 2K content in it.


----------



## RejZoR (Dec 11, 2015)

Biggest downside for me, good luck finding a 4K 144Hz screen


----------



## qubit (Dec 11, 2015)

RejZoR said:


> Biggest downside for me, good luck finding a 4K 144Hz screen


They will be epic when they eventually hit the market.


----------



## RejZoR (Dec 11, 2015)

And then you need hardware that can run 4K at least 144 fps  Good luck with that as well


----------



## zithe (Dec 11, 2015)

RejZoR said:


> And then you need hardware that can run 4K at least 144 fps  Good luck with that as well


Cya in 10 years when 4k is passé!


----------



## qubit (Dec 31, 2015)

RejZoR said:


> And then you need hardware that can run 4K at least 144 fps  Good luck with that as well


I can't wait for the day that this becomes standard and I'd give it a year or two now. This will give me the equivalent framerate performance at 4K as I enjoy now on 2K.

For a top end graphics card, it only needs to be roughly 3-4 times more powerful than it is now, because the resolution is 4 times bigger and that's not an insurmountable mountain given the new technology going into forthcoming graphics cards.


----------



## Elkhawaga (Dec 31, 2015)

4K Res is bullshit at game.... it kills the performance for nothing more than lil expand, 1080p the best ideal


----------



## GreiverBlade (Dec 31, 2015)

qubit said:


> I can't wait for the day that this becomes standard and I'd give it a year or two now. This will give me the equivalent framerate performance at 4K as I enjoy now on 2K.


"2k" as in 1440p? because "1k" is 720p so 1080p is "1.5k" so "4k" 2160p is not "4k" as 2880p would be, it's "3k".... wait .... they are lying to us 

sorry ... morning here and didn't had my coffee ... correcting that right now


----------



## tabascosauz (Dec 31, 2015)

GreiverBlade said:
			
		

> "2k" as in 1440p? because "1k" is 720p so 1080p is "1.5k" so "4k" 2160p is not "4k" as 2880p would be, it's "3k".... wait .... they are lying to us
> 
> sorry ... morning here and didn't had my coffee ... correcting that right now



I think 2K refers to 1080p in this case, going by horizontal resolution alone. 1920 is not far from 2000 (2K) as 3840 is double that and not far off from 4000 (and TV 4K standard surpasses 4000 in horizontal pixels). Though in the usual sense 4K should be 4 times 1080p in pixels (twice horizontal and twice vertical, 2x2=4).

720p is to 1440p as 1080p is to 4K. They should really be thought of as two separate camps.

1280 x 720
times 2 either way is
2560 x 1440
A total increase of 4x pixels (which is why 1440 is called QHD, Quad HD, 4 x HD)

1920 x 1080
Times 2 each way is
3840 x 2160
A total increase of 4x pixels


HD is 66.67% of FHD. QHD is 66.67% of UHD.

Anyways I've come to the conclusion that there are only 3 applications of UHD in a monitor. First is for photographers who get to edit their photos in closer to native resolution. Second is the natural progression of resolution in 27-28" displays, where the trend has been 1080p to 1440p. Lastly is for maintaining pixel density in 30"+ monitors where 1440p doesn't cut it. For gaming, 4K is an unnecessary cut to performance without any benefit. Multi monitors arguably give the player a slight boost to situational awareness in select games; 4K does not.


----------



## GreiverBlade (Dec 31, 2015)

tabascosauz said:


> I think 2K refers to 1080p in this case, going by horizontal resolution alone. 1920 is not far from 2000 (2K) as 3840 is double that and not far off from 4000 (and TV 4K standard surpasses 4000 in horizontal pixels). Though in the usual sense 4K should be 4 times 1080p in pixels (twice horizontal and twice vertical, 2x2=4).
> 
> 720p is to 1440p as 1080p is to 4K. They should really be thought of as two separate camps.
> 
> ...


i always forgot the horizontal ... 
math fails me .... i need a 2nd coffee


----------



## qubit (Dec 31, 2015)

GreiverBlade said:


> "2k" as in 1440p? because "1k" is 720p so 1080p is "1.5k" so "4k" 2160p is not "4k" as 2880p would be, it's "3k".... wait .... they are lying to us
> 
> sorry ... morning here and didn't had my coffee ... correcting that right now





GreiverBlade said:


> i always forgot the horizontal ...
> math fails me .... i need a 2nd coffee


lol you definitely needed that second coffee. 

tabasco is right, I was thinking of 2K as in 1080p and 4K as in 2160p. A 980 Ti can already drive a 4K display at good framerates on its own if the game isn't too demanding (older title helps) and the details turned down a bit, hence I figured a 3-4x boost in performance compared to this would be sufficient to give a solid 144Hz at 4K with a suitable future monitor. And these are definitely coming, as I've seen the latest 4K standard that includes 4k @ 120Hz on TPU or Wikipedia.


----------



## EarthDog (Dec 31, 2015)

2K = 2048x1080
1080p = 1920x1080
The two are NOT interchangeable. However...
720p = 1280x720... but with TV's it is(was) typically 1366x768. There isn't another term so the two ARE interchangeable in this case.


1440p = 2560x1440 or WQHD
4K = 4096x2160
UHD-1 = 3840x2160 (TV's '4K' - notice how they all say UHD-1 for that res for clarity and to prevent false advertising?)

https://upload.wikimedia.org/wikipe...s8.svg/1280px-Vector_Video_Standards8.svg.png

What it is called in a TV is typically different from the PAL/NTSC. Some are interchangeable, others are not.

... sorry, pet peeve to get these right as most seem to brick on it (and I am probably being too literal... but hey, a spade is not a hoe, but they are both garden tools that move dirt!


----------



## qubit (Dec 31, 2015)

EarthDog said:


> 2K = 2048x1080
> 1080p = 1920x1080
> The two are NOT interchangeable. However...
> 720p = 1280x720... but with TV's it is(was) typically 1366x768. There isn't another term so the two ARE interchangeable in this case.
> ...



Yes, those are indeed the official standards. Thing is, everybody gets it "wrong" for all these resolutions and most people likely won't even be aware of the correct names for them, so you'd have to correct every instance of "4K" (3840x2160) not being called the rather awkward UHD-1 instead, for example. We therefore shouldn't technically be using 4K for 3840x2160 either, but we do and there's no issue or confusion with it.

Same way it can be extended to 1920x1080 for 2K. One can just put it down to a colloquialism* so in the context of our PCs there's no ambiguity with these informal references, especially as I've never seen an actual "4K" 2048x1080 PC monitor and if they exist they must be very rare and expensive. I hope that scratches that pet peeve itch a little bit? 
lol, I love your last line! 

Finally, do you have the link to the article that diagram came from please, because there's no way to get back to it from there?

*I had to actually google it for the correct spelling.


----------



## EarthDog (Dec 31, 2015)

Yeah, I guess I prefer to get it right is all.  And, there IS ambiguity with these informal references. People are just lazy and drop the "UHD" off of 4K, but if you look at all the specifications and labeling on the box, its all 4K UHD.

2K for 1920x1080 is a recent forum thing and is wrong, period. Its not a colloquialism... its just wrong but people use still use it. So there is built in ambiguity unless you say things properly as they are defined. Again a spade is not a hoe! 

Link: https://en.wikipedia.org/wiki/Display_resolution


----------



## qubit (Dec 31, 2015)

Well, it's the context of PCs when discussed in tech forums like this which really removes the ambiguity as I've said above, even though these definitions aren't strictly correct and you'd have to start picking up on all of them to be consistent. There's not really any more I can add to that. I can see why it might bug you though, lol. I generally like correctness too and try to define things properly.

One of my pet peeves are users running their monitors at a non-native resolution, leading to a truly awful picture that's got so much less desktop real estate than native. Of course, these users are invariably the clueless ones who will always _absolutely insist_ on running their monitors that way. I tried to get the PCs at work locked down to native resolution only by corporate policy, but was thwarted due to "health and safety" reasons as the users "couldn't see the screen properly" otherwise. Fuckin' hell. 

Thanks for the link. Bookmarked.


----------



## xorbe (Dec 31, 2015)

2560x1440 for 144/165 Hz.  4K == slide show.


----------



## EarthDog (Dec 31, 2015)

qubit said:


> Well, it's the context of PCs when discussed in tech forums like this which really removes the ambiguity as I've said above, even though these definitions aren't strictly correct and you'd have to start picking up on all of them to be consistent. There's not really any more I can add to that. I can see why it might bug you though, lol. I generally like correctness too and try to define things properly.


Im doing my part... LOL!


----------



## DarthBaggins (Dec 31, 2015)

This is why I'm happy with my 1080p monitors but I want to up to 144hz versions (looking at Asus's ProArt (24")series since I do a lot of photography work)


----------



## Gin-year-bread (Jul 4, 2016)

GreiverBlade said:


> "2k" as in 1440p? because "1k" is 720p so 1080p is "1.5k" so "4k" 2160p is not "4k" as 2880p would be, it's "3k".... wait .... they are lying to us
> 
> sorry ... morning here and didn't had my coffee ... correcting that right now



Not even close.

In 16:9 aspect ratio: 2K 2048x1152, so naturally twice that is 4K(4096x2304). 5K: 5120x2880 meaning 1440p would be the equivalent to "2.5K". "1.5K" as you call it would be 1536x864.

Apple sells both 4K and 5K iMacs with a 27 inch screen. They are pretty fantastic.


----------



## redeye (Jul 4, 2016)

so i have both 1440p and 2160p monitors (acer gsync 270, 280)
4k is nice on SLI gtx980... but the frame rate is limited to 60fps. i can get locked 60fps, but something is looks off... the colors, dont know.

(IMO wait for a 4k monitor that can do higher than 60fps. , maybe 120fps)


what is REALLY AWESOME is 5K DSR to 1440p. (4k DSR factor) (yes DSR + SLI + GSYNC finally works! on GTX980SLI)
problem is pushing 5k worth of pixel is TOUGH. on GTAV you can get over 60fps... using high shadows, and normal post processing. no "uber" settings. or aa or fxaa.

overwatch looks really nice at 5k, DSR'ed to 1440p.

so while a 4k monitor is 1337... DSR looks better... the jaggies just disappear...

so you really do not need 4k, just a graphics card with enough "GRUNT" to do effective DSR (GTX1080, SLI GTX980 etc)



TL;DR...to answer your question use 1080p but with  4K resolution DSR'ed to 1080p. the "free" antialiasing is far better than IMO any AA mode.
so go gtx1080 DSR and 1080 monitor... the visuals are equal or Very slightly less clear than 4k.
also with a 1080 monitor can switch to a higher frame rate by using 1080, instead of 4kDSR. whereas the 4k monitor you are "stuck" at 60fps. ( yes NEC makes a 4k60p/1080p120 but it is 3K dollars and is for pro Color apps)

downside to 4k monitors... they are limited to 60fps.
 when 120fps 4k monitors appear it will be a tough choice.


----------

