# TV as computer monitor? 1080p/1080i?



## hat (Nov 25, 2014)

I used a TV as a computer monitor for the longest time, both with the VGA and later on with the HDMI input. It looked fine--at 1440x900, which was the native resolution reported in Windows for the TV (even though it was a 1080i screen... I could select 1080i but it looked like ass).

Now I have a different TV for testing purposes. It's a 1080p screen, and in Windows it shows the native resolution is 1980x1080... but it looks like ass. It looks kinda overbright, text doesn't look right, etc... but if I tune it down to 1366x768, it looks okay (only thought of 1366x768 as it seems to be the standard resolution to run on a TV from a computer).

Is there something up with the TV? Maybe it's just not a good quality screen, or not designed to display an image from a PC? How do I choose a good TV that I intend to use with a PC? Or is it best to just use a monitor anyway?

Side question: all this 1080p/1080i nonsense. I understand p is for progressive and i is for interlaced and it refers to how the video is recorded or somesuch. Doesn't really seem to apply to computers, or even a game console for that matter (when it comes to video games, anyway). It seems to have mostly to do with TV and movies and that sort of thing. So then, is there truly any such thing as a 1080p/1080i computer monitor?


----------



## AsRock (Nov 25, 2014)

There should be options on the TV to select different formats like fullHD \4:3\ native and such although this is from a TOSHIBA TV so chances are yours will be called some thing else.

OTher than that due to lack of information given cannot clearly help ( like what TV you have ).

and trust me you don't want to use interlaced, as i understand it is basically a line of the picture with a black line to follow then another line so it's kinda like having 1/2 the picture missing..


----------



## Jstn7477 (Nov 25, 2014)

What is the exact model of the TV, and using VGA or HDMI? Also, since you have a Radeon, did you change the display underscan to 0% in CCC if using HDMI? (Why AMD is dumb with HDTVs on HDMI, I never found out, and missing EDID data shouldn't mean "underscan the f*** out of the TV)". Likewise, don't use the TV to overscan the underscanned output from the Radeon if it's doing that.


----------



## AsRock (Nov 25, 2014)

Jstn7477 said:


> What is the exact model of the TV, and using VGA or HDMI? Also, since you have a Radeon, did you change the display underscan to 0% in CCC if using HDMI? (Why AMD is dumb with HDTVs on HDMI, I never found out, and missing EDID data shouldn't mean "underscan the f*** out of the TV)". Likewise, don't use the TV to overscan the underscanned output from the Radeon if it's doing that.



Funny you say that as my overscan requirement change went away when i changed AV's lol.  As far as i can tell witht he newer YAMAHA AV's the computer see's the TV not the AV so it shows in windows not what AV your using but what TV your using.


----------



## newtekie1 (Nov 25, 2014)

hat said:


> Is there something up with the TV? Maybe it's just not a good quality screen, or not designed to display an image from a PC? How do I choose a good TV that I intend to use with a PC? Or is it best to just use a monitor anyway?



In my experience a lot of TVs do a shit load of stupid post processing to the image and it can really make things look shitty when using the TV as a monitor.  However, there is almost always a mode in the TV menu that disables all that crap.  If not, then you probably have to adjust the image settings manually and turn the different picture effects off manually as well.  



hat said:


> Side question: all this 1080p/1080i nonsense. I understand p is for progressive and i is for interlaced and it refers to how the video is recorded or somesuch. Doesn't really seem to apply to computers, or even a game console for that matter (when it comes to video games, anyway). It seems to have mostly to do with TV and movies and that sort of thing. So then, is there truly any such thing as a 1080p/1080i computer monitor?



From what I've gathered, the 1080i label is/was misused by TV manufacturers.  Back with tube TVs 480i meant the TV did 640x480 interlaced, so half the horizontal lines weren't used.  With 480p, the TV did true 640x480 with all the horizontal lines being used.  You'd think the same would apply to 1080i/p, the only difference being the resolution.  But no.  Instead, it seems like TV manufactures used 1080i to stand for any TV that had a native resolution higher than 720p but not 1080p.  Most commonly TVs that used a 1366x768 panel were labeled as 1080i.  They would accept a 1080i signal, but downscale it to the lower resolution.  I guess they figured this was OK since they were also upscaling the 720p signal and a lot of times a 1080i signal would look better on the TV.


Signal wise, 1080i is 1920x1080 interlaced like you'd expect.

Any TV/Monitor that uses a native 1080p panel is truly a 1080p monitor/TV.  There aren't, AFAIK, any truly 1080i monitors or TVs.  Any TV labelled FullHD should be a native 1080p panel.


----------



## hat (Nov 25, 2014)

It's a Seiki SE24FE01-W. There is a problm with a damaged part of the screen anyway so we're returning it. Probably just gonna pick up an actual PC monitor as that's what it was gonna be used as anyway.

There were some options similar to what asrock said that made the picture appear bigger or smaller. I was using HDMI, not sure about overscan or underscan in CCC... I did look at it but I don't remember what it was set to. There was a VGA port on the TV though... Maybe this TV isn't really designed to be used as a PC monitor, at least with HDMI.

I don't get why I couldn't just select the native res of 1920x1080 and go, like I did with my TV at 1440x900...

I take it PC monitors are better suited to displaying an image from a PC than a TV anyways...


----------



## kn00tcn (Nov 25, 2014)

this thread hurts my brain

now consider this, if the tvs were ruining the pc image... dont you think they're also ruining the console+tv+bluray images as well? why arent we rioting in the streets due to all the bs that happens with tvs & hdmi 

there is no way 1366x768 can look 'okay', if the tv has 1920x1080 pixels, well you better send it a 1920x1080 signal (p of course), anything not native is a blur, like every single pixel based monitor in existence other than CRTs

now... aside from panel types like TN/MVA/IPS, or screen types like LCD/plasma/etc, what you as a user have to do is calibrate as best as you can

disable overscan or underscan, disable dynamic anything, disable frivolous modes like 'game mode' or 'movie mode', start the sharpness setting at ZERO then adjust, make sure the full native resolution (p of course, for progressive) is being sent, finally adjust the brightness+contrast+color settings on the tv until they're decent using reference images like http://www.lagom.nl/lcd-test/

if the tv or gfx card or console have the option, compare rgb full vs rgb limited if any of them have such an option (a ps3 does, for example), try to see as many colors/whites/blacks as you can in lagom's test images

now... after all of that tweaking... it may never be good enough, like a crappy insignia i've used, somehow it's still slightly blurry & of course a lot of the whites+blacks are crushed, the gamma is way off, it's just plain irritating...

i certainly would love to have a GOOD tv... such as IPS panel without garbage settings that cant be disabled

but at the end of the day, what exactly does anyone need a tv for? speakers? atsc tuner? coax input? inches? several hdmi inputs? there's nothing really proprietary about them, a tv just puts them all in a single unit at the expense of terrible firmware most of the time


----------



## Jstn7477 (Nov 25, 2014)

I used a 2009 Insignia advanced 22" 1080p TV for a couple years, and had interesting HDMI problems myself. After unscrewing the AMD underscan crap with my HD 5770, the picture was okay but any red colored text was super blurry. I was so happy when I purchased my ASUS VG236H monitor as the text was a lot cleaner along with everything else on DVI, but the HDMI is still crappy even to this day with Radeons at the least (had it on an R9 280X recently and all the colors were washed out and the picture was grainy. I'm trying the monitor with my laptop at the moment on HDMI (controlled by Intel HD 4600) and it actually looks fine with that, so probably Radeon HDMI crap going on mostly, along with potential TV processing crap too.


----------



## a_ump (Nov 25, 2014)

I too use my TV(40", 1080p) as my monitor. I have only experienced one consistent issue and that's with text. black text seems too have a redish hue here, blueish hue there, etc. Idk if it has to do with the PPI or not, but the only thing i could come up with is the pixels are so much larger than 1080p on a 22-27" screen that the eye is able to differentiate minute differences in the 3 colors that make up a pixel?.....lol pure pull outa my arse logic but its the best i have.


----------



## kn00tcn (Nov 25, 2014)

Jstn7477 said:


> I used a 2009 Insignia advanced 22" 1080p TV for a couple years, and had interesting HDMI problems myself. After unscrewing the AMD underscan crap with my HD 5770, the picture was okay but any red colored text was super blurry. I was so happy when I purchased my ASUS VG236H monitor as the text was a lot cleaner along with everything else on DVI, but the HDMI is still crappy even to this day with Radeons at the least (had it on an R9 280X recently and all the colors were washed out and the picture was grainy. I'm trying the monitor with my laptop at the moment on HDMI (controlled by Intel HD 4600) and it actually looks fine with that, so probably Radeon HDMI crap going on mostly, along with potential TV processing crap too.



reminded me of yet another setting... color subsampling, which is only good for video compression, i dont see why it should be an option for hardware, so go into CCC & play with the rgb/hdmi settings, avoid things like 4:2:0, go for 4:4:4, go for rgb full, etc



a_ump said:


> I too use my TV(40", 1080p) as my monitor. I have only experienced one consistent issue and that's with text. black text seems too have a redish hue here, blueish hue there, etc. Idk if it has to do with the PPI or not, but the only thing i could come up with is the pixels are so much larger than 1080p on a 22-27" screen that the eye is able to differentiate minute differences in the 3 colors that make up a pixel?.....lol pure pull outa my arse logic but its the best i have.



zoom in on a screenshot of cleartype text, you will see the pixels are colored around the edges, which creates a nice illusion of increased sharpness on monitors... but if the pixels are huge, you will see their colors (alternatively, if you horizontally glide your eyesight across the image, you should see how the colors disappear)


----------



## XSI (Nov 25, 2014)

I use 50" plasma as monitor sometimes, I tried it with 2 nettops, 3-4 laptops, 2 desktops, never had a problem. Other than its just uncomfortable (too big). and it always looked better than other screens.
of course if I use hdmi. vga scks badly


----------



## newtekie1 (Nov 25, 2014)

kn00tcn said:


> there is no way 1366x768 can look 'okay', if the tv has 1920x1080 pixels, well you better send it a 1920x1080 signal (p of course), anything not native is a blur, like every single pixel based monitor in existence other than CRTs


That simply is not true.  Both HDTVs in my home that have PCs connected to them are 1920x1080 panels, and I run both at 1360x768 when using the PC and the image is okay.  It is not as sharp as running at 1080p, but I wouldn't call it blurry either.  The text is clear and readable.  

I've seen what you are talking about in the past, where running at anything other than the native resolution was extremely blurry.  But that was on old 1280x1024 monitors running 1024x768.


----------



## GreiverBlade (Nov 25, 2014)

i add myself as: another "TV as computer screen"user

Toshiba 32L1343DG 1080i (no lines ... ? interlaced have lines right? ) using on HDMI (since it doesnt have anything else than HDMI or VGA) funniest thing is : i can't use my 24" ASUS anymore ... it feels blurry and less good than my TV (also go find a 32" monitor for under 299chf ... all of them cost the double of that, at minimum.)

only issue : with some game i have to 1. alt-tab win then alt-tab back to the game (under/overscan issue iirc) 2. set up to borderless windowed mode (when alt-tab solve nothing)
i think one day i will take a 32" (or a 27" as they seems a bit more affordable)1440p for a change ... maybe i will ask santa claus 

all that to say : i prefer my 32" TV over any of my 24" 16/9 16/10 screen i had (2ms type)

last question : that Toshiba handle 1920x1200 at max it's safe or not? (well i assume it's not a good idea since it's a 16/9 panel and 1200 is a 16/10 res)


----------



## JunkBear (Nov 26, 2014)

Same thing here but i use it on 720P so it use less power on gpu and still manage to play all well.


----------



## kn00tcn (Nov 26, 2014)

newtekie1 said:


> That simply is not true.  Both HDTVs in my home that have PCs connected to them are 1920x1080 panels, and I run both at 1360x768 when using the PC and the image is okay.  It is not as sharp as running at 1080p, but I wouldn't call it blurry either.  The text is clear and readable.
> 
> I've seen what you are talking about in the past, where running at anything other than the native resolution was extremely blurry.  But that was on old 1280x1024 monitors running 1024x768.



i dont mean it's super blurry, it's merely not perfect (i run an android box in 720 on a 1080 tv)

even on old monitors, i dont mean non native is unreadable, but you cant mathematically fit one grid on another grid without altering the image... & i personally want as accurate as possible

now if you want to see something interesting, check out 'majin and the forsaken kingdom' on 360/ps3, they basically did nearest neighbor filtering to upscale 720 to 1080 (i tested the ps3 demo), so there is zero blur, it's real sharp, but you can see the pixel steps arent even, it reminds me of 90s games like magic carpet or doom



GreiverBlade said:


> Toshiba 32L1343DG 1080i (no lines ... ? interlaced have lines right? )
> ...
> only issue : with some game i have to 1. alt-tab win then alt-tab back to the game (under/overscan issue iirc)
> ...
> last question : that Toshiba handle 1920x1200 at max it's safe or not? (well i assume it's not a good idea since it's a 16/9 panel and 1200 is a 16/10 res)



interlaced DATA has lines in an abstract sense, obviously the tv will merge the signal or duplicate the lines to display them on screen, otherwise you'll be looking through a fence of tiny black bars

why arent you set to 1080p in your gfx driver settings? make sure all overscan settings in BOTH the gfx driver & the monitor are off or '100%'

interesting that it can handle x1200... it would either crop or shrink with black bars, either way there is a loss of 'data'


----------



## newtekie1 (Nov 26, 2014)

kn00tcn said:


> i dont mean it's super blurry, it's merely not perfect (i run an android box in 720 on a 1080 tv)
> 
> even on old monitors, i dont mean non native is unreadable, but you cant mathematically fit one grid on another grid without altering the image... & i personally want as accurate as possible
> 
> now if you want to see something interesting, check out 'majin and the forsaken kingdom' on 360/ps3, they basically did nearest neighbor filtering to upscale 720 to 1080 (i tested the ps3 demo), so there is zero blur, it's real sharp, but you can see the pixel steps arent even, it reminds me of 90s games like magic carpet or doom



That is the thing though, the image is "okay".  It isn't perfect, but it is okay.

And most of the time the lower resolution is being used either because the person is too far away from the screen or they have poor vision...or both.  Either way, in the common situations you won't notice the slight blur.


----------



## hat (Nov 26, 2014)

Well I managed to make it look 'right' by following some suggestions made here. The image was being distorted by AMD's underscanning of a blown up image from the TV. There was no PC mode or anything, I just set overscan/underscan to 0 in CCC and set the TV picture size to 'just scan' which seems to be like the native image mode. Turning sharpness to 0 helped further reduce the fucked up look it had.


----------



## Jstn7477 (Nov 26, 2014)

hat said:


> Well I managed to make it look 'right' by following some suggestions made here. The image was being distorted by AMD's underscanning of a blown up image from the TV. There was no PC mode or anything, I just set overscan/underscan to 0 in CCC and set the TV picture size to 'just scan' which seems to be like the native image mode. Turning sharpness to 0 helped further reduce the fucked up look it had.



Great, glad it's working better now. I figured there was some sort of issue like that, and I really wish AMD would figure out the HDMI underscan by default crap as it has been an issue since at least Catalyst 10.7 or something (the drivers my ancient Toshiba A665D-S6091 started off with were 10.4 I believe, and that set worked okay out of the box I thought). NVIDIA and Intel work fine out of the box with no scaling BS, just plug in the HDMI cable and its set.


----------



## hat (Nov 26, 2014)

My radeon 5870 worked fine right away with my other TV too... This seems a bit of an oddball.

At this point I was just trying to satisfy my curiosity. I just want to understand... We have to send this TV back anyway, the screen is a little damaged. But while I still had it I wanted to learn how to set it up properly.


----------



## kn00tcn (Nov 27, 2014)

amd has an obvious solution: display a message to the user, yet after all these years, here we are...


----------



## Mussels (Nov 27, 2014)

as someone who has been using HDMI and HDTV's on his PC's for a long time, it always boils down to one of two things:

1. A config issue on the TV, PC or both for overscan. for AMD, 99% of the time the overscan slider defaults to the wrong setting. don't know why, but its a long standing bug.

2. a totally garbage TV that has a panel that doesn't match its inputs (i've seen 'HD' plasmas with 1024x768 native res)


----------



## JunkBear (Nov 27, 2014)

Polaroid LED tv of 19inches are 1366x768. Pretty strange and cheap too.


----------



## Mussels (Nov 28, 2014)

JunkBear said:


> Polaroid LED tv of 19inches are 1366x768. Pretty strange and cheap too.




my old as hell 40" has that res, only HDMI2 supports that res, and it looks amazing to this day.


----------



## newtekie1 (Nov 28, 2014)

JunkBear said:


> Polaroid LED tv of 19inches are 1366x768. Pretty strange and cheap too.



Actually, not that strange at all.  Almost all the TVs on the market today that are "720P" actually use a 1366x768 panel.


----------



## hat (Nov 28, 2014)

How to you display 720p on 1366x768? On that thought, how did my 1080i TV display 1080i when it was 1440x900 at least on my PC?


----------



## Frag_Maniac (Nov 28, 2014)

^Any display can typically show fullscreen any common res within it's aspect ratio that's below it's native. Just pop in a game, set it to 1280x720, and the TV should display it.

1080i would be broadcast resolution. 1080i uses two 540 images that flash back and forth rapidly to create the illusion of 1080. The two 540 images are each refreshed at 30Hz. 1080i can only be broadcast in 30Hz. Most true HD content on TV is either 1080i or 720p. 720p being 60Hz because it's Progressive vs Interlaced (one video frame after another vs alternating).

Most every TV has an "info" button on it's remote that will show the res of the program you're watching. TV listing sites like TV Guide and TitanTV also show what res the program is in.

The real crime of subscription TV services IMO is even since the HD broadcast changeover, many stations are still 480i. Which in this day and age is pathetic.

I'd rather see subscription TV go all HD before 4K comes, otherwise it's kinda pointless. If not it could easily mean only the rich will be able to afford Ultra HD (4K) packages..


----------



## Mussels (Nov 29, 2014)

hat said:


> How to you display 720p on 1366x768? On that thought, how did my 1080i TV display 1080i when it was 1440x900 at least on my PC?




if its for a PC, you dont. you display 1360x768

they have a built in scaler to upscale 1280x720 content which is fine for movies, but not good for PC content.


----------

