# TV won't go back to native resolution



## espionildo (Aug 10, 2016)

So I decided to watch an HD movie on my pc, since I use my TV as a monitor.
To make the image quality better, I changed my PC's video output resolution to 1920 x 1080 and my TV did that normal thing where it goes black for a while then the image reappears with a different resolution.
The weird thing is - when I went to revert it back to the native 1360 x 768, my computer's display changed but the TV continued thinking it was 1920 x 1080, now the display looks very weird. I tried resetting the PC but that didn't work


----------



## dorsetknob (Aug 10, 2016)

Have you tried the most simple idea of Turning off your TV and then turning it on again
then if needed going thru the menu settings ????


----------



## espionildo (Aug 10, 2016)

dorsetknob said:


> Have you tried the most simple idea of Turning off your TV and then turning it on again
> then if needed going thru the menu settings ????


yep


----------



## Caring1 (Aug 11, 2016)

Make sure you reset the display settings on the correct screen from your PC.


----------



## FR@NK (Aug 11, 2016)

The edid of most tv's will report 1080p as supported even if the native resolution is only 768p.

What video card are you using?


----------



## Bill_Bright (Aug 11, 2016)

I bet you are using HDMI, huh? I sure wish they would get the bugs out of that interface. Of course, when they finally do, it means we all have to upgrade (by new) all our devices to support the upgraded, bug free interface. 

Turning off the TV may not be enough. Today's TVs are basically a computer with a dedicated task. And turning them off really just puts them in stand-by - often preserving several settings. You may have to turn off, unplug from the wall, wait 15 - 30 seconds, then plug in and turn on. If the TV has a master power switch on the back, you can flip that instead of unplugging.

And/or do the same thing (full power off) with the computer.

Do you have another TV you can try?


----------



## Dethroy (Aug 11, 2016)

Most TVs also have a factory reset option that you could try as well.


----------



## FR@NK (Aug 12, 2016)

Guys, this issue is on the PC side, not the TV side.



espionildo said:


> when I went to revert it back to the native 1360 x 768, my computer's display changed but the TV continued thinking it was 1920 x 1080



This can be caused by GPU Scaling. With scaling on, no matter what resolution you set, the video card will scale the resolution to whatever resolution it thinks is the native resolution(1080p in this case).


----------



## Dethroy (Aug 12, 2016)

FR@NK said:


> Guys, this issue is on the PC side, not the TV side.


You sure? The wording is quite ambiguous ...


----------



## dorsetknob (Aug 12, 2016)

espionildo said:


> since I use my TV as a monitor.





espionildo said:


> since I use my TV as a monitor.
> To make the image quality better, I changed my PC's video output resolution to 1920 x 1080 and my TV did that normal thing where it goes black for a while then the image reappears with a different resolution.
> The weird thing is - when I went to revert it back to the native 1360 x 768, my computer's display changed but the TV continued thinking it was 1920 x 1080, now the display looks very weird. I tried resetting the PC but that didn't work





FR@NK said:


> Guys, this issue is on the PC side, not the TV side.



OP SEEMS PRETTY CLEAR HE IS CONTINUAL REFERING TO TV


----------



## qubit (Aug 12, 2016)

Bill_Bright said:


> I bet you are using HDMI, huh? I sure wish they would get the bugs out of that interface. Of course, when they finally do, it means we all have to upgrade (by new) all our devices to support the upgraded, bug free interface.
> 
> *Turning off the TV may not be enough. Today's TVs are basically a computer with a dedicated task. And turning them off really just puts them in stand-by - often preserving several settings.* You may have to turn off, unplug from the wall, wait 15 - 30 seconds, then plug in and turn on. If the TV has a master power switch on the back, you can flip that instead of unplugging.
> 
> ...


I find it particularly annoying when they implement "intelligent" functionality that just gets in the way of the setting you're trying to set.

@espionildo The TV might not be set for a 1:1 pixel ratio which would indeed look weird, so look for a setting for it in the menu. It comes under various names such as "just scan" or "exact fit" etc. Make sure the PC is set to 1360x768 first of course.

Also, it wouldn't hurt to give us the make and model of the TV and the PC so that specifics can be looked up. If it's a PC that you've built yourself then the graphics card model and Windows version are the pertinent information.


----------



## Bill_Bright (Aug 12, 2016)

qubit said:


> I find it particularly annoying when they implement "intelligent" functionality that just gets in the way of the setting you're trying to set.


I agree. HDMI was created by and for the home theater equipment industry to make cable management easier and much tidier - especially in surround sound setups with multiple sources like cable/DVR boxes, game consoles, DVD/Blu-Ray players and AV receivers.

HDMI was forced on to the computer industry because the major players in big screen TVs just happen to be the same major players in computer monitors, and they didn't want to deal with supporting both DVI and HDMI. Yeah, the much smaller size of the HDMI connector is nice, but is anyone really happy with the sound quality of the speakers built into many computer monitors? Except for office worker who only need Windows sounds, must users use self-powered "computer speakers" and/or headsets.

I agree we don't need multiple digital video interfaces, but it sure would be nice if HDMI worked as consumers [rightfully so] expect it should.

And Fr@nk, I am not convinced it is on the PC side either. It could be, but it could also be on the TV side. It appears one or the other are not cooperating with the "handshaking" needed to sync up properly. Since the computer screen sync's up fine, that suggests the computer/graphics solution does know how to sync properly.


----------



## FR@NK (Aug 13, 2016)

At first I thought it might have been something on the TV side, but as you can see the OP replied in post 3 that he had turned the TV off and on again; and that he also went through the menu settings on the TV. I'll repost it incase anyone skipped over it by mistake.

POST 3:


dorsetknob said:


> Have you tried the most simple idea of Turning off your TV and then turning it on again
> then if needed going thru the menu settings ????





espionildo said:


> yep



So at this point I'm ruling out anything on the TV side based on the OP's reply. Really what else can be done to the TV other then power cycle and changing menu settings? The TV in question is a 768p display so it's hard to believe its a high end "intelligent" TV.



dorsetknob said:


> OP SEEMS PRETTY CLEAR HE IS CONTINUAL REFERING TO TV





Bill_Bright said:


> And Fr@nk, I am not convinced it is on the PC side either. It could be, but it could also be on the TV side. It appears one or the other are not cooperating with the "handshaking" needed to sync up properly. Since the computer screen sync's up fine, that suggests the computer/graphics solution does know how to sync properly.



Ok great! Are you guys familiar with how GPU up-scaling works? Because if it was enabled it would cause the exact symptoms the OP is experiencing.

I'll break it down on how it works and how it relates to the OP's problem:

When you connect a TV to a GPU, the GPU reads the EDID information from the TV. This information tells the GPU all the different resolutions that the TV supports. But the issue is that nearly all 720p/768p TVs report that they accept a 1080p signal. Why is this? Because most set-top/cable boxes and bluray/dvd players output a 1080p signal. So the TVs accept this signal and scales it to its own panel resolution(this even happens on 1080p displays and 1to1 pixel mapping is rarely ever seen). At this point the GPU thinks 1080p is the native resolution; so when you enable the up-scaling feature the GPU will only ever output 1080p to the display. When you set your desktop to 1360x768 the GPU will scale it to 1080p and send that to the display(hence the OP said "TV continued thinking it was 1920 x 1080"). Then the TV sees this signal and down scales it to its panel resolution 768p. What you end up with is a picture that has been up scaled then down scaled at odd ratios so you lose tons of image quality.

To the OP, can you please check your graphic driver's settings and see if up-scaling is enabled?

I wish you all the best of luck! - FR@NK


----------



## qubit (Aug 13, 2016)

FR@NK said:


> The TV in question is a 768p display so it's hard to believe its a high end "intelligent" TV.


It doesn't have to be high end to support "intelligent" features. It's as simple as programming it to do a particular action when it sees a particular signal, which can screw up what you're trying to achieve. What I will say though, is that a cheap TV is likely to have buggier programming which could well lead to making it impossible to get the right setting in some circumstances.



FR@NK said:


> So the TVs accept this signal and scales it to its own panel resolution(this even happens on 1080p displays and 1to1 pixel mapping is rarely ever seen).


Are you saying that a 1080p TV fed a 1080p signal won't have a 1:1 pixel mapping or am I missing something here?


----------



## Bill_Bright (Aug 13, 2016)

@ FR@NK - yes I know how up-scaling works. I did not say it NOT on the PC side. I just said I am not convinced.  And it is for the reasons qubit pointed out - a possible failure to properly handshake which could still be on the TV side.

I will say looking at the up-scale setting on the PC's graphics solution is worth checking, however, and that is a good suggestion. I hope it is as simple as that.


----------



## FR@NK (Aug 13, 2016)

qubit said:


> Are you saying that a 1080p TV fed a 1080p signal won't have a 1:1 pixel mapping or am I missing something here?



Correct. By default most TVs overscan even if the source is the same resolution as the panel. If you disable over-scanning and any scaling settings on the TV you can normally get 1to1 pixel mapping to work. Back when HD signals became popular there was noise in the picture at the edges so over-scanning would make this invisible.



Bill_Bright said:


> a possible failure to properly handshake which could still be on the TV side.



There is no handshake! The source reads EDID data from the display to get the resolutions it supports and the source decides which one to use. This data is read only write protected and is even able to be read when the display is powered off. The display doesnt send any other information and it doesnt receive any information from the source other then the display signal.

This "handshake" you are referring doesnt happen.


----------



## qubit (Aug 13, 2016)

FR@NK said:


> Correct. By default most TVs overscan even if the source is the same resolution as the panel. If you disable over-scanning and any scaling settings on the TV you can normally get 1to1 pixel mapping to work. Back when HD signals became popular there was noise in the picture at the edges so over-scanning would make this invisible.


hmmm, I'm I don't think they overscan most of the time, or I'd soon notice it with a fuzzy picture.


----------



## FR@NK (Aug 13, 2016)

qubit said:


> hmmm, I'm I don't think they overscan most of the time, or I'd soon notice it with a fuzzy picture.



They do even the high end TVs. You dont notice it when you are watching TV.


----------



## qubit (Aug 13, 2016)

I think I would.


----------



## Bill_Bright (Aug 13, 2016)

Of course it's handshaking!





FR@NK said:


> The source reads EDID data from the display to get the resolutions it supports and the source decides which one to use.


The source decides which to use based on the information provided that indicates the "native" or preferred resolution. The graphics card then sets that resolution. It may set it again once the graphics card drivers are loaded. That's handshaking.

This handshaking occurs several times every time a PC boots. When you first boot, before any boot drive is touched and card drivers are read, the system syncs on standard resolutions of the BIOS and Safe Mode and the monitor. You typically can see this on many monitors as it flips back and forth inputs looking for a VGA/Analog or Digital signal. Again, that is handshaking.

All graphics solutions, including cards and integrated, and all monitors know how to "handshake" and syn up on standard protocols. This is essential or else monitors and motherboards could not communicate before any OS is installed. Then they handshake again when the OS and drivers load setup up the final resolution and refresh rates.

The definition of "handshaking": "_In information technology, telecommunications, and related fields, *handshaking* is an automated process of negotiation that dynamically sets parameters of a communications channel established between two entities before normal communication over the channel begins. It follows the physical establishment of the channel and precedes normal information transfer."_


----------



## bogmali (Aug 14, 2016)

Either take your arguments to PM or you both can argue while on vacation, your choice. No need to steer this thread going the wrong direction. Housecleaning done and I don't like doing that. The next time I do it, holiday passes will be issued


----------

