# Does a 1080 tv ever display an image correctly...



## Lazzer408 (Oct 11, 2012)

...that's coming from a PC?

I recently purchased two HDTVs.  A 42" Hisense and a 24" Sansui.  Most computer monitors, as you know, will look like $hit unless it's running at native resolution. At native, the image is nice and sharp because every pixle generated lands on a pixle in the LCD panel.  LCD TVs, on the other hand, seem to -ALL- look like crap and need lots of tweaking and manual adjustment to get it to look 'ok' at best. Why is this? Do HDTVs use some odd-ball LCD panels that aren't actually 1920x1080? Everything to date leaves a black border around the desktop when using HDMI set to 1920x1080@60. It also looks as if the sharpness is cranked to infiity leaving the image very grainy and the contrast/gamma makes everything washed out.

It's not just the HDTVs I've tried recently. It's every HDTV I've ever tried to use with a computer. I've tried onboard gfx, nvidia, ATI/AMD.

I'm using the Sansui at the moment and the gamma is so far off my eyes are watering.

Anyone know what I'm talking about? What's different in a 1920x1080 TV vs. a 1920x1280 monitor? What I wouldn't give to go back to 4:3 when everything worked and ran higher then this marketing magic that is "HD".


----------



## theonedub (Oct 11, 2012)

All 3 of my Vizio 1080P HDTVs correctly receive and display 1080P from a PC via HDMI and VGA. Don't see any color or sharpness issues either. Maybe its an AMD thing? Guess its just bad luck?


----------



## Maban (Oct 11, 2012)

A lot of the cheapo HDTVs accept a 1080P input but aren't actual 1080P displays.


----------



## Lazzer408 (Oct 11, 2012)

These claim to be 1080 panels.  I'm reading about something called 4:4:4 Chroma Subsampling that may have to do with it. http://www.avsforum.com/t/1381724/official-4-4-4-chroma-subsampling-thread/30

dub - your just lucky

I have a saying for technology these days and it is that everything is built "good enough", as in, 90% of the consumers won't or don't notice the defects. When I buy a 1080 TV, I would assume that's 1920x1080 and that if I put a 1920x1080 signal into it that 1920x1080 pixles light up. The dozen or so TVs I've had (since HDMI came out) have never worked right with a computer. Any TV I've owned with a DVI input worked fine with a PC.

Here's a shot from the 24".

EDIT - Turning off cleartype text got rid of the rainbow effect but it's still fuzzy.


----------



## cdawall (Oct 15, 2012)

The fix for everyone one of my crappy looking tv's via an HDMI/VGA/DVI input has been to disable scaling within the TV. The issue is the TV tries to scale all images including 1080P coming into it in order to remove the possibility of inputs not outputting a full 1920X1080.


----------



## xBruce88x (Oct 15, 2012)

My room mate is running a 46in Philips on his Athlon x4 rig I built him and it seems to do 1080p from the vga cable just fine.


----------



## Protagonist (Oct 15, 2012)

Most of the HDTVs I've seen are advertised to be 1080p, but the sad truth to most of the HDTVs the don't actually have 1920 x 1080 Pixels, what i have found out in the Manuals that come with those TVs is that,..

They have like 1366 x 768 Pixels or less then it can upscale if you play a 1080p clip on it or run a game at 1080p. but the sad truth is most of them don't have 1920 x 1080p.

Most manuals state something known as working resolutions 1366 x 768 and can upscale to 1920 x 1080.

Only few HDTVs actually have the 1920 x 1080.

On the other hand if a monitor is advertised to be 1920 x 1080p then it has that, that's why monitors are better than HDTVs


----------



## 3870x2 (Oct 15, 2012)

Most TVs I have seen just require some tweaking either with the TV itself or your own graphics drivers.


----------



## erixx (Oct 15, 2012)

One rule for all: run native rez and check the display modes (movie, photo, natural, game, etc) untill you find the one that shows computer text and icons well. On my Philips 32" (real 1950x1080pixels) all but one modes is good.
Of course, you can tweak the gamma, contrast etc by yourself...


----------



## Steevo (Oct 15, 2012)

My 46" Toshiba is just as sharp as my monitors.


----------



## erixx (Oct 15, 2012)

do not believe


----------



## Sasqui (Oct 15, 2012)

@ Lazzer, what are you using for a cable?

I've never seen anything remotely like that, either on 27" Samsung 1080P TV/Monitor, Dell 2405FP Monitor and my 40" Sony Bravia 1080P TV.

If the TV isn't true 1080 pixels, then scaling certainly could account for the defective image.


----------



## Nordic (Oct 15, 2012)

I had the problem you described but I thought it was hdmi exclusive. I used the scaling tool in ccc and now the picture fills the entire screen just as I want.


----------



## Steevo (Oct 15, 2012)

erixx said:


> do not believe



I think it is, and I guess thats all that matters to me, its actually a samsung, look at the firmware files. My laptop passed the 4:4:4 tests too.


----------



## patrico (Oct 15, 2012)

hiya, my lg 42" works nice for me over dhmi or rgb


----------



## Krazy Owl (Oct 16, 2012)

Irico 32" ED 60hz 1080P here paired with a GT8600 512megs.  Everything  at max out in BF2 and no flickers at all even at full FPS.


----------



## Lazzer408 (Oct 18, 2012)

Sasqui said:


> @ Lazzer, what are you using for a cable?



HDMI to HDMI  -or-  DVI(pc side) to HDMI(tv side)



james888 said:


> I had the problem you described but I thought it was hdmi exclusive. I used the scaling tool in ccc and now the picture fills the entire screen just as I want.



We shouldn't have to adjust anything to force a 1080 image to fit a 1080 screen. Everything I can find online says this 42" is a 1080 but what's interesting is one of my games was trying to run at 1366 x 768 stating that was the native resolition of the display, like st.bone was saying. If I set the desktop resolution to 1366 x 768 the image will be past the borders of the screen.

I took the 26" back to the store.

This is a fairly good read with a bunch of other people having the exact same issue with a varity of TVs. It's an old thread but it has a lot of things to try if your having the same issue. I agree with what most of them are saying about the manufactures screwing it up.
http://www.tomshardware.com/forum/267972-33-1080-hdmi-hdmi-connected-quality


----------



## Nordic (Oct 18, 2012)

I should of stated this. In my case it was the pre 12.8 bug that amd cards have with hdmi


----------



## Lazzer408 (Oct 18, 2012)

james888 said:


> I should of stated this. In my case it was the pre 12.8 bug that amd cards have with hdmi



That must be alot of them. I've tried a HD5450?(some odd ball I had), 4850, 4870, 4970, 5870, and 7770. They all have this bug? Also the 8000, 9000, and 400 series nvidia that also do the EXACT same thing. Did I mention the Intel onboard also does it? lol 

If ATI/AMD fixed anything great. I'll try it tomorrow. I think I currently have 12.6.


----------



## Nordic (Oct 18, 2012)

Lazzer408 said:


> If ATI/AMD fixed anything great. I'll try it tomorrow. I think I currently have 12.6.


Went away for me in 12.8. Even in the change log. I had the problem on 12.6


----------



## LAN_deRf_HA (Oct 18, 2012)

I see lots and lots and lots of people using hdtvs as monitors and very rarely does it look good by my standards. For it to have a shot of working like a real monitor you need a good tv and to turn any scaling on the vga end off. Then you have to find the scaling setting on your tv and hope it has one just right. On my panasonic plasma it's called "HD size 2" which the description says is cutting things down to 95%. If it all works out it should be as sharp as a monitor and not running off screen. 

I thought maybe the vga scaling blur was just a nvidia problem but my friend had the same issue with his amd card with a high end LG LCD and again the fix was the same, turn off vga scaling and find the right tv setting.


----------



## erixx (Oct 18, 2012)

Bugs and technical issues like scaling apart, it is all related to what you do with the HDTV (writing jobs, design, games or movies...)+working distance+resolution

My 32" 1950x1040 HDTV is brilliant for everything but only good enough for text, it is just 99% sharp at 80 cm distance.

It is quite obvious (should be) that a 24" 1950x1040 monitor is much sharper... so people with a monster 40" HDTV as monitor (how the hell!?) will have to have huge resolutions like 2500x1600 (just an invented example) to see sharp text at close distance (or work 2 meters away from the screen).


----------



## Lazzer408 (Oct 18, 2012)

james888 said:


> Went away for me in 12.8. Even in the change log. I had the problem on 12.6



12.8 didn't change anything for me. I think the panel in this TV might be some oddball resolution. Either that or something between the input and the panel is scaling the image and I can't disable it. No matter what I try, I have to use scaling via. CCC to make it fit the entire screen but it's still not in alignment with the panel pixles. Oh well. Next time I'll be sure to get a 4-4-4 TV.

EDIT - So I had to try something. I forgot if I tried it before but here's what WORKS. Using a DVI to VGA adaptor at the PC, and a VGA to VGA cable to the TV, and making a custom 1920x1080@60 resolution = RAZOR SHARP TEXT!! HDMI is over-rated and over-hyped and under-developed. That said, Walmart's $368 42" HiSense LED TV works great with a PC!

http://www.walmart.com/ip/Hisense-42-Class-LED-LCD-1080p-60Hz-HDTV-F42K20E/20549808


----------



## v12dock (Oct 19, 2012)

AMD and Nvidia to Vizio and Samsung TVs and they both displayed 1080p perfectly


----------



## Nordic (Oct 19, 2012)

Mines a vizeo. Same as the one in specs <--


----------



## Solaris17 (Oct 19, 2012)

cdawall said:


> The fix for everyone one of my crappy looking tv's via an HDMI/VGA/DVI input has been to disable scaling within the TV. The issue is the TV tries to scale all images including 1080P coming into it in order to remove the possibility of inputs not outputting a full 1920X1080.



i ran into this problem with nvidia cards. i had to go into the tv menu and put it on just scan instead of specifying a resolution and or aspect ratio.


----------



## newtekie1 (Oct 19, 2012)

st.bone said:


> Most of the HDTVs I've seen are advertised to be 1080p, but the sad truth to most of the HDTVs the don't actually have 1920 x 1080 Pixels, what i have found out in the Manuals that come with those TVs is that,..
> 
> They have like 1366 x 768 Pixels or less then it can upscale if you play a 1080p clip on it or run a game at 1080p. but the sad truth is most of them don't have 1920 x 1080p.
> 
> ...



A TV advertised as 1080p uses a 1920x1080 panel.  What you are talking about is 1080i TVs, they use 1366x768 panels.  Technically, HD is 720p, so a 1366x768 panel is used and can be sold as an HDTV.  Too many people assume that just because it says HDTV that it is 1920x1080, but technically anything 720p or greater can be sold as an HDTV.

As for the OP's problems, I haven't had a 1080p TV yet that didn't look good.  However, they usually require some tweaking to get a good image.  For example, here is what the image looks like on my 1080p 42" Vizio that is ~4 years old:


----------



## boogerlad (Oct 19, 2012)

Check your underscan/overscan settings. My experience is nVidia cards do it automatically, but amd needs manual adjustment.



newtekie1 said:


> A TV advertised as 1080p uses a 1920x1080 panel.  What you are talking about is 1080i TVs, they use 1366x768 panels.  Technically, HD is 720p, so a 1366x768 panel is used and can be sold as an HDTV.  Too many people assume that just because it says HDTV that it is 1920x1080, but technically anything 720p or greater can be sold as an HDTV.
> 
> As for the OP's problems, I haven't had a 1080p TV yet that didn't look good.  However, they usually require some tweaking to get a good image.



Uhh, no. 1080p is a 1920*1080 @ 60hz. 1080i is 1920*1080 @ 30hz. 720p is 1280*768 @ 60hz, etc...


----------



## newtekie1 (Oct 19, 2012)

boogerlad said:


> Uhh, no. 1080p is a 1920*1080 @ 60hz. 1080i is 1920*1080 @ 30hz. 720p is 1280*768 @ 60hz, etc...



1080i is 1080 Interlaced, so 1920x540.  Most 1080i material is still 60Hz, however some material actually combines two frames into one to give a 1080p signal at 30Hz and then double the frame up to match the 60Hz the TV displays.

And 720p is 1280x720 @ 60Hz.

However, most 720p HDTVs actually use 1366x768 panels, and these 1366x768 panels are also somehow able to be labeled as 1080i, as I said.


----------



## eidairaman1 (Oct 19, 2012)

EDID tables are screwy


----------



## Lazzer408 (Oct 19, 2012)

It's all marketing strategies. Before 1080 was marketed like candy to a bunch of sweet-toothed sheep, I didn't have any trouble finding a 1920x1280 24" monitor a few years ago. Ah yes, good old DVI. You plug it in and it worked. A few days ago I took a walk past Best Buy's monitor display to notice they were all 1080. Corporations LOVE confusing customers don't they? The 1920x1280 monitors are gone but don't worry, they'll be back in the form of "new" UDTV so they can market them to you all over again. I'm well aware of many 1920x1280 monitors being available but at a price but when I bought my 24" 1920x1280 (8 years ago?) it was on sale at Office Depot for $199. Same thing goes for CPUs. A few years ago a core2duo desktop was a few 100 at Walmart. 8-10 years later I find systems with B960s for 100s more. Anyone ever get the feeling that technology advancements have slowed down significantly over the last 10 years? I'm seeing a trend of products being marketed as something new when in fact I could have bought it 10 years ago for alot less.

As for the HDMI/PC/HDTV issues, WHY would a manufacture deliberately flaw the TVs ability to properly display 1080 content entering the set in a "native format" (for lack of a better term)? If the problem was limited to a few makes or models I could understand it as an overlook. I would also expect a firmware update to correct it. Fact is, the problem plagues the TV industry. Just a simple Google search containing "fuzzy text, hdtv, 1080, PC, ect." will bring result after result of people having this problem. Yes some people "get lucky", like v12dock did, but there clearly is a problem.



newtekie1 said:


> A TV advertised as 1080p uses a 1920x1080 panel.



Wrong. I've seen MANY 1080i/p TVs that have 768 panels in them. It was VERY common for early plasma sets. As long as it could display an image from a 1080 signal, they put 1080 on the box. It internally converted it to 768 but they won't tell you that.


----------



## eidairaman1 (Oct 19, 2012)

i recall 1920x1200 being ideal



Lazzer408 said:


> It's all marketing strategies. Before 1080 was marketed like candy to a bunch of sweet-toothed sheep, I didn't have any trouble finding a 1920x1280 24" monitor a few years ago. Ah yes, good old DVI. You plug it in and it worked. A few days ago I took a walk past Best Buy's monitor display to notice they were all 1080. Corporations LOVE confusing customers don't they? The 1920x1280 monitors are gone but don't worry, they'll be back in the form of "new" UDTV so they can market them to you all over again. I'm well aware of many 1920x1280 monitors being available but at a price but when I bought my 24" 1920x1280 (8 years ago?) it was on sale at Office Depot for $199. Same thing goes for CPUs. A few years ago a core2duo desktop was a few 100 at Walmart. 8-10 years later I find systems with B960s for 100s more. Anyone ever get the feeling that technology advancements have slowed down significantly over the last 10 years? I'm seeing a trend of products being marketed as something new when in fact I could have bought it 10 years ago for alot less.
> 
> As for the HDMI/PC/HDTV issues, WHY would a manufacture deliberately flaw the TVs ability to properly display 1080 content entering the set in a "native format" (for lack of a better term)? If the problem was limited to a few makes or models I could understand it as an overlook. I would also expect a firmware update to correct it. Fact is, the problem plagues the TV industry. Just a simple Google search containing "fuzzy text, hdtv, 1080, PC, ect." will bring result after result of people having this problem. Yes some people "get lucky", like v12dock did, but there clearly is a problem.


----------



## Lazzer408 (Oct 19, 2012)

eidairaman1 said:


> EDID tables are screwy



EDID shouldn't matter because one can force any resolution they want. If the TV has a 1920x1080 panel then a 1920x1080 custom resolution SHOULD fit dot4dot through HDMI. It doesn't fit over HDMI. That's the problem. VGA is working great but a digital signal would look better.



eidairaman1 said:


> i recall 1920x1200 being ideal



I love the cheapo 24" I use at home that runs 1920x1280. At work I have another 24" Samsung monitor that also runs 1920x1280. I only use it for testing because it has all 3 inputs (HDMI, DVI, VGA) so I just grab the cord I need rather then searching for adaptors. Tonight we built a mid-range gaming PC for a customer and his jaw dropped at the detail his games had through that monitor.


----------



## eidairaman1 (Oct 19, 2012)

the Laptop i have runs at 1920x1200 actually, its good for movies, but at time was decent for games, i know it cant run anything but movies now



Lazzer408 said:


> EDID shouldn't matter because one can force any resolution they want. If the TV has a 1920x1080 panel then a 1920x1080 custom resolution SHOULD fit dot4dot through HDMI. It doesn't fit over HDMI. That's the problem. VGA is working great but a digital signal would look better.
> 
> 
> 
> I love the cheapo 24" I use at home that runs 1920x1280. At work I have another 24" Samsung monitor that also runs 1920x1280. I only use it for testing because it has all 3 inputs (HDMI, DVI, VGA) so I just grab the cord I need rather then searching for adaptors. Tonight we built a mid-range gaming PC for a customer and his jaw dropped at the detail his games had through that monitor.


----------



## Steevo (Oct 19, 2012)

I just tested my Toshiba and it is displaying 4:4:4 over HDMI for me using the testing tools and my camera at pixel level is showing magenta-cyan-red in true color space down to the single pixel.


----------



## Kwod (Oct 19, 2012)

james888 said:


> I had the problem you described but I thought it was hdmi exclusive. I used the scaling tool in ccc and now the picture fills the entire screen just as I want.



Exactly, both my old 4850, my current 6850 and my AMD powered laptop via CCC scale correctly and proportionally any PC signal sent to my 1024x768 plasma or my 1080p HDTV.
CCC has an overscan slider, that's the key.
I think on my 768p plasma, I sent at 720p res over HDMI and then used CCC to scale it perfectly.

I have my 26in 1920x1200 PC LCD and my 1080p HDTV connected permanently, LCD is connected via DVI, and HDTV via HDMI, I use the HDTV for all DVD/HD, and LCD for games and www, but also use the HDTV for web surfing as well.


----------



## Kwod (Oct 19, 2012)

Lazzer408 said:


> Wrong. I've seen MANY 1080i/p TVs that have 768 panels in them. .



Do you live in Nigeria or something?.....these days, nearly everything is FULL HD aka 1920x1080p, ie, the displays will deinterlace.
Yes, there's still some cheap 768p panels in both plasma and LCD, but usually only the smallest and cheapest models, though I think Samsung, LG and panasonic still make 768p up to 50in on plasma, but have an additional 4-5 1080p models on top.

1080p resolution is also superior to 768p, for example, my Bruce Springsteen Hyde Park Bluray looks sensational on either my 768p plasma or 1080p LCD, but long range fine detail images are more visible on the 1080p screen, but I consider plasma to be a far superior TV technology.


----------



## Protagonist (Oct 19, 2012)

newtekie1 said:


> A TV advertised as 1080p uses a 1920x1080 panel. What you are talking about is 1080i TVs, they use 1366x768 panels. Technically, HD is 720p, so a 1366x768 panel is used and can be sold as an HDTV. Too many people assume that just because it says HDTV that it is 1920x1080, but technically anything 720p or greater can be sold as an HDTV.



Thanks this makes lots of sense, but honestly here in Kenya HDTVs have the 1080p label on them but they only have 1366 x 768 most of them anyway.

I have had of HDTVs being showcased in the country with then 1920 x 1080 but I'm yet to walk in a shop and find one. I Know they are around but very rare in my country. Unless you personally Import.


----------



## Lazzer408 (Oct 19, 2012)

Kwod said:


> CCC has an overscan slider, that's the key.



You shouldn't have to adjust anything. You shouldn't have to FORCE a 1920x1080 signal to fit a native 1920x1080 screen. Tweaking the overscan slider is changing the 1920x1080 signal to something else your TV likes to fit on the screen. Maybe if you adjust the horizontal and vertical one pixle at a time it might work but the slider snaps to something very close but it's unlikely dot4dot correct. This is math. 1.99999999999~ isn't 2. 



Kwod said:


> Do you live in Nigeria or something?.....these days, nearly everything is FULL HD aka 1920x1080p



Illinois. His statement was "A TV advertised as 1080p uses a 1920x1080 panel" This isn't ALWAYS true. Maybe "most" TVs "These days" are what they say, but lets not confuse people by saying ALL 1080 TVs have 1080 panels.


----------



## Protagonist (Oct 19, 2012)

Lazzer408 said:


> Illinois. His statement was "A TV advertised as 1080p uses a 1920x1080 panel" This isn't ALWAYS true. Maybe "most" TVs "These days" are what they say, but lets not confuse people by saying ALL 1080 TVs have 1080 panels.



True


----------



## Kwod (Oct 19, 2012)

Lazzer408 said:


> You shouldn't have to adjust anything. You shouldn't have to FORCE a 1920x1080 signal to fit a native 1920x1080 screen.



My main concern is that I have what appears to be a 1:1 pixel perfect screen, and that's what I can get, so I'm not sure what the fuss is.


----------



## newtekie1 (Oct 19, 2012)

Lazzer408 said:


> Wrong. I've seen MANY 1080i/p TVs that have 768 panels in them. It was VERY common for early plasma sets. As long as it could display an image from a 1080 signal, they put 1080 on the box. It internally converted it to 768 but they won't tell you that.



I'd like you to show me one, because I've never seen a single TV advertised as 1080p that used a 1366x768 panel.  I've seen plenty of 1080i sets, early plasmas were notorious here, but never a 1080p.  Of course now a days they label the 1366x768 panels as 720p sets instead of 1080i, even if they are capable of accepting a 1080i signal(and most are).



Lazzer408 said:


> Illinois. His statement was "A TV advertised as 1080p uses a 1920x1080 panel" This isn't ALWAYS true. Maybe "most" TVs "These days" are what they say, but lets not confuse people by saying ALL 1080 TVs have 1080 panels.



Yes it is, at least in the U.S. it is.  Even going years back, 1080p has to have a 1920x1080 panel.  The 1080i label was what was used to label 1366x768 panels, as I said.  It confused people back in the early days of HDTV because not many knew the difference between 1080i and 1080p.  And to add to the confusion some set makers simply labeled them 1080, dropping the i, so you really had to read closely what the capabilities of the TV were when buying.  However, I have never seen a TV labeled 1080p that didn't use a 1920x1080 panel, they simply can't do it, it is considered false advertising.  For the set to be labeled 1080p it has to be 1920x1080.

And I never said All 1080 TVs have 1080 panels, I said all 1080*p* TVs have 1080 panels, because they do.  1080*i* TVs used 1366x768 panels.


----------



## Lazzer408 (Oct 19, 2012)

newtekie1 said:


> I'd like you to show me one, because I've never seen a single TV advertised as 1080p that used a 1366x768 panel.  I've seen plenty of 1080i sets, early plasmas were notorious here, but never a 1080p.  Of course now a days they label the 1366x768 panels as 720p sets instead of 1080i, even if they are capable of accepting a 1080i signal(and most are).
> 
> 
> 
> ...



I quoted you as saying 1080p. I know it exists becuse I've owned one. I don't remember what brand (likely Samsung or Dell) or what model numbers. It was years ago but I can clearly remember being pissed off about it. That's when my hatred for "HD" anything began. Back when I paid an arm and a leg and got the shaft. The'll get it right some day but it hasn't happened yet. Until then, a large majority of us are not getting what we paid for.


----------



## newtekie1 (Oct 19, 2012)

Lazzer408 said:


> I quoted you as saying 1080p. I know it exists becuse I've owned one. I don't remember what brand (likely Samsung or Dell) or what model numbers. It was years ago but I can clearly remember being pissed off about it. That's when my hatred for "HD" anything began. Back when I paid an arm and a leg and got the shaft. The'll get it right some day but it hasn't happened yet. Until then, a large majority of us are not getting what we paid for.



Again, that is 1080i, 1080p is always a 1920x1080 panel.  1080i was 1366x768, you probably saw 1080 and assumed 1080p, but it was really 1080i.  When HDTVs first started to hit the consumer market they just used 1080 a lot, you had to read closely to see that it was really 1080i using a 1366x768 panel.  But 1080p always means a 1920x1080 panel is used.  It was a common "trick" back then to slap 1080i or just 1080 on the box in big letters on a TV that had a 1366x768 panel to get people's attention.  People would buy them expecting a 1920x1080 panel, and that wasn't what they got.


----------



## Lazzer408 (Oct 20, 2012)

newtekie1 said:


> 1080p is always a 1920x1080 panel.



So why didn't they call it 720i or "768i" if they really wanted to be honest about it. Point is... "1080"(x) was supposed to indicate the resolution as being 1080 pixles. Wether it's progressive or interlaced shouldn't matter. That's just how it's drawn on the screen. Bottom line is they took advantage of people. Fortunatly I knew enough about it to find the actual panel resolution before buying a TV.

Back to the issue at hand. 

HDMI = no workee
VGA = Workee with forced resolution with analog losses.


----------



## eidairaman1 (Oct 20, 2012)

Lazzer408 said:


> So why didn't they call it 720i or "768i" if they really wanted to be honest about it. Point is... "1080"(x) was supposed to indicate the resolution as being 1080 pixles. Wether it's progressive or interlaced shouldn't matter. That's just how it's drawn on the screen. Bottom line is they took advantage of people. Fortunatly I knew enough about it to find the actual panel resolution before buying a TV.
> 
> Back to the issue at hand.
> 
> ...



http://en.wikipedia.org/wiki/Interlaced_video

http://en.wikipedia.org/wiki/Progressive_scan


----------



## Mussels (Oct 20, 2012)

i've always had samsung and LG HDTV's look correct.

you often need to change settings (force 16:9, disable overscan, use a select HDMI port thats labelled DVI, etc)

as for the size of the screen - anything over 42" tends to look like crap. 1080p can only be stretched so far without aliasing all the text.



newtekie: 1080i was never 1360x768.   1360x768 panels were replacements for 720p screens (they were cheaper to make, re-using 1024x768 production equipment) - they supported 1080i as a requirement for HDMI certification/compatibility, but it does NOT mean they were 1080i screens. they just supported it as a backup. (i know this, because i own one. 1080i looks like ass if its not the native res of the screen)


----------



## newtekie1 (Oct 20, 2012)

Lazzer408 said:


> So why didn't they call it 720i or "768i" if they really wanted to be honest about it. Point is... "1080"(x) was supposed to indicate the resolution as being 1080 pixles. Wether it's progressive or interlaced shouldn't matter. That's just how it's drawn on the screen. Bottom line is they took advantage of people. Fortunatly I knew enough about it to find the actual panel resolution before buying a TV.



Well 720i or 768i would both be incorrect, as the panels were capable of 720p/768p.  You need to look up what difference the i and p make at the end of these terms because it seems obvious you don't know.

You are arguing for 1080 in general, when I clearly said 1080*p* means 1920x1080.  I never once said 1080i or simply 1080 means 1920x1080, in fact I've directly said that these often referred to panels that were actually 1366x768.  That letter at the end makes a huge difference, and it is the reason you and others get taken, and it seems you still don't realize that.

And that leads us back to the original problem, do you actually have a 1080p TV or is it a 1080i?  If it is a 1080i that really has a 1366x768 panel in it, then you are going to have the problems you are seeing because you aren't using the native resolution.  The will TV accept the signal and then scaling the image, making it look like crap.  Find the model of your TVs and look up their actual panel specs, it isn't difficult.  I can tell you right now from looking on the Sansui site that they have two 24" TVs available, one is a native 1920x1080 panel and one is a 1366x768.  Both will accept a 1080p signal, but only one is really a 1080p monitor.  If you got the 1366x768 one then it will look like crap unless you feed it a 1366x768 signal.


----------



## Mussels (Oct 20, 2012)

newtekie: you're confusing the fact that 768p screens work at 1080i, with what they're sold as. 1080i screens did exist (if uncommon). just because 768p screens work at 1080i is no reason to call them 1080i screens, it just adds to the confusion.


Yes, you could get 768p screens advertised as 1008i capable - but thats why they call them HD (720p/1080i) and full HD (1080p)

The OP needs to find out if his screen is HD or full HD, and go from there. even if its full HD/true 1080p, it doesnt mean its going to look good. cheap HDTV's often have crappy post processing effects or overscan issues that make them useless for PC usage.


----------



## Lazzer408 (Oct 20, 2012)

newtekie1 said:


> do you actually have a 1080p TV



Yes. 1920x1080 PROGRESSIVE works fine through the VGA port but Windows didn't detect the display. I had to create a custom resolution. Looks near perfect and fits dot4dot but a digital signal through HDMI should look even better but doesn't.



Mussels said:


> newtekie: you're confusing the fact that 768p screens work at 1080i, with what they're sold as. 1080i screens did exist (if uncommon). just because 768p screens work at 1080i is no reason to call them 1080i screens, it just adds to the confusion.
> 
> 
> Yes, you could get 768p screens advertised as 1008i capable - but thats why they call them HD (720p/1080i) and full HD (1080p)
> ...



1920x1080 looks great using the VGA port on the TV. The HDMI input looks like crap and must be scaling somewhere between the input and the panel. There's no PC or gaming mode in the menu. I have 3 HDMI ports but none are labled HDMI(DVI) or HDMI(game), ect. Port 3 I just noticed yesterday because it was apart from the others so I'll try it tomorrow. Maybe there's something special about it.


----------



## Mussels (Oct 20, 2012)

Lazzer408 said:


> Yes. 1920x1080 PROGRESSIVE works fine through the VGA port but Windows didn't detect the display. I had to create a custom resolution. Looks near perfect and fits dot4dot but a digital signal through HDMI should look even better but doesn't.
> 
> 
> 
> 1920x1080 looks great using the VGA port on the TV. The HDMI input looks like crap and must be scaling somewhere between the input and the panel. There's no PC or gaming mode in the menu. I have 3 HDMI ports but none are labled HDMI(DVI) or HDMI(game), ect. Port 3 I just noticed yesterday because it was apart from the others so I'll try it tomorrow. Maybe there's something special about it.



use one of the DVI ports, and then look for the button that changes between 4:3/16:9/just scan/1:1 pixel, whatever its called. odds are one mode there will look correct (on my old samsung its just scan, my housemates new samsung its 16:9 - so it varies even in the same brand)

once you've got that done, scour every last menu on the TV and video card drivers for over/underscan. AMD default to 15% underscan, for example.


----------



## newtekie1 (Oct 20, 2012)

Mussels said:


> newtekie: you're confusing the fact that 768p screens work at 1080i, with what they're sold as. 1080i screens did exist (if uncommon). just because 768p screens work at 1080i is no reason to call them 1080i screens, it just adds to the confusion.



1366x768 screens were often sold as 1080i TVs with 1080i advertised on the box, yes it was confusing and incorrect, but that is how it was. I'm not confusing this, they really were sold at 1080i TVs.  However, 1080p has not been that way, when they say 1080p the TV has a 1920x1080 panel.


----------



## Mussels (Oct 20, 2012)

newtekie1 said:


> 1366x768 screens were often sold as 1080i TVs with 1080i advertised on the box, yes it was confusing and incorrect, but that is how it was. I'm not confusing this, they really were sold at 1080i TVs.  However, 1080p has not been that way, when they say 1080p the TV has a 1920x1080 panel.



that would be a marketing problem in your area then.

They should have been advertised as 1080i capable (or 1080i inputs), with their native res listed somewhere.


----------



## newtekie1 (Oct 20, 2012)

Mussels said:


> that would be a marketing problem in your area then.
> 
> They should have been advertised as 1080i capable (or 1080i inputs), with their native res listed somewhere.



Oh the native res was always listed somewhere, but that didn't stop them from slapping huge 1080i stickers on the packaging and advertising them as 1080i TVs. The front of the box would have huge 1080i logos on it, and the real native resolution would be listed in a small table of specs on the side of the box.  But that was the beginning of the HDTV era, now that trick doesn't really work anymore, so they label the TVs 720p, the term 1080i has pretty much died.


----------



## Mussels (Oct 20, 2012)

well, its irrelevant since it works at 1080p on VGA. we just need to focus on the settings that are screwing with his HDMI.


----------



## Lazzer408 (Oct 21, 2012)

The 3rd HDMI input is the same as the others.

To show the problem that's happening, I've set up the TV's HDMI and VGA as multimonitor and cloned with both running 1920x1080p. I have both a VGA cable and an HDMI cable running from an HD7770 to the TV. I can now use the TV remote to switch between the two and compair. To get the HDMI signal to fit the screen, I have to use ATI's underscan feature found in CCC.

There may be some moire distortion from my camera. Please ignore that.
The images are as follows.

First image: HDMI 1080p full screen.
Second image: HDMI 1080p close up of problem with fine lines.
Third image: VGA 1080p full screen
Forth image: VGA 1080p close up of fine lines properly displayed.
Fifth image: 1920x1080 test pattern. View full screen to test your display. It should look like the forth image. If not, your scaling.


----------



## qubit (Oct 21, 2012)

Lazzer408 said:


> ...that's coming from a PC?
> 
> I recently purchased two HDTVs.  A 42" Hisense and a 24" Sansui.  Most computer monitors, as you know, will look like $hit unless it's running at native resolution. At native, the image is nice and sharp because every pixle generated lands on a pixle in the LCD panel.  LCD TVs, on the other hand, seem to -ALL- look like crap and need lots of tweaking and manual adjustment to get it to look 'ok' at best. Why is this? Do HDTVs use some odd-ball LCD panels that aren't actually 1920x1080? Everything to date leaves a black border around the desktop when using HDMI set to 1920x1080@60. It also looks as if the sharpness is cranked to infiity leaving the image very grainy and the contrast/gamma makes everything washed out. I  saw that you also had problems with nvidia. However, did you set the driver scaling to 1:1? (It's got the option for it)
> 
> ...



I've not read the whole thread, so I'm sorry if I've duplicated anything here.

There seem to be two problems here. Firstly, it's the TVs themselves, as they seldom seem to display a correct 1080p PC picture without some fiddling and faffing, even when connecting HDMI-HDMI. This includes TVs with a true 1920x1080 resolution. The picture isn't mapped properly 1:1 to the display pixels and sharpness tends to be off are the two most common problems.

Second problem appears to be with AMD drivers, yet again. With my nvidia cards, I can achieve the correct 1:1 pixel mapping by tweaking TV and video drivers, achieving a good, sharp picture. However, with the AMD cards, I can never get them to map 1:1, giving a frankly shit picture. This was true of discreet graphics cards and my laptop with AMD graphics and a HDMI port. It's a really stupid and dumb problem, so I dunno why it continues to be a problem at all. Surely, just generating a virgin 1080p signal and displaying it 1:1 without pointless post-processing is the easiest thing?! :shadedshu

Now, it's been a good year or so since I tried this, but from what I see here, the problem still isn't resolved and with the way AMD is going these days, I'm not surprised.

In the case of your specific TVs, if they weren't 1080p then that would explain why the picture looked crap, even though they were technically "compatible" with it.


----------



## Mussels (Oct 21, 2012)

qubit said:


> I've not read the whole thread, so I'm sorry if I've duplicated anything here.
> 
> There seem to be two problems here. Firstly, it's the TVs themselves, as they seldom seem to display a correct 1080p PC picture without some fiddling and faffing, even when connecting HDMI-HDMI. This includes TVs with a true 1920x1080 resolution. The picture isn't mapped properly 1:1 to the display pixels and sharpness tends to be off are the two most common problems.
> 
> ...




Lazzer: hunt through the video menus and try everything. theres got to be some setting (game mode, or similar) to turn off the post processing. hell maybe its just the sharpness option.

qubit: the AMD problem is the overscan in the CCC. its common, but easily fixed. Some TV's do look crap no matter what you do.


----------



## qubit (Oct 21, 2012)

Mussels said:


> qubit: the AMD problem is the overscan in the CCC. its common, but easily fixed. Some TV's do look crap no matter what you do.



Thanks M, I might try another fiddle, lol, even though I'm sure I had a go with that at the time. Weirdly, you don't get any of this shit when connecting AMD cards to 1080p monitors. Why a 1080p monitor with a tuner and remote control, then called a "TV" should be so different is a mystery to me.


----------



## Mussels (Oct 21, 2012)

qubit said:


> Thanks M, I might try another fiddle, lol, even though I'm sure I had a go with that at the time. Weirdly, you don't get any of this shit when connecting AMD cards to 1080p monitors. Why a 1080p monitor with a tuner and remote control, then called a "TV" should be so different is a mystery to me.



because true HDMI devices dont have EDID info. they just assume the signal is one of three modes (720p, 1080i, 1080p).

then the manufacturers add in their image 'enhancements' to make video look better, and overscan to compensate for something that doesnt exist.


----------



## qubit (Oct 21, 2012)

Mussels said:


> because true HDMI devices dont have EDID info. they just assume the signal is one of three modes (720p, 1080i, 1080p).
> 
> *then the manufacturers add in their image 'enhancements' to make video look better, and overscan to compensate for something that doesnt exist.*



What can I do but facepalm? :shadedshu


----------



## Lazzer408 (Oct 21, 2012)

Mussels said:


> the manufacturers add in their image 'enhancements' to make video look better



That's exactly it and usually there's a game mode or pc mode in the TV's settings or even a particular port designed for 1:1. What's agravating are the enhancements fiddling with the resolution/scaling/sampling/scaning/whatever itself. A 1080 signal should match dot4dot to a 1080 panel REGUARDLESS of any enhancements.

I've contacted the manufacture and will hear back from them Monday. Being "a retailer who's considering their products" gets me a little more attention from them. I told them, fix the issue and I'll resell your products, but I can't sell defective products.


----------



## techpun (Oct 22, 2012)

*DLP TV's..*

Do they fall into this category, I always thought my text was blurry and not readable from far distances because of a DLP but to find out it may be because I have ATI vid card is good to know.


----------



## Mussels (Oct 22, 2012)

techpun said:


> Do they fall into this category, I always thought my text was blurry and not readable from far distances because of a DLP but to find out it may be because I have ATI vid card is good to know.



the AMD bug is a simple slider in the drivers.


----------



## Lazzer408 (Oct 22, 2012)

techpun said:


> Do they fall into this category, I always thought my text was blurry and not readable from far distances because of a DLP but to find out it may be because I have ATI vid card is good to know.



Yes.  A DLP also needs a dot4dot native signal to be sharp, BUT, you will have to underscan your image anyways to make it fit the screen so dot4dot won't help you. DLP ALWAYS overscans the projected image to the screen is completely filled.



Mussels said:


> the AMD bug is a simple slider in the drivers.



What bug?  1920x1080 is 1920x1080. If there's a "bug" it's probably the TV's fault.



Here's what the manufacture had to say about my issue.

"Sir,
           Thank you for your comments.  I understand your question and have heard this from other people as well, but unfortunately what you are trying to accomplish is not what this TV was designed or intended for.  Hisense clearly states in the owner’s manual that this unit is intended for use as an LCD TV, not a monitor.  

Thank you"


No name or nothing. Man what a cop-out.  So others are having the same problem.  Since HiSense is a manufacture of other brands as well, I wonder how many other TVs have the problem.

I read about a guy who walked into a TV store with his laptop and told the salesman that if any of the TVs would properly display his laptop without tweaking, he would buy it. NONE of them did.


----------



## newtekie1 (Oct 22, 2012)

Lazzer408 said:


> What bug?  1920x1080 is 1920x1080. If there's a "bug" it's probably the TV's fault.



AMD drivers have a bug that sets the over/underscan option to a value other than 0 when using HDMI with a HDTV.  So there is a black boarder around the image on the TV and the image looks like crap.  That is what he is talking about.


----------



## qubit (Oct 22, 2012)

newtekie1 said:


> AMD drivers have a bug that sets the over/understand option to a value other than 0 when using HDMI with a HDTV.  So there is a black boarder around the image on the TV and the image looks like crap.  That is what he is talking about.



You'd think they could fix a tiddly little problem like that, wouldn't you?


----------



## Lazzer408 (Oct 23, 2012)

Someone mentioned in 12.8 they fixed that. That's easy enough to adjust to zero but at zero it's too big for the screen.

I'm interested to hear what Hisense says about this issue. If they are telling me "unfortunately what you are trying to accomplish is not what this TV was designed or intended for" then I guess their 1080p TV is not intended to display a 1920x1080 signal? Idiots. I won't stop until they fix it. >.<


----------



## eidairaman1 (Oct 23, 2012)

guess that means a Monitor works better than a TV despite the Pixels on screen being the same


----------



## Lazzer408 (Oct 23, 2012)

I guess. How many companies use TVs as displays for marketing and conferencing?  It should be "standard" for any 1080p TV to -correctly- play 1080p content...right?  I mean, my car starts when I turn the key. That's fairly standard in the automotive industry.


----------



## newtekie1 (Oct 23, 2012)

Lazzer408 said:


> That's easy enough to adjust to zero but at zero it's too big for the screen.



That's messed up.  It shouldn't do that unless the image is being scaled by the TV.


----------



## Lazzer408 (Oct 23, 2012)

newtekie1 said:


> That's messed up.  It shouldn't do that unless the image is being scaled by the TV.



IT IS!

Thanks for reading.


----------



## Mussels (Oct 23, 2012)

Lazzer408 said:


> Someone mentioned in 12.8 they fixed that. That's easy enough to adjust to zero but at zero it's too big for the screen.
> 
> I'm interested to hear what Hisense says about this issue. If they are telling me "unfortunately what you are trying to accomplish is not what this TV was designed or intended for" then I guess their 1080p TV is not intended to display a 1920x1080 signal? Idiots. I won't stop until they fix it. >.<



if 0 is too big for the screen, your screen has overscan enabled and thats the cause of your image problems. if you dont have it at zero in the AMD drivers you will NEVER fix the image problems.


----------



## Lazzer408 (Oct 23, 2012)

I can't change the fitment options of the TV when using HDMI so whatever they did it's fixed at overscan.


----------



## Mussels (Oct 23, 2012)

Lazzer408 said:


> I can't change the fitment options of the TV when using HDMI so whatever they did it's fixed at overscan.



its buried in there somewhere. on my samsung theres a button on the remote to change the aspect ratio - 4:3. 16:9, zoom, etc. try that in every mode and see if one works. and to reinforce it - with AMD overscan at 0


----------



## newtekie1 (Oct 24, 2012)

On all of my HDTVs the only way to change the setting is a button on the remote. I can't get to the option to change it through the normal menus system.


----------



## eidairaman1 (Oct 24, 2012)

Lazzer408 said:


> I guess. How many companies use TVs as displays for marketing and conferencing?  It should be "standard" for any 1080p TV to -correctly- play 1080p content...right?  I mean, my car starts when I turn the key. That's fairly standard in the automotive industry.



NTSC/PAL/SECAM

http://en.wikipedia.org/wiki/NTSC

http://en.wikipedia.org/wiki/PAL

http://en.wikipedia.org/wiki/SECAM

VS

http://en.wikipedia.org/wiki/Computer_display_standard


----------



## Lazzer408 (Oct 24, 2012)

eidairaman1 said:


> NTSC/PAL/SECAM
> 
> http://en.wikipedia.org/wiki/NTSC
> 
> ...



 What does NTSC ect have to do with HD resolutions?


----------



## Mussels (Oct 24, 2012)

Lazzer408 said:


> What does NTSC ect have to do with HD resolutions?



the HDTV's sold in each region still need to support them, so retarded optimisations get added in that shouldnt be needed.


----------



## Techtu (Nov 12, 2012)

Can I jump in please? 

I used to run my TV just fine until I moved to AMD... Now I also have the black boarder 

Thing is, sometimes it's there and other times it isn't. One game it goes away on all the time is Sleeping Dogs but have to game at 1366x768 to achieve that.

HELP


----------



## Mussels (Nov 13, 2012)

Techtu said:


> Can I jump in please?
> 
> I used to run my TV just fine until I moved to AMD... Now I also have the black boarder
> 
> ...



its a slider in the drivers. its been mentioned heaps of times in this thread alone.


----------



## BUCK NASTY (Nov 13, 2012)

Mussels said:


> its a slider in the drivers. its been mentioned heaps of times in this thread alone.


Thanks Guy's! the slider was set to 8%, so I got a nice screen size boost by setting it to zero.


----------

