# HDTV's and PC's - Info and help thread



## Mussels (Dec 28, 2011)

*Plasma Vs LCD*
I've never owned a plasma, so my information here is second hand. The short version is that while plasmas can look better (they dont have to be at a native resolution to look their best) they still often fudge what resolution they run at, and could even be non widescreen (i've personally seen dozens of 1024x768 plasmas advertised as '1080p capable')

most of this article is about LCD HDTV's (the most common), so if you have a plasma, it may not be relevant.

*Size of the screen:*

Size means nothing. just because a screen is bigger, doesn't mean it takes more CPU or GPU power to run. 

Even intels integrated GPU's can manage a 56" plasma TV without breaking a sweat these days, yes even for 1080p movie playback. (this belongs in a separate thread, but even my intel atom 1.6Ghz netbook can play 720p HD content fine, its really not very demanding at all and uses CPU almost exclusively)

HDTV's work fine for PC gaming, but you need to be aware of motion sickness, which is why console games often have a different FoV (field of view) to PC games.

*Resolution (720p. 768p, 1080p etc):*
2. Resolution does matter, but TV manufacturers lie. just because it has a 1080p input doesn't mean it actually is a 1080p panel - it could be a 720p or 768p panel and they just take the 1080p signal and discard/shrink some of the image to make it fit. this is quite common on those 'too good to be true' bargain priced 1080p HDTV's you see heavily discounted, and pretty obvious when you actually try a PC on the TV and find the image is blurry and grainy (especially on text) no matter what you do. 
If no matter what you do your TV just looks blurry, this is probably why. you'll never be able to find the native resolution, so it will always look bad. sometimes using a different connection (VGA instead of HDMI) can yield better results.


*Input (HDMI, DVI, VGA etc)*
Use HDMI when possible, and only try other inputs for troubleshooting. HDMI is the only one that does audio and video in the one cable, so its the technically superior (and most common)

Its worth noting that some TV's have differences in the HDMI ports - my Samsung has three, yet only one (HDMI 2) supports its native 1366x768 resolution - the rest do 720p and 1080p, and even forcing 1366x760 looks blurry due to it scaling down to 720 automatically.

because of these oddities, i've actually found my 40" 1366x768 Samsung actually looks better than most modern cheap 1080p TV's, despite their pixel density advantage they still look blurrier and very 'soft' for text in windows.

As mentioned earlier, trying other connections to troubleshoot can be worth it. they may well stick to the HDMI specs with their non spec panel, and VGA might be the only way to get a decent, unmodified image at native panel resolution. Make sure to test all resolutions, and not just 720p/1080p

*Overscan*. 
TV's, monitors, and video cards just *love* enabling overscan by default. this crops off the edges of your image, and is a bad idea for a PC user. disable it first of all, before messing around with other settings. this can also be a cause of a blurry image.
On my Samsung 23.6" monitor, this was called "AV mode" and it caused me all sorts of headaches while setting up the monitor on HDMI.


*Image "enhancements"*
 Manufacturers (I'm glaring at you, Sony) just love adding in things to make TV and movies look better (arguably), but for an uncompressed digital signal such as PC or game console, they often make things look much, much worse.

So look for any settings like denoise, mosquito noise settings, deblocking or digital noise reduction (DNR) and turn them the hell off. A good test i found is to look at a still image of a solid colour (i like black) and look for artifacting - DNR on a sony 42" i tested made the black have a repeating pattern that looked like a tetris game was going on, and it was clearly visible in dark scenes in TV shows and movies as well.


These enhancements can also increase response times, causing delay and lag for gaming. Some HDTV's (like my samsung) have a "Game Mode" which forces all enhancements off to minimise lag and make it more suited for gaming on PC/console.

*Refresh rates - 50/60/120/200/400hz etc*

Another misleading term that sony love to advertise - they repeat frames to make some images look smoother, most noticeable on low refresh sources like interlaced TV or movies (DVD in other words), but when using HDMI or VGA connections, you're stuck at 60Hz, no matter what. So dont get tricked into thinking '400Hz' means 400Hz will show up as an available resolution.

*LED HDTV's*
Just to clarify a common misconception, LED is the lighting behind the panel, replacing the old CCFL method. it can be brighter, and it definitely saves power and runs cooler - but it means nothing to image quality.

A good 5 year old HDTV can look better than a cheap modern LED HDTV. Keep that in mind when purchasing, just because its LED 1080P on the box doesnt mean its a quality screen.



Here is an example photo of a HDTV that would be utterly horrible for HTPC use: see what you can find wrong with it


----------



## Mussels (Dec 29, 2011)

quite a few likes, but no suggestions on what to expand upon/add next? c'mon guys!


----------



## BumbleBee (Dec 29, 2011)

driver pictorial?


----------



## Mussels (Dec 29, 2011)

BumbleBee said:


> driver pictorial?



i can only cover ATI/AMD, would need someone else to provide nvidia/intel pics.


----------



## BumbleBee (Dec 29, 2011)

troubleshooting?

wookie blues?


----------



## Black Panther (Dec 29, 2011)

LCD vs LED / OLED?


----------



## CJCerny (Dec 29, 2011)

It's been my experience that using a HDMI input on a TV isn't necessarily the "best" input to use, at least when it comes to connecting a PC. Many TV's have a "PC Input" connector on them--typically a VGA connector and a 3.5mm stereo audio jack--that results in a more pleasing display. Now, before we get all flaming crazy, let me explain...all TVs do some processing on their HDMI inputs. The processing is optimized for the input from video game consoles, blu ray players, etc.  The result of that processing is that the desktop text displayed by the TV when the computer is connected to an HDMI input is much less sharp than it is when the computer is connected to the dedicated PC input, to which none of that processing is applied. It is also likely that the "PC Input" might accept a much lower resolution (typically 1368x768) than the HDMI input does (1080p). Some models surface some or all of the processing options--some don't surface much of anything.

Long story short...if your TV has a "PC Input", try it and compare it to the HDMI input. There's no harm in doing that. Stick with the one you like better. Every TV model is probably a little different, depending on what the TV's firmware does with each input and also based on how much you can control what the firmware does with each input. Don't worry about a resolution trade off if you find one input more visually pleasing than the other.


----------



## dank1983man420 (Dec 29, 2011)

Mussels said:


> quite a few likes, but no suggestions on what to expand upon/add next? c'mon guys!



How about refresh rates(and how many manufactuers cheat with their 120Hz and 240Hz refresh rates) 

Response times and how it affects input lag while gaming

Also maybe mention how to tell a S IPS panel from a plain TFT based on shape of the pixel ...etc.


----------



## digibucc (Dec 29, 2011)

CJCerny said:


> It's been my experience that using a HDMI input on a TV isn't necessarily the "best" input to use, at least when it comes to connecting a PC. Many TV's have a "PC Input" connector on them



wow. I've laughed every time i saw that VGA port on my tv. why on earth would i use that when i've got hdmi? not that i use the audio, it goes to the receiver from the pc sound card anyway 

well as you said, doesn't hurt to try - and i've got to agree with you. at least on this tv (i'll try the other one later) - it is much crisper, and the colors are finally right! i have fought with the settings to get colors right with hdmi since day one. i really can't believe i never even tried it to see, good advice!  i'll try some video later, but really i use it more often for lazy browsing


----------



## CJCerny (Dec 29, 2011)

Yeah...here's the thing...I've messed with at least a dozen brand new TV's in the past year--both low end and high end--and I haven't run into one yet on which I could get a nice sharp desktop display using a HDMI connection, despite trying all the driver and TV based adjustments that were available to me. People need to be aware that this may very well be an issue for them too if they want to use their TV as a monitor.


----------



## digibucc (Dec 29, 2011)

yeah i couldn't understand why 1080p didn't look as clear (accounting for size) on the tv as it did on the monitor. now it does  i feel stupid but at the same time you would think since hdmi can handle it, the tv manufacturer's wouldn't screw it up.

my monitors have hdmi and they look great with it, so you are def right imo that it's the picture processing the tv does with hdmi.


----------



## Goodman (Dec 29, 2011)

Mussels said:


> quite a few likes, but no suggestions on what to expand upon/add next? c'mon guys!



Well... what about pics (text ,video ,etc...) to see the difference between HDTV (real 1080) vs PC monitor?

Even if it's only an HDTV 720p it still would give us a better idea of what to expect 

Thanks!



CJCerny said:


> It's been my experience that using a HDMI input on a TV isn't necessarily the "best" input to use, at least when it comes to connecting a PC. Many TV's have a "PC Input" connector on them--typically a VGA connector and a 3.5mm stereo audio jack--that results in a more pleasing display. Now, before we get all flaming crazy, let me explain...all TVs do some processing on their HDMI inputs. The processing is optimized for the input from video game consoles, blu ray players, etc.  The result of that processing is that the desktop text displayed by the TV when the computer is connected to an HDMI input is much less sharp than it is when the computer is connected to the dedicated PC input, to which none of that processing is applied. It is also likely that the "PC Input" might accept a much lower resolution (typically 1368x768) than the HDMI input does (1080p). Some models surface some or all of the processing options--some don't surface much of anything.
> 
> Long story short...if your TV has a "PC Input", try it and compare it to the HDMI input. There's no harm in doing that. Stick with the one you like better. Every TV model is probably a little different, depending on what the TV's firmware does with each input and also based on how much you can control what the firmware does with each input. Don't worry about a resolution trade off if you find one input more visually pleasing than the other.



See my Thread --->http://www.techpowerup.com/forums/showthread.php?t=157595

What you just wrote got me a step closer to buy a bigger monitor then an HDTV (1080) that i was looking for but i still would like to see some pics or video that compare the two before i made my final decision


----------



## digibucc (Dec 29, 2011)

Goodman said:


> Well... what about pics (text ,video ,etc...) to see the difference between HDTV (real 1080) vs PC monitor?



if you look at the pictures on your pc monitor they won't mean anything. unless i am missing your point? ie you can't see the difference between 1080p and 720p on a 720p monitor.edit:  but on a 1080p or above you could so that may be it. nvm.
and "real 1080" vs pc monitor? i think you may mean real "hdtv" , but it's not some magical setting. if your display can do above 1080 vertical lines you have High definition video capabilities, whether there is an hdtv stamp or not.


----------



## Mussels (Dec 30, 2011)

i did cover in the article/guide about HDMI vs VGA and why it does that, its up/down scaling to weird panel resolutions to fit HDMI specs.


refresh rates is a good one to cover, but some of the rest cant be done.


how can i show you a picture? take photos? your own monitor would adjust how it looks then. people on small res screens wouldnt see much difference, etc. i'll look into it, but it might not be doable. i should be able to find some example images for how overscan works at least.


LED is a simple one to add in.


edit: most done.

also lol at 'real 1080' - PC monitors are the ones really doing it, because they actually give you 1920x1080 panels - not 1024x768 and a downscaler like a lot of cheap HDTV's (especially plasmas)


and someone said how to tell monitors from shape of pixel? wth? never heard of that one before.


----------



## Mussels (Dec 30, 2011)

damnit i added it all in, then firefox erased it >.>

time to retype


----------



## digibucc (Dec 30, 2011)

there are different pixel shapes, but i don't know anything about them.

as for hdmi, you should be a little more clear that it is the obvious solution for an htpc, but for fine text if you don't need audio in the same cable, vga may work better. it's not just scaling i don't think - i tried many variations of zoom (scaling on tv) and ccc scaling from the gpu, and none of them looked as good as vga first try. i really think it has to do with color and contrast filters or something similar. 

now to just get rid of the tearing with hd movies  hint hint


----------



## Mussels (Dec 30, 2011)

yeah but VGA adds in its own problems, since it cant do 1:1 pixel mapping. everything can be set up perfectly, and still a few pixels will blur or shift since its an analogue signal.


note to self: mention aspect ratio/1:1 mapping options


----------



## Goodman (Dec 30, 2011)

digibucc said:


> and "real 1080" vs pc monitor? i think you may mean real "hdtv" , but it's not some magical setting. if your display can do above 1080 vertical lines you have High definition video capabilities, whether there is an hdtv stamp or not.



Of course i meant HDTV when i wrote "real 1080" what else could it be?
Simply saying that 720p & 1080 are both HDTV & i don't care about the 720p HDTV only the ones that can do 1080p (1920x1080 resolution) as i would use it for a desktop monitor

What i would like to see is something like a small video or pictures of a PC monitor & an HDTV plug in on a computer running everything at the same time to see the difference between a TV & a monitor , specially when using texts

Long time ago (~10 years) i tried using a CRT TV for games but text was unreadable & resolution was really low & bad 

I would try that myself right now but all the 5 TV's i got in my house are all CRT & the only one that is LCD is a monitor for my PC & that is why i want to see what is the difference between the two 
Now that TV's (HDTV) can do 1920x180 resolution , i think it meant finally be a good choice for desktop use , maybe?


----------



## Mussels (Dec 30, 2011)

Goodman: you meant "real HDTV's, at 1080p" where we read "real 1080p" vs "false 1080p"


----------



## Goodman (Dec 30, 2011)

Mussels said:


> Goodman: you meant "real HDTV's, at 1080p" where we read "real 1080p" vs "false 1080p"



Well yeah! ok...
HDTV *at* 1080 vs PC monitor 

Anyways as far as i know 720p is also HDTV & i don't care much about 1366x768 resolution & that is why i am asking about the HDTV that can do 1920x1080 for PC use


----------



## Mussels (Dec 30, 2011)

PC monitor will always look better due to pixel density. same res over larger area = bigger pixels, worse quality at same distance.


getting a big screen at 1080p just means you'd better sit back further.


----------



## Goodman (Dec 30, 2011)

Mussels said:


> PC monitor will always look better due to pixel density. same res over larger area = bigger pixels, worse quality at same distance.
> 
> 
> getting a big screen at 1080p just means you'd better sit back further.



I know that & that's why i wouldn't go to big for a PC use , i think 32" should be the biggest someone should use (exception of 3 monitors setup) it's just that price on an 30"+ monitor is f*in insane  & an 27" is expensive but meant be worth getting in debt for a few months...

The thing is , is now that most 32" HDTV can do 1920x1080 resolution & are "cheap" to buy less that half the price of a 30" monitor , i figure a good 32" HDTV will be good enough & probably wont see much of a difference between the two if any?

Anyhow if things keeps on going this way , monitor prices will have to drop big time as HDTV getting much closer to what a monitor can do as quality & resolution wise , i guess in about 3-5 years there will be no differences what so ever between a monitor & a TV except for price , maybe?


----------



## Mussels (Dec 30, 2011)

Goodman: i've covered it a few times, the problem is that these cheap 1080p TV's still have crap image quality. they use cheap panels, so 1080p does not mean they look as good as  PC monitor would at 1080p.

the heavily discounted prices on the budget models comes at the cost of image quality, as several people in the thread have already stated.


----------



## AlienIsGOD (Dec 30, 2011)

http://www.samsung.com/ca/consumer/...dex.idx?pagetype=prd_detail&tab=specification thats my TV and i find the image quite pleasing through HDMI.  Now I have never bought a HDTV before,  so I don't even know if thats a 1/2 decent Samsung model.


----------



## keling (Jan 26, 2012)

Nice thread. Have been using the living room's 42" HDTV occasionally and currently looking one for my PC. I enjoyed the big screen for flight sims and racing games but for multiplayer FPS, I do feel it's hindering me.

Is there any list of recommended HDTVs from the TPU forum? A poll from the users perhaps?


----------



## cdawall (Feb 12, 2012)

Do you want some extra plasma info? I currently run a plasma for my home theatre (720P) and a LCD (1080P) for my bedroom.


----------



## BumbleBee (Feb 12, 2012)

this is the guide of all guides when it comes to televisions.

http://www.tweakguides.com/HDTV_1.html


----------



## Mussels (Feb 12, 2012)

all info is good, i just find  that threads like these help consolidate info and stop the same questions being asked over and over again.


----------



## TheoneandonlyMrK (Feb 12, 2012)

Mussels said:


> quite a few likes, but no suggestions on what to expand upon/add next? c'mon guys!



I use a 32"" sony bravia and read your guide with interest, most i knew already but its good to have someone else agree with my thoughts , as far as comments and improvements

my hdtv has 4 available pixle formats

YcBcr4,4,4
YcBcr4,2,2
R,G,B4,4,4
R,G,B4,2,2

its allways auto set itself to YcBcr4,2,2 but i allways change this to RGB4,4,4 pc standard RGB 

over HDMI of course, im not entirely sure of the advantages of each , or which would be best


----------



## monte84 (Feb 12, 2012)

I have dealt alot with TV's as PC monitors (using a 32in samsung right now over HDMI and it looks really nice, very clear text). if your wanting 1920*1080, then make sure it can do 1920*1080 in VGA (easy way to indicate that it is a true 1080 TV. Most will only do 1366*768 but claim to be a 1080p tv. When using HDMI you need to have 1:1 pixle mapping enabled (this setting is labaled different by manufacturers if supported, for me it was the screen fit mode). If not the image will be scaled and blurry. And adjust your scaling option in you video driver control panel.


----------



## TheoneandonlyMrK (Feb 12, 2012)

monte84 said:


> When using HDMI you need to have 1:1 pixle mapping enabled (this setting is labaled different by manufacturers if supported, for me it was the screen fit mode). If not the image will be scaled and blurry. And adjust your scaling option in you video driver control panel.



erm most of what you said is in the OP but your 1;1 mapping bit isnt, so why are you saying this? what leads you to say this (facts), ive obviously tried all four pixel modes and the only noticeable difference is slight colour hue changes and im after reasons/facts not just an opinion ,no offense i would like to know


----------



## monte84 (Feb 12, 2012)

Simple, 1:1 pixle mapping was not listed in this thread, its good information to have. It does not change the color hue it change the stretching of this picture on the screen. Its usually with the settings like Wide, Zoom, etc. Its directly realted to overscanning. More info over here at the AVSforums, also i good list of TV's that support this feature. http://www.avsforum.com/avs-vb/showthread.php?t=748074


----------



## Mussels (Feb 13, 2012)

theoneandonlymrk said:


> I use a 32"" sony bravia and read your guide with interest, most i knew already but its good to have someone else agree with my thoughts , as far as comments and improvements
> 
> my hdtv has 4 available pixle formats
> 
> ...




IIRC RGB 4,4,4,4 is best since it has a larger color spectrum than the others. i dont know the full details on that one.


and really, i missed 1:1? i thought that was covered


----------



## Fitseries3 (Feb 13, 2012)

not sure if this fits in this thread but i think it does.... 

simple question...

if my motherboard has onboard "good/great" audio like how gigabyte and asus boards have xfi and other good onboard audio chips, not regular realtek/soundmax chips, but i use HDMI from my video card, an asus 6950 DCuII, do i get the xfi quality sound to my TV/receiver? or does the video card have its own audio chip and own quality of sound?

im using a samsung 8000 series 55" LED tv and harmon kardon receiver


----------



## FordGT90Concept (Feb 13, 2012)

If you're using HDMI, you're sending a digital feed so it really doesn't matter.  The DAC (Digital-to-Analog Converter) in the TV matters.  The Harmon Kardon receiver likely has good DACs so I'd send the audio signal there.

Alternatively, you could put the video signal anywhere and plug the analog audio cables from the X-Fi into the receiver.  It is more complicated to do that and creates a wire mess but it gives you more control over the audio via the X-Fi chip.

If you never use the features of the X-Fi chip (like CMSS, environmental effects, etc.) I'd just keep it simple and use HDMI.


----------



## Mussels (Feb 13, 2012)

what ford said. the video cards have their own digital only sound card, so the onboard is completely ignored/negated.


----------



## erixx (Aug 26, 2012)

So ordering a 
PHILIPS 32PFL5007H LED 32", FULL HD, 400HZ PMR, i am loosing money?

(presently I have a 32" 100hz LCD Philips and i love it, but it is HOT and not 100% good for text reading)


----------

