# 100Hz LCD TV Screens (worth It?)



## twicksisted (Oct 25, 2009)

Ok as the title implys...*100Hz LCD TV Screens* (worth It?) I want to hear your opinions on this topic.
Standard LCD TV are 60Hz (or capable of 60Fps max to my understanding).... 100Hz screens are now the rage with advertising saying that you'll see faster frames for sports etc... even though the signal received is only running 24, 29.97 or 30fps.

Personally I dont see this as making any sense... blueray movies are generally 24fps... with different movie formats coming out at 29.97 or 30fps... there are also other formats direct from digital camera (like the one i own) like 50, 60fps etc... but these are not commercial formats that are broadcast or even on DVD or Blueray.

So my question is why spend the extra of a 100Hz TV screen? I say marketting gimmick, but id like to prove myself wrong and I hope that there is a point to it 

(Edit: *PLEASE KEEP THIS TOPIC ON TV SCREENS*... Not using the screen for gaming etc as thats obviously gonna make a difference with the higher refresh rate and higher FPS)


----------



## Galaxy (Oct 25, 2009)

I really don't know why, but the picture seems a lot more smoother on a 100Hz TV. My friend owns some Phillips 100Hz LCD 42" and it seems that everything is "faster" - just a little...

I own a LG Scarlet 32LG6000. Maybe I'm just used to it - but you see a difference between "ordinary" LCD and a 100Hz LCD.


----------



## wolf (Oct 25, 2009)

My mate just bought a 40" .... I cant remember the brand but it advertized 100hz for sure.

I was completely unable to manually set 100hz anywhere on my pc, however there is a 100hz button on the remote that made games MORE blurry, imagine that.

I had high hopes but really i think its a gimmick, they try and emulate what 100hz is like or something :shadedshu

EDIT: sorry this was more gaming related, TV looks sweet on it 100hz or 60hz.


----------



## DaedalusHelios (Oct 25, 2009)

twicksisted said:


> Ok as the title implys...*100Hz LCD TV Screens* (worth It?) I want to hear your opinions on this topic.
> Standard LCD TV are 60Hz (or capable of 60Fps max to my understanding).... 100Hz screens are now the rage with advertising saying that you'll see faster frames for sports etc... even though the signal received is only running 24, 29.97 or 30fps.
> 
> Personally I dont see this as making any sense... blueray movies are generally 24fps... with different movie formats coming out at 29.97 or 30fps... there are also other formats direct from digital camera (like the one i own) like 50, 60fps etc... but these are not commercial formats that are broadcast or even on DVD or Blueray.
> ...




Are you using it for HD TV?

Latency is more important than refresh rate for TV viewing. Some Visio branded TV's are around 14ms or worse. I suggest reading independent in-depth reviews of a model before buying one. The range of quality is vast.


----------



## twicksisted (Oct 25, 2009)

No im not using it, I have a very nice Samsung 40" 1080p 60hz.... I have been advising people that to my knowldge for TV (DVD, Blueray & Digital Sattelite Transmission) useage its not going to make any difference as opposed to normal 60hz telivisions and to save their cash instead or buy a better quality 60hz panel...

now this is something that to me makes perfect sense, but reading all the marketting bullshit online and in ad's I wanted to hear from others who could give me technical reasons to why its actually gonna be better or to why im wrong thinking this


----------



## Binge (Oct 25, 2009)

http://forums.techpowerup.com/showthread.php?t=106528

Follow the link above for an interesting look at screens with above 60Hz refresh rate.

A personal opinion/suggestion.... Evo, the tournament for street fighter, uses ASUS VH223H model LCDs as their competition screen as it has 0 command latency which is imperative for gaming.  This is far more important to me than 2ms GTG response or refresh rates above 60Hz.

You could measure command latency if you were to dual monitor a CRT and an LCD and then watch/record/take a picture of the two with a large scale clock on either screen that read into the milliseconds.  You'd notice the CRT and the LCD say the same time, but the CRT may get ahead of the LCD at times.  When this happens it means that the display signal is being preprocessed too much and you lose some frames.


----------



## human_error (Oct 25, 2009)

I have 60hz and 100hz tv setups and the 100hz is better. The way it does it is that the tv knows the previous frame, it knows the next frame so it can interpolate where the various visual components should be in intermediate frames (the delay is minimal, especially if you're not playing games in which case the sound is also delayed and you'd never know). This means that the images look smoother as the "jumps" in position of moving objects are a lot smaller if they move the same distance in 100 frames as opposed to 60.

Now with 100hz it's not a feature i say you _need_ - you won't miss it if you get a 60hz screen, however once you've got it or had it you grow acustomed to it and will notice it when it's gone. It is something which is nice to have in a tv as well, as it does make everything a lot "smoother" when in motion.


----------



## twicksisted (Oct 25, 2009)

human_error said:


> I have 60hz and 100hz tv setups and the 100hz is better. The way it does it is that the tv knows the previous frame, it knows the next frame so it can interpolate where the various visual components should be in intermediate frames (the delay is minimal, especially if you're not playing games in which case the sound is also delayed and you'd never know). This means that the images look smoother.
> 
> Now with 100hz it's not a feature i say you _need_ - you won't miss it if you get a 60hz screen, however once you've got it or had it you grow acustomed to it and will notice it when it's gone. It is something which is nice to have in a tv as well, as it does make everything a lot "smoother" when in motion.



Ok but if the source material is only blueray (24fps) or TV 30fps... then why would having 100fps as opposed to 60Fps be visibily different?.... surely if the source is in 24/30fps youre not getting any extra information as there arent any extra frames to display?


----------



## human_error (Oct 25, 2009)

twicksisted said:


> Ok but if the source material is only blueray (24fps) or TV 30fps... then why would having 100fps as opposed to 60Fps be visibily different?.... surely if the source is in 24/30fps youre not getting any extra information as there arent any extra frames to display?



you do - you get the current frame and the next frame - by looking at the different visual parts of the image you can identify what parts move between the frames and add in where it would be if there was a frame.

For example, take the letter f to be an object (eg a ball in a sport) and each time it is shown below is it's position in a new frame. The TV can see where the ball was at frame 1, it knows where it is in frame 2 from the source, so it can divide the distance travelled between those frames by 3, add that distance to where it was in frame 1 to get frame 1 and 1/4, then that distance frame 1 1/4 to get frame 1 2/4 and then again to get frame 1 3/4 before showing the source frame 2.

This is how it would look - each time the letter is printed is it's position on the screen, and the only 1 f can be seen at once. As you increase the number of frames shown with calculated positions of the object you can see that it would look a lot smoother as the distance travelled between each frame is reduced.

<---visual distance across a screen-->
30fps:
f---f---f---f---f---f---f---f---f---f--
60fps:
f-f-f-f-f-f-f-f-f-f-f-f-f-f-f-f-f-f-f-f-f-
100fps:
ffffffffffffffffffffffffffffffffffffffffffffffffff

To make the movement look smoother the aim is to reduce the distance the object has visually moved between each frame, which is how 100hz and even faster screens look smoother from the same source. As i said above they can predict where an object is half way between 2 source frames and can divide that up further to get the desired positions for the added-in frames.

I hope the above makes sense - if not i'll try and explain better.


----------



## twicksisted (Oct 25, 2009)

yeah that does make sense but only if the material being played had all of those frames to begin with... if you know what i mean?
So lets say like in your diagram you record something in 30fps:
30fps:
f---f---f---f---f---f---f---f---f---f--

When you play it back on a telivision... you can change the fact that the "F" is only on those frames becuase those frames have specifically set the F in those locations if you get what i mean?
If the original source was recorded (as in your diagram):
100fps:
ffffffffffffffffffffffffffffffffffffffffffffffffff

Then i could see the point in playing it back in the same framerate, but its not... its recorded and broadcast at 29.97 fps... having more frames isnt gonna make it sound better... *kinda like taking an MP3 and rendering it to 24bit WAV... its not gonna sound any better even though 24bit WAV audio is a better format* The source material is what it is


----------



## Clouds4brains (Oct 25, 2009)

100/120hz tv's are awsome, I love the way that everything you watch on one makes whatever you watch look like a soap opera, my opinion is that thse faster tv's make good actors look bad because nothing looks treal anymore.
My gf dad has one and it looks so silly and unwatchable, makes me wonder if the its only people with these tv's that big them up cos the alternative is to say " yeah i spent loads of money on a crap tv and it looks like crap, he come watch my film it looks same as  a soap opera gone bad "

Games console looked good , tv was total rubbish!!!!


----------



## human_error (Oct 25, 2009)

twicksisted said:


> yeah that does make sense but only if the material being played had all of those frames to begin with... if you know what i mean?
> 
> So lets say like in your diagram you record something in 30fps:
> 30fps:
> ...



It is very easy to predict where everything would be between 2 frames. The Tv can look at frame 1 which was:
(each new line is a new frame for these examples)

f----

and then look at frame 2 which was 

----f

and from that it can see that it can add (assuming object f is moving at a steady speed for ease in this example)
-f---
--f--
---f-
between frames 1 and 2 (this can be done extremely quick, with practically no delay as a 60hz tv would just be waiting to render the next frame anyway, while the 100hz tv is doing the calculating and adding the extra frames). 

so your end result from above is instead of:
f----
----f

you get:
f----
-f---
--f--
---f-
----f

it's all simple maths as you know where an object was, you know where it is in x time so you have distance moved as well, meaning to get an extra frame in you do:

original pos+(distance moved/2)=new frame not from source

You just divide it up even more to add as many extra frames as is needed. Obviously the system is slightly more complicated to deal with varying speed objects and changing scene detection so it doesn't mess any images up, but essentially this is how the concept works.


----------



## twicksisted (Oct 25, 2009)

nice explaination thanks... but with that explaination its almost like saying it dosent matter what the source input is... what if its only 15fps... will the 100hz screen make objects appear to move more smoothely still?... it couldnt surely as youre still only getting 15fps. (like if you were playing a game on a screen and only getting 15fps... dosent matter if your screen is set at 100hz, wont appear to move any smoother)

If it does then my next question is:
Can you really notice a difference with your eyes... for instance if in your explaination the source material is 30fps then:

60hz screen will draw the screen 2X the frame rate  (60fps)
100hz will draw the frame just over 3X the frame rate (100fps)
(30hz would draw 1X for example (30fps))

This seems very marginal really... if this technology is better then 200hz would be a much bigger difference (almost 7X) no matter what the source input material again?
Im not trying to prove anyone wrong here im just trying to understand better why for myself


----------



## a_ump (Oct 25, 2009)

i think twicksisted has a valid point and i understand what he's saying as that is what makes sense to myself as well. i've read this whole page and some parts again and its interesting. 

but i too don't see how a tv can create frames 1.25,1.5,1.75, etc based off frame 1 and 2. As twick said, by making these other frames it is generating more frames, so technically if it does do as you(human_error) say then it would make whatever your viewing have a greater FPS than what the source outputs. I don't believe that it would make them 100FPS cause the tv have to create these new frames itself, which in turn with higher resolutions like 1080p it'd require more processing power to create these new frames and more time than say a 480p video. 

I personally had no idea a tv had the processing power to create frames, in fact if a TV can do this i'd think that there'd be software out for pc by now that could do this, know of any? and as twick stated i'm not saying your wrong i'm very intrigued by what i've read n simply want to better understand.


----------



## DRDNA (Oct 25, 2009)

higher MHz ='s less ghosting ...less blur .. crisper fast movment....this is in theory!


----------



## human_error (Oct 25, 2009)

twicksisted said:


> nice explaination thanks... but with that explaination its almost like saying it dosent matter what the source input is... what if its only 15fps... will the 100hz screen make objects appear to move smoothely still?... it couldnt surely as youre still only getting 15fps. (like if you were playing a game on a screen and only getting 15fps... dosent matter if your screen is set at 100hz, wont appear to move any smoother)
> 
> If it does then my next question is:
> Can you really notice a difference with your eyes... for instance if in your explaination the source material is 30fps then:
> ...



Well the tv does have limits - to avoid ruining scenes there is a limit to the distance the tv will interpolate between, so if the fps is too low then the tv may as well be the same speed as the fps as it can't add in the extra detail as the "jumping" of the objects may be intentional. I only notice it when something is moving smoothly - an actor turning their head or a tennis ball is a good example of this. Also due to the higher framerate the "ghosting" which can happen on screens is gone as it may need 3 or 4 refreshes to completely remove the previous position of the object completely so on 100hz this happens a lot faster than on 60, helping reduce blurryness as well.

You won't notice the difference if it only showed 1 frame - you can only see the difference on things which are moving a relatively small distance across the screen between frames - if it was a 15fps source then the tv couldn't calculate the distance between:
f---------------------
and 
---------------------f
properly as there is too much unknown detail there for the tv to be able to calculate - the scene will have changed too much with too much unknown data for the tv to accurately calculate what should go where, so it does have its limits. 

As i said in my original post my opinion on 100hz is that it is nice, but it is not a "killer feature" and so other factors such as screen clarity, contrast ratio (to a limit), colour reproduction and sticking within your budget must all come first, if you can meet all of those and get the choice between 100hz and 60hz within your budget then go for the 100hz as you will get some benefit from it, but you won't miss it if you can't get it as you can't notice it not being there (if that makes sense).

Really if you want to see if you think it is worth it go to a brick and mortar store and look at 2 screens next to each other (one 100hz, 1 60hz) with the same source and see if you notice any difference - these things are always down to personal preference at the end of the day as some people will say it's very important while others wont notice the difference.

**edit**

as for the complexity the tv is looking at 2 bitmaps, quickly scanning for similar patterns very close to each other in the bitmaps and creating intermediary bitmaps to display - with dedicated silicon designed around the problem it isn't too complex, but as i said it does have limits and a lot of the bonus in sports especially is the reduction in ghosting as the more refreshes per second the faster the previous image's ghosting is removed, reducing blurryness). This is of course how i understand the technology to work, i could be mistaken.


----------



## twicksisted (Oct 25, 2009)

thanks for your input here.... i do appreciate it as im trying to understand this myself... and i see where youre coming from... but its not really explaining the technology itself... more explaining the concept and what you think its doing.... 

Saying that at 15fps (or another non-smooth video format) its not going to work its magic in the above post, but at 30fps its going to make it smoother by adding in extra frames where it thinks they should go dosent really explain anything (if you know what I mean)... surely if the technology can work at 30fps and make it appear like 100FPS it could make 15fps 3X better?.... theres 200hz tellys out now... does that mean that those should technically do even better?

Also if this technology works as you have explained it to... what about playing video that itself has the frames purposly offset as a video effect... it would surely cancel that and make it smooth messing up what the video intended

This is what im getting at... im hearing a lot of hype and ad speak but no actual proof as to how it could be better technically... obviously a screen running at 100hz is going to be better than the same screen running at 60hz... thats non disputable... but will you actually be able to notice it with the current formats out today?

surely 30fps is 30fps... its not gonna get any faster just becuase your screen is capable of a higher framerate... if it can then the future isnt in graphics cards... just buy a faster screen and youll get higher fps instead... (obviously that makes no sense)


----------



## a_ump (Oct 25, 2009)

i understand better now, though it still amazes me that, even if it is only under certain circumstances, a tv can do this. But now i wonder why they don't have Computer monitors doing this...would think it'd be a hit for gamers. well nvm i forgot that in a game the next frame in a game isn't set in stone.

EDIT:
i love your first sentence twick: "its not explaining the technology itself, only the concept." which is very true. Surely there'd be some technical differences that could be dug up on one of those 100mhz tv's compared to a 60mhz tv. Though if you think about it why would a 100mhz tv be able to have this tech, but the 60mhz which is still at least 2x the FPS of anything you'd view not? shouldn't 60mhz tv's be able to up the FPS of a sports game from 30 to at least 45? and if 60mhz tv don't have the implemented tech then these 100mhz tv's must have specific hardware components that can do this new frame magic as i'll call it. Which should mean we can look up what that hardware chip is and get more details.


----------



## Binge (Oct 25, 2009)

The feature most TVs with 100Hz refresh rates have is a processing element that interpolates the missing frames.  Most 60Hz LCDs will change colors slowly vs TVs with higher refresh rates, and this causes a blur effect.  Today even without the extra interpolation of frames the 100Hz set will show a moving object with less blurring/streaking than a set that runs on 60Hz.  This is not to say the sets of tomorrow couldn't change colors faster and eliminate blur on a 60Hz set, but with how things are I doubt we'll see that without a reason.  I believe the manufacturer would want to sell a product so they would give you 600Hz along with the fast changing pixels.


----------



## twicksisted (Oct 25, 2009)

hmmm i see no blur on my 1080p 40" samsung.... and the colour vibrancy and everything is perfect (best out of the whole shop when i bought it)..... the pixel changing speed is down to the pixel latency GTG not how many HZ the panel is running...

I believe my screen is 5ms GTG.... now if what youre saying is correct then a 100Hz would be faster... and a 200+ would be even quicker...?

Its surely the same technology as an LCD panel for computer games and those go as low as 2ms GTG in 60hz.... so that would imply that at 200hz is would be what 0.5ms GTG? I dont think so? as those 100hz tellys are also 5ms GTG etc...


----------



## human_error (Oct 25, 2009)

twicksisted said:


> thanks for your input here.... i do appreciate it as im trying to understand this myself... and i see where youre coming from... but its not really explaining the technology itself... more explaining the concept and what you think its doing....
> 
> Saying that at 15fps (or another non-smooth video format) its not going to work its magic in the above post, but at 30fps its going to make it smoother by adding in extra frames where it thinks they should go dosent really explain anything (if you know what I mean)... surely if the technology can work at 30fps and make it appear like 100FPS it could make 15fps 3X better?.... theres 200hz tellys out now... does that mean that those should technically do even better?
> 
> ...



With the 15fps example where i said it would do this it is because i assumed that at 15fps the objects would have moved quite a distance between frames - if the objects were moving very short distances between the frames (like in slow motion) then the TV would add in the extra frames - you just wouldn't notice as the source is moving soo slowly the extra frames would have to be adding very tiny distances (<1 pixel per frame perhaps). The limitation on if the TV adds the frames or not is the distance between the objects in the source frames - the tv only adds the extra frames if the objects moved very small distances otherwise as you said it would ruin scenes where things jump around for dramatic effect (it's a limitation placed on the algorithm to prevent adding frames which should not be added). 

The distance the TV will be prepared to calculate between would have to be so small between refreshes that you wouldn't be able to point at a 60hz tv and show where the added frames would be added on a 100hz tv (without slowly going through each frame and looking at which objects had moved within the maximum distance)- this is why people say it looks smoother but they couldn't explain why easily - the effect is cumulative over a lot of frames (24-30 frames per second from original source) so the human eye can see it is smoother, but the viewer couldn't say why as the distance between individual source frames is too small to identify as the image is played back at normal speed.

**edit**



> i love your first sentence twick: "its not explaining the technology itself, only the concept." which is very true. Surely there'd be some technical differences that could be dug up on one of those 100mhz tv's compared to a 60mhz tv. Though if you think about it why would a 100mhz tv be able to have this tech, but the 60mhz which is still at least 2x the FPS of anything you'd view not? shouldn't 60mhz tv's be able to up the FPS of a sports game from 30 to at least 45? and if 60mhz tv don't have the implemented tech then these 100mhz tv's must have specific hardware components that can do this new frame magic as i'll call it. Which should mean we can look up what that hardware chip is and get more details.



Absolutely - the TVs have extra dedicated chips for this purpose which is why 100hz tvs cost more than 60hz (other than just marketing)- more hardware is needed to make the interpolation possible as well as faster switching panels. This technology is inside things like Sony's "bravia engine" and the other "brands" each company puts on the tvs (a 60hz bravia and a 100hz bravia set have different chips for example).


----------



## oily_17 (Oct 25, 2009)

a_ump said:


> but the 60mhz which is still at least 2x the FPS of anything you'd view not? shouldn't 60mhz tv's be able to up the FPS of a sports game from 30 to at least 45? ...




I remember reading this -



> NTSC is 29.97 Frames per second (Fps) which is 59.94 interlaced fields per second (fps). Standard US TV's are therefore usually said to be 60Hz field rate and as European TV is 25 Frames per second which is 50 fields per second it is usually referred to as 50Hz.



So maybe confusing frames and fields per second is where some people go wrong.


----------



## niko084 (Oct 25, 2009)

I might make a side note here that not all bluray is 24fps!
100hz is an odd number, 120hz LCD's now..
120hz is a pretty sweet number to be at, it's divisible by 24/30/60 for various movies.

Now remember, Plasma's are between 400-600+ hz.
They are well known to out perform LCD's in many aspects *except power draw and cost*.

Higher HZ the better, but you kinda want them to be a multiple of you common viewing fps also.
Some sets will also step down for 24fps media.

As always, if you are looking for a lot of in depth info you maybe better off looking at avsforum.com


----------



## twicksisted (Oct 25, 2009)

im sorry that im having a hard time getting this... its just it sounds good on paper but the actual technology that is supposedly doing these operations dosent seem viable or possible.

In order to know fill in the missing frames, it will need to know what the last frame its working on to be, and it'd need to buffer it... but with a digital broadcast or a DVD the processing happens inside the decoder or player... not the TV... then that signal is played on the screen directly.... no time to mess with the signal or it would be out of synch with the sound coming out of the decoder / dvd player thats attached to your 5.1 surround system...

If its going to assume where something is going to be... itll have to have it buffered up to know where the last frame is in order to fill in the rest... you cant buffer with a realtime media


----------



## niko084 (Oct 25, 2009)

twicksisted said:


> im sorry that im having a hard time getting this... its just it sounds good on paper but the actual technology that is supposedly doing these operations dosent seem viable or possible.
> 
> In order to know where something is going to be, youd need to buffer it... but with a digital broadcast or a DVD the processing happens inside the decoder or player... not the TV... then that signal is played on the screen.... no time to mess with the signal or it would be out of synch with the sound coming out of the decoder / dvd player thats attached to your 5.1 surround system...
> 
> If its going to assume where something is going to be... itll have to have it beffered up to know where the last frame is in order to fill in the rest... you cant buffer in real time



What you don't realize is that it IS out of sync. It doesn't need to be far out of sync, you are sending digital signals over 2 different cables being processed by two different decoders.

This is truly why Analog is FTW to any true Audio/Videophile, even if it's not noticeable, it is still simple fact.


----------



## human_error (Oct 25, 2009)

twicksisted said:


> im sorry that im having a hard time getting this... its just it sounds good on paper but the actual technology that is supposedly doing these operations dosent seem viable or possible.
> 
> In order to know where something is going to be, youd need to buffer it... but with a digital broadcast or a DVD the processing happens inside the decoder or player... not the TV... then that signal is played on the screen.... no time to mess with the signal or it would be out of synch with the sound coming out of the decoder / dvd player thats attached to your 5.1 surround system...
> 
> If its going to assume where something is going to be... itll have to have it beffered up to know where the last frame is in order to fill in the rest... you cant buffer in real time



But it does have the time - while a 60hz tv would be idling between frames a 100hz tv is able to calculate the extra frames and send them for rendering. There is a buffering involved - but you wouldn't notice a 1ms delay on a frame, and this is also why most decent source decoders (eg blu-ray players, digital tv decoders, surround sound kits) have options to delay audio output by 5,10,50 etc miliseconds so the video and audio is perfectly in sync (if the audio comes from the TV then the TV delayes the audio automatically).

This interpolation is all extremely fast and is done with a chip which is designed to interpolate and only interpolate - dedicated silicon with carefully designed algorithms and instruction sets make it very efficient to process the data as they only compare bitmaps for similar shapes which dont move far between source frames. It is possible as it is done today.

The resaon pc monitors don't do it is to keep cost down - 100hz tvs cost £600+ easily for bottom end models - pc monitor makers don't want extra chips consuming power, creating heat and eating into their margins when the PC should be creating a source which matches the screen's capability - TV is limited to 25-30fps max, the source can't be improved as the transmission technology would also need an overhaul for little gain, so it needs to be done on the TV where the source can't be changed, but on the pc the source can be changed to match the tv and so it is better to do that than have the monitor approximate the extra frames.


----------



## niko084 (Oct 25, 2009)

human_error said:


> This interpolation is all extremely fast and is done with a chip which is designed to interpolate and only interpolate - dedicated silicon with carefully designed algorithms and instruction sets make it very efficient to process the data as they only compare bitmaps for similar shapes which dont move far between source frames. It is possible as it is done today.



Take a look at the processing and different abilities of high end sets, you will quickly see that they themselves have pretty cool processors to do all sorts of stuff to change your movie.

Watch a movie on a few different sets and you can very easily start to discern the differences.

For instance on a fast pan, some screens will judder, some wont. Some screens are coming with smoothing where they fill empty frames or re-process them and edit them so clean them up. Some people don't even like it, they say they like their film looking like a film, which is the whole reason they started the whole 24fps bluray stuff anyways, everything got too real, too clean and people didn't like it.


----------



## twicksisted (Oct 25, 2009)

human_error said:


> But it does have the time - while a 60hz tv would be idling between frames a 100hz tv is able to calculate the extra frames and send them for rendering. There is a buffering involved - but you wouldn't notice a 1ms delay on a frame, and this is also why most decent setups have options to delay audio output by 5,10,50 etc miliseconds so the video and audio is perfectly in sync.
> 
> This interpolation is all extremely fast and is done with a chip which is designed to interpolate and only interpolate - dedicated silicon with carefully designed algorithms and instruction sets make it very efficient to process the data as they only compare bitmaps for similar shapes which dont move far between source frames. It is possible as it is done today.



if you cant notice a 1ms delay with your eyes... then how can you see the difference between 60hz & 100hz?

How do illusionists work and other magicians... sleight of hand... its a known fact that eyes dont process that fast thats how most magic tricks work... essentially processing faster than our eyes percieve then and how could that look any better when we cant see it? (it is happening... but you cannot see it then)

A processor works faster calculating sums than someones own brain... but thats pointless if you dont know / cant see what its doing... thats all im getting at.... the output cant be noticeable to the human eye as its operating faster than what you process... faster still than the original media it was broadcast in


----------



## Binge (Oct 25, 2009)

twicksisted said:


> hmmm i see no blur on my 1080p 40" samsung.... and the colour vibrancy and everything is perfect (best out of the whole shop when i bought it)..... the pixel changing speed is down to the pixel latency GTG not how many HZ the panel is running...
> 
> I believe my screen is 5ms GTG.... now if what youre saying is correct then a 100Hz would be faster... and a 200+ would be even quicker...?
> 
> Its surely the same technology as an LCD panel for computer games and those go as low as 2ms GTG in 60hz.... so that would imply that at 200hz is would be what 0.5ms GTG? I dont think so? as those 100hz tellys are also 5ms GTG etc...



You know GTG stands for gray to gray, right? Color to color would change much slower than changing between tones of black.  Lemme just quote the source I posted earlier...



> Pixel Response (below): Frame transitions were captured with a 1/1000 camera shutter showing new frames overlapping old ones on the LCD - old pixels are slow to fade away.
> LCDs have improved their response times greatly, so the overlap below didn't last long enough to be seen (by me). Pixel response doesn't seem to be an issue with the Samsung.
> The pic below can even be compared with those of the fastest LCDs at Behardware here or here
> Samsung probably uses the same "double overdrive" mentioned in BeHardware's review of the 100hz LE4073BD, - the tip-off being a white outline behind the 120hz car.
> ...



The next time you assume there's no blur on your 1080p set, take a camera and take a picture of motion in a video game, and then do the same with a CRT.




human_error said:


> But it does have the time - while a 60hz tv would be idling between frames a 100hz tv is able to calculate the extra frames and send them for rendering. There is a buffering involved - but you wouldn't notice a 1ms delay on a frame, and this is also why most decent source decoders (eg blu-ray players, digital tv decoders, surround sound kits) have options to delay audio output by 5,10,50 etc miliseconds so the video and audio is perfectly in sync (if the audio comes from the TV then the TV delayes the audio automatically).



I've heard a lot of people preach about how humans can't visually compute a millisecond... then why can I visually watch a ms counter and using a beat machine match the rate at which the numbers are moving?  Why can world class Street Fighter players combo 1ms links and use frame traps to confuse opponents?  People may not be able to react immediately to an interval of 1ms, but people are able to sense it even if they aren't aware of it consciously.


----------



## twicksisted (Oct 25, 2009)

niko084 said:


> Take a look at the processing and different abilities of high end sets, you will quickly see that they themselves have pretty cool processors to do all sorts of stuff to change your movie.
> 
> Watch a movie on a few different sets and you can very easily start to discern the differences.
> 
> For instance on a fast pan, some screens will judder, some wont. Some screens are coming with smoothing where they fill empty frames or re-process them and edit them so clean them up. Some people don't even like it, they say they like their film looking like a film, which is the whole reason they started the whole 24fps bluray stuff anyways, everything got too real, too clean and people didn't like it.



Yes these are realtime DSP's for rendering effects.... similar to DSP's in audio effects (reverbs delays etc...).... but they are not going to make an actual difference to something... a perceptual difference yes... but if the media is flawed its flawed... in this case realtime TV is flawed as its only 30fps.... and the panel can work at 100fps

surely then by what youre saying its not how many Hz a TV has got, its down to its DSP chips


----------



## human_error (Oct 25, 2009)

twicksisted said:


> if you cant notice a 1ms delay with your eyes... then how can you see the difference between 60hz & 100hz?
> 
> How do illusionists work and other magicians... sleight of hand... its a known fact that eyes dont process that fast thats how most magic tricks work... essentially processing faster than our eyes percieve then and how could that look any better when we cant see it? (it is happening... but you cannot see it then)
> 
> A processor works faster calculating sums than someones own brain... but thats pointless if you dont know / cant see what its doing... thats all im getting at.... the output cant be noticeable to the human eye as its operating faster than what you process... faster still than the original media it was broadcast in



It is a cumulative effect not an individual frame-by-frame effect, so you don't notice extra frame #1052 but you notice that there are 3x more frames (this is how you can see the difference between 30 and 60fps and beyond, it isn't that you see _every_ frame but when your brain decodes the images it sees it doesn't hit spots on a re-render of the image as often so it doesn't notice the re-renders as much (it's all about timing).


----------



## twicksisted (Oct 25, 2009)

Binge said:


> The next time you assume there's no blur on your 1080p set, take a camera and take a picture of motion in a video game, and then do the same with a CRT.



This is what im getting at... if i need a slow motion fast shutter camera to see the difference, then surely its pointless... i cant see it with my own eyes... surely thats the whole point of the technology is that you can see the difference?


----------



## niko084 (Oct 25, 2009)

twicksisted said:


> if you cant notice a 1ms delay with your eyes... then how can you see the difference between 60hz & 100hz?
> 
> How do illusionists work and other magicians... sleight of hand... its a known fact that eyes dont process that fast thats how most magic tricks work... essentially processing faster than our eyes percieve then and how could that look any better when we cant see it? (it is happening... but you cannot see it then)



What you are not thinking about is a larger picture..

Your eye doesn't catch a small object making a very small move from a decent distance.
You may not see a bullet fly by your face at 500mph, you sure as heck will see a semi do it!

There is way to much to really even start going into it all.

Simple put it's common knowledge among people in the know that higher hz = better, now of course there are other things that come into play obviously, but a 100hz screen will look smoother than a 60hz set.

Do us and yourself a favor and walk into an electronics store that sells nice tv's and ask someone there if they can show you the difference first hand, any place that sells higher end stuff should be ready and willing to do so, on top of that you will get a good idea for which companies screen you prefer, they do all look a bit different.

I notice a good difference myself, if you don't honestly great, you get to save a bunch of money!


----------



## Binge (Oct 25, 2009)

twicksisted said:


> This is what im getting at... if i need a slow motion fast shutter camera to see the difference, then surely its pointless... i cant see it with my own eyes... surely thats the whole point of the technology is that you can see the difference?



Whether you want to notice it or not is up to you.  It depends entirely on your want/need to have that level of clarity before you.  When I play competition FPS I use a CRT because even if it is just placebo I know I am seeing the picture with absolute clarity at quickest possible rate at which even a camera can capture clearly.

I have to ask... why would you be using a fast shutter camera instead of a digital to take a single picture?  I'm only asking because digital cameras don't need to expose film, therefore no shutter speed.


----------



## twicksisted (Oct 25, 2009)

niko084 said:


> What you are not thinking about is a larger picture..
> 
> Your eye doesn't catch a small object making a very small move from a decent distance.
> You may not see a bullet fly by your face at 500mph, you sure as heck will see a semi do it!
> ...



Yes im not debating that the technology isnt better... obviously it is... what im getting at is if you are running a TV with a 100hz panel as opposed to a 60hz panel... and you are running normal connections (sattelite, DVD, blueray) to it... will you notice a difference.... im not debating that running it with a PC on a graphics card its going to be faster etc...

I am very keen on my technology and spent a lot of time checking out a lot of different panels before i bought my 1080p 40" samsung television... im not asking this question on the forum as someone who dosent have a clue 

Basically I wanted a technical explaination as to why its going to be better for TV/DVD/blueray useage... not that it looks better becuase i saw another panel at my mates house and it looked better than my normal one at home if you know what i mean.... all the explainations have been good so far but im still not sold really..

anyone else understand what i mean?


----------



## twicksisted (Oct 25, 2009)

Binge said:


> I'm only asking because digital cameras don't need to expose film, therefore no shutter speed.



of course a digital camera use shutters.... DSLR cameras are exactly the same at SLR cameras (at least my nikon does)... instead of film it would use an analogue to digital convertor to digitise the image isntead of burning it to film
anyways thats besides the point


----------



## Binge (Oct 25, 2009)

human_error said:


> It is a cumulative effect not an individual frame-by-frame effect, so you don't notice extra frame #1052 but you notice that there are 3x more frames (this is how you can see the difference between 30 and 60fps and beyond, it isn't that you see _every_ frame but when your brain decodes the images it sees it doesn't hit spots on a re-render of the image as often so it doesn't notice the re-renders as much (it's all about timing).



This is going to be pure argument at best, but you're proving yourself wrong by saying that the brain can understand 30 vs 60fps and beyond.  If a brain can understand a change in display, and if I can describe it in terms of clarity of motion, then it means I am experiencing each frame.  You may not see a distortion as the screen refreshes all the time, but it can be experienced.  If you play on a 60Hz CRT enough you'll notice that the refresh rate is bothering your eyes.  Turn the refresh rate to 75Hz and magically the irritation is gone, and somehow something being played at 75fps seems a bit more smooth than the same at 60fps.  This is because you're experiencing it to a degree.  To be fair it may not be 75 frames that I could draw or recall, but I'm no autistic kid with my brain really involved with those kinds of details.  Motion quality, however, I can say with certainty that for the time being I can experience high 40-70 frames and notice a difference.


----------



## twicksisted (Oct 25, 2009)

Binge said:


> This is going to be pure argument at best, but you're proving yourself wrong by saying that the brain can understand 30 vs 60fps and beyond.  If a brain can understand a change in display, and if I can describe it in terms of clarity of motion, then it means I am experiencing each frame.  You may not see a distortion as the screen refreshes all the time, but it can be experienced.  If you play on a 60Hz CRT enough you'll notice that the refresh rate is bothering your eyes.  Turn the refresh rate to 75Hz and magically the irritation is gone, and somehow something being played at 75fps seems a bit more smooth than the same at 60fps.  This is because you're experiencing it to a degree.  To be fair it may not be 75 frames that I could draw or recall, but I'm no autistic kid with my brain really involved with those kinds of details.  Motion quality, however, I can say with certainty that for the time being I can experience high 40-70 frames and notice a difference.



fair enough... im not trying to argue here seriously... i just want to know whether or not its advertising bullshit or an actual benefit... as these panels are being sold as televisions... (not gaming monitors)... and television is displayed in many different frame rates all covered by 60hz panels... to date there isnt a broadcast format that exceeds or even reaches the need for 100hz, 120hz,200hz etc... so im asking is there a point and if there is why... thats all 

its like selling a ferrari to do the school run.... if you see what im getting at


----------



## Binge (Oct 25, 2009)

Your rationality is spot on, and I can only respond with my usual harping about high frame count in video games.  Without respect to games it's the motion blur I described above, but even then I wouldn't use an LCD if I was bothered by that motion blur.


----------



## twicksisted (Oct 25, 2009)

Binge said:


> Your rationality is spot on, and I can only respond with my usual harping about high frame count in video games.  Without respect to games it's the motion blur I described above, but even then I wouldn't use an LCD if I was bothered by that motion blur.



cool well i just really wanted to know with this thread is im mis-advising people who want to buy a new panel into buying a better quality 60hz panel that they can afford than buying a standard commercial 100hz panel (becuase its got more hz it must be better etc...)

after googling and reading many threads online about similar topics i cant really find a good reason myself for this... so i guess i posted here as i do respect a lot of the members opinions on here as they are all based on fact and back them up with good technical explaination and proof... 

Im the typical upgrade fanatic and i always try and buy the best for myself if technologically it is actually better... and becuase of this a lot of friends and family always come to me for advice for whatever technicalogical piece of kit they want to buy... I just wanted to know whether there was something that I was missing on this


----------



## niko084 (Oct 25, 2009)

twicksisted said:


> its like selling a ferrari to do the school run.... if you see what im getting at



I understand what you are saying now.

Ok, to get more clear.

100hz is going to be a bit smoother, noticeable in itself I couldn't say.. You get more still frames and your sets anti retention *if it has it* is going to move much quicker and be less noticeable.

In the US we have 24/30/60, therefore 120hz is the ticket, its divisible by 24/30 and 60.
In Europe I believe 24/25/50? 100hz , it's divisible by 25/50 and nearly 24.

The frames being divisible is purely the most important part that comes into play when you are discussing HZ / FPS. When you watch a bluray movie on a 60hz screen they literally just throw a frame away like it was meaningless, causes a judder, one some don't notice, personally it drives me insane! Low HZ can also cause flicker, having 2x or more the HZ to fps helps completely eliminate this, it's also part of why plasma's are so clean, huge multipliers, the frame switch is so fast.

on a 60hz / 60fps, your frame switch from frame 1 to frame 2 takes a 60th of a second.
on a 120hz / 60fps, your frame switch from frame 1 to frame 2 takes a 120th of a second.
That is what flicker is, some people are more susceptible to it than others.

Now if that isn't exactly the issue, I wouldn't jump to buy the faster screen, but then we come into the other part, most screens with the higher HZ are also much better screens all around, better color, faster response, better blacks, etc etc.

That's about as technical as I know about sets and the main reasoning behind purchasing my plasma, beyond the blacks being black. Also why I am waiting very ill patiently for an organic led lcd for my computer.

Something you may notice easier, again this is back to the gaming side, play a game you get over 100 fps in, you occasionally notice some judder when you move quickly. Now, lock your fps at 30fps and play it again, you will get probably no judder, but you will get some blurring. I know this is gaming and not television, but it is still the same issue, just taken to a bit further extreme in something you are paying much closer attention to.


----------



## twicksisted (Oct 25, 2009)

i have too noticed frame judder on blueray playback... but having said that it was only when i used my PC's blueray player to playback blueray movies... as soon as i got a decent standalone blueray player that judder went completely (having said that my GF couldnt see the judder on the pc in the first place and i spent the £300 on a player lol)...

anyways... im not so sure its a question of getting a 100hz panel as getting a good quality panel... for instance pioneer and sony make some "videophile" (if that words been coined yet) displays that run at 60hz... with fast pixels, vibrant colours etc... and these will be noticeably different to a standard mainstream LCD screen.... loads of cheap screens look crap that goes without saying.

So advising someone to buy a excellent quality 60hz panel as opposed to spending the same money on a bog standard run of the mill 100hz just because its 100hz... will that in itself make it a better visual experience?


----------



## Binge (Oct 25, 2009)

twicksisted said:


> i have too noticed frame judder on blueray playback... but having said that it was only when i used my PC's blueray player to playback blueray movies... as soon as i got a decent standalone blueray player that judder went completely (having said that my GF couldnt see the judder on the pc in the first place and i spent the £300 on a player lol)...
> 
> anyways... im not so sure its a question of getting a 100hz panel as getting a good quality panel... for instance pioneer and sony make some "videophile" (if that words been coined yet) displays that run at 60hz... with fast pixels, vibrant colours etc... and these will be noticeably different to a standard mainstream LCD screen.... loads of cheap screens look crap that goes without saying.
> 
> So advising someone to buy a excellent quality 60hz panel as opposed to spending the same money on a bog standard run of the mill 100hz just because its 100hz... will that in itself make it a better visual experience?



I vote for whichever has the better IQ/motion clarity.  By your example you can get a 60Hz screen with fast pixels and vibrant colours.  That alone would make me feel more secure than buying into a 100Hz screen at which I may or may not be getting the same quality screen.


----------



## niko084 (Oct 25, 2009)

twicksisted said:


> for instance pioneer and sony make some "videophile" (if that words been coined yet) displays that run at 60hz...



If you talk about hardcore Videophiles there is only 1 set... Pioneer Kuro Plasma, end of discussion, it's one of VERY few items to the extremists there is no argument about 

I believe the Kuro's are 600hz.

***
I would say you are better off investing in a high end set then jumping for the HZ yes.
The HZ shouldn't be the absolute deciding factor.

My set is 480hz, I picked it over a Kuro do to cost mainly, it's still so incredibly clean and crisp it nearly drives me nuts.


----------



## farlex85 (Oct 25, 2009)

niko084 said:


> If you talk about hardcore Videophiles there is only 1 set... Pioneer Kuro Plasma, end of discussion, it's one of VERY few items to the extremists there is no argument about
> 
> I believe the Kuro's are 600hz.
> 
> ...



Kuro's actually have a 72HZ mode for 24p material (blue-ray and such) but for all other sources use 60hz. 600hz belongs to that of the new Panasonics, but it isn't a 600hz refresh in the same way, and it isn't for the same effect. And yes, they are godly, although the latest best Panasonics and Samsungs come mighty close.

IMO 120hz (and now 240hz) produces no noticeable effect in of itself. The only "advantage" of TVs that carry this refresh is the ability to interpolate and thus reduce judder, however this has an effect that I and many others find to be actually quite negative, reducing a movie to a soap opera type feel (some source material, such as sports, can benefit from it). The more important thing is a mode that comes in a multiple of 24, preferably at least 72, so that 24p material can be played back smoothly in it's original format. Other than that HZ is pure marketing.


----------



## niko084 (Oct 25, 2009)

farlex85 said:


> Kuro's actually have a 72HZ mode for 24p material (blue-ray and such) but for all other sources use 60hz. 600hz belongs to that of the new Panasonics, but it isn't a 600hz refresh in the same way, and it isn't for the same effect. And yes, they are godly, although the latest best Panasonics and Samsungs come mighty close.



You are right, the Kuro has a cool step down feature on the subfield, actually it's the same with the Panasonics claiming 480-600hz, Kuro's did the same thing, it's a bit of marketing hype, it sounds like a lot bigger difference than it is.

The biggest thing about Plasma vs LCD is the response time, LCD 2MS is SCREAMING, Plasma we are talking about .001ms maybe..



> IMO 120hz (and now 240hz) produces no noticeable effect in of itself. The only "advantage" of TVs that carry this refresh is the ability to interpolate and thus reduce judder, however this has an effect that I and many others find to be actually quite negative, reducing a movie to a soap opera type feel (some source material, such as sports, can benefit from it). The more important thing is a mode that comes in a multiple of 24, preferably at least 72, so that 24p material can be played back smoothly in it's original format. Other than that HZ is pure marketing.



I can agree with that, some people do like the bit of judder, keeps that "movie" feeling.

That's really why I say when it comes down to it, go sit down watch a bunch of them and decide what you really like, because the coolest fastest technology may not be for you.


----------



## farlex85 (Oct 25, 2009)

niko084 said:


> You are right, the Kuro has a cool step down feature on the subfield, actually it's the same with the Panasonics claiming 480-600hz, Kuro's did the same thing, it's a bit of marketing hype, it sounds like a lot bigger difference than it is.
> 
> The biggest thing about Plasma vs LCD is the response time, LCD 2MS is SCREAMING, Plasma we are talking about .001ms maybe..
> 
> ...



Pioneer never claimed to have any higher than 72hz refresh rate, they were 60hz all the way. Newer high-end Panasonics (V10s) have a similar feature where they refresh at 96hz for 24p material. What they advertise on the cards in the store is just to try to compete w/ LCDs w/ 120-240hz, it's a different sort of refresh. It's almost the same thing as Contrast Ratio. The displayed contrast in no way reflects any sort of real number that can be used to compare TVs (everyone uses different standards and methods to jack up cr's and they aren't comparable). You just have to read reviews to find out which ones have the inkiest blacks (still the Pioneer although as I said the V10 and Samsung's 8500 come mighty close) and usually the high-end ones will have some sort of 3:2 pulldown, which is really the only sort of refresh one needs to look at unless one enjoys dejudder interpolation.

I agree you should try before you buy, unfortunately many big box stores don't set things up properly for you to really try them properly. Try some local Home Theater stores or friends for a real look.


----------



## FordGT90Concept (Oct 25, 2009)

100+ Hz is a joke.

30 fps = 30 Hz minimum = maximum 1/30th second delay between new display

60 Hz = 1/60th second delay between new display
120 Hz =  1/120th second delay between new display

When your eyes only operate at ~24 Hz.  60 Hz makes sense because every time your eyes refresh, the screen has also refreshed at least once.  120 Hz is far more than your eyes detect and therefore, pointless unless you are in the media industry that has cameras recording at 30 fps / 60 Hz for the same reason why 60 Hz is good enough for human eyes.


----------



## Binge (Oct 26, 2009)

FordGT90Concept said:


> 100+ Hz is a joke.
> 
> 30 fps = 30 Hz minimum = maximum 1/30th second delay between new display
> 
> ...



Source that isn't wiki?


----------



## farlex85 (Oct 26, 2009)

Binge said:


> Source that isn't wiki?



His facts are indeed incorrect, but the main sentiment I suppose seems good enough. There is no single number of frames the human eye can detect, nor a common refresh rate (that particular statistic doesn't even really make sense when referring to human vision, our eyes don't "refresh"). Usually with Ford don't take his statistics as actual science, but sometimes he can get things going in the right direction. It is true most people won't notice the difference b/t 120hz and 60hz on equivalent 30fps material, although to be sure some will.


----------



## FordGT90Concept (Oct 26, 2009)

The eyes are always receiving visual information but the brain only processes a still image about 24-28 times per second.  It is relatively easy to test--just show an image for 1/20th to 1/30th of a second and see if the individual saw anything.  It can also be tested by using a strobe light--setting the frequency of the strobe to make it appear constantly on even though it is not.

Video cameras are the same in that the optics are always receiving a visual but it only saves an image every x number of milliseconds.

When it comes to recreating the image, the display has to refresh faster than the "eye" (be it human or device) does.  If it does not, an update of the frame may be skipped by the "eye" making it appear to jump (e.g. lag).   Everything has to happen in sequence and at a rate faster than the "eye" in order for it to appear smooth.  Anything greater than that rate is not useful.  As such, if you have no problems with a 60 Hz display, you won't gain anything by getting something greater than 60 Hz, all things being equal.


By the way, this phenomena is called "persistence of vision."  LCDs don't have this problem because the technology itself makes sure the "vision" is always there.  Higher refresh rates are completely useless unless the frame rates are higher as well.


----------



## farlex85 (Oct 26, 2009)

FordGT90Concept said:


> The eyes are always receiving visual information but the brain only processes a still image about 24-28 times per second.  It is realtively easy to test--just show an image for 1/20th to 1/30th of a second and see if the individual saw anything.
> 
> Video cameras are the same in that the optics are always receiving a visual but it only saves an image every x number of milliseconds.
> 
> When it comes to recreating the image, the display has to refresh faster than the "eye" (be it human or device) does.  If it does not, an update of the frame may be skipped by the "eye" making it appear to jump (e.g. lag).   Everything has to happen in sequence and at a rate faster than the "eye" in order for it to appear smooth.  Anything greater than that rate is not useful.  As such, if you have no problems with a 60 Hz display, you won't gain anything by getting something greater than 60 Hz, all things being equal.



Selective attention and the way the brain processes visual stimuli make it more complicated than your proposed example. There is no evidence of a set number of fps the brain can detect among our entire species. But yes if 60hz looks good it is likely 120hz won't look tremendously better. The primary reason for its advent was to employ interpolation (de-judder/very noticeable) and compensate for LCD's previous poor response time. Then it became a marketing ploy.


----------



## Easy Rhino (Oct 26, 2009)

FordGT90Concept said:


> 100+ Hz is a joke.
> 
> 30 fps = 30 Hz minimum = maximum 1/30th second delay between new display
> 
> ...



i dont think it really matters what our eyes can detect. 120hz offers far smoother video playback during intense action sequences. video is much more life-like at 120hz than it is at 60hz.


----------



## FordGT90Concept (Oct 26, 2009)

Your eyes can't detect "smoother video playback."  As farlex said, the difference isn't the Hz but extra steps (interpolation) they are doing with the extra refreshes.  "All things being equal," 120 Hz is useless.


----------



## farlex85 (Oct 26, 2009)

Easy Rhino said:


> i dont think it really matters what our eyes can detect. 120hz offers far smoother video playback during intense action sequences. video is much more life-like at 120hz than it is at 60hz.



Video is much more "life-like" when you turn on interpolation. With the new samsung sets you can turn on 120hz w/o turning on interpolation (LED ones and the refreshed 750+ series, most every other set out there automatically uses dejudder once 120hz is turned on). Interpolation's effect is dramatic, especially with a Samsung. 120hz not as much


----------



## Easy Rhino (Oct 26, 2009)

FordGT90Concept said:


> Your eyes can't detect "smoother video playback."  As farlex said, the difference isn't the Hz but extra steps (interpolation) they are doing with the extra refreshes.  "All things being equal," 120 Hz is useless.



so im not noticing the difference between my 60hz tv and my 120hz tv? it may not be the hz that matters, but certainly the technology in the new 120 hz tvs that does matter. so to the OP, get a 120 hz tv, you wont be disappointed.


----------



## farlex85 (Oct 26, 2009)

Easy Rhino said:


> so im not noticing the difference between my 60hz tv and my 120hz tv? it may not be the hz that matters, but certainly the technology in the new 120 hz tvs that does matter. so to the OP, get a 120 hz tv, you wont be disappointed.



Which model of 120hz do you have?


----------



## Easy Rhino (Oct 26, 2009)

farlex85 said:


> Which model of 120hz do you have?



both tvs are made my lg. i dont know the model numbers.


----------



## farlex85 (Oct 26, 2009)

Easy Rhino said:


> both tvs are made my lg. i dont know the model numbers.



Ok, well LG uses a more subtle interpolation engine than AMP (Samsung's) but it is still very noticeable. What your TV is doing is roughly "estimating" where another frame would go in between 2 actual frames. This gives it a "life-like" or as I like to say "soap opera" effect. Some people love it, some people (like me) hate it, you really have to try it to see which camp you would fall into. Some people hate it and grow to love it. It totally destroys watching a movie for me, although I will admit it is cool in some circumstances. At any rate, this is not the effect of 120hz, but rather interpolation (which it's true for all practical purposes might as well be the same thing with the majority of the market and consumers). Whatever makes it good for ya.


----------



## Easy Rhino (Oct 26, 2009)

farlex85 said:


> Ok, well LG uses a more subtle interpolation engine than AMP (Samsung's) but it is still very noticeable. What your TV is doing is roughly "estimating" where another frame would go in between 2 actual frames. This gives it a "life-like" or as I like to say "soap opera" effect. Some people love it, some people (like me) hate it, you really have to try it to see which camp you would fall into. Some people hate it and grow to love it. It totally destroys watching a movie for me, although I will admit it is cool in some circumstances. At any rate, this is not the effect of 120hz, but rather interpolation (which it's true for all practical purposes might as well be the same thing with the majority of the market and consumers). Whatever makes it good for ya.



ah, yea i see what you are talking about. interesting that companies choose to market the hz rate of their televisions rather than all the other fancy (and more important) technology inside them. it probably has something to do with dumb consumers... but to the OP original point, buy a 120hz tv ONLY AFTER YOUVE COMPARED THE TWO WITH YOUR OWN EYES! because my point is you wont find any of the new goodies that make the video smoother during action sequences in a 60hz tv. i really dont think they will continue making 60hz televisions now that hz rating is a marketing gimmick.

side note: i hate watching television on the 120hz tv but i really like watching movies on the 120hz tv.


----------



## farlex85 (Oct 26, 2009)

Easy Rhino said:


> ah, yea i see what you are talking about. interesting that companies choose to market the hz rate of their televisions rather than all the other fancy (and more important) technology inside them. it probably has something to do with dumb consumers...



Yeah simplicity sells. You use fancy smancy words like interpolate and people stare at you blankly as if you're talking complex astro-physics, where as if you just say "here this number is bigger than this number it's totally awesome" they start wetting their pants and calling all their friends to come watch. It makes you wonder how we came up with all this technology in the first place.......


----------



## Easy Rhino (Oct 26, 2009)

farlex85 said:


> Yeah simplicity sells. You use fancy smancy words like interpolate and people stare at you blankly as if you're talking complex astro-physics, where as if you just say "here this number is bigger than this number it's totally awesome" they start wetting their pants and calling all their friends to come watch. It makes you wonder how we came up with all this technology in the first place.......


----------



## FordGT90Concept (Oct 26, 2009)

But the higher 120 Hz also doesn't necessarily mean you are getting interpolation as well so it is just another thing to check before buying a TV (on top of the dozens of other things already: contrast, dynamic contrast, latency, inputs, outputs, on-screen TV guide, smartsound, cabinet size, display size, panel type, weight, etc.

Don't you remember when it was easy to buy a TV?  SmartSound (yes|no), tube size, and connectivity.


----------



## Mussels (Oct 26, 2009)

i actually have a 100Hz screen, my 24"


it only has 60Hz inputs so i have the usual limit to 60Hz/60FPS, but the screen internally "doubles" the image like human_error said with all his F's


it makes no real difference to quality, buy a quality screen based on viewing angles, inputs, outputs, and response times! not refresh rate! (remember that many games/consoles/HDMI devices ONLY work at 60Hz)


----------



## FordGT90Concept (Oct 26, 2009)

HDMI can do 75 Hz but yeah, most are at 60 Hz or less.


----------



## niko084 (Oct 26, 2009)

FordGT90Concept said:


> HDMI can do 75 Hz but yeah, most are at 60 Hz or less.



I'm really starting to wonder where you electronics and biology background comes from....
Get a monitor to run 24hz and watch your desktop, if you don't see a overly noticeable flicker, you probably need brain surgery.
This is why even film reels in newer age cinemas roll 72fps.

HDMI can so 1000hz or more if the output device will send it.
Ever see the marketed "120hz Hdmi" cables... Ya they are a joke...

Cable + HZ doesn't really even mean anything, they have absolutely nothing to do with each other, it's a completely meaningless rating or measurement.

That's like saying how much hard drive space your cpu can fill...


----------



## Mussels (Oct 26, 2009)

FordGT90Concept said:


> The eyes are always receiving visual information but the brain only processes a still image about 24-28 times per second.  It is relatively easy to test--just show an image for 1/20th to 1/30th of a second and see if the individual saw anything.  It can also be tested by using a strobe light--setting the frequency of the strobe to make it appear constantly on even though it is not.



there is a link i cant be stuffed finding right now, that shows the air force did tests and found their pilots could see an image from just one in over 200 frames per second of ain aircraft, AND identify it


the human eye can see an infinite number of frames per second, it all comes down to how the mind processes it - and that varies between people.


----------



## FordGT90Concept (Oct 26, 2009)

If you were just showing static text on it 24/7, 24 Hz is plenty for an LCD.  This is why HDMI 1.4 spec includes the 4096x2160 resolution at 24Hz.  HDMI 1.3 spec includes the 1920x1200 resolution at 75 Hz (which is a common display resolution and frequency).

Most LCD panels aren't fast enough to "flicker," at least like a CRT does.


It isn't "completely meaningless" because HDMI cables have a peak bandwidth that too much data cant surpass.  An HDMI cable, for instance, can't do 4096x2160 at 75 Hz even if your devices supported it because the cable bandwidth just isn't there.




Mussels said:


> there is a link i cant be stuffed finding right now, that shows the air force did tests and found their pilots could see an image from just one in over 200 frames per second of ain aircraft, AND identify it


Pretty impressive.  At the same time, most people don't qualify to be an air force pilog because of their vision.  That figure, therefore, is likely to be towards the upper end of the spectrum rather than average.


----------



## Binge (Oct 26, 2009)

Mussels said:


> there is a link i cant be stuffed finding right now, that shows the air force did tests and found their pilots could see an image from just one in over 200 frames per second of ain aircraft, AND identify it
> 
> 
> the human eye can see an infinite number of frames per second, it all comes down to how the mind processes it - and that varies between people.



+1 and I've said things to the same effect before in this thread.  Standard TV/HD movies may not have the frames to make such a set really worth it, but I bet my nuts that just about anyone would notice a difference if they were playing their favorite racing/flight sim/shooter.


----------



## niko084 (Oct 26, 2009)

FordGT90Concept said:


> If you were just showing static text on it 24/7, 24 Hz is plenty for an LCD.  This is why HDMI 1.4 spec includes the 4096x2160 resolution at 24Hz.  HDMI 1.3 spec includes the 1920x1200 resolution at 75 Hz (which is a common display resolution and frequency).


It's not being sent in HZ, it's being sent in FRAMES.
Now even though the images are sent at 24 FRAMES per second, the actual viewing tool "TV, Monitor etc" at say 72hz will show that same 1 frame, refreshing it 3 times per second.
You are confusing HZ with FPS, two VERY different terms.


> Most LCD panels aren't fast enough to "flicker," at least like a CRT does.


This is true, they blur instead because they still dim and re-light the image, although being it's not a strong change in contrast it's not noticeable in the same way.



> It isn't "completely meaningless" because HDMI cables have a peak bandwidth that too much data cant surpass.  An HDMI cable, for instance, can't do 4096x2160 at 75 Hz even if your devices supported it because the cable bandwidth just isn't there.


Again it's 4096x2160 @ 75FPS not HZ.



> Pretty impressive.  At the same time, most people don't qualify to be an air force pilog because of their vision.  That figure, therefore, is likely to be towards the upper end of the spectrum rather than average.


Also not true, most people fail do to stress handling ability... Everyone is pretty used to the idea that if you eye sight isn't good enough you can't be a fighter pilot.


----------



## FordGT90Concept (Oct 26, 2009)

niko084 said:


> It's not being sent in HZ, it's being sent in FRAMES.
> Now even though the images are sent at 24 FRAMES per second, the actual viewing tool "TV, Monitor etc" at say 72hz will show that same 1 frame, refreshing it 3 times per second.
> You are confusing HZ with FPS, two VERY different terms.
> 
> Again it's 4096x2160 @ 75FPS not HZ.


That's not how it works.  The output device is always going to send a signal at regular intervals (Hz) regardless of whether or not it changed.  A new "frame" starts when the display buffer changes from the previous "frame."  Displays use Hz for bandwidth figures because displays usually don't have their own display buffer.  They need an image to be sent for rendering constantly; that is, the display is updated via the output device regardless if it changed or not.

No change = refresh
Change = frame (includes refresh)





niko084 said:


> Also not true, most people fail do to stress handling ability... Everyone is pretty used to the idea that if you eye sight isn't good enough you can't be a fighter pilot.


The details are here:
http://usmilitary.about.com/cs/genjoin/a/pilotvision.htm


----------



## niko084 (Oct 26, 2009)

FordGT90Concept said:


> That's not how it works.  The output device is always going to send a signal at regular intervals (Hz) regardless of whether or not it changed.  A new "frame" starts when the display buffer changes from the previous "frame."  Displays use Hz for bandwidth figures because displays usually don't have their own display buffer.  They need an image to be sent for rendering constantly; that is, the display is updated via the output device regardless if it changed or not.
> 
> No change = refresh
> Change = frame (includes refresh)



There is no such thing as "regular intervals" when it comes to digital either, that's what we call Jitter, and your screen is also susceptible to it.

Hertz- A unit of frequency equal to one cycle per second.
If you take that as boldly as you can, that would mean on a digital scale 0/1s to keep it simple, 60hz would mean 60 0s and/or 1s in a second, that doesn't equal 24/25/30/50/60 FPS, there is WAY more involved, especially now that you have to consider audio on top of that. HDMI 1.3b is 10.2 Gbit/s, you can say a single 0 or 1 is 1bit of data, so if you want to say HZ on a HDMI cable, you are looking at 10,952,166,604.8 bit/s, now a HDMI has 19 wires so 576,429,821.30HZ is more like the reality of the cables rate of frequency per conductor.

Hertz and FPS are two VERY different terms.
*Although they are also very commonly misused in the electronics "marketing" field as most of us know*

After further review you are correct LCD displays do display the same image until sent a new command, Plasma and CRT do not share this.

The increased HZ on a LCD is simply to reduce judder vs using 3:2 pulldown and other such methods.

Plasma and CRT screens on the other hand, use it for very important reasons beyond judder because they do flash.


----------



## WarEagleAU (Oct 26, 2009)

Not sure about where ya'll are at but the rage is *120*Hz TVs here in the states and they go for a bit more. We are now seeing a flood (HHGregg, Best Buy, Target, WALLY WORLD, etc) of 240Hz Tvs. To me, I have a 60hz 720p but when I see a 120 or 240 (LED, OLED, etc) or whatever, it looks leaps and bounds better than mine.


----------



## niko084 (Oct 26, 2009)

WarEagleAU said:


> Not sure about where ya'll are at but the rage is *120*Hz TVs here in the states and they go for a bit more. We are now seeing a flood (HHGregg, Best Buy, Target, WALLY WORLD, etc) of 240Hz Tvs. To me, I have a 60hz 720p but when I see a 120 or 240 (LED, OLED, etc) or whatever, it looks leaps and bounds better than mine.



There is valid reason for the higher HZ, and its been said, it's all about not needing pulldown for various streams, you find the a multiplier of all the above.

The other side of that is, the screens themselves are the higher end screens, physically better in other areas, which make much more difference than just the refresh rate itself, which isn't a lot in itself.

The giant marketing done on HZ is simply that, marketing, does it make a difference yes, but it's no where near the difference that the rest of the screen makes, it just gives the average consumer a number to read they can relate to.

As I think might have been said above, explaining to a customer that a plasma is 480-600hz over a 60-120hz LCD is really easy, explaining that it has a .001ms response time and 8-10 60hz sub fields is a little more than most care to hear, keep it simple.

Same reason companies market their "home theater in a box kits" as 1000, 1500, 2000 watts... BS it only draws 300 from the WALL! It sounds big, and that's what people like.

800hp doesn't mean a thing with 5ft/lbs of torque


----------



## farlex85 (Oct 26, 2009)

WarEagleAU said:


> Not sure about where ya'll are at but the rage is *120*Hz TVs here in the states and they go for a bit more. We are now seeing a flood (HHGregg, Best Buy, Target, WALLY WORLD, etc) of 240Hz Tvs. To me, I have a 60hz 720p but when I see a 120 or 240 (LED, OLED, etc) or whatever, it looks leaps and bounds better than mine.



There's many things that factor into making a television look good, and the refresh rate isn't really one of them. It can make you ooh and ah when they turn on interpolation though, so if you mean the life-like effect then yeah it's cool. Other factors, primarily contrast ratio and in a big box store brightness, will give an image it's pop and crispness, and if the settings are correctly calibrated even more so. TV tech has been advancing fast, today's TVs are much better than their 3-4 year old brethren, only that has very little to do w/ 120hz.


----------



## niko084 (Oct 26, 2009)

farlex85 said:


> TV tech has been advancing fast, today's TVs are much better than their 3-4 year old brethren, only that has very little to do w/ 120hz.



Indeed, backlighting on LCD's is probably the single biggest thing, these new LED screens are ELITE.

Sony's triple panel screens that layer RGB are really nice too!

On top of all sorts of other things that are not really marketed or talked about among general users and consumers, stuff I don't even keep up with.
*Like the ram chips on a Sapphire card vs a Powercolor*


----------



## farlex85 (Oct 26, 2009)

niko084 said:


> Indeed, backlighting on LCD's is probably the single biggest thing, these new LED screens are ELITE.
> 
> Sony's triple panel screens that layer RGB are really nice too!
> 
> ...



Yeah LED backlighting is pretty awesome, once it finally comes to full fruition at a reasonable price it will likely finally be the nail in the coffin for plasmas. Although Panasonic may be able to pull out something cool to keep them around. I'm still hoping SED's and Laser TVs become a reality.


----------



## niko084 (Oct 26, 2009)

farlex85 said:


> Yeah LED backlighting is pretty awesome, once it finally comes to full fruition at a reasonable price it will likely finally be the nail in the coffin for plasmas. Although Panasonic may be able to pull out something cool to keep them around. I'm still hoping SED's and Laser TVs become a reality.



OMG Laser! Yes..

Ya LED LCD's are nice, I don't think they will take out plasma's though, at least not for some *me*, but it would make for a nice monitor!

Laser and/or SED very well could though...

Now if they could get power draw and maybe the weight  on Plasma's down a bit, that would be nice... My 42" takes in about 160-180 watts with a peak at something like 485...


----------



## FordGT90Concept (Oct 26, 2009)

niko084 said:


> There is no such thing as "regular intervals" when it comes to digital either, that's what we call Jitter, and your screen is also susceptible to it.
> 
> ...


FPS only exists in the context of a display adapter (it produces frames as fast as it can).  Monitors only care that they receive an image to display so many times per second (hertz).  Again, very few displays have a display buffer internally (they can't "store" an old frame to use over and over).


----------

