# HDR and 120Hz refresh rate coming to TV



## qubit (Dec 12, 2014)

It's the higher 100/120Hz refresh rates that I'm really excited about. Combine this with a strobing backlight and you've got liquid smooth, blur-free movement.  We've been stuck at 50/60Hz for far too long.

http://www.theregister.co.uk/2014/12/11/breaking_fad_high_dynamic_range_hdr_tv_roadmap


----------



## puma99dk| (Dec 12, 2014)

well we have 100hz now HDTV's now and have been out for a while, the problem is more or less the HDMI ports in the tv's only support 60hz, but maybe they will implant DP's in to the tv's later so we can use the technology they implant to the tv's otherwise 100, 200, 600 and 800hz or whatever they say ain't really useable since movies are more less 25, 29,99 and 30fps.


----------



## qubit (Dec 12, 2014)

That's an interpolated 100/120Hz that we have now. While some TVs are very good at this, we're still stuck with the same Nyquist sampling limit due to the slow refresh rate of the input signal. The new standard will introduce true 120Hz like we enjoy with our computers which will double it.

And yeah, so much content is stuck at a crappy 24Hz or so for no good reason. This will have to match the new standard and finally we'll get judder free programming.


----------



## The Von Matrices (Dec 12, 2014)

I can maybe see sporting event broadcasts using 120Hz since people are already acclimated to watching 720p60 HDTV, but will consumers actually say "I want 120Hz video" or will they instead say "I just want to watch my team/TV show?"  I have trouble believing that all but a few technology enthusiasts will consistently notice the difference and demand 120Hz video.  4K broadcasts (and 3D before it) suffer a similar issue.  HDTV succeeded only because it was such a dramatic and perceptible difference from 480i.  Perhaps convincing people to upgrade from 1080i30/720p60 broadcasts to 4K/120 broadcasts might be possible, but if 4K60 catches on then 4K120 will never become popular because it is an incremental upgrade.

Convincing filmmakers to use HFR is, in my opinion impossible.  In modern filmmaking you could call 24Hz video a cinematographic effect because there's no good reason to use it otherwise.  The only reason that 24 Hz is "cinematic" is because it's what cinemas have been using for people's entire lifetimes, and therefore people associate any 24 Hz video with cinema.  I have seen the_ Hobbit_ films in 4K 3D HFR at the local theater and thought they looked spectacular, but in contrast all the friends with whom I went to see it thought it looked "fake" or "too much like a TV show" at 48Hz.  I just don't see how that battle can be overcome to convince people to give up the effect; I equate it to the same as eliminating the huge fireballs from film explosions (because explosives don't actually generate conflagrations), but it's standard practice and continues to be used because that's what people want to look at and that's what filmmakers have been using all their lives.

HDR images are a slightly different problem because you need a dark room to actually make use of the technology.  If you look at the HDR images in a bright room, maybe with natural sunlight or at least with the light on, you can't see the details in the lower range and the lower range just becomes black.  Just like with 3D content, you need specific viewing circumstances to make the most of it, and from what I've experienced. most people aren't going to bother with creating ideal viewing conditions or even worry about picture quality or else we wouldn't see masses of people buying $150 LCD TVs with 200:1 contrast ratios.

In the end, the success of such a technology isn't technical at all.  The success all comes down to whether consumers demand the content, and while I support the idea, I just don't see the average consumer caring enough.


----------



## qubit (Dec 13, 2014)

Von, sadly I agree with your assertion that the general population is just too dumb to care (I know you didn't use those words). Yeah, as long as there is a noise or a picture coming from their device, they don't care - and I don't care for such mediocrity.

Still, judging by that article, higher frame rates are coming anyway, so it's not all bad. No doubt the marketing will take care of the great unwashed's doubts. 

People, even movie directors, often dislike higher frame rates, complaining of the so-called "soap opera effect" instead of appreciating the greatly improved picture quality brought about by it, especially the elimination of judder which is my pet hate. I think they should do more to increase frame rate than resolution though. For example, while 4K does indeed look better than 2K, it's not as obvious as doubling the frame rate would be.

Here are some links about higher frame rates and what people think of them:

http://reviews.cnet.com/8301-33199_7-57569102-221/what-is-the-soap-opera-effect/

http://televisions.reviewed.com/New...d-why-480Hz-looks-terrible-on-your-new-TV.htm

http://news.cnet.com/8301-33620_3-5...ct-when-your-tv-tries-to-be-smarter-than-you/

http://news.cnet.com/8301-17938_105-9942886-1.html


----------



## Blue-Knight (Dec 13, 2014)

qubit said:


> We've been stuck at 50/60Hz for far too long.


Because it is (more than) enough.



Spoiler: Stupid comments (DO NOT quote)



TV's video frame rate are 60 right here, most movies are only 24 fps...

Funny thing is that if I say I bought a GTX 980 for gaming at 640x480 with limited frame rate of 25  at lowest possible settings people will accuse me of "overkiller". But a 120Hz TV to display a video at 24 Hz is not.

Ununderstandable! 



Just my opinion.


----------



## qubit (Dec 13, 2014)

Indeed BN, there's mountains of content filmed at 24Hz (films, dramas, documentaries and more) so all one can do with it is interpolate to the higher frame rates. Some TVs are really good at it, too.


----------



## twilyth (Dec 13, 2014)

I'm pretty sure the current crop of 3d UHD sets accommodate both 120hz and HDMI 2.  The only real issue I think is which compression std is going to be used.  Netflix is using HEVC while Amazon seems to favor VP9 - at least I think I've got that right.  And both have some 4k content that you can stream right now.  Not much, but some.

The problem from what I've read is HDCP.  Which means that you can't play 4k content from your pc yet until they work out the DMCA issues with it.  But if you use Amazon fire I'm pretty sure it's not an issue.  IDK for sure since I don't really keep up with this stuff but I've been trying to do some research since I'm looking into getting a UHD set soon.

This is just a guess but is it possible that higher frame rates help with compression?  I know that mpeg4 takes the difference between frames rather than trying to compress each one.  If you have a higher frame rate, then that means that less information changes from one frame to the next, so maybe you get better overall compression even though technically you have twice as many frames.  IDK, just a thought.


----------



## qubit (Dec 14, 2014)

twilyth said:


> This is just a guess but is it possible that higher frame rates help with compression?  I know that mpeg4 takes the difference between frames rather than trying to compress each one.  If you have a higher frame rate, then that means that less information changes from one frame to the next, so maybe you get better overall compression even though technically you have twice as many frames.  IDK, just a thought.


That's an interesting point. Twice the information is being sent, so you'd need twice the bandwidth for uncompressed video. Even with compressed video you'd need more bandwidth, but I'd say it's likely to be significantly less than twice the bandwidth due to the smaller differences between frames, as you say. No doubt the compression can be improved further still by comparing several frames together, perhaps even hundreds, for where a scene is fairly static. It would take a faster processor and more memory to do this, of course.


----------



## erocker (Dec 14, 2014)

qubit said:


> the general population is just too dumb to care



If aliens were to visit, there should be a large sign next to the planet stating this^.  Earth's slogan.


----------



## marmiteonpizza (Dec 14, 2014)

I swear the maximum refresh rate of our eyes is 60 though...? So wouldn't over that be impossible to notice a difference? May be completely wrong though.


----------



## qubit (Dec 14, 2014)

Joel Charig said:


> I swear the maximum refresh rate of our eyes is 60 though...? So wouldn't over that be impossible to notice a difference? May be completely wrong though.


The human visual system* doesn't have a refresh rate in the same sense as a computer so there's no maximum refresh as such and it works in an analog way, so the maximum refresh rate perceptible depends on the person, the equipment used, environment and even the health and time of day of that person.

Between 60Hz and 120Hz the difference is clearly visible and dramatic (I speak from experience with my own equipment, both CRTs and LCDs) and as you go higher it gets less and less obvious and hence less benefit. I can see a small, but perceptible difference between 120Hz and 144Hz for example, which my current Asus VG278HE monitor can do. Presumably there would be little benefit over anything above 200Hz or so, but I don't have the gear to test the limits of human perception for a more definitive answer.

*I'm including the whole thing here, eyes, brain, optic nerve and everything inbetween.


----------



## Rowsol (Dec 14, 2014)

Joel Charig said:


> I swear the maximum refresh rate of our eyes is 60 though...? So wouldn't over that be impossible to notice a difference? May be completely wrong though.



God I'm tired of hearing this.

Anyway, it's amazing how long video has been limited to 24 fps when tvs have been pushing at least 60, usually 75 fps for a decade or 2 (rough guess, probably longer).  I just don't get it.


----------



## twilyth (Dec 14, 2014)

Film is expensive but now that movie theaters are changing over to digital display systems, refresh should probably start to change.  I guess it depends on how long it will be before film is completely extinct.

About perception - the center of your vision is a lot less sensitive to motion than the periphery.  So as you move even slightly off center from the fovea, you notice motion a lot more.  You may have noticed this with some fluorescent lights.  If you look directly at them, they seem to be on continuously, but if you look off to the side, you can see them blink.


----------



## The Von Matrices (Dec 14, 2014)

Rowsol said:


> Anyway, it's amazing how long video has been limited to 24 fps when tvs have been pushing at least 60, usually 75 fps for a decade or 2 (rough guess, probably longer).  I just don't get it.


You need to eliminate the public mindset that lower FPS=a more professional/expensive production before we will see high framerate video become common.  Just take a look at the links that @qubit posted in his reply if you want an idea of how widespread the idea is.


----------



## ...PACMAN... (Dec 14, 2014)

When I get drunk, my eyes need vsync turned on else there's just too much tearing going on...


----------



## Rowsol (Dec 14, 2014)

...PACMAN... said:


> When I get drunk, my eyes need vsync turned on else there's just too much tearing going on...



haha.


----------



## Nordic (Dec 14, 2014)

I have a 144 hz monitor, and with lightboost it is great for gaming. The smoothness is greatly increased. Still though, 120/240hz tv's just look off and too quick. It probably is because I grew up with 24fps but I really do enjoy it more.


----------



## Silas Woodruff (Dec 14, 2014)

Joel Charig said:


> I swear the maximum refresh rate of our eyes is 60 though...? So wouldn't over that be impossible to notice a difference? May be completely wrong though.










 go from 5:16 and closely watch on from there.


----------



## qubit (Dec 14, 2014)

That's a great video, really explains it well for the most part and the Half-Life 2 runthrough was a pleasant distraction that took me back a decade. And that accent! Indeed, the framerate can never be too high.

He's a little bit wrong on how we percieve motion blur though. We see it in everyday life due to persistence of vision in the retina, or in other words, the response time of the cells in the retina. However, show that same motion on a strobing backlight monitor at 60 or 120fps and the blur is eliminated. He should have pointed this out (video posted in 2014, so he would know about such monitors). This _does_ result in superfluid motion on such monitors and makes everything so fabulously clear. I love it, but many people don't, because it's not what they're used to.

Perhaps the brain will eventually show motion blur if the motion is fast enough, but I'm not sure.


----------



## marmiteonpizza (Dec 14, 2014)

qubit said:


> He's a little bit wrong on how we percieve motion blur though. We see it in everyday life due to persistence of vision in the retina, or in other words, the response time of the cells in the retina. However, show that same motion on a strobing backlight monitor at 60 or 120fps and the blur is eliminated. He should have pointed this out (video posted in 2014, so he would know about such monitors). This _does_ result in superfluid motion on such monitors and makes everything so fabulously clear. I love it, but many people don't, because it's not what they're used to.


Just out of interest, how do you know so much about this? =P


----------



## Silas Woodruff (Dec 14, 2014)

The brain does show moton blur if the motion is fast enough,  he gives an example in the video(I think) of fighter pilots who were shows a random image for 1/10th of a second and they saw it, they did not know what it was but they know it was shown.

He also goes on to say about moving your hand fast enough in front of your face and you will notice motion blur.


----------



## Aquinus (Dec 14, 2014)

qubit said:


> Indeed BN, there's mountains of content filmed at 24Hz (films, dramas, documentaries and more) so all one can do with it is interpolate to the higher frame rates. Some TVs are really good at it, too.



I think I would be happy with 1080p60 content. I say this because 120p at 1080p or higher is going to require a massive amount of bandwidth not just for the video, but for the interconnect between the device and the display. Right now, if you have a 1080p24 video, a TV has to do 5:2 pulldown on the frame-rate where every 5 frames at 60Hz equals 2 frames at 24P. As a result, there will always be a little bit of jitter since each frame doesn't get displayed equally because of the weird ratio (that last frame gets split in half between two frames on the TV) whereas a 120Hz panel would do 5:1 pulldown where exactly 5 frames at 120Hz equals exactly 1 frame at 24P.

All in all, if we have 30p or 60p content, I think we would solve a lot of the smoothness issues by getting refresh rates and video frame rates to align properly whereas 24p doesn't align to 60Hz very well.

See: http://en.wikipedia.org/wiki/Motion_interpolation
Or: http://en.wikipedia.org/wiki/Telecine#Telecine_judder


----------



## Blue-Knight (Dec 14, 2014)

Joel Charig said:


> I swear the maximum refresh rate of our eyes is 60 though...? So wouldn't over that be impossible to notice a difference? May be completely wrong though.


I can notice the difference between 60 and 120Hz, but beyond that I do not know... Maybe I couldn't tell the difference from 120 to 240Hz.

I never tried.


----------



## qubit (Dec 14, 2014)

Joel Charig said:


> Just out of interest, how do you know so much about this? =P



This interestes me, so It's a mixture of general knowledge and experimenting with my own computer setups.



Aquinus said:


> I think I would be happy with 1080p60 content. I say this because 120p at 1080p or higher is going to require a massive amount of bandwidth not just for the video, but for the interconnect between the device and the display. Right now, if you have a 1080p24 video, a TV has to do 5:2 pulldown on the frame-rate where every 5 frames at 60Hz equals 2 frames at 24P. As a result, there will always be a little bit of jitter since each frame doesn't get displayed equally because of the weird ratio (that last frame gets split in half between two frames on the TV) whereas a 120Hz panel would do 5:1 pulldown where exactly 5 frames at 120Hz equals exactly 1 frame at 24P.
> 
> All in all, if we have 30p or 60p content, I think we would solve a lot of the smoothness issues by getting refresh rates and video frame rates to align properly whereas 24p doesn't align to 60Hz very well.
> 
> ...



Indeed the bandwidth is massive and sounds a bit daunting with the equipment we have now, but that's why technology constantly advances to meet these new demands. Think about a 1080p DVR with 2TB HDD. Such a device wouldn't have been possible as little as 10 years ago, but now it's so yesterday.

Soon enough 4K at 120Hz won't seem so amazing any more. We'll all take it for granted and be looking forward to the next improvement. Personally, I don't think there's much point in improving refresh or resolution further than this, as the visual differences will be so incremental that it won't be worth all that extra bandwidth and hardware expense to do it. But they will anyway, lol.

You're right about that 5:2 pulldown, which makes for some very uneven juddery motion. From what I've read, more expensive studio equipment can do interpolation to smooth some of that judder out, but it never looks right.

EDIT:










Everyone, this is a great video about games companies BSing us about lower frame rates being better, when it's really just an excuse for underperformance.


----------



## Aquinus (Dec 14, 2014)

qubit said:


> Everyone, this is a great video about games companies BSing us about lower frame rates being better, when it's really just an excuse for underperformance.


I find it funny when people say things like this. You make it sound like it's so easy to just make things run faster and to look good at the same time. To make any task run better with more cores is a daunting task. The reality is that it's not that easy.

If it's an "excuse", maybe you could enlighten us as to how they can do their job better.


----------



## Blue-Knight (Dec 14, 2014)

qubit said:


> Everyone, this is a great video about games companies BSing us about lower frame rates being better, when it's really just an excuse for underperformance.


It depends how lower... 30 fps is enough.

But playing some games at 60 or higher can give some advantages to the player...  As playing at 10 fps would give a huge disadvantage, so I can't agree lower fps is better.


----------



## qubit (Dec 14, 2014)

Aquinus said:


> I find it funny when people say things like this. You make it sound like it's so easy to just make things run faster and to look good at the same time. To make any task run better with more cores is a daunting task. The reality is that it's not that easy.
> 
> If it's an "excuse", maybe you could enlighten us as to how they can do their job better.


Where did I say it was easy? It is not. I think you've misunderstood what the video is saying and my comment about it. The point is quite clear and straightforward, so please watch it again and then come back to me if you still have questions.


----------

