I noticed it with BD, playing back a futurama movie
BD player -> TV was absolutely frame rate boosted, it was crystal clear with a hint of the soap opera effect but no artifacting and zero stutter or jitter while panning.
Playback on PC, even after ripping it and playing the RAW file (as well as H264 and H265 attemtps) had the same thing i get with anime playback... stutter/jitter while panning.
What i cant figure out is that it's the same TV, the same HDMI port, same resolution and refresh rate, same TV settings etc. Theres no reason they should be different, but something is crippled when it's played by anything other than a DVD/BD player.
This is the best i've got from looking online, that it may actually be running 50Hz not 60Hz to compensate?
The TV is a 100Hz panel (but 60Hz over HDMI) so the possibility it's doing 24FPs -> 50FPS doubled to 100Hz?
View attachment 247457
Edit: I need to test how this works at verious refresh rates on my displays now, ugh.
24 divides neatly into 144Hz, but not 165Hz
Then you get the monitors that show 60Hz and 59.97Hz and other similar refresh rates, and one may well hold the answer.
120Hz may be the perfect media refresh rate for a panel, since 24, 30 and 60 all divide into it
Edit edit: When my screens turn on HDR, they all lock to 120Hz. I'm starting to suspect why.