I've noticed that on monitors at 120hz and 240hz, full motion video starts to look strange. Not just high resolution animated graphics renderings, but also legacy recordings of live action pre-dating HD standards.
Old NTSC and PAL recordings, and even Black and White TV shows like I Love Lucy or Leave It To Beaver, take on this interpolated, ultra-modern slick feeling of remastered image post-processing.
I can't tell what's going on, but I see it when I view certain monitors, at friends houses, in department stores and at some fast food restaurants that provide a TV in the dining room.
Re-runs of TV shows that I've watched hundreds of times on CRTs and first generation flat-screen monitors, have a certain quality and leave a distinctive impression, that a subset of recent monitors augment, contaminate and tamper with.
I can't tell what's going but I know it when I see it. I suspect that there may be some software inbetween the transcoder circuit and the final illuminated raster that attempts to reduce flicker, and provides virtual frames automatically tweened when low frame counts are encountered.
Few people agree with me, or notice a difference, but it's there, man.
You're not alone, I hate theses processed added frames ! Sometimes it even introduces weird blocking artifacts. I even suspect it sometimes fail to recognize a scene cut (and try to interpolate movement..)
People can't see this because the answer to oculusthrift's "At what threshold does the human eye no longer notice an increase in frame rate" depends on people. Most people around me can't see above 30 fps. I know I can see the difference between 25 and 50 fps.
Eyes don't see in frames per second, they see in light intensity over time. That is why you can see lighting strike and why the image persists in your vision for a few seconds after.
the question is not what fps does the human eye see at (nonsensical question). but, "At what threshold does the human eye no longer notice an increase in frame rate"
That question also doesn't make sense in any realistic scenarios, because it's entirely dependent on how much motion is involved and whether you're rendering simulated motion blur. Significant inter-frame difference is what you notice.
Old NTSC and PAL recordings, and even Black and White TV shows like I Love Lucy or Leave It To Beaver, take on this interpolated, ultra-modern slick feeling of remastered image post-processing.
I can't tell what's going on, but I see it when I view certain monitors, at friends houses, in department stores and at some fast food restaurants that provide a TV in the dining room.
Re-runs of TV shows that I've watched hundreds of times on CRTs and first generation flat-screen monitors, have a certain quality and leave a distinctive impression, that a subset of recent monitors augment, contaminate and tamper with.
I can't tell what's going but I know it when I see it. I suspect that there may be some software inbetween the transcoder circuit and the final illuminated raster that attempts to reduce flicker, and provides virtual frames automatically tweened when low frame counts are encountered.
Few people agree with me, or notice a difference, but it's there, man.
It's there.