Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You may already know this, but the "Game Mode" on modern TVs is just turning off the 120Hz/240Hz interpolation that is done to "improve" the picture (or is required for certain 3D systems to work).

The interpolation introduces significant latency that its obvious (and frustrating) during gaming.

I actually run my TV in game mode by default, because the interpolation done in 120/240Hz mode makes everything look like it is slightly unreal and shot on video. Definitely uncanny valley territory.



This.

Some people aren't able to tell this apparently but it absolutely bothers the heck out of me. Probably the reason why I'm not a fan of 48fps either.

I still think it would be useful to have a variable rate player. 24fps for normal scenes and 48fps for action.


Is there a detectable (to human) difference between a 24fps screen and a 48fps screen where the image only changes every other frame. I can see how this would work with film based projectors, but my understanding of TVs is that pixels are always on and simply change states between frames, so 'changing' to an identical frame should have no effect.


Having the frame change every second frame would mean that the screen would be changing the picture only half the time, while it normally is always changing the color of some pixel.

Not sure that this is not just a pure win.


It affects the amount of motion blur that has to be applied to give a feeling of fluid movement, especially action scenes.


Very glad I'm not the only one to notice it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: