SvennoJ said:
LCD doesn't flicker itself, just the image updates, but yes the corner of your eyes can still see that. The 300 fps test is based on tests by showing fighter pilots images at 1/300th sec and asking them to identify them. You certainly won't experience all of those frames but it still feels more natural in motion. The human eye doesn't perceive a constant fps and to avoid temporal aliasing effects or judder (see things jump instead of move) it's best for the source material to be as high as possible.
The eye is also very good at tracking moving objects, which is why it's best not to introduce unnecessary motion blur. For example a very simple experiment: When you stare out of the passenger window of a moving car you generally see the ground move as a blur as the car speeds up, but now and then you get a flicker of the actual street as it speeds by.
Brightness is also important. The darker the image the sooner it looks fluid. Cinemas have their screen brightness at about 15 Fl, Tv's are more in the range of 50-70 Fl. I can see the difference with the same material when viewed on my 52" LCD and 92" projector (at about 17 Fl). 24fps looks a lot better on the big screen in the dark, while it feels jerky on the TV.
It's a shame The hobbit on blu-ray is just 24fps, I would have liked to see the comparison at home. Guess that will have to wait until the 4K spec for hdmi comes through.
As for cheap fsaa looking AA, there are some tricks like rotated grid sampling or using a sinc filter. Useful to create a smooth image, but small details suffers as they were never rendered.
|
I was refering to the flicker of the flourecent tubes that was lighting the screen behind the LCDs. I figured I was seeing that flicker - it does flicker very rapidly right?