SvennoJ said:
WereKitten said:
Whoa, where does that enormous resolution come from? You don't need a tv screen to be 300 DPI unless you're going to watch it from the same distance you keep your smartphone or tablet, which I suppose is about 10 to 15 inches from your eyes.
"Retina density" is an Apple PR buzzword. What they actually meant is: "pixel density so high that your human eyes can't distinguish two adjacent pixels" - but that obviously depends on how far you are from the display: Steve Job's magical threshold is 300 DPI at 11 inches away.
That's about -unless I used my calculator wrong- a minute of arc angular size for a single pixel, or to put it in human terms, about 1/30th of the angular size of the moon or sun in the sky.
Once again, thanks to our trusted trigonometry, we get that a 1080p, 52" diagonal screen gives that same angular size for each pixel if watched from a distance of 80 inches or more. In other words, that screen is "retina" already, unless you watch it much closer than 80 inches.
And to come back to the original question: the higher the res the less important AA becomes. By definition in a "retina" situation you can't distinguish two adjacent pixels, and as such you should see no 1px-jaggies at all. In your example, a 1px wide phone wire in the distance might somewhat flicker, but it would be at the threshold of the capabilities of your human eyeballs anyway, so your biology would probably blur it into a barely visible line - with no technical intervention needed :)
|
Studies have shown people can still tell a difference upto 100 cycles per degree, or 200 pixels per degree. So a 52" diagonal screen at 80 inches insn't 'retina' until you have 6491 x 3651 pixels. That's when anti aliasing becomes irrelevant. At 30 cycles per degree you start to lose the ability to see the pixels themselves, but you will still see aliasing occuring.
There is also temporal aliasing effects. Movement starts to become fluid above 20 fps. Yet the human brain can still perceive a difference in certain situations upto 300 fps.
Then last but not least is color information and brightness. 8 bit color is also not enough to satisfy the range of the human eye. Color banding is also still an issue that anti aliasing cannot fix.
Still a ways to go before we have 300 fps, 16 bit RGB, 8K displays.
|
Thanks for understanding what I was trying to ask. Especially as things move – you’d get a moiré patter from the structure of the pixels themselves. I think Anti-Aliasing stays useful for the time being. Although, I remember hearing a few years ago about a new way of doing it well beyond 8x sampling, that was exponentially better with the same burdon on the GPU. But it might have been a marketing gimmick, as I don’t recall much follow up.
Wow 300 fps! That is amazing! I almost flipped out when I realized I could see my 120Hz flickering out of the corner of my eye. I did some research and found that some people can perceive up to 200Hz, but 300, Wow!
Incidentally did anyone else see The Hobbit at 48p and what did they think? I defiantly thought it looked better, but still flickered. But I also saw on a DLP which is tortuous for me because of the flickering. The next Cameron Avatar movie is rumored to be at 60p. I am looking forward to experiencing that.