Zappykins said:
kain_kusanagi said: This might be slightly off topic, but do you think that real time graphics in extreme HD like 4k will have a need for anti-aliasing? I mean if you look at aliasing on SD games and compare it to aliasing on HD games, the lack of aliasing at 1080p is far less noticeable that it is at 480p. I'm talking about PC because you can turn the settings off and on to see the difference, but the same would obliviously apply to any gaming system. I wonder if anti-aliasing will get left behind when games are rendered at such high resolutions that we can't even see the aliasing with our naked eye. |
Wouldn't it become MORE imporant? Espically with larger screens.
Think of how your character is looking over a nice vista - long draw distance, lots of little things far away, and something like a telephone wire. Without AA the wire would flicker something horibly as you panned around (as an opject in the distance would be smaller than 1 pixel.)
Now, if we get to retina type density on a 52" screen with 300dpi. (I calculated the resolution at 13,650x7650 - well beyond 4K, let's call it 16K) Then it might not matter. But until we are at such amazing resolutions I think AA and similar tech will become more important.)
Can CGI-Quality answer this please?
|
Whoa, where does that enormous resolution come from? You don't need a tv screen to be 300 DPI unless you're going to watch it from the same distance you keep your smartphone or tablet, which I suppose is about 10 to 15 inches from your eyes.
"Retina density" is an Apple PR buzzword. What they actually meant is: "pixel density so high that your human eyes can't distinguish two adjacent pixels" - but that obviously depends on how far you are from the display: Steve Job's magical threshold is 300 DPI at 11 inches away.
That's about -unless I used my calculator wrong- a minute of arc angular size for a single pixel, or to put it in human terms, about 1/30th of the angular size of the moon or sun in the sky.
Once again, thanks to our trusted trigonometry, we get that a 1080p, 52" diagonal screen gives that same angular size for each pixel if watched from a distance of 80 inches or more. In other words, that screen is "retina" already, unless you watch it much closer than 80 inches.
And to come back to the original question: the higher the res the less important AA becomes. By definition in a "retina" situation you can't distinguish two adjacent pixels, and as such you should see no 1px-jaggies at all. In your example, a 1px wide phone wire in the distance might somewhat flicker, but it would be at the threshold of the capabilities of your human eyeballs anyway, so your biology would probably blur it into a barely visible line - with no technical intervention needed :)