By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - Understanding Anti-Aliasing

SvennoJ said:
WereKitten said:
SvennoJ said:
 

Studies have shown people can still tell a difference upto 100 cycles per degree, or 200 pixels per degree.
So a 52" diagonal screen at 80 inches insn't 'retina' until you have 6491 x 3651 pixels. That's when anti aliasing becomes irrelevant.
At 30 cycles per degree you start to lose the ability to see the pixels themselves, but you will still see aliasing occuring.
...

Thanks for the quantitative info. I referred in my post to "Steve Jobs' magic number" because Zappykins started from 300 DPI, so I thought that's where he was starting for his calculations, but I'm no optometrist :)

I have read from multiple sources that a 20/20 sight maps to a minute of arc due to the way the Snellen charts are built. For example the 20/20 samples are built on 5x5 grids of 5 minutes of arc total width, thus they can contain 5 "pixels" or 2.5 cycles per main direction.

Your numbers point to 3x as much, but are they for statistical outliers - i.e. exceptional sight - or for the "normal" vision?

That 100 cpd is based on tests NHK has done by showing people 2 images side by side at different cpd and asking which one looks / feels more real. They determined 100 cpd as an upper limit. http://www2.tech.purdue.edu/Cgt/courses/cgt512/discussion/Chastain_Human%20Factors%20in%20UDTV.pdf

20/20 vision is the limit at which you can determine detail, or read text. You will not be able to read pixel font text at 100 cpd. Yet you can still tell a difference, which probably also has to do with the grid pattern of display pixels.

Btw how can we tell when anti-aliasing is perfect? I see aliasing effects too in real-life. For example when you walk towards a baseball field from a distance with 2 chain link fences overlapping at opposite sides of the field, you see all kinds of moire patterns and stairstepping patterns going on. Editting all that out doesn't represent reality truthfully.

The lattest,though, is not a sampling artifact. It's down to earth physical interference... We will need ray-tracing for that, before we worry about removing it. Nice use of GPU resources.



"All you need in life is ignorance and confidence; then success is sure." - Mark Twain

"..." - Gordon Freeman

Around the Network
WereKitten said:
SvennoJ said:

Btw how can we tell when anti-aliasing is perfect? I see aliasing effects too in real-life. For example when you walk towards a baseball field from a distance with 2 chain link fences overlapping at opposite sides of the field, you see all kinds of moire patterns and stairstepping patterns going on. Editting all that out doesn't represent reality truthfully.

The lattest,though, is not a sampling artifact. It's down to earth physical interference... We will need ray-tracing for that, before we worry about removing it. Nice use of GPU resources.

True, but do we really need ray-tracing for that? It should already occur with transparent textures or rendering power lines for example. fxaa is pretty destructive in those situations.

I'm looking forward to ray-tracing, anti-aliasing won't be a problem for now. Current hardware is far from powerful enough to render a real time high-res noise free image. Realistic lighting still eludes the best cgi in movies. I recently watched Samsara and Life of Pi. CGI still can't compete with 70mm film.



SvennoJ said:

True, but do we really need ray-tracing for that? It should already occur with transparent textures or rendering power lines for example. fxaa is pretty destructive in those situations.

...

It was partly ironic, as in "aren't we happy that our most fancy rendering tech should bring more visual artifacts to be realistic" :)

But it's not exactly the same as with overlapping transparent textures. The "moire effect" you get in real life when you overlap two chainlink fences is magnified by the optical diffraction in the overlapped pattern corners, the same phenomenon that brings you fringes of light and dark around the border of real shadows.

Stand in front of a window, close one eye, raise you hand between your eyes and the light at 10-15 inches from your face and "pinch the air" between your index and thumb, trying to get them as close as you can without them touching. You'll see the light between them decreasing critically when they're really close but still separated. You might even observe some dark fringes in that light slice, if you help yourself with your other hand to keep the fingers steady.

In the same way when you look at some moire patterns in real life you get "dark blots" in the corners that are actually "optically thicker" than the overlapped grids (obviously that light is not lost, it is diffracted on a wider angle,  just less evident to the eye). There might also be biological effects in there, due to how our eye responds to very fine separation between two light spots or dark spots, but I'm ignorant on the subject.

If you want to simulate/render that, I suppose that you either simulate light wavefronts in spherical harmonics, or compute optical ray curving due to interference in ray tracing.



"All you need in life is ignorance and confidence; then success is sure." - Mark Twain

"..." - Gordon Freeman

SvennoJ said:
WereKitten said:

Whoa, where does that enormous resolution come from? You don't need a tv screen to be 300 DPI unless you're going to watch it from the same distance you keep your smartphone or tablet, which I suppose is about  10 to 15 inches from your eyes.

"Retina density" is an Apple PR buzzword. What they actually meant is: "pixel density so high that your human eyes can't distinguish two adjacent pixels" - but that obviously depends on how far you are from the display: Steve Job's magical threshold is 300 DPI at 11 inches away.

That's about -unless I used my calculator wrong- a minute of arc  angular size for a single pixel, or to put it in human terms, about 1/30th of the angular size of the moon or sun in the sky.

Once again, thanks to our trusted trigonometry, we get that a 1080p, 52" diagonal screen gives that same angular size for each pixel if watched from a distance of 80 inches or more. In other words, that screen is "retina" already, unless you watch it much closer than 80 inches.

And to come back to the original question: the higher the res the less important AA becomes. By definition in a "retina" situation you can't distinguish two adjacent pixels, and as such you should see no 1px-jaggies at all. In your example, a 1px wide phone wire in the distance might somewhat flicker, but it would be at the threshold of the capabilities of your human eyeballs anyway, so your biology would probably blur it into a barely visible line - with no technical intervention needed :)

Studies have shown people can still tell a difference upto 100 cycles per degree, or 200 pixels per degree.
So a 52" diagonal screen at 80 inches insn't 'retina' until you have 6491 x 3651 pixels. That's when anti aliasing becomes irrelevant.
At 30 cycles per degree you start to lose the ability to see the pixels themselves, but you will still see aliasing occuring.

There is also temporal aliasing effects. Movement starts to become fluid above 20 fps. Yet the human brain can still perceive a difference in certain situations upto 300 fps.

Then last but not least is color information and brightness. 8 bit color is also not enough to satisfy the range of the human eye. Color banding is also still an issue that anti aliasing cannot fix.

Still a ways to go before we have 300 fps, 16 bit RGB, 8K displays.

Thanks for understanding what I was trying to ask. Especially as things move – you’d get a moiré patter from the structure of the pixels themselves.  I think Anti-Aliasing stays useful for the time being.  Although, I remember hearing a few years ago about a new way of doing it well beyond 8x sampling, that was exponentially better with the same burdon on the GPU. But it might have been a marketing gimmick, as I don’t recall much follow up.

Wow 300 fps! That is amazing!  I almost flipped out when I realized I could see my 120Hz flickering out of the corner of my eye.  I did some research and found that some people can perceive up to 200Hz, but 300, Wow!

Incidentally did anyone else see The Hobbit at 48p and what did they think?  I defiantly thought it looked better, but still flickered.  But I also saw on a DLP which is tortuous for me because of the flickering.  The next Cameron Avatar movie is rumored to be at 60p. I am looking forward to experiencing that.



 

Really not sure I see any point of Consol over PC's since Kinect, Wii and other alternative ways to play have been abandoned. 

Top 50 'most fun' game list coming soon!

 

Tell me a funny joke!

Heavenly_King said:
Weedlab said:
Lafiel said:
Weedlab said:
Brings me back to PS2 days. Oh so jaggy.

have you played Infamous 1 ? that's next gen jaggyness ;) great game though


That is true. inFAMOUS was an ugly game, for a lot of reasons. Sucker Punch admitted that in an interview in the last 2 years. They said they made a lot of mistakes with the first game and it was only after they realized the errors in their ways - as they say hindsight is 20/20. They definitely learned their lesson since the second game has infinitely better from a visual perspective. But that aside, you are right, it is a great game.

 

who is the woman in your avatar?? she looks really beautiful and sexy :3

Looks like the same girl from my avatar. If so her name is Mya. Just guessing though.



"Common sense is not so common." - Voltaire

Platinumed Destiny, Vanquish, Ninja Gaiden Sigma Plus, Catherine, and Metal Gear Rising. Get on my level!!


Get your Portable ID!                                                                                     

Around the Network
SvennoJ said:

20/20 vision is the limit at which you can determine detail, or read text. You will not be able to read pixel font text at 100 cpd. Yet you can still tell a difference, which probably also has to do with the grid pattern of display pixels. 

its a common misconception, 20/20 vision is not perfect hell its not even good, its adequate to get a drivers license or so, most teenagers eyesight (if they dont ware lens) is 20/12. That is nearly twice as good as 20/20. Somehow over the years cause I assume that 20/20 equals 1/1 equals 100% that people have misunderstood it to mean perfect. I can easily see the difference between pixels on a iphone's retina display from more than 12" away  



All bow to me the VGChartz current reigning 3DS prediction champion 

 Bet with tbone51: Pokeon X & pokemon Y will not sell more than 8 million in 2013

 jarrod said:The Xbox360 or ps3 will not sell more than 75million units

July 2009 daveJ saidTrue the wii has a large lead now but by 2017 the most likely result will be 1. ps3 2. xbox360 3. wii <-- wii's successor launched in 2011 effectively killing sales of the wii

 2009 daveJ said: The wii will not break the 50% marketshare barrier it will go below the 40% marketshare barrier though in the future. VGChartz members: Impossible, youre an idiot that knows nothing about sales

RazorDragon said:
Weedlab said:
Brings me back to PS2 days. Oh so jaggy.


Not if you played on a CRT screen.


Not if you use an emulator. Spider-Man 2 doesn't make my eyes bleed anymore! *Tears of joy*

 

OT: This is one if the most informative threads, I've ever read on this site.



I am the Playstation Avenger.

   

Zappykins said:

Thanks for understanding what I was trying to ask. Especially as things move – you’d get a moiré patter from the structure of the pixels themselves.  I think Anti-Aliasing stays useful for the time being.  Although, I remember hearing a few years ago about a new way of doing it well beyond 8x sampling, that was exponentially better with the same burdon on the GPU. But it might have been a marketing gimmick, as I don’t recall much follow up.

Wow 300 fps! That is amazing!  I almost flipped out when I realized I could see my 120Hz flickering out of the corner of my eye.  I did some research and found that some people can perceive up to 200Hz, but 300, Wow!

Incidentally did anyone else see The Hobbit at 48p and what did they think?  I defiantly thought it looked better, but still flickered.  But I also saw on a DLP which is tortuous for me because of the flickering.  The next Cameron Avatar movie is rumored to be at 60p. I am looking forward to experiencing that.

LCD doesn't flicker itself, just the image updates, but yes the corner of your eyes can still see that. The 300 fps test is based on tests by showing fighter pilots images at 1/300th sec and asking them to identify them. You certainly won't experience all of those frames but it still feels more natural in motion. The human eye doesn't perceive a constant fps and to avoid temporal aliasing effects or judder (see things jump instead of move) it's best for the source material to be as high as possible.

The eye is also very good at tracking moving objects, which is why it's best not to introduce unnecessary motion blur. For example a very simple experiment: When you stare out of the passenger window of a moving car you generally see the ground move as a blur as the car speeds up, but now and then you get a flicker of the actual street as it speeds by.

Brightness is also important. The darker the image the sooner it looks fluid. Cinemas have their screen brightness at about 15 Fl, Tv's are more in the range of 50-70 Fl. I can see the difference with the same material when viewed on my 52" LCD and 92" projector (at about 17 Fl).
24fps looks a lot better on the big screen in the dark, while it feels jerky on the TV.

It's a shame The hobbit on blu-ray is just 24fps, I would have liked to see the comparison at home. Guess that will have to wait until the 4K spec for hdmi comes through.

As for cheap fsaa looking AA, there are some tricks like rotated grid sampling or using a sinc filter. Useful to create a smooth image, but small details suffers as they were never rendered.



daveJ said:
SvennoJ said:

20/20 vision is the limit at which you can determine detail, or read text. You will not be able to read pixel font text at 100 cpd. Yet you can still tell a difference, which probably also has to do with the grid pattern of display pixels. 

its a common misconception, 20/20 vision is not perfect hell its not even good, its adequate to get a drivers license or so, most teenagers eyesight (if they dont ware lens) is 20/12. That is nearly twice as good as 20/20. Somehow over the years cause I assume that 20/20 equals 1/1 equals 100% that people have misunderstood it to mean perfect. I can easily see the difference between pixels on a iphone's retina display from more than 12" away  

I checked their retina claim once and came to this:
"The iPad 4 retina display needs to be viewed at a distance of 43.4" to make it a true retina display."
At that distance you see it at 100cpd.

For comparison a 1080p 40" set needs to be viewed from 17ft4" away to see it at 100cpd. That's the upper limit and anti aliasing becomes irrelevant.



WereKitten said:

But it's not exactly the same as with overlapping transparent textures. The "moire effect" you get in real life when you overlap two chainlink fences is magnified by the optical diffraction in the overlapped pattern corners, the same phenomenon that brings you fringes of light and dark around the border of real shadows.

Stand in front of a window, close one eye, raise you hand between your eyes and the light at 10-15 inches from your face and "pinch the air" between your index and thumb, trying to get them as close as you can without them touching. You'll see the light between them decreasing critically when they're really close but still separated. You might even observe some dark fringes in that light slice, if you help yourself with your other hand to keep the fingers steady.

In the same way when you look at some moire patterns in real life you get "dark blots" in the corners that are actually "optically thicker" than the overlapped grids (obviously that light is not lost, it is diffracted on a wider angle,  just less evident to the eye). There might also be biological effects in there, due to how our eye responds to very fine separation between two light spots or dark spots, but I'm ignorant on the subject.

If you want to simulate/render that, I suppose that you either simulate light wavefronts in spherical harmonics, or compute optical ray curving due to interference in ray tracing.

I think it has more to do with biological effects. They eye has it's own edge enhancement build in. There are lots of optical illusions floating about to demonstrate that. Physical optical diffraction doesn't happen until you get down to the wavelength of light, 300-700 nm, not really possible with fingers :) http://bowiestie.wordpress.com/2010/02/18/moire-pattern-gifs/ no diffraction there.

Even more off-topic (sorry cgi) but optical illusions are just too cool :) (right-click view for the full effect)