By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - Understanding Anti-Aliasing

Kynes said:
TheJimbo1234 said:
Kynes said:
TheJimbo1234 said:
And here I was expecting you to talk about Fourier transforms. Ah well, maybe next time.


But we're talking anti aliasing on videogame graphics, not anti aliasing on a mathematical way. I don't think that this forum is the best place to talk about the Shannon theorem or signal reconstruction.


..it's done using the same method. How do you think they code it? "Please blur this pixel? Cheers buddy!" =p  


I know, I know, but as haxxiy says, I'm sure that 99% of the people here doesn't know what a FFT is, so it's not the way to try to explain it. "intelligent blur" AA isn't really AA, it's not a mathematically correct way to do AA, but it's a good enough substitute. Real AA as MSAA or SSAA do it the "correct" way, taking more samples to try to correct the undersampling.

Exactly when I read about complex numbers and other such stuff on a site like this I'm like wtf ,  especially when the person just mentions the name and does nothing to explain how it impacts anti aliasing , it comes across as look at me I'm clever even though the person may not have meant in that way .



Research shows Video games  help make you smarter, so why am I an idiot

Around the Network
TheJimbo1234 said:
haxxiy said:
TheJimbo1234 said:
And here I was expecting you to talk about Fourier transforms. Ah well, maybe next time.

Yeah, because talking about integrals and transforms will surely make up for much better conversations here on a videogame forum, right?

People just need to know the core of anti-aliasing programming so they don't go around spreading bullshit and misleading others, that's all. 


Very true, I just thought it was obvious, hence the presumption he would go into how it is done and the mathematical variants of MSAA and standard etc and why the performance demands rapidly increase.

If you want to go into it then go ahead and post. I wouldn't mind reading about it in greater depth (just try to keep relatively low level, lol).



crissindahouse said:

bad aliasing problems have to be seen in motion, i think that's on of those things you have to see to be annoyed of it. when i look at screens without anti-aliasing or only little i see the edges but they aren't really a problem, if you see it in motion with all the flickering it's very annoying in my opinion. 

but the explanation is good if people don't know much about it.

Yes! That flickering movement is so annoying.  Sometimes, you can see where someone forgot to turn it on one style of object in the background.



 

Really not sure I see any point of Consol over PC's since Kinect, Wii and other alternative ways to play have been abandoned. 

Top 50 'most fun' game list coming soon!

 

Tell me a funny joke!

Scoobes said:
TheJimbo1234 said:
haxxiy said:
TheJimbo1234 said:
And here I was expecting you to talk about Fourier transforms. Ah well, maybe next time.

Yeah, because talking about integrals and transforms will surely make up for much better conversations here on a videogame forum, right?

People just need to know the core of anti-aliasing programming so they don't go around spreading bullshit and misleading others, that's all. 


Very true, I just thought it was obvious, hence the presumption he would go into how it is done and the mathematical variants of MSAA and standard etc and why the performance demands rapidly increase.

If you want to go into it then go ahead and post. I wouldn't mind reading about it in greater depth (just try to keep relatively low level, lol).


Well that is the thing, besides simply using the forumla to change the value of pixles around it to a brightness lesser than that of the original, I don't know how it works, hence why I was disappointed.



kain_kusanagi said:
This might be slightly off topic, but do you think that real time graphics in extreme HD like 4k will have a need for anti-aliasing? I mean if you look at aliasing on SD games and compare it to aliasing on HD games, the lack of aliasing at 1080p is far less noticeable that it is at 480p. I'm talking about PC because you can turn the settings off and on to see the difference, but the same would obliviously apply to any gaming system.

I wonder if anti-aliasing will get left behind when games are rendered at such high resolutions that we can't even see the aliasing with our naked eye.

Wouldn't it become MORE imporant?  Espically with larger screens.

Think of how your character is looking over a nice vista - long draw distance, lots of little things far away, and something like a telephone wire.  Without AA the wire would flicker something horibly as you panned around (as an opject in the distance would be smaller than 1 pixel.)

Now, if we get to retina type density on a 52" screen with 300dpi.  (I calculated the resolution at 13,650x7650 - well beyond 4K, let's call it 16K)  Then it might not matter.  But until we are at such amazing resolutions I think AA and similar tech will become more important.)

Can CGI-Quality answer this please?



 

Really not sure I see any point of Consol over PC's since Kinect, Wii and other alternative ways to play have been abandoned. 

Top 50 'most fun' game list coming soon!

 

Tell me a funny joke!

Around the Network
TheJimbo1234 said:
Scoobes said:
TheJimbo1234 said:
haxxiy said:
TheJimbo1234 said:
And here I was expecting you to talk about Fourier transforms. Ah well, maybe next time.

Yeah, because talking about integrals and transforms will surely make up for much better conversations here on a videogame forum, right?

People just need to know the core of anti-aliasing programming so they don't go around spreading bullshit and misleading others, that's all. 


Very true, I just thought it was obvious, hence the presumption he would go into how it is done and the mathematical variants of MSAA and standard etc and why the performance demands rapidly increase.

If you want to go into it then go ahead and post. I wouldn't mind reading about it in greater depth (just try to keep relatively low level, lol).


Well that is the thing, besides simply using the forumla to change the value of pixles around it to a brightness lesser than that of the original, I don't know how it works, hence why I was disappointed.


Well, here you have the source code of SMAA, as you are a very intelligent guy, you should understand how it works:

https://github.com/iryoku/smaa/blob/master/SMAA.h



Zappykins said:
kain_kusanagi said:
This might be slightly off topic, but do you think that real time graphics in extreme HD like 4k will have a need for anti-aliasing? I mean if you look at aliasing on SD games and compare it to aliasing on HD games, the lack of aliasing at 1080p is far less noticeable that it is at 480p. I'm talking about PC because you can turn the settings off and on to see the difference, but the same would obliviously apply to any gaming system.

I wonder if anti-aliasing will get left behind when games are rendered at such high resolutions that we can't even see the aliasing with our naked eye.

Wouldn't it become MORE imporant?  Espically with larger screens.

Think of how your character is looking over a nice vista - long draw distance, lots of little things far away, and something like a telephone wire.  Without AA the wire would flicker something horibly as you panned around (as an opject in the distance would be smaller than 1 pixel.)

Now, if we get to retina type density on a 52" screen with 300dpi.  (I calculated the resolution at 13,650x7650 - well beyond 4K, let's call it 16K)  Then it might not matter.  But until we are at such amazing resolutions I think AA and similar tech will become more important.)

Can CGI-Quality answer this please?

Whoa, where does that enormous resolution come from? You don't need a tv screen to be 300 DPI unless you're going to watch it from the same distance you keep your smartphone or tablet, which I suppose is about  10 to 15 inches from your eyes.

"Retina density" is an Apple PR buzzword. What they actually meant is: "pixel density so high that your human eyes can't distinguish two adjacent pixels" - but that obviously depends on how far you are from the display: Steve Job's magical threshold is 300 DPI at 11 inches away.

That's about -unless I used my calculator wrong- a minute of arc  angular size for a single pixel, or to put it in human terms, about 1/30th of the angular size of the moon or sun in the sky.

Once again, thanks to our trusted trigonometry, we get that a 1080p, 52" diagonal screen gives that same angular size for each pixel if watched from a distance of 80 inches or more. In other words, that screen is "retina" already, unless you watch it much closer than 80 inches.

And to come back to the original question: the higher the res the less important AA becomes. By definition in a "retina" situation you can't distinguish two adjacent pixels, and as such you should see no 1px-jaggies at all. In your example, a 1px wide phone wire in the distance might somewhat flicker, but it would be at the threshold of the capabilities of your human eyeballs anyway, so your biology would probably blur it into a barely visible line - with no technical intervention needed :)



"All you need in life is ignorance and confidence; then success is sure." - Mark Twain

"..." - Gordon Freeman

WereKitten said:
Zappykins said:
kain_kusanagi said:
This might be slightly off topic, but do you think that real time graphics in extreme HD like 4k will have a need for anti-aliasing? I mean if you look at aliasing on SD games and compare it to aliasing on HD games, the lack of aliasing at 1080p is far less noticeable that it is at 480p. I'm talking about PC because you can turn the settings off and on to see the difference, but the same would obliviously apply to any gaming system.

I wonder if anti-aliasing will get left behind when games are rendered at such high resolutions that we can't even see the aliasing with our naked eye.

Wouldn't it become MORE imporant?  Espically with larger screens.

Think of how your character is looking over a nice vista - long draw distance, lots of little things far away, and something like a telephone wire.  Without AA the wire would flicker something horibly as you panned around (as an opject in the distance would be smaller than 1 pixel.)

Now, if we get to retina type density on a 52" screen with 300dpi.  (I calculated the resolution at 13,650x7650 - well beyond 4K, let's call it 16K)  Then it might not matter.  But until we are at such amazing resolutions I think AA and similar tech will become more important.)

Can CGI-Quality answer this please?

Whoa, where does that enormous resolution come from? You don't need a tv screen to be 300 DPI unless you're going to watch it from the same distance you keep your smartphone or tablet, which I suppose is about  10 to 15 inches from your eyes.

"Retina density" is an Apple PR buzzword. What they actually meant is: "pixel density so high that your human eyes can't distinguish two adjacent pixels" - but that obviously depends on how far you are from the display: Steve Job's magical threshold is 300 DPI at 11 inches away.

That's about -unless I used my calculator wrong- a minute of arc  angular size for a single pixel, or to put it in human terms, about 1/30th of the angular size of the moon or sun in the sky.

Once again, thanks to our trusted trigonometry, we get that a 1080p, 52" diagonal screen gives that same angular size for each pixel if watched from a distance of 80 inches or more. In other words, that screen is "retina" already, unless you watch it much closer than 80 inches.

And to come back to the original question: the higher the res the less important AA becomes. By definition in a "retina" situation you can't distinguish two adjacent pixels, and as such you should see no 1px-jaggies at all. In your example, a 1px wide phone wire in the distance might somewhat flicker, but it would be at the threshold of the capabilities of your human eyeballs anyway, so your biology would probably blur it into a barely visible line - with no technical intervention needed :)

Studies have shown people can still tell a difference upto 100 cycles per degree, or 200 pixels per degree.
So a 52" diagonal screen at 80 inches insn't 'retina' until you have 6491 x 3651 pixels. That's when anti aliasing becomes irrelevant.
At 30 cycles per degree you start to lose the ability to see the pixels themselves, but you will still see aliasing occuring.

There is also temporal aliasing effects. Movement starts to become fluid above 20 fps. Yet the human brain can still perceive a difference in certain situations upto 300 fps.

Then last but not least is color information and brightness. 8 bit color is also not enough to satisfy the range of the human eye. Color banding is also still an issue that anti aliasing cannot fix.

Still a ways to go before we have 300 fps, 16 bit RGB, 8K displays.



SvennoJ said:

Studies have shown people can still tell a difference upto 100 cycles per degree, or 200 pixels per degree.
So a 52" diagonal screen at 80 inches insn't 'retina' until you have 6491 x 3651 pixels. That's when anti aliasing becomes irrelevant.
At 30 cycles per degree you start to lose the ability to see the pixels themselves, but you will still see aliasing occuring.
...

Thanks for the quantitative info. I referred in my post to "Steve Jobs' magic number" because Zappykins started from 300 DPI, so I thought that's where he was starting for his calculations, but I'm no optometrist :)

I have read from multiple sources that a 20/20 sight maps to a minute of arc due to the way the Snellen charts are built. For example the 20/20 samples are built on 5x5 grids of 5 minutes of arc total width, thus they can contain 5 "pixels" or 2.5 cycles per main direction.

Your numbers point to 3x as much, but are they for statistical outliers - i.e. exceptional sight - or for the "normal" vision?



"All you need in life is ignorance and confidence; then success is sure." - Mark Twain

"..." - Gordon Freeman

WereKitten said:
SvennoJ said:
 

Studies have shown people can still tell a difference upto 100 cycles per degree, or 200 pixels per degree.
So a 52" diagonal screen at 80 inches insn't 'retina' until you have 6491 x 3651 pixels. That's when anti aliasing becomes irrelevant.
At 30 cycles per degree you start to lose the ability to see the pixels themselves, but you will still see aliasing occuring.
...

Thanks for the quantitative info. I referred in my post to "Steve Jobs' magic number" because Zappykins started from 300 DPI, so I thought that's where he was starting for his calculations, but I'm no optometrist :)

I have read from multiple sources that a 20/20 sight maps to a minute of arc due to the way the Snellen charts are built. For example the 20/20 samples are built on 5x5 grids of 5 minutes of arc total width, thus they can contain 5 "pixels" or 2.5 cycles per main direction.

Your numbers point to 3x as much, but are they for statistical outliers - i.e. exceptional sight - or for the "normal" vision?

That 100 cpd is based on tests NHK has done by showing people 2 images side by side at different cpd and asking which one looks / feels more real. They determined 100 cpd as an upper limit. http://www2.tech.purdue.edu/Cgt/courses/cgt512/discussion/Chastain_Human%20Factors%20in%20UDTV.pdf

20/20 vision is the limit at which you can determine detail, or read text. You will not be able to read pixel font text at 100 cpd. Yet you can still tell a difference, which probably also has to do with the grid pattern of display pixels.

Btw how can we tell when anti-aliasing is perfect? I see aliasing effects too in real-life. For example when you walk towards a baseball field from a distance with 2 chain link fences overlapping at opposite sides of the field, you see all kinds of moire patterns and stairstepping patterns going on. Editting all that out doesn't represent reality truthfully.