fatslob-:O said:
Why do you have this crusade against FXAA ? It's practically imitates 4x MSAA really well without a lot of performance hits. Sure some textures may get blurred but that can easily be fixed by putting on anistropic filtering. |
Because I hate blurred textures/assets. And it can't be fixed on consoles anyway. On PC the only way to "fix" the blurred textures is to combine FXAA with a sharpen effect. But that will come to its own price too with invented artifacts and all the textures will have the same grainy look.
I have again recently read a post from an enthusiast Wii U/PS3 owner that prefered the sharper version of Batman Arkham city on PS3 versus the same game with the same textures but FXAAed blurred version on Wii U. The irony in this story is that even the cheapy lousy FXAA needs GPU time so the sharper look on PS3 was less GPU expensive for the PS3 GPU. So many testimonies on Internet where people explain that they hate FXAA and try to avoid it at all cost on PC notably.
And don't forget FXAA is not really designed to smooth the aliased edges (like MSAA, SMAA, MLAA or other real Anti-Aliasing technique) but only blurs every contrasted pixels whether they are aliased edges or textures sub-details. FXAA belongs to the blur algorithm family like gaussian blur or Qincunx. The future on consoles is with the morphological AA like MLAA or SMAA (Ryse).
I don't hate FXAA for itself, I hate the fact that developers use it lazingly too strongly. It can have a positive effect on some games ONLY if developers test the effect on assets with the level of FXAA used. You can use FXAA without blurring textures (like in Far Cry 3 on PS3, the FXAA doesn't touch the high res textures). But most developers don't bother to test that their assets are not blurred.







