By using this site, you agree to our Privacy Policy and our Terms of Use. Close
fatslob-:O said:

Ok now you making up lies, batman wasn't sharper on the PS3, it was sharper on the WII U. FXAA only takes about 1 millisecond to filter the image on a $100 videocard compared to 4x MSAA taking up roughly 5 miliseconds. Do you have any statistics from AMD or Nvidia to show what type of anti aliasing PC gamers prefer ?

From Eurogamer Digital foundry Batman face-off:

"The only real exception comes in the form of the addition of NVIDIA's FXAA post-processing technology. The original console versions of Arkham City operated at native 720p with no anti-aliasing employed at all so we might expect a welcome bump in image quality from the addition of the AA tech on Wii U. However, the arrival of FXAA is something of a double-edged sword. On the one hand, high contrast edges are smoothed significantly - a welcome addition. Unfortunately, on the flipside, the additional blurring detracts from the quality of the artwork, with specular highlights in particular dulled significantly."

What is this sharpen effect you speak of ?

Some PC games combine (without telling you) some sharpening effect with FXAA to "fix" the blur. That's why most PC gamers think FXAA doesn't blur textures. That's a common illusion among PC gamers.

FYI, MLAA and FXAA are virtually similar in every respect.

No, they are not. You can see FXAA like an advanced blur when MLAA and the others morphological AA like SMAA will try to smooth more precise pattern like long aliased edges and not random textures sub-details.

All of these methods use edge detection for anti aliasing except for MSAA.

MLAA is a morphological AA, it will try to dectect and smooth only aliased edges, yes. But FXAA will blur stuff everytimes it detects too much contrast between several pixels. What is too much contrast is precisely what can be modified by the developers.