By using this site, you agree to our Privacy Policy and our Terms of Use. Close
globalisateur said:
ethomaz said:

From GAF.

 

http://www.neogaf.com/forum/showthread.php?t=764678

Glad the 4xMSAA is back... no more PS3 MLAA or FXAA... this was the biggest news ever for me.

1080p with 4xMSAA is MEGATON to me.


Everyone on Internet is mistaking the meaning of this tweet. It's not the resolution which is not set is stone, it's the 4xMSAA.

And you know they could still combine the great 4xMSAA with the shitty FXAA and completely wreck the image and they'll probably do just that IMO. Many PC games (like AC4 on PC) use this horrible combination.

 

EDIT: I have just seen the post about the Ready at Dawn specific AA. My fears are perfectly founded as it seems it's just a presentation of a new "blur" algorithm to reduce specualr Aliasing. Not even morphological apparently. Just a smart gaussian blur. Why do they even try to re-invent the wheel? Just pick the best morphological AA out there: SMAA. I would even prefer a light FXAA which is at least sub-pixel than any gaussian blur (vaseline).


I would expect 1080p with FXAA to look better than 800p with 4x MSAA. I know from my experience as a PC gamer resolution is more important than AA. But I trust Ready At Dawn, whatever they do will be for the best.

 

I do agree that FXAA can give a "milky" look though