By using this site, you agree to our Privacy Policy and our Terms of Use. Close
DonFerrari said:
globalisateur said:
DonFerrari said:
Better have no AA than blurry messy shittyA. Let it keep evolving.


Yes. yes and yes. But developers are afraid of 10% of people complaining about "I can see the jaggies". 

So they just wreck completely their game by adding a big vaseline filter.

 

But you know, not all developers do that. Some really care about image quality hopefully because they understand that clarity and sharpness is really important in gaming.

I would rather have they putting that resource on better texture, better models, better fps or leave it unitilized than to lazy AA to have a mark on a checklist. Sony could use that 3,5Gb of reserved RAM and make a standard AA decent so shitty devs don't make a worst game because it's cool to have AA.



You still don't understand. What's the point of using 10% better resolution on textures and 10% more polygons on the models if those are destroyed (yes destroyed) by the FXAA? Those GPU ressources are not seen by the gamers because those details are wrecked and transformed into a blur by the FXAA (or other blurring methods like motion blur or Qincunx or TXAA).

By its nature, FXAA will blur any high constrasts between sub-pixels. The problem is that high resolution textures and high polygoned models mean high constrasts between pixels. in fact FXAA will barely touch the low resolution textures, kinda ironic.

FXAA is a bad AA solution period. The more the game is detailed, the more FXAA will wreck/blur it.

Anyway it's normal most people don't understand this reasoning. Most developers, which are working very hard, and are smarter than most people, still don't understand it.