By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Squilliam said:
ethomaz said:

Ok. You can think anything about me... but I'm not creating history or flaming.

To Microsoft/360 is good because some games uses 2xMSAA... so 2xMLAA will release some more GPU to graphics... but for games that use 100% of the GPU for graphics that have no use... a better alternative would be to have it done by the CPU but someone told me that the 360 CPU can't handle that.


The ATI implementation and a console implementation is apples/oranges. The former is pretty much a hack whereas the latter is built into the game code. The actual method used to reproduce the effect is likely different too.

Why would they move a 'graphics effect' off the GPU? It's the Sony solution which is backwards here. It is quite possible that they can fit it into the GPU frame without much if any performance penalties depending on how it is scheduled.

So a HD 6870 has penalties and the R500 X360 GPU no? For me the Sony method puting the MLAA on Cell is better than leave with GPU on consoles.

My first question was if Microsft could put it on CPU and someone said it's not possible... so the best way to Microsoft was optimising it for GPU... that's great but could be better.

The games on 360 uses the tri-core 360 CPU for what? Physics only? So if microsoft could put this element in the CPU the GPU would be free to work only with graphics.

MLAA is lighter than MSAA and SSAA... but few games uses AA on 360 (or PS3) because of the low power of the GPU... I wanted a implementation without graphics penalties... the Cell on PS3 done it better.