| selnor said:
The 360 does 4xMSAA free. Thats the point. PS3 would use considerable resource from Cell to do 4xMSAA, meaning less can be used for other things. Thats why it's so great that the 360 has a specially built chip just for AA. It means the entire system is never compromised. And it's not much more than 16 MSAA as you quote. It's MLAA, which is software driven. Less reliable and many issues of further blurr. Less performance hit like QSAA but worse image on textures, shadows etc. Only good for polygon jaggies. |
For the bolded part: as far as I know the eDRAM chip provides "free" MSAA only up to its 10MB internal memory limit, after which you have to tile and incur in heavy performance hits. For example once you use real 720p and HDR, even using 2xMSAA breaks that limit, which is why with the Halo 3 engine Bungie actually chose to render sub-HD.
Thus, when you say that U2 is "only" doing 2xMSAA, you're talking of an engine that actually provides something the eDRAM can't provide "for free". I'll wait for a proper tech analysis on the final Alan Wake to see what tradeoff they chose to overcome the eDRAM limitations, but it will either a) not use real 720p all the time or b) not use "proper" 10bit HDR or c) incur in tiling performance hit, which they might mitigate with other tradeoffs of their engine.
As for your speculations about "HUGE" resources being drained by the Cell when SPU-based antialiasing techniques are used or it being "less reliable" and "blurry", I'd like to see some basis for that considering that:
1) the Cell is designed very closely to dedicated stream processors
2) it's not like such techs will blur the textures for any mystical reason. Different techniques will bring different results.
Finally, the proof in the pudding argument.
I've seen you refer to ME2 as the "best graphics ever on console". That's a game that uses no AA afaik, except in cutscenes.
On the other hand Uncharted 2 has the most lavish graphical quality I've ever witnessed on consoles with its "mere" 2xMSAA, HD colour processing and lots of action happening onscreen, and I include in the comparison anything I've seen of Alan Wake up to now, though you quote it sports 4xMSAA.
And from what I've seen GOWIII seems to up the ante and - from what I know and the videos, but I'm ready to stand corrected after I try the real final version of the game - doesn't seem to suffer from "HUGE" performance penalties as it sports incredible scaling, many dozens of AI, a lot of fast physical interaction, continuous datastreaming.







