By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - Xbox 360 Reportedly getting a slight Graphical Boost

trasharmdsister12 said:
BenVTrigger said:
So I don't get it since I'm not the best tech guy on here but is this a software / development type thing or a hardware thing? I'm guessing it can't be a hardware addition because not all 360's would be compatible.

There seems to be some confusion here so I'll try to basically explain what this is and what it means for any future games that may utilize it as well as performance and graphical fidelity changes it may bring.

In Layman's terms anti-aliasing is the process of smoothing out jagged edges on an image (in this case a rendered frame from the game engine) to improve image quality. It has less to do with graphical fidelity and effects than it does with Image Quality (IQ) which define how clear, defined, and smooth an image appears. Morphological anti-aliasing is just another method of doing so and depending on the source image it will produce a different result than multi-sample anti-aliasing (MSAA); sometimes better and sometimes worse. To answer your first question it is a software thing. It's mathematical calculations run on an image to ideally locate and remove jaggies one way or another.

Now the PS3 uses the Cell's advantageous heavily parallel architecture and specialized processing units to achieve this task very efficiently. Until recently the manner in which this process was accomplished was unfit for more standard hardware as found in the PC and 360. What these folk have done is "port" these tasks to allow them to be run on the GPU (which are incredibly parallel in this day and age) instead of the CPU.

The PS3's architecture and GPU didn't allow for simple implementation of the standard MSAA technique so they opted for MLAA, which IMO has produced mixed results based on its use (absolutely perfect use in God of War 3 but really mucked things up in Killzone 3). Again that all depends on the source image (things like art, contrast, colours in use, shapes of edges) and any technique can outperform others given certain conditions or dealing with certain patterns. The 360 never really needed to use MLAA as it could *potentially* get "free" MSAA from the eDRam (this is sorta getting off topic so I'm not gonna elaborate at the moment) and the results were generally good enough.

With MLAA requiring ~2.5 ms of the GPU's full attention for the process to take place (assuming this is a standard sized image for the system), that would leave ~30 ms for the rest of the processing by the game engine requiring the GPU for a game running at 30 FPS. That isn't a huge hit but it is a hit nonetheless so I'd assume developers will opt for the "free" MSAA where they can. If they can't though then I'll be interested to see how they choose to implement this technique and work their engine to take advantage of any benefits it may bring to their engine's work-flow setup.

 

Also this is the third time in the last month that I've seen an article on this on VGChartz.

Thanks for sharing. This was a good explanation on what MLAA actually does.




Around the Network
Oblivion86 said:
Raider84 said:
6 years late to the party with a watered down version of anti aliasing.. I don't think anyone will notice the changes they will be so slight and subtle.

Yeah even though there version of MLAA is said to be up to 3 time as fast as the Cell's. I guess not everyone can read correctly though.


Excuse me? The changes will be so subtle they will most likely be unnoticable to the human eye. I don't understand why you are bringing up the 'Cell' ?? I didn't say anything regarding that. I guess you're right, not everyone can read correctly (;



Raider84 said:
Oblivion86 said:
Raider84 said:
6 years late to the party with a watered down version of anti aliasing.. I don't think anyone will notice the changes they will be so slight and subtle.

Yeah even though there version of MLAA is said to be up to 3 time as fast as the Cell's. I guess not everyone can read correctly though.


Excuse me? The changes will be so subtle they will most likely be unnoticable to the human eye. I don't understand why you are bringing up the 'Cell' ?? I didn't say anything regarding that. I guess you're right, not everyone can read correctly (;

That's the big change and that is why it is going to be such a boost because it will compulate @ 1.2 ms, while the Cell which has had the tech for a few years is compulating it at 3.8 ms. I am saying this is not a watered-down version it is an improvement for what consoles can do.



it doesn't matter i have a ps3 and jagged edge's gives me the sh!ts every game i play has some sort of jagged edges which is really annoying since im picky with my graphics :P especially la noire :(



I don't think it makes much of a difference due to the fact that it that PS3 hardly edge's out the 360 in terms of graphics anyway so this little upgrade won't do much and unless your a omega nerd you won't notice any graphical difference's.



Around the Network

An article by Digital Foundry was put up yesterday( http://www.gamesindustry.biz/articles/digitalfoundry-tech-focus-mlaa-heads-for-360-pc ) explaining the process, it was also accompanied by a video comparrison of numerous games with and without it runing side by side and it does make a fairly large difference, definately noticable.



Potable_Toe said:

An article by Digital Foundry was put up yesterday( http://www.gamesindustry.biz/articles/digitalfoundry-tech-focus-mlaa-heads-for-360-pc ) explaining the process, it was also accompanied by a video comparrison of numerous games with and without it runing side by side and it does make a fairly large difference, definately noticable.

Digital Foundry article was put up Okt 2010. ( http://www.eurogamer.net/articles/digitalfoundry-mlaa-360-pc-article ) This is old news.

And the reason why MLAA is not easy to use for PC/360 games (except Fable 3):

"AMD's solution has much in common with Sony's, but is fundamentally different in many ways. The fact that the God of War III MLAA operates on SPU has some very specific advantages - the Cell's satellite processors are far more flexible in terms of how they can be programmed, leading some to believe that GPU implementations will struggle to match the quality level.

More obvious to the end user, AMD's approach is a post-process filter that works through the entire completed frame, including the HUD and any on-screen text. This results in exactly the same kind of artifacting on text as seen with The Saboteur on PS3. As the MLAA algorithm works on the whole screen, it simply doesn't know the difference between a genuine edge and text, resulting in a noticeable impact to quality, along with occasional dot-crawl on HUD elements.

Artifacts can be minimised by running at higher resolutions, and the cards that the MLAA mode runs on should be able to cope with most - if not all - games at 1080p and higher anyway. However, the effects can never be eliminated with the current implementation. Sony's MLAA tech, in contrast, works on the frame before the HUD and text are added, producing a noticeably cleaner result. "



nikosx said:
Potable_Toe said:

An article by Digital Foundry was put up yesterday( http://www.gamesindustry.biz/articles/digitalfoundry-tech-focus-mlaa-heads-for-360-pc ) explaining the process, it was also accompanied by a video comparrison of numerous games with and without it runing side by side and it does make a fairly large difference, definately noticable.

Digital Foundry article was put up Okt 2010. ( http://www.eurogamer.net/articles/digitalfoundry-mlaa-360-pc-article ) This is old news.

And the reason why it's not easy to use for PC/360 games (except Fable 3):

"AMD's solution has much in common with Sony's, but is fundamentally different in many ways. The fact that the God of War III MLAA operates on SPU has some very specific advantages - the Cell's satellite processors are far more flexible in terms of how they can be programmed, leading some to believe that GPU implementations will struggle to match the quality level.

More obvious to the end user, AMD's approach is a post-process filter that works through the entire completed frame, including the HUD and any on-screen text. This results in exactly the same kind of artifacting on text as seen with The Saboteur on PS3. As the MLAA algorithm works on the whole screen, it simply doesn't know the difference between a genuine edge and text, resulting in a noticeable impact to quality, along with occasional dot-crawl on HUD elements.

Artifacts can be minimised by running at higher resolutions, and the cards that the MLAA mode runs on should be able to cope with most - if not all - games at 1080p and higher anyway. However, the effects can never be eliminated with the current implementation. Sony's MLAA tech, in contrast, works on the frame before the HUD and text are added, producing a noticeably cleaner result. "


So it was, well I noticed it yesterday when it popped up on the homepage of GI.biz... either way it details it for people and shows the results of MLAA processed on a GPU. Plus it makes for a good read.



If the MLAA looks similair to the PS3 one then no you won't notice any difference.