| brendude13 said: Hmm yeah. I know a fair amount about technology and the PS3's SPU's and stuff, but I have no idea what the 10mb Edram does. And a guy just a few pages back said that the so called "PS3 gameplay" was from an old trailer and was therefore not polished and tweaked etc. All I think we can do is wait for the LoT and DF comparisons. |
FYI you can change the settings so you can have up to a 100 comments per page if you want.
The 360 memory architecture is unified and is an Edram set up. So it has 512mb shared between the RAM and VRAM and developers can allocate as they see fit but it also has a further 10mb of EDRAM. This gives the 360 absolutely tons of bandwidth which should make programming a breeze when it comes to adding on the extra's like motion blur, AA, transparencies etc but there's an issue with the 10MB size which isn't big enough for a 720p image (which is the industry HD standard) plus all the extra's. Either go sub HD to enable AA, motion blur, tansparencies, dynamic shadows etc or go 720p and downgrade or eliminate this that or the other. PGR 3/4, Alan Wake, Splinter Cell Conviction, Crysis 2 and Halo 3/ODST/Reach are prime examples of where the developers have decided to go sub HD but have all the other bells and whistles. Gears 1/2/3, Bulletstorm, Mass Effect 1&2, GTAIV and RDR are prime examples of where the developers have gone the 720p route but drastically cut back on the AA, motion blur etc. As to which games look better, 360 gamers tend to be divided. Many think Alan Wake is one of the best looking 360 games despite being 540p while others think 720p is a must for 7th gen gaming.
There is a way round the Edram framebuffer issue and it's by using the predicted tiling programing method. Problem with tiling is it can take time and it can also eat into the CPU resources so most developers avoid this route and good thing too unless you want to be playing games at 15-20fps. 360 CPU is already being pushed to it's limit.








