By using this site, you agree to our Privacy Policy and our Terms of Use. Close
fatslob-:O said:
curl-6 said:
fatslob-:O said:

The performance hit of tiling is small but once it goes past 16 or so tiles it can become problematic but no developers to my knowledge have ever attempted it and most of the time they opted out using 2 or 4 tiles. Why would a CPU be important for the purposes of rendering ? 

You don't know the definition of a "bottleneck" ... A bottleneck is something that limits potential on a system, not holds it back. 10mb of eDRAM is enough as it is for 720p resolutions using a tiling solution. A 720p framebuffer is around 7.3mb. The depth buffer is always of the same size as the framebuffer even when it's multisampled. 10mb of eDRAM is enough for a 720p 2x multisampled framebuffer with the depth buffer and can fit in 3 tiles.

So really when you think about it the xbox 360's solution is aguably better.

10MB does limit potential on the system; with 32MB it could do more.

On Wii U, a 1080p frame takes only half the eDRAM.

10MB of eDRAM is NOT a bottleneck once again. It doesn't limit potential, it adds more potential. More memory =/= Better memory

Despite the fact that the WII Us eDRAM has a higher bandwidth it's still not better by any means because the 8 ROPs on the xbox 360's eDRAM gives it the edge.

Yes, the WII U may be able to store a full 1080p frame but that's moot because most of the games on the WII U end up being 720p or lower therefore having 32mb of eDRAM was in vain.

32MB was not in vain; there are 1080p Wii U games, and with a 720p frame there is even more space left over for framebuffer operations or CPU/GPU tasks. 

Having to use an inefficient workaround just to render higher resolution frames (360) is a bottleneck. With more eDRAM, it could render the frames intact, and have the benefit of bonus space to use for other things.