By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - Digital Foundry: Hands-on with Bayonetta 2

SubiyaCryolite said:

You are right theres no way will it'll get patched so soon to launch. I guess I was trying to be "optimistic" as to avoid being labelled a hater out the gate. What I dont understand is how the XBox 360 was able to handle MSAA in 720p60fps games like Forza and Street Fighter IV with just 10MB of eDRAM. What are dev's using the Wii Us 32MB eSRAM for? Why wont Nintendo turn on descent AA in its own offerings.

Because Nintendo has an incredibly conservative approach to graphics technology born of working with DX7 hardware up until 2 years ago. Hell, if Wii U was as powerful as PS4 their games would probably still have simple/no AA, they'd just be 1080p.

On 360 devs like turn 10 actively tried to push the hardware as far as it would go. On Wii U, almost nobody's even interested in exploring its limits. Nobody with the technical prowess to actually do so, anyway.



Around the Network
fatslob-:O said:
SubiyaCryolite said:

You are right theres no way will it'll get patched so soon to launch. I guess I was trying to be "optimistic" as to avoid being labelled a hater out the gate. What I dont understand is how the XBox 360 was able to handle MSAA in 720p60fps games like Forza and Street Fighter IV with just 10MB of eDRAM. What are dev's using the Wii Us 32MB eSRAM for? Why wont Nintendo turn on descent AA in its own offerings.

I don't know ... There are all sorts of potential issues on the WII U. The most expensive stage in a graphics workload is the pixel shading but this stage depends on more than just raw shading power such as bandwidth. It's a definite possibility that the WII U could either be facing a lack of shading power or a lack of bandwidth to feed the shading units or maybe even both at the same time so those aspects could be the reason why Nintendo Developers or partners omit anti aliasing altogether in order to meet their frame time budget of 16ms. 12.8 GB/s of bandwidth isn't what any developer would call ideal conditions for a platform. Another reason which ties into the first one could be despite the fact that FXAA is fairly cheap to apply that may not be the case on the WII U since the possible lack of shading power prohibits the WII U from applying it in a timely fashion. 

Anti aliasing is more than just depending on the amount of eDRAM. What's the point in even having more memory when the rest of your system to put it lightly is "frail" ? It's almost like people buying into the hype that a graphics card with a puny die having 4GBs of video memory must be the superior to the graphics card with a monstrously large die of only having 3GBs of video memory. 

12.8 GB/s is only for main RAM; eDRAM bandwidth will be vastly higher.

Also, plenty of Wii U games have shown it can handle methods like FXAA and MSAA.



curl-6 said:

12.8 GB/s is only for main RAM; eDRAM bandwidth will be vastly higher.

Also, plenty of Wii U games have shown it can handle methods like FXAA and MSAA.

The main RAM matters a lot since since that's where you'll be storing the textures and the materials. 

I realize that there are a few WII U games which have FXAA and MSAA ...



curl-6 said:

Because Nintendo has an incredibly conservative approach to graphics technology born of working with DX7 hardware up until 2 years ago. Hell, if Wii U was as powerful as PS4 their games would probably still have simple/no AA, they'd just be 1080p.

On 360 devs like turn 10 actively tried to push the hardware as far as it would go. On Wii U, almost nobody's even interested in exploring its limits. Nobody with the technical prowess to actually do so, anyway.

I don't know about that since the PS4 is pretty much capable of pulling off FXAA in less than a milisecond per frame. You might have a point with MSAA but I find that fairly doubtful because of the fact that the PS4 has 32 ROPs and tons of bandwidth to be able to handle it. 



fatslob-:O said:
curl-6 said:

12.8 GB/s is only for main RAM; eDRAM bandwidth will be vastly higher.

Also, plenty of Wii U games have shown it can handle methods like FXAA and MSAA.

The main RAM matters a lot since since that's where you'll be storing the textures and the materials. 

I realize that there are a few WII U games which have FXAA and MSAA ...

But eDRAM is where you (should) be storing data that requires very fast access. It's big enough that you can use it for a lot more than just stashing the framebuffer, and on Wii U both the CPU and GPU have direct access to eDRAM.

For example, Shin'en have talked about how they used it for: "the actual framebuffers, intermediate framebuffer captures, as a fast scratch memory for some CPU intense work and for other GPU memory writes."



Around the Network
fatslob-:O said:
curl-6 said:

Because Nintendo has an incredibly conservative approach to graphics technology born of working with DX7 hardware up until 2 years ago. Hell, if Wii U was as powerful as PS4 their games would probably still have simple/no AA, they'd just be 1080p.

On 360 devs like turn 10 actively tried to push the hardware as far as it would go. On Wii U, almost nobody's even interested in exploring its limits. Nobody with the technical prowess to actually do so, anyway.

I don't know about that since the PS4 is pretty much capable of pulling off FXAA in less than a milisecond per frame. You might have a point with MSAA but I find that fairly doubtful because of the fact that the PS4 has 32 ROPs and tons of bandwidth to be able to handle it. 

That's kind of what I mean though, even where the hardware is capable, Nintendo don't seem to really take advantage of that capability.



curl-6 said:

But eDRAM is where you (should) be storing data that requires very fast access. It's big enough that you can use it for a lot more than just stashing the framebuffer, and on Wii U both the CPU and GPU have direct access to eDRAM.

For example, Shin'en have talked about how they used it for: "the actual framebuffers, intermediate framebuffer captures, as a fast scratch memory for some CPU intense work and for other GPU memory writes."

 

I'm almost certain that a lot of the graphics workload doesn't depend on faster access otherwise you wouldn't see PC developers going apeshit about the memory latencies of GDDR5 but a few workloads that are encountered on the CPU do depend on memory latency. 

I'm not saying that the eDRAM is useless per se but it's mostly ideal for framebuffer and CPU performance situations so I still think that the main RAM is more important in both the WII U's and X1's case for rendering.



fatslob-:O said:
curl-6 said:

But eDRAM is where you (should) be storing data that requires very fast access. It's big enough that you can use it for a lot more than just stashing the framebuffer, and on Wii U both the CPU and GPU have direct access to eDRAM.

For example, Shin'en have talked about how they used it for: "the actual framebuffers, intermediate framebuffer captures, as a fast scratch memory for some CPU intense work and for other GPU memory writes."

I'm almost certain that a lot of the graphics workload doesn't depend on faster access otherwise you wouldn't see PC developers going apeshit about the memory latencies of GDDR5 but a few workloads that are encountered on the CPU do depend on memory latency. 

I'm not saying that the eDRAM is useless per se but it's mostly ideal for framebuffer and CPU performance situations so I still think that the main RAM is more important in both the WII U's and X1's case for rendering.

Being able to use it for non-framebuffer GPU writes and intermediate framebuffer captures seems pretty useful. Shin'en also pointed out it's great for deferred rendering because at 720p you can comfortably store multiple framebuffers in eDRAM.



curl-6 said:

Being able to use it for non-framebuffer GPU writes and intermediate framebuffer captures seems pretty useful. Shin'en also pointed out it's great for deferred rendering because at 720p you can comfortably store multiple framebuffers in eDRAM.

Where did they say that ? 



fatslob-:O said:
curl-6 said:

Being able to use it for non-framebuffer GPU writes and intermediate framebuffer captures seems pretty useful. Shin'en also pointed out it's great for deferred rendering because at 720p you can comfortably store multiple framebuffers in eDRAM.

Where did they say that ?