By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - Digital Foundry: Hands-on with Bayonetta 2

Shin'en games look like crap, they can say what ever they want but they haven't delivered anything impressive on wiiu.

Moderated,

-Mr Khan



Around the Network
curl-6 said:

Yes it's great that you can store multiple render targets on the eDRAM but the performance you get ultimately comes down to how fast you can shade the surfaces in the frame and to do that you must fetch the material data that is stored in the main RAM and then pass it through a bunch of SIMD lanes to perform the pixel shading program. What shin'en is describing is a fine and dandy speed up of the lighting step in deferred rendering but what truly bogs down the graphics pipeline is the shading so the main RAM is still more important to the main process. 



fatslob-:O said:
curl-6 said:

Yes it's great that you can store multiple render targets on the eDRAM but the performance you get ultimately comes down to how fast you can shade the surfaces in the frame and to do that you must fetch the material data that is stored in the main RAM and then pass it through a bunch of SIMD lanes to perform the pixel shading program. What shin'en is describing is a fine and dandy speed up of the lighting step in deferred rendering but what truly bogs down the graphics pipeline is the shading so the main RAM is still more important to the main process. 

The main performance hitch in Wii U games (besides a lack of optimization) doesn't so much seem to be heavy use of shaders as scenes with tons of AI/animations/physics going off at once, which comes back to its lower clocked CPU.

Bayo 2 vs the 360 version of Bayo 1 seems to be a fairly good representation of the two consoles; same resolution/AA setup, but the Wii U's RAM and GPU advantages net it larger and more detailed worlds, and v-sync.



curl-6 said:

The main performance hitch in Wii U games (besides a lack of optimization) doesn't so much seem to be heavy use of shaders as scenes with tons of AI/animations/physics going off at once, which comes back to its lower clocked CPU.

Bayo 2 vs the 360 version of Bayo 1 seems to be a fairly good representation of the two consoles; same resolution/AA setup, but the Wii U's RAM and GPU advantages net it larger and more detailed worlds, and v-sync.

Your underestimating the impact of shading performance ... Almost every effects that you see in games such lighting, shading, ambient occlusion/global illumination, depth of field, and even post process anti aliasing is done by the pixel shader! Hell, some GPUs are smart enough that their capable of ordering the fragments in the UAV's to be able to provide programmable blending through the pixel shader! 

Sure the CPU could be an issue but animations are mostly handled through the vertex shading stage of the GPU and physics can be GPU accelerated so that the CPU doesn't have to worry about that aspect. 

Bayonetta 2's world can be attributed to having more RAM but as for the GPU though there was a clear trade off in framerate in order for that to also happen. 



fatslob-:O said:
curl-6 said:

The main performance hitch in Wii U games (besides a lack of optimization) doesn't so much seem to be heavy use of shaders as scenes with tons of AI/animations/physics going off at once, which comes back to its lower clocked CPU.

Bayo 2 vs the 360 version of Bayo 1 seems to be a fairly good representation of the two consoles; same resolution/AA setup, but the Wii U's RAM and GPU advantages net it larger and more detailed worlds, and v-sync.

Your underestimating the impact of shading performance ... Almost every effects that you see in games such lighting, shading, ambient occlusion/global illumination, depth of field, and even post process anti aliasing! Hell, some GPUs are smart enough that their capable of ordering the fragments in the UAV's to be able to provide programmable blending through the pixel shader! 

Sure the CPU could be an issue but animations are mostly handled through the vertex shading stage of the GPU and physics can be GPU accelerated so that the CPU doesn't have to worry about that aspect. 

Bayonetta 2's world can be attributed to having more RAM but as for the GPU though there was a clear trade off in framerate in order for that to also happen. 

The CPU still has to run the animation skeletons and AI though. Dips generally correlate with large numbers of NPCs.

And even with bigger and more detailed worlds, higher poly characters, and more elaborate setpieces, Bayo 2 isn't that far off the 360 version of Bayo 1 in framerate, but with none of the rampant screen tearing that plagued the 360. (Which makes sense; a more robust GPU wouldn't have to submit incomplete frames to keep the refresh rate up)



Around the Network

All i read was like
Fps fps fps fps.



Pocky Lover Boy! 

curl-6 said:

The CPU still has to run the animation skeletons and AI though. Dips generally correlate with large numbers of NPCs.

And even with bigger and more detailed world, higher poly characters, and more elaborate setpieces, Bayo 2 isn't far off the 360 version of Bayo 1 in framerate, but with none of the rampant screen tearing that plagued the 360. (Which makes sense; a more robust GPU wouldn't have to submit incomplete frames to keep the refresh rate up)

Skeletal animations can be performed on the GPU ... You can run ANY vertex programs on the GPU so you don't have to worry about animations or transformations. 

"Dips generally correlate with large numbers of NPCs." Yet Bayonetta 2 is having framerate issues with what appears to be less than 10 enemies at a time ? 

Depends on what you mean by "far off" ... Bayonetta on the xbox 360 has a 10 FPS advatange up against Bayonetta 2 on the WII U in terms of gameplay. 

@Bold What your saying makes no sense ... Screen tearing is caused by the display's refresh rate being out synchronization with the video cards refresh rate, not because of the fact that a frame isn't fully drawn. 

In case you haven't noticed, v-sync is a double edged sword that can cause stuttering with games that have a variable framerates so despite the game having a higher framerate it doesn't necessarily translate to having a better gameplay experience due to the fact that frame pacing issues will plague controller response since v sync will force the display to refresh at 16ms or 33ms. In some cases v syncing a variable framerate isn't all that bad as long as the game persists to have a consistent 50+ FPS as stuttering does become less noticeable since there will be a lot more frames delivered in 16ms frame time rather than 33ms frame time to be able to mask the frame pacing issues more subtly so games like COD Ghosts or the last of use remastered can get a pass on using both vsync and a variable framerate. However, games like Infamous Second Son and Bayonetta 2 need to be put on a 30FPS leash since they fail to maintain a comfortable 50+ FPS average to be able to give a better gameplay experience with v sync. 



fatslob-:O said:
curl-6 said:

The CPU still has to run the animation skeletons and AI though. Dips generally correlate with large numbers of NPCs.

And even with bigger and more detailed world, higher poly characters, and more elaborate setpieces, Bayo 2 isn't far off the 360 version of Bayo 1 in framerate, but with none of the rampant screen tearing that plagued the 360. (Which makes sense; a more robust GPU wouldn't have to submit incomplete frames to keep the refresh rate up)

Skeletal animations can be performed on the GPU ... You can run ANY vertex programs on the GPU so you don't have to worry about animations or transformations. 

"Dips generally correlate with large numbers of NPCs." Yet Bayonetta 2 is having framerate issues with what appears to be less than 10 enemies at a time ? 

Depends on what you mean by "far off" ... Bayonetta on the xbox 360 has a 10 FPS advatange up against Bayonetta 2 on the WII U in terms of gameplay. 

@Bold What your saying makes no sense ... Screen tearing is caused by the display's refresh rate being out synchronization with the video cards refresh rate, not because of the fact that a frame isn't fully drawn. 

In case you haven't noticed, v-sync is a double edged sword that can cause stuttering with games that have a variable framerates so despite the game having a higher framerate it doesn't necessarily translate to having a better gameplay experience due to the fact that frame pacing issues will plague controller response since v sync will force the display to refresh at 16ms or 33ms. In some cases v syncing a variable framerate isn't all that bad as long as the game persists to have a consistent 50+ FPS as stuttering does become less of an noticeable since there will be a lot more frames delivered in 16ms frame time rather than 33ms frame time to be able to mask the frame pacing issues more subtly so games like COD Ghosts or the last of use remastered can get a pass on using both vsync and a variable framerate. However, games like Infamous Second Son and Bayonetta 2 need to be put on a 30FPS leash since they fail to maintain a compfortable 50+ FPS average to be able to give a better gameplay experience with v sync. 

The CPU doesn't just sit there doing nothing, it's there for a reason. And I said "generally", not "in every case".

I'm aware of what causes screen tearing, but the reason it goes out of sync is cos the GPU can't keep up. That's why it's associated with processing overload. The 360 tears quite heavily throughout Bayonetta 1's action sequences, (and sometimes even when nothing's happening) in addition to dropping in actual framerate. Wii U, on the other hand, maintains v-sync.

If both were v-sync'd we would likely see 360 dropping far more severely than Wii U despite running smaller and less detailed worlds, lower poly characters, and less ambitious setpieces.

Personally, I'll take a 40-60fps performance over a locked 30fps for this kind of game.



curl-6 said:
amak11 said:
maximo said:


The Wiiu gpu has less Gflops then 360/ps3 a weaker cpu, even though it has some advantages,  it also has some disadvantages,  it's really a poor excuse for a next generation console, Its More Like A Last Gen Console With A gamepad.


It has a GPU that makes up for the weaker CPU so I suggest you rephrase what you said before you get chewed out by every other member here. Don't want your first post to be your last. 

It's significantly better than a 360, and it shows. 

It's not his first post, he's the 19th alt account of a truly tragic individual whose life seems to revolve around bashing the Wii U.

Ah makes sense



SubiyaCryolite said:

Its an improvement over its predecessor with a few framerate hiccups, hopefully these can be addressed before release. Bayonetta 2 is another Wii U game with no AA and poor anistrophic filtering, it seems that no dev will bother to use the 32MB esram for MSAA which is a shame. Lastly its been confirmed to be 720p so those unreasonable 1080p rumours can be put to rest.


Can someone explain to me what the 32MB or ESRAM is used for? I've seen it a couple of times but I don't understand the technical aspects of it. How can it be utilized? How can it improve the game?