By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - Shin'en: If you can't make great looking games on Wii U, it's not the hardware's fault

curl-6 said:
fatslob-:O said:
curl-6 said:
fatslob-:O said:

Err you realize that bayonetta 1 was at 720p at 60fps on the xbox 360.

And it made smaller and less detailed environments, and simpler lighting, shading, and visual effects compared to what we've seen of Bayonetta 2.

Meh.

The technical difference is quite distinct. Bayonetta 1 is one of my favourite games of the 7th gen, but Wii U's superior RAM and GPU really show when you compare it to Bayo 2, even in the latter's unfinished state.

I don't know about the superior GPU but it does have more ram although slower ram. 



Around the Network
fatslob-:O said:
curl-6 said:

The technical difference is quite distinct. Bayonetta 1 is one of my favourite games of the 7th gen, but Wii U's superior RAM and GPU really show when you compare it to Bayo 2, even in the latter's unfinished state.

I don't know about the superior GPU but it does have more ram although slower ram. 

Shin'en have already confirmed the GPU as being several generations (GPU generations, obviously) ahead of PS3/360. 

http://www.nintendolife.com/news/2013/05/shinen_wii_u_has_enough_power_for_years_to_come_gpu_is_several_generations_ahead_of_current_consoles

And the OP already addressed the slower RAM issue; "Theoretical RAM bandwidth in a system doesn’t tell you too much because GPU caching will hide a lot of this latency."

Having more than 3 times as much eDRAM as 360 also alleviates the issue, because you can use that for things that require fast RAM access.



fatslob-:O said:
snowdog said:
http://www.neogaf.com/forum/showpost.php?p=74557043&postcount=7333

Yes R700s are DX10.1, but as I've already mentioned Latte bears absolutely no resemblance whatsoever to any R700. All you need to do is look at the die shot to see that. It appears to resemble Brazos more closely with a bit of Llano thrown in too.

Have found a link to that changelog info now. There was also another changelog that showed both the PS4 and Wii U builds sharing another DX11-equivalent feature.

As for my fixed function idea and third party engines we know that Nintendo were working very closely with the big boys so if they have included fixed functions you can bet that they'll integrate well with them, even if it means that those engines have less freedom to use their own HDR, depth of field etc etc.

Could be a reason why Epic have thrown their toys out of the pram regarding UE4 support for the Wii U, perhaps..?

@Bold LOL that thing isn't anything like Brazos or Llano otherwise why is the CPU and GPU separate compared to those processors having their GPU's integrated on the die. 

What part of "tessellation and depth of field aren't automatically handled by the WII U's API" this time did you not understand. I said that every developer would need to impliment their own solutions by modifying the engine to allow such features. 



Have a look at the die shot comparisons in the Latte thread on Gaf and all will become clear. It looks nothing like an R700. The closest in terms of the layout of transistors is Brazos.

And we have no idea how depth of field, HDR are handled with Latte. If they are handled by an evolved TEV Unit developers would have two choices - using the fixed functions or use valuable shaders to implement their own. Going by analysis on Gaf we're probably looking at 160-320 of them which isn't a great deal.

Nintendo must have made steps in an attempt to bridge the gap in power between the Wii U, PS4 and One and fixed functions fit the bill. There's a great deal about Latte that's a complete mystery.



snowdog said:
fatslob-:O said:
snowdog said:
http://www.neogaf.com/forum/showpost.php?p=74557043&postcount=7333

Yes R700s are DX10.1, but as I've already mentioned Latte bears absolutely no resemblance whatsoever to any R700. All you need to do is look at the die shot to see that. It appears to resemble Brazos more closely with a bit of Llano thrown in too.

Have found a link to that changelog info now. There was also another changelog that showed both the PS4 and Wii U builds sharing another DX11-equivalent feature.

As for my fixed function idea and third party engines we know that Nintendo were working very closely with the big boys so if they have included fixed functions you can bet that they'll integrate well with them, even if it means that those engines have less freedom to use their own HDR, depth of field etc etc.

Could be a reason why Epic have thrown their toys out of the pram regarding UE4 support for the Wii U, perhaps..?

@Bold LOL that thing isn't anything like Brazos or Llano otherwise why is the CPU and GPU separate compared to those processors having their GPU's integrated on the die. 

What part of "tessellation and depth of field aren't automatically handled by the WII U's API" this time did you not understand. I said that every developer would need to impliment their own solutions by modifying the engine to allow such features. 



Have a look at the die shot comparisons in the Latte thread on Gaf and all will become clear. It looks nothing like an R700. The closest in terms of the layout of transistors is Brazos.

And we have no idea how depth of field, HDR are handled with Latte. If they are handled by an evolved TEV Unit developers would have two choices - using the fixed functions or use valuable shaders to implement their own. Going by analysis on Gaf we're probably looking at 160-320 of them which isn't a great deal.

Nintendo must have made steps in an attempt to bridge the gap in power between the Wii U, PS4 and One and fixed functions fit the bill. There's a great deal about Latte that's a complete mystery.

I wouldn't put too much stock in neogaf's speculations, they're working without a lot of solid evidence. It should be viewed as what it is; guesswork.



Not guesswork when they've been comparing and contrasting the Latte and Brazos die shots though mate. The guesswork comes when trying to identify groups of transistors, ROPs, ALUs etc.

The Latte doesn't look exactly like Brazos but it's a great deal closer than an R700. Like I've said it looks like Nintendo and AMD have made a Frankenstein creation lol. It's a completely custom chip and we know next to nothing about it.



Around the Network
snowdog said:
Not guesswork when they've been comparing and contrasting the Latte and Brazos die shots though mate. The guesswork comes when trying to identify groups of transistors, ROPs, ALUs etc.

The Latte doesn't look exactly like Brazos but it's a great deal closer than an R700. Like I've said it looks like Nintendo and AMD have made a Frankenstein creation lol. It's a completely custom chip and we know next to nothing about it.

That my point, some is by necessity estimation, because the actual specs are under NDA.



curl-6 said:
fatslob-:O said:
curl-6 said:

The technical difference is quite distinct. Bayonetta 1 is one of my favourite games of the 7th gen, but Wii U's superior RAM and GPU really show when you compare it to Bayo 2, even in the latter's unfinished state.

I don't know about the superior GPU but it does have more ram although slower ram. 

Shin'en have already confirmed the GPU as being several generations (GPU generations, obviously) ahead of PS3/360. 

http://www.nintendolife.com/news/2013/05/shinen_wii_u_has_enough_power_for_years_to_come_gpu_is_several_generations_ahead_of_current_consoles

And the OP already addressed the slower RAM issue; "Theoretical RAM bandwidth in a system doesn’t tell you too much because GPU caching will hide a lot of this latency."

Having more than 3 times as much eDRAM as 360 also alleviates the issue, because you can use that for things that require fast RAM access.

Sure the GPU can do more shader operations and output a little higher primitives for better character models and evironments but it doesn't beat the PS3 in texture fillrates.

GPU caches can only do so much plus microsoft tried to apply the same situation but look what happened to them. Eventually the GPU will need to access the main memory and it ain't gonna be pretty plus the wii u can't always depend that eDRAM to always have that piece of data considering that it can only cache about 32mb a time.

Storing textures on the eDRAM is out of the the question unless they attempt to use tiled deferred rendering.



Third parties suck sometimes! (Looking at you EA, Capcom, and Square)



Proud gamer of Nintendo and Sony consoles since 2003.

fatslob-:O said:
curl-6 said:

Shin'en have already confirmed the GPU as being several generations (GPU generations, obviously) ahead of PS3/360. 

http://www.nintendolife.com/news/2013/05/shinen_wii_u_has_enough_power_for_years_to_come_gpu_is_several_generations_ahead_of_current_consoles

And the OP already addressed the slower RAM issue; "Theoretical RAM bandwidth in a system doesn’t tell you too much because GPU caching will hide a lot of this latency."

Having more than 3 times as much eDRAM as 360 also alleviates the issue, because you can use that for things that require fast RAM access.

Sure the GPU can do more shader operations and output a little higher primitives for better character models and evironments but it doesn't beat the PS3 in texture fillrates.

GPU caches can only do so much plus microsoft tried to apply the same situation but look what happened to them. Eventually the GPU will need to access the main memory and it ain't gonna be pretty plus the wii u can't always depend that eDRAM to always have that piece of data considering that it can only cache about 32mb a time.

Storing textures on the eDRAM is out of the the question unless they attempt to use tiled deferred rendering.

What's your source for the PS3 GPU having better fillrate?

The 360 ran into eDRAM problems because 10MB simply wasn't enough; you frequently had to render in multiple passes or in sub-HD resolutions. The Wii U has 32MB of eDRAM, so it won't sufer the same issues.



curl-6 said:
fatslob-:O said:
curl-6 said:

Shin'en have already confirmed the GPU as being several generations (GPU generations, obviously) ahead of PS3/360. 

http://www.nintendolife.com/news/2013/05/shinen_wii_u_has_enough_power_for_years_to_come_gpu_is_several_generations_ahead_of_current_consoles

And the OP already addressed the slower RAM issue; "Theoretical RAM bandwidth in a system doesn’t tell you too much because GPU caching will hide a lot of this latency."

Having more than 3 times as much eDRAM as 360 also alleviates the issue, because you can use that for things that require fast RAM access.

Sure the GPU can do more shader operations and output a little higher primitives for better character models and evironments but it doesn't beat the PS3 in texture fillrates.

GPU caches can only do so much plus microsoft tried to apply the same situation but look what happened to them. Eventually the GPU will need to access the main memory and it ain't gonna be pretty plus the wii u can't always depend that eDRAM to always have that piece of data considering that it can only cache about 32mb a time.

Storing textures on the eDRAM is out of the the question unless they attempt to use tiled deferred rendering.

What's your source for the PS3 GPU having better fillrate?

The 360 ran into eDRAM problems because 10MB simply wasn't enough; you frequently had to render in multiple passes or in sub-HD resolutions. The Wii U has 32MB of eDRAM, so it won't sufer the same issues.

It's based on the fact that chipworks took a die shot and digital foundry did an analysis on it. http://www.eurogamer.net/articles/df-hardware-wii-u-graphics-power-finally-revealed

The similarities to the AMD RV770 basically means it has 16 TMU's compared to the PS3's 24 TMU's. 

Like I said these small caches are fine if you attempt tiled rendering(software wise ofcourse).