By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - Shin'en: If you can't make great looking games on Wii U, it's not the hardware's fault

fatslob-:O said:
curl-6 said:
fatslob-:O said:

Sure the GPU can do more shader operations and output a little higher primitives for better character models and evironments but it doesn't beat the PS3 in texture fillrates.

GPU caches can only do so much plus microsoft tried to apply the same situation but look what happened to them. Eventually the GPU will need to access the main memory and it ain't gonna be pretty plus the wii u can't always depend that eDRAM to always have that piece of data considering that it can only cache about 32mb a time.

Storing textures on the eDRAM is out of the the question unless they attempt to use tiled deferred rendering.

What's your source for the PS3 GPU having better fillrate?

The 360 ran into eDRAM problems because 10MB simply wasn't enough; you frequently had to render in multiple passes or in sub-HD resolutions. The Wii U has 32MB of eDRAM, so it won't sufer the same issues.

It's based on the fact that chipworks took a die shot and digital foundry did an analysis on it. http://www.eurogamer.net/articles/df-hardware-wii-u-graphics-power-finally-revealed

The similarities to the AMD RV770 basically means it has 16 TMU's compared to the PS3's 24 TMU's. 

Like I said these small caches are fine if you attempt tiled rendering(software wise ofcourse).

Digital Foundry's article is based on incomplete evidence. It's speculation presented as hard proof.

There's still lots about the chip that only devs and Nintendo themselves know.



Around the Network

I agree!



I have to agree, Just look at X or DKCTF, you need good developers to make good looking games.



curl-6 said:
fatslob-:O said:
curl-6 said:
fatslob-:O said:

Sure the GPU can do more shader operations and output a little higher primitives for better character models and evironments but it doesn't beat the PS3 in texture fillrates.

GPU caches can only do so much plus microsoft tried to apply the same situation but look what happened to them. Eventually the GPU will need to access the main memory and it ain't gonna be pretty plus the wii u can't always depend that eDRAM to always have that piece of data considering that it can only cache about 32mb a time.

Storing textures on the eDRAM is out of the the question unless they attempt to use tiled deferred rendering.

What's your source for the PS3 GPU having better fillrate?

The 360 ran into eDRAM problems because 10MB simply wasn't enough; you frequently had to render in multiple passes or in sub-HD resolutions. The Wii U has 32MB of eDRAM, so it won't sufer the same issues.

It's based on the fact that chipworks took a die shot and digital foundry did an analysis on it. http://www.eurogamer.net/articles/df-hardware-wii-u-graphics-power-finally-revealed

The similarities to the AMD RV770 basically means it has 16 TMU's compared to the PS3's 24 TMU's. 

Like I said these small caches are fine if you attempt tiled rendering(software wise ofcourse).

Digital Foundry's article is based on incomplete evidence. It's speculation presented as hard proof.

There's still lots about the chip that only devs and Nintendo themselves know.

Anandtech also does a little benchmarking and finds out that the WII U's processor doesn't obliterate the tablet processors. http://www.anandtech.com/show/6465/nintendo-wii-u-teardown

And by obliterate I mean being able to get like 2 to 3 times more performance than the motorola droid. 

You realize that digital foundry's article has some evidence.



fatslob-:O said:
curl-6 said:

Digital Foundry's article is based on incomplete evidence. It's speculation presented as hard proof.

There's still lots about the chip that only devs and Nintendo themselves know.

Anandtech also does a little benchmarking and finds out that the WII U's processor doesn't obliterate the tablet processors. http://www.anandtech.com/show/6465/nintendo-wii-u-teardown

And by obliterate I mean being able to get like 2 to 3 times more performance than the motorola droid. 

You realize that digital foundry's article has some evidence.

That benchmark is a test of browser speed, not processor speed.

And yes, Digital Foundry has some evidence, but they don't have the whole picture, which they admit in the article.



Around the Network

I'm starting to notice a pattern here. 

Every time there's a Shin'en thread the post count always goes incredible high.

Further analysis is require.



Nintendo and PC gamer

curl-6 said:
fatslob-:O said:
curl-6 said:

Digital Foundry's article is based on incomplete evidence. It's speculation presented as hard proof.

There's still lots about the chip that only devs and Nintendo themselves know.

Anandtech also does a little benchmarking and finds out that the WII U's processor doesn't obliterate the tablet processors. http://www.anandtech.com/show/6465/nintendo-wii-u-teardown

And by obliterate I mean being able to get like 2 to 3 times more performance than the motorola droid. 

You realize that digital foundry's article has some evidence.

That benchmark is a test of browser speed, not processor speed.

And yes, Digital Foundry has some evidence, but they don't have the whole picture, which they admit in the article.

I'm pretty sure it test the processor speed and not the internet speed and that benchmark was used for processing miscellaneous things on a cpu and what's more is that by looking at the tables it's slower than every phone processors. I just noticed lower is better.

Edit: Now I know why 4A games made the claim why the ibm espresso was slow.



fatslob-:O said:
curl-6 said:
fatslob-:O said:

Anandtech also does a little benchmarking and finds out that the WII U's processor doesn't obliterate the tablet processors. http://www.anandtech.com/show/6465/nintendo-wii-u-teardown

And by obliterate I mean being able to get like 2 to 3 times more performance than the motorola droid. 

You realize that digital foundry's article has some evidence.

That benchmark is a test of browser speed, not processor speed.

And yes, Digital Foundry has some evidence, but they don't have the whole picture, which they admit in the article.

I'm pretty sure it test the processor speed and not the internet speed and that benchmark was used for processing miscellaneous things on a cpu and what's more is that by looking at the tables it's slower than the phone processors. I just noticed lower is better.

The article blames the lower benchmark performance on the browser using an older version of Webkit, saying it's likely a browser software issue, hence not a processing power issue. It's not surprising that a console browser would underperform compared to tablets, where web browsing is more of a priority. 



@fatslob

Espresso is slow compared to PS360 if you make it perform floating point performance tasks. Its only method for doing these SIMD tasks is via paired singles, and at this clock, it likely doesn't match those two. Though there were benchmarks done by a member of Neogaf (blu) that proves that the architecture of Broadway isn't too bad at these tasks at all, just likely not at the same level as PS360. But it's a processor with strengths in places like GP or integer code, something the PS360 are just terrible at. Calling Espresso "slow" is not fair at all, since it's not designed to do much of those floating point tasks. Probably why Iwata encourages GPGPU, because that could alleviate Espresso.



osed125 said:

I'm starting to notice a pattern here. 

Every time there's a Shin'en thread the post count always goes incredible high.

Further analysis is require.

It's because they have positive things to say about the Wii U's hardware, and we simply cannot have that! ;)