Quantcast
View Post
padib said:

@Bonzo. That's why it has a gpgpu.

oniyide said:

what other reasons? besides the one that EA actually gave which is not supporting the Wii U cause it wasnt doing well, which it hasnt and it isnt

You're contradicting yourself. You just said that Sony has worked closely with 3rd parties. So that would be another reason that you yourself suggested.

That's why I dislike arguing with you, you just want to say that what I post is wrong even if you just said the same thing.

That gpgpu argument is incredibly weak, for a start the radeon in the wii u is no more than 352 gflops, more likely 176 gflops so there isn't going to be much spare capacity but more importantly it can only assist minor functions of the cpu its no substitute for a more powerful cpu. Surely we have reached the stage where not one cpu intensive game works well on the wii u that we can stop using that ridiculous fanboy defence. The radeon gpu area of the wii u is shared with the original wii gpu, the main 32MB of eRAM etc. Anyway the point is we know the cpu is weak, weaker than 360 and PS3 by a long way, there is no technical or honest defence of this so whatever the gpu situation we know the wii u performs below 360 and PS3 for cpu performance and this is verified easily anyway by looking at game performance. Any fair minded person would never simply claim the wii u is more powerful when the evidence absolutely destroys their argument for many genres of games which require higher cpu resources. It seems you can never be too surprised how low Nintendo will go in performance nowadays and I think that is the issue for many defenders of the wii u they are in denial about the wii u. However if they really believe they have a defence it needs to be focused on why they think 3 old 32bit ppc cpu's designed in the last century running at 1.25ghz is competitive in performance with more modern cpu designs and also explain how this performance works when the cpu has to work with very low 12.8gb/s bandwidth memory in combination with the gpu. It will be interesting to read if nothing else.