Katakis08 said:
No, i am not Esa-Petteri (who the hell is that?)
And it's my first and only account here, but I observe this great page since years.
@HappySqurriel "The Gamecube/XBox were more similar in performance to the Geforce 3 which was a 2000/2001 graphics card, and the Wii is more powerful than the Gamecube and is (probably) more similar to the Geforce 4 in real world performance." To compare a Gamecube with Xbox is dangerous. The Xbox had much more GPU power than the Cube, and it still has more power than the Wii. Xbox featured shader model 1.1 and had MSAA (multisampling). And you can't compare real shader with the TEV units of the Wii. TEV stages are pretty similiar to the renderstages from DirectX7. But the pixel operations aren't the biggest problem of the Wii, the vertex unit is. You don't have GPU powered softskinning on the Wii because it only has vertex lighting (and this is only a fake, because it doesn't have perspective correction). The Wii doesn't support 720p nor 1024i/p. And I stick to it: Wii has the power of a Gerforce 200 MX. I don't blame Nintendo for saving money on the GPU, they did a great job by introducing new controller techniques.
|
features, maybe, raw power, not really. xbox1 has nothing on Wii on raw power, it's a known fact already, I don't know how people come up with xbox is as powerful as the Wii and more powerful than the cube crap. it's not the power, it's the features, get your facts straight. cube's raw power was already above the xbox1, but it didn't have shader model capabilities which was a nice feature to have.
ps: you can't look at consoles the same way you'd look at programming PC games, DX7 is a windows libary which has absolutely nothing to do with the Wii's TEV, you'd do more low lvl coding on the Wii for good looking games instead of using middleware and western devs are just too lazy.








