By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Soriku said:
Katakis08 said:
>>Wii has a bit weaker graphics

That is a little bit understated :D

Wii uses graphics hardware from 2000/2001 equivalent to a GeForce2 MX 200.
It doesn't have shader and it's far away from HD resolutions.

But the average Wii consumer doesn't care, he only wants to play Mario Kart with his children from time to time ;)

 

Esa-Petteri, is that you?

Hint: If you wanna make an ALT account, don't make yourself so obvious.

No, i am not Esa-Petteri (who the hell is that?)

 

And it's my first and only account here, but I observe this great page since years.

 

@HappySqurriel

"The Gamecube/XBox were more similar in performance to the Geforce 3 which was a 2000/2001 graphics card, and the Wii is more powerful than the Gamecube and is (probably) more similar to the Geforce 4 in real world performance."

To compare a Gamecube with Xbox is dangerous. The Xbox had much more GPU power than the Cube, and it still has more power than the Wii. Xbox featured shader model 1.1 and had  MSAA (multisampling).

And you can't compare real shader with the TEV units of the Wii. TEV stages are pretty similiar to the renderstages from DirectX7. But the pixel operations aren't the biggest problem of the Wii, the vertex unit is. You don't have GPU powered softskinning on the Wii because it only has vertex lighting (and this is only a fake, because it doesn't have perspective correction). The Wii doesn't support 720p nor 1024i/p. 

And I stick to it: Wii has the power of a Gerforce 200 MX.

I don't blame Nintendo for saving money on the GPU, they did a great job by introducing new controller techniques.