By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Many people don’t seem to understand the multiple factors which combined to making the Wii the system it is today ...

First and foremost, in 2005/2006 the hardware that could be used to make a $200 to $300 console with minimal loss could either produce “Wii Quality” graphics (figuratively speaking) at HD resolutions or it could produce advanced graphics at Standard definition. Sony and Microsoft dealt with this by pushing the price of their consoles to new highs ($400 to $600) while still taking very large losses on each system sold; and, being that most of the best looking games struggle to approach 720p @ 30fps it could be argued that the systems are somewhat shy of being a true HD console that displays advanced graphics.

To add to this the Wii was a very large gamble, and Nintendo was (probably) very concerned that it could be a failure. In general, they probably wanted to keep all "Start-Up" costs of the system as low as possible and wanted to have a healthy profit off of the hardware because there was the potential that the system could have a very short (1 year) lifespan; and this would put their R&D investment and hardware licensing costs at an order of magnitude lower than what Sony or Microsoft were willing to pay.

In hindsight we can say that Nintendo probably didn’t need to be as cautious, and could have probably released a console with conventional processing power but it should be understandable why the Wii became the system it is. Now, I could be wrong but I suspect that Nintendo’s next system will probably be much more conventional when it comes to processing power; and if you assume a 2011/2012 release date this would probably be about 4 to 8 times the processing power of the XBox 360. When you are dealing with a system that performs in this range there really is no reason why it wouldn’t output at HD resolutions.