@(bd)x3
As a quick answer: coding changes radically when you can split your load into many threads. But I've suggested several times that we let the technical specs aside, I don't think that discussion would lead to anything interesting in this context.
I don't understand what you mean when you say that I approach HD as MS and Sony. This is not about HD resolution only, but let's suppose for a while that we include processing power into the HD moniker.
You seem to be of the idea that back in 2002-2005 (when these consoles were being designed) there were cheap options that Nintendo could have chosen. I'm saying that the 360 and PS3 were designed with an architecture that was radically different from your average PC exactly to try to be competitive in processing power by the time they came out and at the same time cheaper than the equivalent PC components.
This brought costs for development of new tools and training of the developers, but was successful in bridging PC gaming to console gaming, so that nowadays many PC developers are expanding or migrating to the growing console market.
For the Wii to be in the same ballpark they would have had to go through similar costs and similar breakup from the past. In a sense the 360 and the PS3 did what you think Nintendo could have done: provide a "just good enough" HD solution.
The 360, when it came out, was as cheap as it could reasonably be - some would say even cheaper than it needed to be, as its cooling design reveled itself rushed - and yet the 360 and PS3 still weight down the requirements of console/PC development.
I can't see how this "cheap HD" Nintendo console could be born in 2005.
Unless you mean basically just a Wii capable of exactly the same games due to the memory/CPU bottleneck but with a beefier GPU capable of 720p. That was probably the only cheap upgrade option at the time that could be retrofitted to existing development tools and gamecube library. But it is very uncertain how much multiplatform development it would have encouraged.







