By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - What should Nintendo have done instead of Wii U?

Pemalite said:
JWeinCom said:

Sure, but how is an HD revision which would probably drive up both hardware and software costs increase profit margins?  

Newer chips don't always cost more, they can often be smaller, faster, cheaper to manufacture and require less power delivery and cooling reducing the overall bill of materials.

The X-Box 360 had roughly 5 times as much Ram as the Wii, which is important if you're loading HD textures.  It's power consumption was approximately 1/10 of the XBox 360's 2009 model which would indicate significantly less need for cooling.  The 360's processor had 3 cores and ran at about 5 times the speed in addition to having other advantages.  Its memory bandwith was about 1/7 of the 360.  The Wii doesn't support HDMI output.  The Wii itself is less than half the size of even the smaller 360 models.

To be clear, are you suggesting that Nintendo could have gotten the Wii anywhere close to the threshold for HD while lowering power consumption cooling requirements and costs?  



Around the Network
JWeinCom said:
Pemalite said:

Newer chips don't always cost more, they can often be smaller, faster, cheaper to manufacture and require less power delivery and cooling reducing the overall bill of materials.

The X-Box 360 had roughly 5 times as much Ram as the Wii, which is important if you're loading HD textures.  It's power consumption was approximately 1/10 of the XBox 360's 2009 model which would indicate significantly less need for cooling.  The 360's processor had 3 cores and ran at about 5 times the speed in addition to having other advantages.  Its memory bandwith was about 1/7 of the 360.  The Wii doesn't support HDMI output.  The Wii itself is less than half the size of even the smaller 360 models.

To be clear, are you suggesting that Nintendo could have gotten the Wii anywhere close to the threshold for HD while lowering power consumption cooling requirements and costs?  

Hm, not sure what Permalite had in mind, my original idea when I said Wii HD is Wii that is capable of outputting Wii level visuals (or slightly better, with better AA for example) in 720p, not a PS360 level console. So some 2.5x Wii, which is still way bellow PS360. Miyamoto himself said he wished Wii was HD and they were caught by surprise with HD TV adoption rates.



HoloDust said:
JWeinCom said:

The X-Box 360 had roughly 5 times as much Ram as the Wii, which is important if you're loading HD textures.  It's power consumption was approximately 1/10 of the XBox 360's 2009 model which would indicate significantly less need for cooling.  The 360's processor had 3 cores and ran at about 5 times the speed in addition to having other advantages.  Its memory bandwith was about 1/7 of the 360.  The Wii doesn't support HDMI output.  The Wii itself is less than half the size of even the smaller 360 models.

To be clear, are you suggesting that Nintendo could have gotten the Wii anywhere close to the threshold for HD while lowering power consumption cooling requirements and costs?  

Hm, not sure what Permalite had in mind, my original idea when I said Wii HD is Wii that is capable of outputting Wii level visuals (or slightly better, with better AA for example) in 720p, not a PS360 level console. So some 2.5x Wii, which is still way bellow PS360. Miyamoto himself said he wished Wii was HD and they were caught by surprise with HD TV adoption rates.

I didn't say it needed those exact specs, but that was just to give an idea of what HD consoles were doing.  Someone with more tech knowledge can correct me if I'm wrong, but I think to get even in the ballpark would require a pretty major hardware revision, rather than an update with a cheaper chipset.  Unless you're talking about some real barebones upscaling, and I dunno if even that could be accomplished while cutting costs.  

As for whether or not the idea is a good one in general, I don't think so.  The PS4 Pro makes sense for instance, because there is a significant amount of the PS4 audience interested in top quality visuals and 4K.  Whereas I don't think the Wii audience was mostly interested in visual fidelity.  I'm sure there were some people who would prefer an HD Wii, but I don't think it would be enough to make it a profitable venture, especially if it means developing separate versions of all their games to run on both systems, or making Wii HD exclusive games which would fragment the market.  A Wii HD would have made sense later on, but 2009 was still way too early IMO.  



JWeinCom said:
HoloDust said:

Hm, not sure what Permalite had in mind, my original idea when I said Wii HD is Wii that is capable of outputting Wii level visuals (or slightly better, with better AA for example) in 720p, not a PS360 level console. So some 2.5x Wii, which is still way bellow PS360. Miyamoto himself said he wished Wii was HD and they were caught by surprise with HD TV adoption rates.

I didn't say it needed those exact specs, but that was just to give an idea of what HD consoles were doing.  Someone with more tech knowledge can correct me if I'm wrong, but I think to get even in the ballpark would require a pretty major hardware revision, rather than an update with a cheaper chipset.  Unless you're talking about some real barebones upscaling, and I dunno if even that could be accomplished while cutting costs.  

As for whether or not the idea is a good one in general, I don't think so.  The PS4 Pro makes sense for instance, because there is a significant amount of the PS4 audience interested in top quality visuals and 4K.  Whereas I don't think the Wii audience was mostly interested in visual fidelity.  I'm sure there were some people who would prefer an HD Wii, but I don't think it would be enough to make it a profitable venture, especially if it means developing separate versions of all their games to run on both systems, or making Wii HD exclusive games which would fragment the market.  A Wii HD would have made sense later on, but 2009 was still way too early IMO.  

What I'm talking is indeed just rendering internally at higher res with somewhat better AA and AF, no additional effort really. Those 2.5x are about what you need for that, and tech inside Wii was very heavily outdated even when it released, let alone in 2009/2010.

I'd argue that difference between Wii and Wii HD would be much more noticeable than PS4 to PS4 Pro. Wii looks good on SDTVs, but it really falls apart on HDTVs.

I think it would give an option to those interested in crisper image and would probably had better legs and end up maybe in 120 mil range combined.



JWeinCom said:

The X-Box 360 had roughly 5 times as much Ram as the Wii, which is important if you're loading HD textures.  It's power consumption was approximately 1/10 of the XBox 360's 2009 model which would indicate significantly less need for cooling.  The 360's processor had 3 cores and ran at about 5 times the speed in addition to having other advantages.  Its memory bandwith was about 1/7 of the 360.  The Wii doesn't support HDMI output.  The Wii itself is less than half the size of even the smaller 360 models.

To be clear, are you suggesting that Nintendo could have gotten the Wii anywhere close to the threshold for HD while lowering power consumption cooling requirements and costs?  

The Wii was running HD textures. Some 360 games had 4k textures. (4096x4096)
Texture resolution is independent of the display output resolution.

The Original Xbox was running at HD resolutions for many games, the Wii was a similar ballpark in terms of overall capability.

What I am getting at... Is that fabricating old, large chips isn't always cheaper... Companies like to retool their fabs to newer process nodes, while doing so... There tends to be less fabs on older process nodes, those older nodes tend to get used for specialized chips/controllers for specialized markets and thus get charged a premium. - Thus building a console chip on antiquated and old nodes can actually start to increase in costs while that node is being depreciated.

Same thing goes for RAM, Ram is is a commodity and thus suffers the wrath of market forces like supply/demand.. Thus after a DRAM  technology has hit full market saturation, it tends to be at it's lowest price point, from there as other markets shift to newer DRAM technologies, supply switches to the newer DRAM and older DRAM technologies tend to go up in price as manufacturing for it stops. - Consoles don't tend to make any changes on the Ram front.

Ergo, older hardware isn't necessarily always cheaper or more cost effective than newer, faster hardware.

Right now, I can guarantee a Raspberry Pi is not only faster than the Wii, but would end up being cheaper to manufacture for example.



--::{PC Gaming Master Race}::--

Around the Network

I think they should have just done Wii U right, instead of doing something else.

And with right, I don't mean anything thing like processing power.
So what do I mean?

---

- I mean proper naming, identity and marketing. Ever "Super Wii" would have been better than "Wii U". And why not bring back the two polite japanese men from the Wii ads, and have them say something like "See You on Wii U", if they absolutely HAD to use that name?

- I mean sticking to ONE HADWARE CONFIG AND ONE HARDWARE CONFIG ONLY at launch, particularly after seeing how the two-versions approach hurt the initial sales of both the 360 and PS3. Wii U BASIC SETs rotted on shelves for about a year, while no new DELUXE SETs were being ordered, because store managers didn't understand how fundamental the difference was. Eventually, most department stores around Europe just stopped carrying the Wii U. Before its second holiday season.

- I mean designing a fast and functional main system interface. Instead we got a complete mess, that was also unpleasantly slow.

- I mean having the system work on release. To even turn on a launch window Wii U, it needed a firmware-update. But hold on, the firmware was so borked, it couldn't even automatically set up internet connections. And it didn't have an Ethernet port. Most had to figure out for themselves to go to Nintendo's online guide for how to locate and manually type in your router's MAC address. An online a guide that wasn't provided in all that many languages.

- I mean sticking to what they know, and NOT trying to be a TV-service, a Social Media, and a virtual city tour.

- I mean packing in something more easily decipherable and slightly less self-aggrandizing than "Nintendoland". The game is great, but nobody understands what it is from looking at the box.

- I mean coordinating your use of cords. HDMI, proprietary power plug, second proprietary power plug for GamePad, third proprietary power plug for the IR bar, and would you believe it, for the ProController, a FOURTH type of power cable, and while not proprietary, it was the type-one USB micro, that everyone except GPS manufacturers had phased out.

- I mean pricing things fairly. Charging people for upgrading their Virtual Console games to run on the Wii U directly, instead of having to open the Wii menu? After going through the HELL that is transfering from a Wii to a Wii U?

- I mean letting people play the games they paid for. Many of Nintendo's own releases received mandatory patches. For Mario Kart 8, a certain update primarily advertised Mercedes Benz, as well as inserting taunting links to the eShop directly on the character- and stage select screens.

---

And really, I mean thinking for a moment. I could write a book about the things Nintendo did wrong in selling people on the Wii U.

When Sony saw that the PS3 was limping during it's first year, they devised an extensive relaunch campaign. New console design, new logo, new ads, new packaging, new pricepoint, new chipset configuration. And it worked.

Nintendo just didn't even try to save the Wii U, and this is probably what saddens me the most.

Last edited by Podings - on 09 January 2020

HoloDust said:
JWeinCom said:

I didn't say it needed those exact specs, but that was just to give an idea of what HD consoles were doing.  Someone with more tech knowledge can correct me if I'm wrong, but I think to get even in the ballpark would require a pretty major hardware revision, rather than an update with a cheaper chipset.  Unless you're talking about some real barebones upscaling, and I dunno if even that could be accomplished while cutting costs.  

As for whether or not the idea is a good one in general, I don't think so.  The PS4 Pro makes sense for instance, because there is a significant amount of the PS4 audience interested in top quality visuals and 4K.  Whereas I don't think the Wii audience was mostly interested in visual fidelity.  I'm sure there were some people who would prefer an HD Wii, but I don't think it would be enough to make it a profitable venture, especially if it means developing separate versions of all their games to run on both systems, or making Wii HD exclusive games which would fragment the market.  A Wii HD would have made sense later on, but 2009 was still way too early IMO.  

What I'm talking is indeed just rendering internally at higher res with somewhat better AA and AF, no additional effort really. Those 2.5x are about what you need for that, and tech inside Wii was very heavily outdated even when it released, let alone in 2009/2010.

I'd argue that difference between Wii and Wii HD would be much more noticeable than PS4 to PS4 Pro. Wii looks good on SDTVs, but it really falls apart on HDTVs.

I think it would give an option to those interested in crisper image and would probably had better legs and end up maybe in 120 mil range combined.

Would have been fine for consumers, but I just don't see it being worth it for Nintendo.  

Pemalite said:
JWeinCom said:

The X-Box 360 had roughly 5 times as much Ram as the Wii, which is important if you're loading HD textures.  It's power consumption was approximately 1/10 of the XBox 360's 2009 model which would indicate significantly less need for cooling.  The 360's processor had 3 cores and ran at about 5 times the speed in addition to having other advantages.  Its memory bandwith was about 1/7 of the 360.  The Wii doesn't support HDMI output.  The Wii itself is less than half the size of even the smaller 360 models.

To be clear, are you suggesting that Nintendo could have gotten the Wii anywhere close to the threshold for HD while lowering power consumption cooling requirements and costs?  

The Wii was running HD textures. Some 360 games had 4k textures. (4096x4096)
Texture resolution is independent of the display output resolution.

The Original Xbox was running at HD resolutions for many games, the Wii was a similar ballpark in terms of overall capability.

What I am getting at... Is that fabricating old, large chips isn't always cheaper... Companies like to retool their fabs to newer process nodes, while doing so... There tends to be less fabs on older process nodes, those older nodes tend to get used for specialized chips/controllers for specialized markets and thus get charged a premium. - Thus building a console chip on antiquated and old nodes can actually start to increase in costs while that node is being depreciated.

Same thing goes for RAM, Ram is is a commodity and thus suffers the wrath of market forces like supply/demand.. Thus after a DRAM  technology has hit full market saturation, it tends to be at it's lowest price point, from there as other markets shift to newer DRAM technologies, supply switches to the newer DRAM and older DRAM technologies tend to go up in price as manufacturing for it stops. - Consoles don't tend to make any changes on the Ram front.

Ergo, older hardware isn't necessarily always cheaper or more cost effective than newer, faster hardware.

Right now, I can guarantee a Raspberry Pi is not only faster than the Wii, but would end up being cheaper to manufacture for example.

Can you explain a bit more how and why 360 games would have 4k textures?  That doesn't make sense to me.

I know that chips get cheaper to produce over time, but not enough to make significant graphical leaps while cutting costs.  When have we ever seen a console revision that significantly boosted performance without an increase in cost?



JWeinCom said:
HoloDust said:

What I'm talking is indeed just rendering internally at higher res with somewhat better AA and AF, no additional effort really. Those 2.5x are about what you need for that, and tech inside Wii was very heavily outdated even when it released, let alone in 2009/2010.

I'd argue that difference between Wii and Wii HD would be much more noticeable than PS4 to PS4 Pro. Wii looks good on SDTVs, but it really falls apart on HDTVs.

I think it would give an option to those interested in crisper image and would probably had better legs and end up maybe in 120 mil range combined.

Would have been fine for consumers, but I just don't see it being worth it for Nintendo.  

Well, in that alternate timeline I think Wii would have longer legs due to Wii HD - as I said, probably making it to 120mils...which would be worth for Nintendo.



HoloDust said:
JWeinCom said:

Would have been fine for consumers, but I just don't see it being worth it for Nintendo.  

Well, in that alternate timeline I think Wii would have longer legs due to Wii HD - as I said, probably making it to 120mils...which would be worth for Nintendo.

I think an HD version would boost legs, but why launch that in 2009 as opposed to 2011?



Pemalite said:
JWeinCom said:

The X-Box 360 had roughly 5 times as much Ram as the Wii, which is important if you're loading HD textures.  It's power consumption was approximately 1/10 of the XBox 360's 2009 model which would indicate significantly less need for cooling.  The 360's processor had 3 cores and ran at about 5 times the speed in addition to having other advantages.  Its memory bandwith was about 1/7 of the 360.  The Wii doesn't support HDMI output.  The Wii itself is less than half the size of even the smaller 360 models.

To be clear, are you suggesting that Nintendo could have gotten the Wii anywhere close to the threshold for HD while lowering power consumption cooling requirements and costs?  

The Wii was running HD textures. Some 360 games had 4k textures. (4096x4096)
Texture resolution is independent of the display output resolution.

The Original Xbox was running at HD resolutions for many games, the Wii was a similar ballpark in terms of overall capability.

What was it exactly (aside from the output) that limited the Wii to 480p, was it the 3MB of eDRAM that made a bigger framebuffer unfeasible?