HoloDust said:
JWeinCom said:
I didn't say it needed those exact specs, but that was just to give an idea of what HD consoles were doing. Someone with more tech knowledge can correct me if I'm wrong, but I think to get even in the ballpark would require a pretty major hardware revision, rather than an update with a cheaper chipset. Unless you're talking about some real barebones upscaling, and I dunno if even that could be accomplished while cutting costs.
As for whether or not the idea is a good one in general, I don't think so. The PS4 Pro makes sense for instance, because there is a significant amount of the PS4 audience interested in top quality visuals and 4K. Whereas I don't think the Wii audience was mostly interested in visual fidelity. I'm sure there were some people who would prefer an HD Wii, but I don't think it would be enough to make it a profitable venture, especially if it means developing separate versions of all their games to run on both systems, or making Wii HD exclusive games which would fragment the market. A Wii HD would have made sense later on, but 2009 was still way too early IMO.
|
What I'm talking is indeed just rendering internally at higher res with somewhat better AA and AF, no additional effort really. Those 2.5x are about what you need for that, and tech inside Wii was very heavily outdated even when it released, let alone in 2009/2010.
I'd argue that difference between Wii and Wii HD would be much more noticeable than PS4 to PS4 Pro. Wii looks good on SDTVs, but it really falls apart on HDTVs.
I think it would give an option to those interested in crisper image and would probably had better legs and end up maybe in 120 mil range combined.
|
Would have been fine for consumers, but I just don't see it being worth it for Nintendo.
Pemalite said:
JWeinCom said:
The X-Box 360 had roughly 5 times as much Ram as the Wii, which is important if you're loading HD textures. It's power consumption was approximately 1/10 of the XBox 360's 2009 model which would indicate significantly less need for cooling. The 360's processor had 3 cores and ran at about 5 times the speed in addition to having other advantages. Its memory bandwith was about 1/7 of the 360. The Wii doesn't support HDMI output. The Wii itself is less than half the size of even the smaller 360 models.
To be clear, are you suggesting that Nintendo could have gotten the Wii anywhere close to the threshold for HD while lowering power consumption cooling requirements and costs?
|
The Wii was running HD textures. Some 360 games had 4k textures. (4096x4096) Texture resolution is independent of the display output resolution.
The Original Xbox was running at HD resolutions for many games, the Wii was a similar ballpark in terms of overall capability.
What I am getting at... Is that fabricating old, large chips isn't always cheaper... Companies like to retool their fabs to newer process nodes, while doing so... There tends to be less fabs on older process nodes, those older nodes tend to get used for specialized chips/controllers for specialized markets and thus get charged a premium. - Thus building a console chip on antiquated and old nodes can actually start to increase in costs while that node is being depreciated.
Same thing goes for RAM, Ram is is a commodity and thus suffers the wrath of market forces like supply/demand.. Thus after a DRAM technology has hit full market saturation, it tends to be at it's lowest price point, from there as other markets shift to newer DRAM technologies, supply switches to the newer DRAM and older DRAM technologies tend to go up in price as manufacturing for it stops. - Consoles don't tend to make any changes on the Ram front.
Ergo, older hardware isn't necessarily always cheaper or more cost effective than newer, faster hardware.
Right now, I can guarantee a Raspberry Pi is not only faster than the Wii, but would end up being cheaper to manufacture for example.
|
Can you explain a bit more how and why 360 games would have 4k textures? That doesn't make sense to me.
I know that chips get cheaper to produce over time, but not enough to make significant graphical leaps while cutting costs. When have we ever seen a console revision that significantly boosted performance without an increase in cost?