By using this site, you agree to our Privacy Policy and our Terms of Use. Close
JWeinCom said:
Pemalite said:

The Wii was running HD textures. Some 360 games had 4k textures. (4096x4096)
Texture resolution is independent of the display output resolution.

The Original Xbox was running at HD resolutions for many games, the Wii was a similar ballpark in terms of overall capability.

What I am getting at... Is that fabricating old, large chips isn't always cheaper... Companies like to retool their fabs to newer process nodes, while doing so... There tends to be less fabs on older process nodes, those older nodes tend to get used for specialized chips/controllers for specialized markets and thus get charged a premium. - Thus building a console chip on antiquated and old nodes can actually start to increase in costs while that node is being depreciated.

Same thing goes for RAM, Ram is is a commodity and thus suffers the wrath of market forces like supply/demand.. Thus after a DRAM  technology has hit full market saturation, it tends to be at it's lowest price point, from there as other markets shift to newer DRAM technologies, supply switches to the newer DRAM and older DRAM technologies tend to go up in price as manufacturing for it stops. - Consoles don't tend to make any changes on the Ram front.

Ergo, older hardware isn't necessarily always cheaper or more cost effective than newer, faster hardware.

Right now, I can guarantee a Raspberry Pi is not only faster than the Wii, but would end up being cheaper to manufacture for example.

Can you explain a bit more how and why 360 games would have 4k textures?  That doesn't make sense to me.

I know that chips get cheaper to produce over time, but not enough to make significant graphical leaps while cutting costs.  When have we ever seen a console revision that significantly boosted performance without an increase in cost?

The higher the texture resolution, the sharper and crisper and the more details textures will have in a game, regardless if you are running at 480P, 720P, 1080P or 4k.

The output resolution is how many pixels the entire image displayed on your screen is.

Texture resolution is how many pixels the texture sitting on top of a surface (i.e like the side of a building) is.

A higher display resolution will of course bring more details out in high resolution textures as there are more pixels to resolve smaller details... Which is why allot of "enhanced" Xbox and Xbox 360 games on the Xbox One X have more details "pop". - It's the same textures as the original games release.

curl-6 said:
Pemalite said:

The Wii was running HD textures. Some 360 games had 4k textures. (4096x4096)
Texture resolution is independent of the display output resolution.

The Original Xbox was running at HD resolutions for many games, the Wii was a similar ballpark in terms of overall capability.

What was it exactly (aside from the output) that limited the Wii to 480p, was it the 3MB of eDRAM that made a bigger framebuffer unfeasible? 

Well, you aren't obligated to use the 3MB of eDRAM, so you can skip that entirely.

You can also take a tiled rendering approach to use that 3MB for higher resolutions anyway... And of course you can use it as a general cache as well.

I would assume it's just a firmware restriction rather than a technical one... As Component can output 720P just fine which the Wii supports, obviously the higher resolution would result in less/lower quality effects of course.



--::{PC Gaming Master Race}::--