By using this site, you agree to our Privacy Policy and our Terms of Use. Close
TheBardsSong said:
curl-6 said:
It really depends on the kind of game being made.

Let's assume for a second that the devs have mastered both consoles. (Xbox and Gamecube)

If you wanted to make a game with large areas and heavy use of normal mapping, the Xbox would be the better choice because of its larger RAM, hard drive streaming, and programmable pixel shaders. Halo 1 & 2, and Chronicles of Riddick are prime examples of this.

If you wanted to make a game that pushes lots of polygons and effects at once, like the Rogue Squadron games, the Gamecube would work better because it's GPU is better suited to handling that kind of thing.

Then why does it seem the average GC game is rather low poly? Hell, it's obvious a lot of PS2 exclusives push more polygons than your average Gamecube game.

Read the bolded again.

This is the "developer effort" effect/argument. Like how a lot of Wii games look worse than Rogue Squadron.

Last gen the Gamecube was the console that developers put in the least amount of effort. It had low market share (although similar to Xbox) and was the harder to develop for (X-box vs Gamecube, PS2 was probably the hardest but also had the largest market share by a huge margin). On X-box, most developers could just port over PC code with ease (and therefore cheaply) and still get the game looking half-decent. Gamecube required more work and optimisation than developers were willing to put in. It also made no financial sense to put in effort for such low market share. Only a few developers truly pushed the console for this reason, but when they did, the effects was incredibly impressive.