fazz said: makingmusic476 said: Guys, the Wii is quite a bit more powerful than the ps2. However, you can't say that the difference between these two:


Equates to the difference between these two:


And let's not forget that GoW2 had much larger environments than SSBB, and could be played in 720p. |
Yeah, 720p at a much lower framerate 
Also, I have much better examples: vs. and If someone can't see the difference in geometry, shaders and textures, they're really blind. Now, let's get technical. The Playstation 2 and the Gamecube (we'll start with the GC) had very different architecture that required deep analyzing to understand how they compared to each other. The PS2 had to resort in it's rather powerful CPU to overcome it's primitive GPU. Seeing that the GPU lacked MANY features that the Xbox and GC's GPUs had, the CPU had to brute-force all of those tasks and therefore all of it's processing power was wasted, nullifying it's advantage in processing power. First of all, let's go with RAM. The PS2 had 36MB to work with graphics while the Gamecube had just 27MB (yes, the other 16MB could only be used for audio/DVD-buffer). If the GC had less RAM, how could it had better textures, models and effects? Answer: The Gamecube had 6:1 S3 Texture Compression. This means that textures that would take 24MB on the PS2, would just take 4MB on the GC. So, effectively the GC had around 4X more memory for textures. Now with the Wii, that has an Unified Memory Architecture and can use all of it's 88MB of RAM for anything the developer wants, it would have around 12X more memory available for textures... and I'm not taking into account that the Wii's memory is much faster than the one in the Cube, if so the difference would've be much higher. Oh, now that we're talking about RAM and seeing that some people love Wikipedia facts so much, the Wii uses the very same GDDR3 RAM in the Xbox 360 in a (according to your beloved Wikipedia) 128 bit bus. The Wii's MEM-2 (that's the name of the GDDR3 in a game developing environment) can have as much bandwidth as the Xbox 360. That's 22GB per second. Are we being too generous with that piece of crap called the Wii? Ok, let's cut it in half the speed, ie 700Mhz effective. Now we have 11GB per second. That's over 3X the bandwidth on the Cube. Want more? Everyone says "Wii's GPU is just an overclocked GC GPU!"... but wrong my little droogys. The Wii's GPU is radically different from the one in the Cube, more so that now it has been split in two chips :shock: If Nintendo had just used a process-reduced Flipper in the Wii, the chip would have taken less than 30% of the physical space than one of the chips in the package. Way to waste money on unused chip space Nintendo!... All sarcasm aside, we can assume that ATi has doubled the pixel pipelines, texture units, TEV units and/or embedded frame/texture bufers. Not to mention the TEV units have been improved to make them much more flexible and programmable, this to achieve advanced shader effects... that BTW the PS2 had none. The bottom-line: The Wii is as far from the PS2, as the Xbox 360 from the Wii... but if you don't agree with me, you'll ignore everything I said above ;> |