By using this site, you agree to our Privacy Policy and our Terms of Use. Close
windbane said:
fazz said:
Woah there souixan, the GC didn't used DDR3, it used 1T-SRAM or something like that. The Wii is the one using GDDR3 RAM on a 128-bit bus width (few words: the same as the PS3 and 360)

Also, nobody knows yet the ACTUAL clock rates of CPU, GPU and RAM of the Wii. And we still don't the actual architecture of the GPU which could be the same as the GC's, but it could be the same as the 360's too. Mario Galaxy is a good example of graphical achievement on the Wii, it uses many shader effects that I'm certain where not possible on the GC... and it still runs flawlessly at 60 frames per second. So in short, the Wii is far above any console of the past generation.

And no, the Wii doesn't makes the GC games perform any better.

Graphically Galaxy pales in comparison to PS3 and 360 games. According to Wikipedia, which cites sources:

Processors:

 

Several dvelopers have compared it to XBox. The Wii is not far better than last-gen systems.

I'm not sure why people have to argue this. Just accept the graphics and move on. The games can still be fun.


First of all you lost any credability when you went with "according to wikipedia" which is basically saying "according to some guy who also has a computer". You are far better off citing the sources directly.

More disturbing than people defending the Wii's graphics are those who feel the need to demean them. The real problem is the complete lack of hands on or practical experience from most of the people making these claims. The usual internet banter about which console has more oomph is nothing new, and has never gone anywhere.

 

When it boils right down to it most people don't have a clue what the different numbers cited mean. For example one of the most common comparisons used is to compare the Hz of two chips and declare one the victor based on that alone. If this sort of test made any practical sense then AMD would be long dead. But Mhz and Ghz aren't the end all be all of the computing world. And indeed many people have very little knowledge of what these numbers are referring to to begin with. I would actually go so far as to say that in a practical sense high Mhz and Ghz are becoming the enemy of the engineers of these machines as higher clock speeds do in fact equate to higher temperatures. This is why they try to get more done in each clock cycle in addition to actually increasing the clock speed.

Just the fact that the Wii was released several years after the Xbox means that its architecture is almost certainly going to be getting more done with every clock cycle than the Xbox...this is just the way processors and indeed computers and electronics in general work. So right out of the gate(no pun intended for those in the know) I can almost gaurantee you that 1 Hz (wii) > 1 Hz (xbox) and instantly that should tell you that any comparison purely based on Hz is going to be faulty without first examining the architectures themselves.

Now you might ask ok, how can we modify the comparison to be more accurate? And the answer is that without knowing specifics of the designs the only way to tell is with controlled practical real world comparisons, or otherwise known as: benchmarks. The upside of benchmarks is it helps cut out some of the BS and go straight to the end results, the bad part is that a poorly designed benchmark can provide completely false results.

With that said we have very little info to go on for the Wii and the result is that benchmarks and their practical comparisons, flawed as some may be, are the only option.

 



To Each Man, Responsibility