By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - I have a couple of Wii graphics question.

Wii is about as powerful as an Xbox.  Many developers have commented on that.

Most Wii games can be ported easily to PS2, just like XBox games were. 



Around the Network

ok tot he one above me. the specs of the gamecube was revealed shortly after its release and it was not less then 600mhz. now in the official specs it said the GCN was in fact 648mhz and the xbox was 668mhz and not much diferent exepct for shaders. now what was only released by nintendo on the wii was just this wii is 2-3 times more powerful then the gamecube so then in power take roughly 650mhz and multiply. To say the specs had been released for th ewii is false show me the official document from nintendo that states what is int he wii. otherwise all attempt to discuss the power of the wii beyond the 2 to 3 times moree powerful the the cube is irrelevant.



dick cheney loves me, he wants to take me hunting

 

mkwii code- 1977-0565-0049

Woah there souixan, the GC didn't used DDR3, it used 1T-SRAM or something like that. The Wii is the one using GDDR3 RAM on a 128-bit bus width (few words: the same as the PS3 and 360)

Also, nobody knows yet the ACTUAL clock rates of CPU, GPU and RAM of the Wii. And we still don't the actual architecture of the GPU which could be the same as the GC's, but it could be the same as the 360's too. Mario Galaxy is a good example of graphical achievement on the Wii, it uses many shader effects that I'm certain where not possible on the GC... and it still runs flawlessly at 60 frames per second. So in short, the Wii is far above any console of the past generation.

And no, the Wii doesn't makes the GC games perform any better.

EDIT

@scorptile: Gamecube CPU was 485Mhz, Xbox 1 was 733Mhz, but GC's was more powerful because it was 64 bit and Xbox's was 32 bit (note that the PS2's CPU was 128 bit), BUT the Xbox's 1 GPU was faster and more advanced, so it was more powerful than the GC's GPU. And the PS2's GPU was the less advanced and powerful of the three.

The clock speeds of the Wii's CPU are rumored to be 729Mhz because it says that number on the heatspreader of the CPU... but my GC's CPU doesn't says its clockspeed on its heatspreader. Neither does the PS3's Cell BBE, so the 729Mhz figure is just a very random assumption. Personally I think that having a 90nm processor running at that speed is a waste. Of course it MUST be slower than the 360's and PS3's, but think of notebook computers, they usually run at around 1.6~1.8Ghz. So I'd put my money for the Wii's CPU speed at over 1Ghz, but less than 1.5Ghz.



fazz said:
Woah there souixan, the GC didn't used DDR3, it used 1T-SRAM or something like that. The Wii is the one using GDDR3 RAM on a 128-bit bus width (few words: the same as the PS3 and 360)

Also, nobody knows yet the ACTUAL clock rates of CPU, GPU and RAM of the Wii. And we still don't the actual architecture of the GPU which could be the same as the GC's, but it could be the same as the 360's too. Mario Galaxy is a good example of graphical achievement on the Wii, it uses many shader effects that I'm certain where not possible on the GC... and it still runs flawlessly at 60 frames per second. So in short, the Wii is far above any console of the past generation.

And no, the Wii doesn't makes the GC games perform any better.

Graphically Galaxy pales in comparison to PS3 and 360 games.  According to Wikipedia, which cites sources:

Processors:

 

Several dvelopers have compared it to XBox.  The Wii is not far better than last-gen systems.

I'm not sure why people have to argue this.  Just accept the graphics and move on.  The games can still be fun. 



Rugger08 said:
@souixan

do you know stats for original xbox? was it higher or lower than gc?

it was faster but the original Xbox also wasted a good deal of resource on O.S. if I remember correctly. CPU- 733 MHz Coppermine-based Mobile Celeron GPU - 233 MHz "NV2A" ASIC. with 64 MB DDR SDRAM at 200 MHz shared system memory I'm not sure about the rest. And yeah sorry about DDR3 I was talking to someone about a GPU prior to writing that up and must have messed it up in my head. Edit: The original Xbox had a simmilar ammount of pixel pipelines when compared to the Gamecube but the Xbox had 2 texture units per pipeline compared to the Gamecubes 1 making it far more capable when constructing textures. Both were capable of the same Texture filtering though. Edit again: Fun fact, the wii's GPU clocks slightly faster then the Xbox's and it's CPU slightly slower. The original Xbox was able to output in 480i, 576i, 480p, 720p and 1080i the wii only outputs in 480i, 576i and 480p I wonder why they don't support a higher resolution it seems like they could.

Around the Network
fazz said:
Woah there souixan, the GC didn't used DDR3, it used 1T-SRAM or something like that. The Wii is the one using GDDR3 RAM on a 128-bit bus width (few words: the same as the PS3 and 360)

Also, nobody knows yet the ACTUAL clock rates of CPU, GPU and RAM of the Wii. And we still don't the actual architecture of the GPU which could be the same as the GC's, but it could be the same as the 360's too. Mario Galaxy is a good example of graphical achievement on the Wii, it uses many shader effects that I'm certain where not possible on the GC... and it still runs flawlessly at 60 frames per second. So in short, the Wii is far above any console of the past generation.

And no, the Wii doesn't makes the GC games perform any better.

As far as the Wiis GPU yeah Nintendo hasn't said anything but you could get specs on the model the Wii's based on through ATI. It's pretty much the same as the GCN's except more shaders, higher clock and a few other specs.

windbane said:
fazz said:
stuff

Graphically Galaxy pales in comparison to PS3 and 360 games. According to Wikipedia, which cites sources:

Processors:

 

Several dvelopers have compared it to XBox. The Wii is not far better than last-gen systems.

I'm not sure why people have to argue this. Just accept the graphics and move on. The games can still be fun.


First of all, Wikipedia is not exactly a good source, neither is IGN and their "insiders". Who could that be? Ubi? EA? Just consider that some of those guys games run worse on the PS3 compared to the 360 despite being a known fact that the PS3 is more powerful than the 360. I wouldn't trust them for anything related  to hardware. And as a GC owner that played many GC games, I must say that nothing on the GC compares to Mario Galaxy. And no, I'm not saying it looks better than 360 games eh!

@souixan: Maybe Nintendo doesn't unlocks the HD resolutions because the Wii lacks the clock speed to render it at an acceptable framerate, especially considering the things some 3rd party developers have done on the Wii (PS2 graphics with low framerate). On another note, Gran Turismo 4 supported 1080i on the PS2... at how much framerate I don't know.



I believe Tourist Trophy (another Polyphony game) also ran in 1080i on the PS2. How well, I do not know.



Hates Nomura.

Tagged: GooseGaws - <--- Has better taste in games than you.

The N64 had great graphics... didn't sell well... PS1 won..

The Gamecube and Xbox had great graphics.... but PS2 won...

The PS3 and Xbox360 have awesome graphics.... but...



Understanding is the key.

 

fazz said:

Woah there souixan, the GC didn't used DDR3, it used 1T-SRAM or something like that. The Wii is the one using GDDR3 RAM on a 128-bit bus width (few words: the same as the PS3 and 360)

Also, nobody knows yet the ACTUAL clock rates of CPU, GPU and RAM of the Wii. And we still don't the actual architecture of the GPU which could be the same as the GC's, but it could be the same as the 360's too. Mario Galaxy is a good example of graphical achievement on the Wii, it uses many shader effects that I'm certain where not possible on the GC... and it still runs flawlessly at 60 frames per second. So in short, the Wii is far above any console of the past generation.

And no, the Wii doesn't makes the GC games perform any better.

EDIT

@scorptile: Gamecube CPU was 485Mhz, Xbox 1 was 733Mhz, but GC's was more powerful because it was 64 bit and Xbox's was 32 bit (note that the PS2's CPU was 128 bit), BUT the Xbox's 1 GPU was faster and more advanced, so it was more powerful than the GC's GPU. And the PS2's GPU was the less advanced and powerful of the three.

The clock speeds of the Wii's CPU are rumored to be 729Mhz because it says that number on the heatspreader of the CPU... but my GC's CPU doesn't says its clockspeed on its heatspreader. Neither does the PS3's Cell BBE, so the 729Mhz figure is just a very random assumption. Personally I think that having a 90nm processor running at that speed is a waste. Of course it MUST be slower than the 360's and PS3's, but think of notebook computers, they usually run at around 1.6~1.8Ghz. So I'd put my money for the Wii's CPU speed at over 1Ghz, but less than 1.5Ghz.

So does that make the PS2 faster than the Xbox?

You have made a mistake that I saw Kwaad make, equating the integer width of a processor in some way with power. Let me make this 100% clear. Using a 64 bit (or 128 bit or 256 bit etc) processor does not in any way make a machine more powerful.  This common misconception came from the old 8 bit 16 bit 32 bit days where there was a meaningful relationship between integer width and the quality games that could be produced.

The only advantage a 64 bit machine has over a 32 bit machine is in dealing with numbers that can not be  represented as a 32 bit number (numbers above 2.1 billion for signed arithmatic, or 4.2 billion for unsigned). Obviously, in games, these numbers come up very rarely. Despite many years, we still haven't gone past 32 bit colour, because using 24 bits (8 bits RGB) for colour information is about as good as our eyes can cope with. Sound has gone past 32 bits but the CPU doesn't need to deal with arithmatic there.

Back in the NES days, the leap from 8 bits to 16 bits and to 32 bits was massive, you went from a maximum of 256 colours, to 65,536 colours, to 16 million colours. same with sounds. But the leap from 32 bit to 64 bit is very small, and it certainly doesn't make the Gamecube more powerful than the Xbox.

The Bamecube, despite its lower clock speed, is about as powerful as the Xbox because it uses a PowerPC core, which on average can run through more instructions per clock cycle than the x86 core that the Xbox used. Nothing to do with the width of the integer.

 



Help! I'm stuck in a forum signature!