By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - Wii Vs PS3/XBOX360 [Technicaly]

sc94597 said:
leo-j said:
sc94597 said:
leo-j said:
sc94597 said:
selnor said:
PS2



The following are the basics of the PS2 specs:

CPU: 128 Bit "Emotion Engine"
System Clock: 300 MHz
System Memory: 32 MB Direct Rambus
Memory Bus Bandwidth: 3.2 GB per second
Co-Processor: FPU (Floating Point Multiply Accumulator x 1, Floating Point Divider x 1)
Vector Units: VU0 and VU1 (Floating Point Multiply Accumulator x 9, Floating Point Divider x 1)
Floating Point Performance: 6.2 GFLOPS
Compressed Image Decoder: MPEG2

Graphics
Clock Frequency: 150MHz
DRAM Bus bandwidth: 48 GB Per Second
DRAM Bus width: 2560 bits
Pixel Configuration: RGB:Alpha:Z Buffer (24:8:32)
Maximum Polygon Rate: 75 Million Polygons Per Second
3D CG Geometric Transformation: 66 million Polygons Per Second

Audio
Number of voices: ADPCM: 48 channel on SPU2 plus definable by software
Sampling Frequency: 44.1 KHz or 48 KHz (selectable)

l/O
CPU Core: Current PlayStation CPU
Clock Frequency: 33.8 MHz or 37.5 MHz (selectable)
Sub Bus: 32 Bit
Interface Types: IEEE1394, Universal Serial Bus (USB)
Communication: via PC-Card PCMCIA
Disc Media: DVD-ROM (CD-ROM compatible)

- These are all of the Sony PS2 Specs.
And this proves my point. The difference between the ps2 and wii is about the same as the wii to hd consoles.

 

Yes it does, 128mb of ram= 512mb of ram, and 3.5GHZ processer= 750mhz, and we can also say the Xbox 360 is just 1.2 times more powerful than the original XBOX.

 


Show me in that post where I say the wii=hd consoles.


Let me tell you the reality, the wii is not FAR MORE POWERFUL than the ps2, its about 3 times more powerful according to what I have seen on the wii, and that still doesnt add up to 3 times.

Sony said the PS3 is almost 30 times more powerful than the ps2, hence the difference which you try to make it seem in your fanboy dream is fiction.

If anything the PS3 is 10 times powerful than the wii.


Wow you are judging first year wii graphics wit 6th year ps2 graphics. Good job. People who are going to use the X console had x and y console has y so that means yconsole>xconsole really need to learn about things called bottle necks and how much more efficient a processor could make a difference. IF you consider these bottlenecks the wii is about 6 times more powerful than the ps2. Now the 360/ps3 are around that much. So yes the differences are similar if not exactly the same.


 There's not much difference in graphics between SMS and SMG, nuff said.



This will only take a moment of your time. *steals your watch*

Around the Network

SO there isn't a difference between this
See More Super Mario Galaxy Various at IGN.com
and this
See More Super Mario Sunshine Various at IGN.com



makingmusic476 said:
fazz said:
makingmusic476 said:
fazz said:
makingmusic476 said:
Guys, the Wii is quite a bit more powerful than the ps2. However, you can't say that the difference between these two:

And let's not forget that GoW2 had much larger environments than SSBB, and could be played in 720p.


Yeah, 720p at a much lower framerate

Also, I have much better examples:

vs.

and

If someone can't see the difference in geometry, shaders and textures, they're really blind.

Now, let's get technical. The Playstation 2 and the Gamecube (we'll start with the GC) had very different architecture that required deep analyzing to understand how they compared to each other. The PS2 had to resort in it's rather powerful CPU to overcome it's primitive GPU. Seeing that the GPU lacked MANY features that the Xbox and GC's GPU had, the CPU had to brute-force all of those tasks and therefore all of it's processing power was wasted.

First of all, let's go with RAM. The PS2 had 36MB to work with graphics while the Gamecube had just 27MB (yes, the other 16MB could only be used for audio/DVD-buffer). If the GC had less RAM, how could it had better textures, models and effects? Answer: The Gamecube had 6:1 S3 Texture Compression. This means that textures that would take 24MB on the PS2, would just take 4MB on the GC. So, effectively the GC had around 4X more memory for textures. Now with the Wii, that has an Unified Memory Architecture and can use all of it's 88MB of RAM for anything the developer wants, it would have around 12X more memory available for textures... and I'm not taking into account that the Wii's memory is much faster than the one in the Cube, if so the difference would've be much higher.

Oh, now that we're talking about RAM and seeing that some people love Wikipedia data so much, the Wii uses the very same GDDR3 RAM in the Xbox 360 in a (according to your beloved Wikipedia) 128 bit bus. The Wii's MEM-2 (that's the name of the GDDR3 in a game developing environment) can have as much bandwidth as the Xbox 360. That's 22GB per second. Are we being too generous with that piece of crap called the Wii? Ok, let's cut it in half the speed, ie 700Mhz effective. Now we have 11GB per second. That's over 3X the bandwidth on the Cube.

Want more? Everyone says "Wii's GPU is just an overclocked GC GPU!"... but wrong my little droogys. The Wii's GPU is radically different from the one in the Cube, more so that now it has been split in two chips :shock: If Nintendo had just used a process-reduced Flipper in the Wii, the chip would have taken less than 30% of the physical space than one of the chips in the package. Way to waste money on unused chip space Nintendo!... All sarcasm aside, we can assume that ATi has doubled the pixel pipelines, texture units, TEV units and/or embedded frame/texture bufers. Not to mention the TEV units have been improved to make them much more flexible and programmable, this to achieve advanced shader effects... that BTW the PS2 had none.


I'm not saying that the Wii isn't more powerful than the PS2. It's much more powerful than the ps2. I even said so in the first line of the post you quoted. What I'm saying is that the difference between the ps3/360 and Wii is greater than the difference between the Wii and ps2.


Explain why my friend.

And while we're at it, let's post a first-gen PS2 game:

Done by no others than the graphics pioneers at Squaresoft


You chose that craptastic game to show off the ps2? Yet you use 1st party Nintendo titles to show of the Wii?

GT3, released less than a year after the ps2 came out in Japan:

Compared to MK Wii, a Nintendo title, released a year and a half after the Wii came out:

Compared to GT5P, released a year after the ps3 came out in Japan:

That is why, my friend.


I've always seen Sony fans praise Square for their ambition on graphics department, that's why I choose that one...

And how can we compare the cartooniest racing game EVER with a realistic one? What's there to compare? I can see more detail, more polygon count and better texture resolution in the Kart one in comparison to the GT3 one.

And GT3 was released 13 months after PS2's launch... just saying. 



fazz said:
makingmusic476 said:
fazz said:
makingmusic476 said:
fazz said:
makingmusic476 said:
Guys, the Wii is quite a bit more powerful than the ps2. However, you can't say that the difference between these two:

And let's not forget that GoW2 had much larger environments than SSBB, and could be played in 720p.


Yeah, 720p at a much lower framerate

Also, I have much better examples:

vs.

and

If someone can't see the difference in geometry, shaders and textures, they're really blind.

Now, let's get technical. The Playstation 2 and the Gamecube (we'll start with the GC) had very different architecture that required deep analyzing to understand how they compared to each other. The PS2 had to resort in it's rather powerful CPU to overcome it's primitive GPU. Seeing that the GPU lacked MANY features that the Xbox and GC's GPU had, the CPU had to brute-force all of those tasks and therefore all of it's processing power was wasted.

First of all, let's go with RAM. The PS2 had 36MB to work with graphics while the Gamecube had just 27MB (yes, the other 16MB could only be used for audio/DVD-buffer). If the GC had less RAM, how could it had better textures, models and effects? Answer: The Gamecube had 6:1 S3 Texture Compression. This means that textures that would take 24MB on the PS2, would just take 4MB on the GC. So, effectively the GC had around 4X more memory for textures. Now with the Wii, that has an Unified Memory Architecture and can use all of it's 88MB of RAM for anything the developer wants, it would have around 12X more memory available for textures... and I'm not taking into account that the Wii's memory is much faster than the one in the Cube, if so the difference would've be much higher.

Oh, now that we're talking about RAM and seeing that some people love Wikipedia data so much, the Wii uses the very same GDDR3 RAM in the Xbox 360 in a (according to your beloved Wikipedia) 128 bit bus. The Wii's MEM-2 (that's the name of the GDDR3 in a game developing environment) can have as much bandwidth as the Xbox 360. That's 22GB per second. Are we being too generous with that piece of crap called the Wii? Ok, let's cut it in half the speed, ie 700Mhz effective. Now we have 11GB per second. That's over 3X the bandwidth on the Cube.

Want more? Everyone says "Wii's GPU is just an overclocked GC GPU!"... but wrong my little droogys. The Wii's GPU is radically different from the one in the Cube, more so that now it has been split in two chips :shock: If Nintendo had just used a process-reduced Flipper in the Wii, the chip would have taken less than 30% of the physical space than one of the chips in the package. Way to waste money on unused chip space Nintendo!... All sarcasm aside, we can assume that ATi has doubled the pixel pipelines, texture units, TEV units and/or embedded frame/texture bufers. Not to mention the TEV units have been improved to make them much more flexible and programmable, this to achieve advanced shader effects... that BTW the PS2 had none.


I'm not saying that the Wii isn't more powerful than the PS2. It's much more powerful than the ps2. I even said so in the first line of the post you quoted. What I'm saying is that the difference between the ps3/360 and Wii is greater than the difference between the Wii and ps2.


Explain why my friend.

And while we're at it, let's post a first-gen PS2 game:

Done by no others than the graphics pioneers at Squaresoft


You chose that craptastic game to show off the ps2? Yet you use 1st party Nintendo titles to show of the Wii?

GT3, released less than a year after the ps2 came out in Japan:

Compared to MK Wii, a Nintendo title, released a year and a half after the Wii came out:

Compared to GT5P, released a year after the ps3 came out in Japan:

That is why, my friend.


I've always seen Sony fans praise Square for their ambition on graphics department, that's why I choose that one...

And how can we compare the cartooniest racing game EVER with a realistic one? What's there to compare? I can see more detail, more polygon count and better texture resolution in the Kart one in comparison to the GT3 one.

And GT3 was released 13 months after PS2's launch... just saying.


First, I don't own a single Square game, and I've never played a single Final Fantasy. *puts up flame shield*

Second, you can notice things like polygons and texture resolution in any shot, regardless of artstyle, and you have done just that.

Third, yes, the MK shot looks better than the GT3 shot. I've said it time and time again, I'm not saying that the Wii isn't more powerful than the ps2. It is. Easily. I'm saying that the gap between the Wii and ps3/360 is larger than the gap between the ps2 and Wii. MKWii looks a lot better than GT3 (and GT4), but GT5P looks WAY better than MKWii.

And finally, yeah, I kinda switched April and March in my mind.



The Wii is much more powerfull then PS2. Not just more powerfull. I wish people could understand that just as I wish that people would understand that PS3 and Xbox 360 won't have a problem playing Crysis (even if strong PCs might have some advantage with DX10) or even more graphical games.



Around the Network
makingmusic476 said:
fazz said:
makingmusic476 said:
fazz said:
makingmusic476 said:
fazz said:
makingmusic476 said:
Guys, the Wii is quite a bit more powerful than the ps2. However, you can't say that the difference between these two:

And let's not forget that GoW2 had much larger environments than SSBB, and could be played in 720p.


Yeah, 720p at a much lower framerate

Also, I have much better examples:

vs.

and

If someone can't see the difference in geometry, shaders and textures, they're really blind.

Now, let's get technical. The Playstation 2 and the Gamecube (we'll start with the GC) had very different architecture that required deep analyzing to understand how they compared to each other. The PS2 had to resort in it's rather powerful CPU to overcome it's primitive GPU. Seeing that the GPU lacked MANY features that the Xbox and GC's GPU had, the CPU had to brute-force all of those tasks and therefore all of it's processing power was wasted.

First of all, let's go with RAM. The PS2 had 36MB to work with graphics while the Gamecube had just 27MB (yes, the other 16MB could only be used for audio/DVD-buffer). If the GC had less RAM, how could it had better textures, models and effects? Answer: The Gamecube had 6:1 S3 Texture Compression. This means that textures that would take 24MB on the PS2, would just take 4MB on the GC. So, effectively the GC had around 4X more memory for textures. Now with the Wii, that has an Unified Memory Architecture and can use all of it's 88MB of RAM for anything the developer wants, it would have around 12X more memory available for textures... and I'm not taking into account that the Wii's memory is much faster than the one in the Cube, if so the difference would've be much higher.

Oh, now that we're talking about RAM and seeing that some people love Wikipedia data so much, the Wii uses the very same GDDR3 RAM in the Xbox 360 in a (according to your beloved Wikipedia) 128 bit bus. The Wii's MEM-2 (that's the name of the GDDR3 in a game developing environment) can have as much bandwidth as the Xbox 360. That's 22GB per second. Are we being too generous with that piece of crap called the Wii? Ok, let's cut it in half the speed, ie 700Mhz effective. Now we have 11GB per second. That's over 3X the bandwidth on the Cube.

Want more? Everyone says "Wii's GPU is just an overclocked GC GPU!"... but wrong my little droogys. The Wii's GPU is radically different from the one in the Cube, more so that now it has been split in two chips :shock: If Nintendo had just used a process-reduced Flipper in the Wii, the chip would have taken less than 30% of the physical space than one of the chips in the package. Way to waste money on unused chip space Nintendo!... All sarcasm aside, we can assume that ATi has doubled the pixel pipelines, texture units, TEV units and/or embedded frame/texture bufers. Not to mention the TEV units have been improved to make them much more flexible and programmable, this to achieve advanced shader effects... that BTW the PS2 had none.


I'm not saying that the Wii isn't more powerful than the PS2. It's much more powerful than the ps2. I even said so in the first line of the post you quoted. What I'm saying is that the difference between the ps3/360 and Wii is greater than the difference between the Wii and ps2.


Explain why my friend.

And while we're at it, let's post a first-gen PS2 game:

Done by no others than the graphics pioneers at Squaresoft


You chose that craptastic game to show off the ps2? Yet you use 1st party Nintendo titles to show of the Wii?

GT3, released less than a year after the ps2 came out in Japan:

Compared to MK Wii, a Nintendo title, released a year and a half after the Wii came out:

Compared to GT5P, released a year after the ps3 came out in Japan:

That is why, my friend.


I've always seen Sony fans praise Square for their ambition on graphics department, that's why I choose that one...

And how can we compare the cartooniest racing game EVER with a realistic one? What's there to compare? I can see more detail, more polygon count and better texture resolution in the Kart one in comparison to the GT3 one.

And GT3 was released 13 months after PS2's launch... just saying.


First, I don't own a single Square game, and I've never played a single Final Fantasy. *puts up flame shield*

Second, you can notice things like polygons and texture resolution in any shot, regardless of artstyle, and you have done just that.

Third, yes, the MK shot looks better than the GT3 shot. I've said it time and time again, I'm not saying that the Wii isn't more powerful than the ps2. It is. Easily. I'm saying that the gap between the Wii and ps3/360 is larger than the gap between the ps2 and Wii. MKWii looks a lot better than GT3 (and GT4), but GT5P looks WAY better than MKWii.

And finally, yeah, I kinda switched April and March in my mind.


 Um realistic games always are meant to push the hardware while cartoony games aren't. That is what fazz was going with it.



I haven't read this topic, nor do I intend to. I just wanted to give a lol at the original post which in essence says "I don't know anything about what I'm talking about, but can some one reaffirm my opinion for me?"



You can find me on facebook as Markus Van Rijn, if you friend me just mention you're from VGchartz and who you are here.

fazz said:
makingmusic476 said:
Guys, the Wii is quite a bit more powerful than the ps2. However, you can't say that the difference between these two:





Equates to the difference between these two:





And let's not forget that GoW2 had much larger environments than SSBB, and could be played in 720p.


Yeah, 720p at a much lower framerate

Also, I have much better examples:

vs.

and

If someone can't see the difference in geometry, shaders and textures, they're really blind.

Now, let's get technical. The Playstation 2 and the Gamecube (we'll start with the GC) had very different architecture that required deep analyzing to understand how they compared to each other. The PS2 had to resort in it's rather powerful CPU to overcome it's primitive GPU. Seeing that the GPU lacked MANY features that the Xbox and GC's GPUs had, the CPU had to brute-force all of those tasks and therefore all of it's processing power was wasted, nullifying it's advantage in processing power.

First of all, let's go with RAM. The PS2 had 36MB to work with graphics while the Gamecube had just 27MB (yes, the other 16MB could only be used for audio/DVD-buffer). If the GC had less RAM, how could it had better textures, models and effects? Answer: The Gamecube had 6:1 S3 Texture Compression. This means that textures that would take 24MB on the PS2, would just take 4MB on the GC. So, effectively the GC had around 4X more memory for textures. Now with the Wii, that has an Unified Memory Architecture and can use all of it's 88MB of RAM for anything the developer wants, it would have around 12X more memory available for textures... and I'm not taking into account that the Wii's memory is much faster than the one in the Cube, if so the difference would've be much higher.

Oh, now that we're talking about RAM and seeing that some people love Wikipedia facts so much, the Wii uses the very same GDDR3 RAM in the Xbox 360 in a (according to your beloved Wikipedia) 128 bit bus. The Wii's MEM-2 (that's the name of the GDDR3 in a game developing environment) can have as much bandwidth as the Xbox 360. That's 22GB per second. Are we being too generous with that piece of crap called the Wii? Ok, let's cut it in half the speed, ie 700Mhz effective. Now we have 11GB per second. That's over 3X the bandwidth on the Cube.

Want more? Everyone says "Wii's GPU is just an overclocked GC GPU!"... but wrong my little droogys. The Wii's GPU is radically different from the one in the Cube, more so that now it has been split in two chips :shock: If Nintendo had just used a process-reduced Flipper in the Wii, the chip would have taken less than 30% of the physical space than one of the chips in the package. Way to waste money on unused chip space Nintendo!... All sarcasm aside, we can assume that ATi has doubled the pixel pipelines, texture units, TEV units and/or embedded frame/texture bufers. Not to mention the TEV units have been improved to make them much more flexible and programmable, this to achieve advanced shader effects... that BTW the PS2 had none.

The bottom-line: The Wii is as far from the PS2, as the Xbox 360 from the Wii... but if you don't agree with me, you'll ignore everything I said above ;>


So those of you who dont know what he's on about. GPU in Wii looks after the motion sensing controls. So it does need to have more power. 

Hollywood is the name of the Graphics Processing Unit (GPU) used in Nintendo's Wii video game console. It was designed by ATI Technologies and is manufactured using the same 90 nm CMOS process[1] as the "Broadway" processor. Very few official details have been released to the public by Nintendo, ATI, or IBM. Unofficial reports claim that it is an enhanced revision of the 162 Mhz Nintendo GameCube LSI "Flipper" and that it is clocked 50% faster at 243 MHz. None of the clock rates have been confirmed by Nintendo, IBM, or ATI. It is not known how many pixel pipelines or shader units Hollywood possesses.[2]

The Hollywood is a Multi-Chip Module package composed of two dies within the cover. One of the two chips, codenamed Napa, controls the I/O functions, RAM access, and the actual GPU with its 3 MB of embedded DRAM (1MB texture cache, 2MB framebuffer), and measures 8 × 9 mm. The other, codenamed Vegas, holds the Audio DSP and the 24 MB of "internal" 1T-SRAM and measures 13.5 × 7 mm. Hollywood has a 128-Bit memory interface between the VRAM and GPU.[3]

The Hollywood also contains an ARM926 core, which has been unofficially nicknamed the Starlet[4]. This embedded microprocessor performs many of the I/O functions, including controlling the wireless functionality, USB, the disc drive, and other miscellaneous functions. It also acts as the security controller of the system, performing encryption and authentication functions. The Hollywood includes hardware implementations of AES and SHA-1, to speed up these functions. Communication with the main CPU is accomplished via an IPC mechanism. The Starlet performs the WiiConnect24 functions while the Wii console is in standby mode

sc94597 said:
makingmusic476 said:
fazz said:
makingmusic476 said:
fazz said:
makingmusic476 said:
 

I'm not saying that the Wii isn't more powerful than the PS2. It's much more powerful than the ps2. I even said so in the first line of the post you quoted. What I'm saying is that the difference between the ps3/360 and Wii is greater than the difference between the Wii and ps2.


Explain why my friend.

And while we're at it, let's post a first-gen PS2 game:

Done by no others than the graphics pioneers at Squaresoft


You chose that craptastic game to show off the ps2? Yet you use 1st party Nintendo titles to show of the Wii?

GT3, released less than a year after the ps2 came out in Japan:

Compared to MK Wii, a Nintendo title, released a year and a half after the Wii came out:

Compared to GT5P, released a year after the ps3 came out in Japan:

That is why, my friend.


I've always seen Sony fans praise Square for their ambition on graphics department, that's why I choose that one...

And how can we compare the cartooniest racing game EVER with a realistic one? What's there to compare? I can see more detail, more polygon count and better texture resolution in the Kart one in comparison to the GT3 one.

And GT3 was released 13 months after PS2's launch... just saying.


First, I don't own a single Square game, and I've never played a single Final Fantasy. *puts up flame shield*

Second, you can notice things like polygons and texture resolution in any shot, regardless of artstyle, and you have done just that.

Third, yes, the MK shot looks better than the GT3 shot. I've said it time and time again, I'm not saying that the Wii isn't more powerful than the ps2. It is. Easily. I'm saying that the gap between the Wii and ps3/360 is larger than the gap between the ps2 and Wii. MKWii looks a lot better than GT3 (and GT4), but GT5P looks WAY better than MKWii.

And finally, yeah, I kinda switched April and March in my mind.


Um realistic games always are meant to push the hardware while cartoony games aren't. That is what fazz was going with it.


 So you're saying games like SMG don't push the Wii's hardware?  Or games like R&C for the ps3? 



selnor said:
fazz said:
makingmusic476 said:
Guys, the Wii is quite a bit more powerful than the ps2. However, you can't say that the difference between these two:





Equates to the difference between these two:





And let's not forget that GoW2 had much larger environments than SSBB, and could be played in 720p.


Yeah, 720p at a much lower framerate

Also, I have much better examples:

 

vs.

 

and

 

If someone can't see the difference in geometry, shaders and textures, they're really blind.

Now, let's get technical. The Playstation 2 and the Gamecube (we'll start with the GC) had very different architecture that required deep analyzing to understand how they compared to each other. The PS2 had to resort in it's rather powerful CPU to overcome it's primitive GPU. Seeing that the GPU lacked MANY features that the Xbox and GC's GPUs had, the CPU had to brute-force all of those tasks and therefore all of it's processing power was wasted, nullifying it's advantage in processing power.

First of all, let's go with RAM. The PS2 had 36MB to work with graphics while the Gamecube had just 27MB (yes, the other 16MB could only be used for audio/DVD-buffer). If the GC had less RAM, how could it had better textures, models and effects? Answer: The Gamecube had 6:1 S3 Texture Compression. This means that textures that would take 24MB on the PS2, would just take 4MB on the GC. So, effectively the GC had around 4X more memory for textures. Now with the Wii, that has an Unified Memory Architecture and can use all of it's 88MB of RAM for anything the developer wants, it would have around 12X more memory available for textures... and I'm not taking into account that the Wii's memory is much faster than the one in the Cube, if so the difference would've be much higher.

Oh, now that we're talking about RAM and seeing that some people love Wikipedia facts so much, the Wii uses the very same GDDR3 RAM in the Xbox 360 in a (according to your beloved Wikipedia) 128 bit bus. The Wii's MEM-2 (that's the name of the GDDR3 in a game developing environment) can have as much bandwidth as the Xbox 360. That's 22GB per second. Are we being too generous with that piece of crap called the Wii? Ok, let's cut it in half the speed, ie 700Mhz effective. Now we have 11GB per second. That's over 3X the bandwidth on the Cube.

Want more? Everyone says "Wii's GPU is just an overclocked GC GPU!"... but wrong my little droogys. The Wii's GPU is radically different from the one in the Cube, more so that now it has been split in two chips :shock: If Nintendo had just used a process-reduced Flipper in the Wii, the chip would have taken less than 30% of the physical space than one of the chips in the package. Way to waste money on unused chip space Nintendo!... All sarcasm aside, we can assume that ATi has doubled the pixel pipelines, texture units, TEV units and/or embedded frame/texture bufers. Not to mention the TEV units have been improved to make them much more flexible and programmable, this to achieve advanced shader effects... that BTW the PS2 had none.

The bottom-line: The Wii is as far from the PS2, as the Xbox 360 from the Wii... but if you don't agree with me, you'll ignore everything I said above ;>


So those of you who dont know what he's on about. GPU in Wii looks after the motion sensing controls. So it does need to have more power.

Hollywood is the name of the Graphics Processing Unit (GPU) used in Nintendo's Wii video game console. It was designed by ATI Technologies and is manufactured using the same 90 nm CMOS process[1] as the "Broadway" processor. Very few official details have been released to the public by Nintendo, ATI, or IBM. Unofficial reports claim that it is an enhanced revision of the 162 Mhz Nintendo GameCube LSI "Flipper" and that it is clocked 50% faster at 243 MHz. None of the clock rates have been confirmed by Nintendo, IBM, or ATI. It is not known how many pixel pipelines or shader units Hollywood possesses.[2]

The Hollywood is a Multi-Chip Module package composed of two dies within the cover. One of the two chips, codenamed Napa, controls the I/O functions, RAM access, and the actual GPU with its 3 MB of embedded DRAM (1MB texture cache, 2MB framebuffer), and measures 8 × 9 mm. The other, codenamed Vegas, holds the Audio DSP and the 24 MB of "internal" 1T-SRAM and measures 13.5 × 7 mm. Hollywood has a 128-Bit memory interface between the VRAM and GPU.[3]

The Hollywood also contains an ARM926 core, which has been unofficially nicknamed the Starlet[4]. This embedded microprocessor performs many of the I/O functions, including controlling the wireless functionality, USB, the disc drive, and other miscellaneous functions. It also acts as the security controller of the system, performing encryption and authentication functions. The Hollywood includes hardware implementations of AES and SHA-1, to speed up these functions. Communication with the main CPU is accomplished via an IPC mechanism. The Starlet performs the WiiConnect24 functions while the Wii console is in standby mode

 He's talking about those. Not the extra arm processor.