selnor said:
So those of you who dont know what he's on about. GPU in Wii looks after the motion sensing controls. So it does need to have more power. Hollywood is the name of the Graphics Processing Unit (GPU) used in Nintendo's Wii video game console. It was designed by ATI Technologies and is manufactured using the same 90 nm CMOS process[1] as the "Broadway" processor. Very few official details have been released to the public by Nintendo, ATI, or IBM. Unofficial reports claim that it is an enhanced revision of the 162 Mhz Nintendo GameCube LSI "Flipper" and that it is clocked 50% faster at 243 MHz. None of the clock rates have been confirmed by Nintendo, IBM, or ATI. It is not known how many pixel pipelines or shader units Hollywood possesses.[2] The Hollywood is a Multi-Chip Module package composed of two dies within the cover. One of the two chips, codenamed Napa, controls the I/O functions, RAM access, and the actual GPU with its 3 MB of embedded DRAM (1MB texture cache, 2MB framebuffer), and measures 8 × 9 mm. The other, codenamed Vegas, holds the Audio DSP and the 24 MB of "internal" 1T-SRAM and measures 13.5 × 7 mm. Hollywood has a 128-Bit memory interface between the VRAM and GPU.[3] The Hollywood also contains an ARM926 core, which has been unofficially nicknamed the Starlet[4]. This embedded microprocessor performs many of the I/O functions, including controlling the wireless functionality, USB, the disc drive, and other miscellaneous functions. It also acts as the security controller of the system, performing encryption and authentication functions. The Hollywood includes hardware implementations of AES and SHA-1, to speed up these functions. Communication with the main CPU is accomplished via an IPC mechanism. The Starlet performs the WiiConnect24 functions while the Wii console is in standby mode |
He's talking about those. Not the extra arm processor.












