By using this site, you agree to our Privacy Policy and our Terms of Use. Close

The second sentence pretty much says it all. However, everyone has their own opinions on the 'Power' of the Wii. Therefore, I decided to post this article;found it very insightful.

http://www.nintendowiifanboy.com/2007/07/24/revolutionary-respectable-specs/ 

Nintendo still hasn't confirmed any of the technical specs of the Wii hardware in detail, and we wouldn't recommend you hold your breath until they do. They don't want people making assumptions of what the system can or can't do based on arbitrary numbers and jargon. We do know that the Wii is much more than "2 Gamecube's taped together." In addition to the revolutionary controllers, we get integrated Wi-fi, 2 USB2.0 ports, 48MB more RAM, internal flash storage, an SD card slot, full-sized DVD disc capacity, and a new operating system and GUI that brings us software like Mii Channel, Photo Channel, Forecast Channel, and Internet Channel. Wii Shop and Virtual Console could not have been done on Gamecube, and with support for component output re-integrated, we can enjoy our old and new games in glorious 480p. That's a pretty long list of upgrades over the Gamecube, and it's in a smaller, more attractive package.

With 50% faster GPU, CPU, and memory speeds, Nintendo didn't make quite as large a leap beyond their prior console as Sony and Microsoft did, but they made their design decisions to keep production high and prices low. That's not to say the hardware can't produce epic games with great graphics.

729MHz IBM PowerPC-derived CPU, codename "Broadway" manufactured on a 90nm SOI process
243MHz
ATi-designed GPU (now AMD), codename "Hollywood" manufactured on a 90nm process
3MB
eDRAM and 24MB 1-T SRAM on the GPU
64MB GDDR3 external "A-RAM"
512MB NAND flash memory
SD card reader for storage expansion
8.5GB DVD9 optical disc drive
802.11b internal wi-fi adapter
Bluetooth internal wireless I/O adapter

These days, a 729MHz single core CPU practically warrants a "pshaw" and a dismissive wave of the hand. Some developers wouldn't get out of bed for less than 2GHz. But (as CPU manufacturers have been forced to admit in recent years) clockspeed is not all that matters. The Broadway is a RISC architecture, so by nature it can do more with its roughly 730MHz than the CISC Xbox CPU can. When we first heard the Nintendo 64 would be deviating from console tradition by using a RISC processor, we were directed to look at Macs as an example of one such architecture. To put the Wii's power in perspective, I started learning Photoshop in college on a 366MHz PowerPC G3 Mac, which had only 64MB of RAM. (And we liked it!) Doing my homework on a 550MHz Intel system didn't make filters and effects render any faster. A ~730MHz PowerPC-family chip is no slouch when it comes to general purpose applications, and apps like the Internet Channel, the Photo Channel, and Everybody Votes can be considered general purpose. The Mac platform was never strongly marketed for its games, but that's not for lack of power. To the contrary, id Software made the grand unveiling of Quake 3, one of their most popular PC games, on a PowerPC-based Mac. And while the "competition" has since adopted PowerPC-based architectures, they're still taking a more brute force approach and pricing themselves right out of the hands of eager gamers.

"Hollywood" is a dual-chip configuration, which doesn't look dissimilar to the Xbox 360's "Xenos" GPU. However, the 2 chips that comprise the "Xenos" package are both dedicated to graphics processing, while one chip ("Napa") handles I/O, RAM access, and graphics processing, and the other ("Vegas") integrates the audio DSP with the 24MB pool of 1-T SRAM. It's definitely not in the same class as the "state of the art" GPUs in the PS3 and Xbox 360, but it's not quite the relic some would have you believe. Its TEV engine performs the functions we would normally expect of the pixel shaders and texture units in more common GPU architectures. It can work with as many as 8 texture layers per pass (specular maps, diffuse maps, normal maps, etc.), so it's quite possible to have bumpy, shiny graphics at least equivalent to what the Xbox can render. We've seen developers making use of the EMBM effect of the hardware, and Starfox Adventures already showed us how well the hardware can handle fur shader effects.

Now, it's a little confusing that the 24MB of 1-T SRAM that was the main system memory for the Gamecube has been moved onto the GPU package in the Wii. We also have a flip-flop in the configuration where the external auxiliary memory is now in greater capacity than the internal system memory. That increase, along with an increase in bandwidth, gives the external memory more versatility. The 16MB pool of 1-T SRAM in the Gamecube was fairly limited in functionality, and commonly thought of as a buffer for audio samples. It's a strange prioritization having 16MB for audio and 3MB for graphics, but there just wasn't enough bandwidth in the A-RAM to use it for much more demanding tasks. That problem has been addressed in the Wii where the A-RAM bandwidth matches that of the system memory.

Both Xbox 360 and PS3 have multiple gigabytes of hard drive space to use as a cache for loading data faster than is possible directly from the optical drive. Wii's A-RAM can be used in the same way. 64MB isn't much, but it's 4x more than the Gamecube had, and it allows even faster access than the hard drives in those other consoles. What does that mean for games? Maintaining the same quick and/or infrequent load times we had on the Gamecube, we could have larger levels, or levels that are filled with more objects.

What about higher resolution textures? Well, not so much ... Yes, we can pull those off the disc into the A-RAM, but when it's time to apply the textures to models and the world, we're still stuck with the same 3MB of eDRAM on the "Hollywood" as we saw in Gamecube's "Flipper." Usually, it's divvied up 1MB for texture caching and 2MB for z-buffering and frame buffers. It doesn't necessarily curse us to textures worse than on the PS2. Although the PS2 has 4MB eDRAM for graphics, it doesn't employ texture compression. Wii and GC's S3TC can make the 1MB of texture cache seem more like 8MB, and with a greater range of colors than we see in PS2 textures.

We could have lived with the slow CPU and GPU. We could have bought a wi-fi network adapter and wireless Remote controllers. We could have played games spanning multiple 1.5GB mini-DVDs. But what really makes the Wii more than just a "Gamecube 1.5" is the internal flash memory. For the first time, we can upgrade and instantly get new games for our Nintendo systems without leaving the house. The flash memory can store system software updates and new channels that enhance the functionality of the system. We can create levels and trade them through WiiConnect24, or potentially download extra content from the game developers. And it makes the Virtual Console possible.

When we take in the whole picture, it is evident that Nintendo doesn't intend for the world of Wii possibilities to be populated with just PS2 ports and minigames. Epic games with online multiplayer, downloadable content and improved graphics are all apparent in Nintendo's upgrades to the Gamecube base hardware. "So where are these games?" you cry. "They're comin'!" was Nintendo's reply at E3. You may not have heard it over the sound of them patting themselves on the back, but they did show more of Super Mario Galaxy and gave us release dates for it and Smash Brothers Brawl, and thankfully didn't announce anymore delays to Metroid Prime 3.

It's pretty clear in my mind that when Iwata-san was presaging our first impressions of the Wii's graphics rendering aptitude, he was thinking of Super Mario Galaxy. The graphics certainly make us say "WOW," followed by a *thud* and then silence as our jaws hit and remain on the floor. Nearly every surface is coated with sharp detail textures and EMBM effects. The artistic use of rim lighting plays well with the celestial motif and makes Mario look seriously out of this world!


Bump mapped for your pleasure

 

Hope that sheds some more light on the issue.