By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - Nintendo Wii: Respectable Specs

The second sentence pretty much says it all. However, everyone has their own opinions on the 'Power' of the Wii. Therefore, I decided to post this article;found it very insightful.

http://www.nintendowiifanboy.com/2007/07/24/revolutionary-respectable-specs/ 

Nintendo still hasn't confirmed any of the technical specs of the Wii hardware in detail, and we wouldn't recommend you hold your breath until they do. They don't want people making assumptions of what the system can or can't do based on arbitrary numbers and jargon. We do know that the Wii is much more than "2 Gamecube's taped together." In addition to the revolutionary controllers, we get integrated Wi-fi, 2 USB2.0 ports, 48MB more RAM, internal flash storage, an SD card slot, full-sized DVD disc capacity, and a new operating system and GUI that brings us software like Mii Channel, Photo Channel, Forecast Channel, and Internet Channel. Wii Shop and Virtual Console could not have been done on Gamecube, and with support for component output re-integrated, we can enjoy our old and new games in glorious 480p. That's a pretty long list of upgrades over the Gamecube, and it's in a smaller, more attractive package.

With 50% faster GPU, CPU, and memory speeds, Nintendo didn't make quite as large a leap beyond their prior console as Sony and Microsoft did, but they made their design decisions to keep production high and prices low. That's not to say the hardware can't produce epic games with great graphics.

729MHz IBM PowerPC-derived CPU, codename "Broadway" manufactured on a 90nm SOI process
243MHz
ATi-designed GPU (now AMD), codename "Hollywood" manufactured on a 90nm process
3MB
eDRAM and 24MB 1-T SRAM on the GPU
64MB GDDR3 external "A-RAM"
512MB NAND flash memory
SD card reader for storage expansion
8.5GB DVD9 optical disc drive
802.11b internal wi-fi adapter
Bluetooth internal wireless I/O adapter

These days, a 729MHz single core CPU practically warrants a "pshaw" and a dismissive wave of the hand. Some developers wouldn't get out of bed for less than 2GHz. But (as CPU manufacturers have been forced to admit in recent years) clockspeed is not all that matters. The Broadway is a RISC architecture, so by nature it can do more with its roughly 730MHz than the CISC Xbox CPU can. When we first heard the Nintendo 64 would be deviating from console tradition by using a RISC processor, we were directed to look at Macs as an example of one such architecture. To put the Wii's power in perspective, I started learning Photoshop in college on a 366MHz PowerPC G3 Mac, which had only 64MB of RAM. (And we liked it!) Doing my homework on a 550MHz Intel system didn't make filters and effects render any faster. A ~730MHz PowerPC-family chip is no slouch when it comes to general purpose applications, and apps like the Internet Channel, the Photo Channel, and Everybody Votes can be considered general purpose. The Mac platform was never strongly marketed for its games, but that's not for lack of power. To the contrary, id Software made the grand unveiling of Quake 3, one of their most popular PC games, on a PowerPC-based Mac. And while the "competition" has since adopted PowerPC-based architectures, they're still taking a more brute force approach and pricing themselves right out of the hands of eager gamers.

"Hollywood" is a dual-chip configuration, which doesn't look dissimilar to the Xbox 360's "Xenos" GPU. However, the 2 chips that comprise the "Xenos" package are both dedicated to graphics processing, while one chip ("Napa") handles I/O, RAM access, and graphics processing, and the other ("Vegas") integrates the audio DSP with the 24MB pool of 1-T SRAM. It's definitely not in the same class as the "state of the art" GPUs in the PS3 and Xbox 360, but it's not quite the relic some would have you believe. Its TEV engine performs the functions we would normally expect of the pixel shaders and texture units in more common GPU architectures. It can work with as many as 8 texture layers per pass (specular maps, diffuse maps, normal maps, etc.), so it's quite possible to have bumpy, shiny graphics at least equivalent to what the Xbox can render. We've seen developers making use of the EMBM effect of the hardware, and Starfox Adventures already showed us how well the hardware can handle fur shader effects.

Now, it's a little confusing that the 24MB of 1-T SRAM that was the main system memory for the Gamecube has been moved onto the GPU package in the Wii. We also have a flip-flop in the configuration where the external auxiliary memory is now in greater capacity than the internal system memory. That increase, along with an increase in bandwidth, gives the external memory more versatility. The 16MB pool of 1-T SRAM in the Gamecube was fairly limited in functionality, and commonly thought of as a buffer for audio samples. It's a strange prioritization having 16MB for audio and 3MB for graphics, but there just wasn't enough bandwidth in the A-RAM to use it for much more demanding tasks. That problem has been addressed in the Wii where the A-RAM bandwidth matches that of the system memory.

Both Xbox 360 and PS3 have multiple gigabytes of hard drive space to use as a cache for loading data faster than is possible directly from the optical drive. Wii's A-RAM can be used in the same way. 64MB isn't much, but it's 4x more than the Gamecube had, and it allows even faster access than the hard drives in those other consoles. What does that mean for games? Maintaining the same quick and/or infrequent load times we had on the Gamecube, we could have larger levels, or levels that are filled with more objects.

What about higher resolution textures? Well, not so much ... Yes, we can pull those off the disc into the A-RAM, but when it's time to apply the textures to models and the world, we're still stuck with the same 3MB of eDRAM on the "Hollywood" as we saw in Gamecube's "Flipper." Usually, it's divvied up 1MB for texture caching and 2MB for z-buffering and frame buffers. It doesn't necessarily curse us to textures worse than on the PS2. Although the PS2 has 4MB eDRAM for graphics, it doesn't employ texture compression. Wii and GC's S3TC can make the 1MB of texture cache seem more like 8MB, and with a greater range of colors than we see in PS2 textures.

We could have lived with the slow CPU and GPU. We could have bought a wi-fi network adapter and wireless Remote controllers. We could have played games spanning multiple 1.5GB mini-DVDs. But what really makes the Wii more than just a "Gamecube 1.5" is the internal flash memory. For the first time, we can upgrade and instantly get new games for our Nintendo systems without leaving the house. The flash memory can store system software updates and new channels that enhance the functionality of the system. We can create levels and trade them through WiiConnect24, or potentially download extra content from the game developers. And it makes the Virtual Console possible.

When we take in the whole picture, it is evident that Nintendo doesn't intend for the world of Wii possibilities to be populated with just PS2 ports and minigames. Epic games with online multiplayer, downloadable content and improved graphics are all apparent in Nintendo's upgrades to the Gamecube base hardware. "So where are these games?" you cry. "They're comin'!" was Nintendo's reply at E3. You may not have heard it over the sound of them patting themselves on the back, but they did show more of Super Mario Galaxy and gave us release dates for it and Smash Brothers Brawl, and thankfully didn't announce anymore delays to Metroid Prime 3.

It's pretty clear in my mind that when Iwata-san was presaging our first impressions of the Wii's graphics rendering aptitude, he was thinking of Super Mario Galaxy. The graphics certainly make us say "WOW," followed by a *thud* and then silence as our jaws hit and remain on the floor. Nearly every surface is coated with sharp detail textures and EMBM effects. The artistic use of rim lighting plays well with the celestial motif and makes Mario look seriously out of this world!


Bump mapped for your pleasure

 

Hope that sheds some more light on the issue.

 



Around the Network

Doesnt the new psp have 64mb ram? Ram doesnt matter for graphcis though, all it is is loading speed. and makes the interent go somewhat faster with the more ram you have. The wii is basicly a great console spec wise.



 

mM
leo-j said:
Doesnt the new psp have 64mb ram? Ram doesnt matter for graphcis though, all it is is loading speed. and makes the interent go somewhat faster with the more ram you have. The wii is basicly a great console spec wise.

 

You really have no idea how computers work, do you?

 



leo-j said:
Doesnt the new psp have 64mb ram? Ram doesnt matter for graphcis though, all it is is loading speed.
Did you even read the article? Or you just judged the game specs, as usual. It says that it is all about loading speed...

 



Great post, a very interesting read about the wii and its potential.



Around the Network

Ditto, a must read for those who consistently belittle the Wii's potential. Haters better rekanize.



"Whenever you find a man who says he doesn't believe in a real Right and Wrong, you will find the same man going back on this a moment later."   -C.S. Lewis

"We all make choices... but in the end, our choices... make us."   -Andrew Ryan, Bioshock

Prediction: Wii passes 360 in US between July - September 2008. (Wii supply will be the issue to watch, and barring any freak incidents between now and then as well.) - 6/5/08; Wow, came true even earlier. Wii is a monster.

All I got out of that was you say everything is 2x faster, including the memory, and due to the way power scales, that means it can render a 50% better image than the GameCube.
The graphics chip on the GameCube was on par with a GeForce2. Double the power and you have a GeForce3/4. That makes the Wii's power on par with that of a 4 or 5 year old computer. Both GPU and CPU. Look at the PC games that came out around that time. UT2003(came out in 2002) is a good example. If you look at that, and add in the modern pixel shaders that are used so heavily today, you have what a Wii game will look like. (And actually, if you think about it, it's the truth)



PSN ID: Kwaad


I fly this flag in victory!

@Plague of Locust 

Very true, though you could still draw negative conclusions from this.... or almost anything for that matter.



There was another writeup I saw awhile back about the particular differences in the wii CPU and GPU from the GC. I wish I could find it, but it mentioned about 50 extra processor instructions per chip which added all the shader and anti-aliasing capabilities of the other 2 juggernaughts.



Witty signature here...

Wii: 14 million by January  I sold myself short

360: 13 million by January I sold microsoft short, but not as bad as Nintendo.

PS3: 6 million by January. If it approaches 8 mil i'll eat crow  Mnn Crow is yummy.

With these results, I've determined that I suck at long term predictions, and will not long term predict anything ever again. Thus spaketh Crono.

Kwaad said:
All I got out of that was you say everything is 2x faster, including the memory, and due to the way power scales, that means it can render a 50% better image than the GameCube.
The graphics chip on the GameCube was on par with a GeForce2. Double the power and you have a GeForce3/4. That makes the Wii's power on par with that of a 4 or 5 year old computer. Both GPU and CPU. Look at the PC games that came out around that time. UT2003(came out in 2002) is a good example. If you look at that, and add in the modern pixel shaders that are used so heavily today, you have what a Wii game will look like. (And actually, if you think about it, it's the truth)

Kwaad, why do you even pretend to know what you're talking about?

ArtX was a company that was made from former SGI employees and was the company Nintendo contracted to produce their GPU for the Gamecube. The Flipper was largely based on the GPU ArtX designed and sold to the US Military for flight simulations and has very little similarity to any nVidia GPU; there is a reason the military is willing to spend $10,000+ per GPU on these flight simulators rather than buy an over the counter graphics card like the Geforce 2. ATI bought ArtX and integrated a large portion of their technology into their main GPU line and this is one of the reasons why the Radeon 9800 series was so successful for ATI.