By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - WiiU Confirmed specs from SDK and GPU info: DirectX 11 features!

Tuesday, July 03, 2012

Wii U - Confirmed Specs from SDK & GPU Info

 


I did not post this back when the specs of the Wii U were leaked because it was still tagged as rumor at the time, but since then it has been confirmed by a known developer/programmer with SDK access that the spec sheet from the Wii U development kit was actually copied and pasted and leaked to the Internet 1 day before E3 and is in fact the truth from Nintendo.  Keep in mind that these specs are based on earlier dev kits from late last year and are still missing some information from Nintendo (like the CPU clock frequencies and probably some unique features of the GPU) but are no longer speculation or rumor. The most recent dev kit for the Wii U is undoubtedly even more powerful than these specs show. 


Main Application Processor

PowerPC architecture.
Three cores (fully coherent).
3MB aggregate L2 Cache size.
core 0: 512 KB
core 1: 2048 KB
core 2: 512 KB
Write gatherer per core.
Locked (L1d) cache DMA per core.

Main Memory

Up to 3GB of main memory (CAT-DEVs only). Note: retail machine will have half devkit memory
Please note that the quantity of memory available from the Cafe SDK and Operating System may vary.

Graphics and Video

Modern unified shader architecture.
32MB high-bandwidth eDRAM, supports 720p 4x MSAA or 1080p rendering in a single pass.
HDMI and component video outputs.

Features

Unified shader architecture executes vertex, geometry, and pixel shaders
Multi-sample anti-aliasing (2, 4, or 8 samples per pixel)
Read from multi-sample surfaces in the shader
128-bit floating point HDR texture filtering
High resolution texture support (up to 8192 x 8192)
Indexed cube map arrays

8 render targets
Independent blend modes per render target
Pixel coverage sample masking
Hierarchical Z/stencil buffer
Early Z test and Fast Z Clear
Lossless Z & stencil compression
2x/4x/8x/16x high quality adaptive anisotropic filtering modes
sRGB filtering (gamma/degamma)
Tessellation unit
Stream out support
Compute shader support

GX2 is a 3D graphics API for the Nintendo Wii U system (also known as Cafe). The API is designed to be as efficient as GX(1) from the Nintendo GameCube and Wii systems. Current features are modeled after OpenGL and the AMD r7xx series of graphics processors. Wii U’s graphics processor is referred to as GPU7.
Sound and Audio

Dedicated 120MHz audio DSP.
Support for 6 channel discrete uncompressed audio (via HDMI).
2 channel audio for the Cafe DRC controller.
Monaural audio for the Cafe Remote controller.

Networking

802.11 b/g/n Wifi.

Peripherals

2 x USB 2.0 host controllers x 2 ports each.
SDCard Slot.

Built-in Storage

512MB SLC NAND for System.
8GB MLC NAND for Applications.

Host PC Bridge

Dedicated Cafe-to-host PC bridge hardware.
Allows File System emulation by host PC.
Provides interface for debugger and logging to host PC.

So what do all these numbers and words mean? The part that stands out to me is "Compute Shader Support" and "Tessellation Unit" with both of those features being very prominent in DirectX 11 graphics cards. Tessellation was possible on Xbox 360/PS3 (at a high cost on performance) but has been much improved and more efficient with DirectX 11. These features alone would put the Wii U's custom "GPU7" processor far ahead of what was possible on Xbox 360 and PS3. Compute Shading is what most likely will be used by the GPU's in the PS4 and the next Xbox to take the strain off the CPU but not hampering the performance of the GPU (I realize that this can be referred to a GPGPU, but since all modern PC GPUs have this functionality it's not really necessary to call it that). 
Here is a good explanation of the benefits that Compute Shading brings: 
Compute Shaders are programs that are executed on the graphics processor. With DirectX 11 and DirectCompute, developers are able to use the massive parallel processing power of modern GPUs to accelerate a much wider range of applications that were previously only executable on CPUs. Compute Shaders can be used to enable new graphical techniques to enhance image quality (such as order independent transparency, ray tracing, and advanced post-processing effects), or to accelerate a wide variety of non-graphics applications (such as video transcoding, video upscaling, game physics simulation, and artificial intelligence). In games, Compute Shader support effectively enables more scene details and realism:

  • Optimized post-processing effects – apply advanced lighting techniques to enhance the mood in a scene
  • High quality shadow filtering – no more hard edges on a shadow, see shadows the way you would in real life
  • Depth of field – use the power of the GPU to have more realistic transitions of focal points – imagine looking through a gun sight or a camera lens
  • High Definition Ambient occlusion – incredibly realistic lighting and shadow combinations

Link

When you start to put all this information together, it becomes easy to understand why ports of 360/PS3 games will not really benefit much in the graphics department on the Wii U without developers totally rewriting their code. Compute Shading & Tessellation for example is something that you will not be seeing in most or any 360/PS3 ports to Wii U, and not only that but the developers making those ports on Wii U would use the same rules as 360/PS3 by making the CPU do the work that could have be done by the Wii U's GPU and it's newer technology. This will be the Wii U's version of a "lazy port" with it's extra features not even being used. 

The good news is that the GPU in the Wii U should be able to accomplish all the effects of the PS4 and next Xbox, with those systems most likely being able to do "more of it" and at faster speeds. However, with the similar technology and features that all next-gen systems will have it's doubtful that the Wii U will get left behind in the graphics race since the difference will not be noticeable enough to warrant actually purchasing another console based on the sole reason of the Wii U not being able to run a certain game like what happened with the original Wii. It's encouraging that we (I say we, but I mean the industry since you could argue that the Wii had the best games this gen) will finally get back to comparing who has the best games and not the best graphics in this coming generation.

 

Source: http://www.nintengen.com/2012/07/wii-u-confirmed-specs-from-sdk-gpu-info.html

 

Edit: Not sure if this info has been posted as confirmed before, but according to this, if a game is built from the ground up for Wii U there will be a noticeable leap in graphics from the last generation.



Around the Network

Actually it says nothing about the number of stream processors, texture mapping units or render output units, so I don't think we can judge how good games can look on the Wii U. Bear in mind that modern GPUs go across the entire spectrum, from discrete motherboard graphics to one kilowatt monsters. Anyways superimposing from miniaturization, power consumption and the size of the Wii U's box it could in theory be something like a slightly gimped 4830, if Nintendo isn't willing to stray too far away from the Wii's power consumption. Say 480 stream processors and 500 MHz - and that's already twice as fast as the PS360 GPU's. The thing is, twice as fast isn't saying that much for raw graphic improvement, what you see on screen as opposed to more frames, anti-aliasing etc.



 

 

 

 

 

Alot of that info is pretty generic like, it could cover a massive range of graphics cards, we need to know processor speeds, Ram Speeds,

32MB of VRAM suggest a unified memory system like the 360 but with enough dedicated graphics memory for the final frame buffer.

Take DirectX11 out of your title tho it will 100% not use that MS will not have written it for there console plus it clearly states the Graphics API it will be using is called GX2



haxxiy said:
Actually it says nothing about the number of stream processors, texture mapping units or render output units, so I don't think we can judge how good games can look on the Wii U. Bear in mind that modern GPUs go across the entire spectrum, from discrete motherboard graphics to one kilowatt monsters. Anyways superimposing from miniaturization, power consumption and the size of the Wii U's box it could in theory be something like a slightly gimped 4830, if Nintendo isn't willing to stray too far away from the Wii's power consumption. Say 480 stream processors and 500 MHz - and that's already twice as fast as the PS360 GPU's. The thing is, twice as fast isn't saying that much for raw graphic improvement, what you see on screen as opposed to more frames, anti-aliasing etc.


I thought all theries pointed to it being derived from the 4850 or 4870? It certainly couldn't be on the lower end if it can handle DirectX11 level features.



Eventhought it seems it still leaves lots of questions to be answered, I found this post very informative.



Around the Network
lilbroex said:
haxxiy said:
Actually it says nothing about the number of stream processors, texture mapping units or render output units, so I don't think we can judge how good games can look on the Wii U. Bear in mind that modern GPUs go across the entire spectrum, from discrete motherboard graphics to one kilowatt monsters. Anyways superimposing from miniaturization, power consumption and the size of the Wii U's box it could in theory be something like a slightly gimped 4830, if Nintendo isn't willing to stray too far away from the Wii's power consumption. Say 480 stream processors and 500 MHz - and that's already twice as fast as the PS360 GPU's. The thing is, twice as fast isn't saying that much for raw graphic improvement, what you see on screen as opposed to more frames, anti-aliasing etc.


I thought all theries pointed to it being derived from the 4850 or 4870? It certainly couldn't be on the lower end if it can handle DirectX11 level features.

The GPu of WiiU has always been rumored to come from an R7xx chip, which means anything from HD46xx to 48xx. It's people that though it would be more like the the 48xx series.

OT: I remember this specs, but we still don't know the clocks of the CPU and GPU nor the type and speed of the system memory.

And, didn't AMD change completelly the tessellation unit in the HD5xxx series because the older tessellator wasn't compatible with DX11?



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

JEMC said:
lilbroex said:
haxxiy said:
Actually it says nothing about the number of stream processors, texture mapping units or render output units, so I don't think we can judge how good games can look on the Wii U. Bear in mind that modern GPUs go across the entire spectrum, from discrete motherboard graphics to one kilowatt monsters. Anyways superimposing from miniaturization, power consumption and the size of the Wii U's box it could in theory be something like a slightly gimped 4830, if Nintendo isn't willing to stray too far away from the Wii's power consumption. Say 480 stream processors and 500 MHz - and that's already twice as fast as the PS360 GPU's. The thing is, twice as fast isn't saying that much for raw graphic improvement, what you see on screen as opposed to more frames, anti-aliasing etc.


I thought all theries pointed to it being derived from the 4850 or 4870? It certainly couldn't be on the lower end if it can handle DirectX11 level features.

The GPu of WiiU has always been rumored to come from an R7xx chip, which means anything from HD46xx to 48xx. It's people that though it would be more like the the 48xx series.

OT: I remember this specs, but we still don't know the clocks of the CPU and GPU nor the type and speed of the system memory.

And, didn't AMD change completelly the tessellation unit in the HD5xxx series because the older tessellator wasn't compatible with DX11?

They did, there is the Xenos/R600/R700 type of tessellation and the DX 11 tessellation, apparently.



 

 

 

 

 

haxxiy said:
JEMC said:

And, didn't AMD change completelly the tessellation unit in the HD5xxx series because the older tessellator wasn't compatible with DX11?

They did, there is the Xenos/R600/R700 type of tessellation and the DX 11 tessellation, apparently.

In that case, I hope that one of the optimizations made to the chip was to change the old tessellator for a new one. Although I'm not sure if  being DX11 compatible or not makes a big difference for a console.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Xenostar said:
Alot of that info is pretty generic like, it could cover a massive range of graphics cards, we need to know processor speeds, Ram Speeds,

32MB of VRAM suggest a unified memory system like the 360 but with enough dedicated graphics memory for the final frame buffer.

Take DirectX11 out of your title tho it will 100% not use that MS will not have written it for there console plus it clearly states the Graphics API it will be using is called GX2


actually is 32 MB of eDRAM that is way bigger than 360 10 MB eDRAM, and the processor has 3 MB cache comparing to 360 1 MB cache and PS3 512 KB cache.



kyu23 said:
Xenostar said:
Alot of that info is pretty generic like, it could cover a massive range of graphics cards, we need to know processor speeds, Ram Speeds,

32MB of VRAM suggest a unified memory system like the 360 but with enough dedicated graphics memory for the final frame buffer.

Take DirectX11 out of your title tho it will 100% not use that MS will not have written it for there console plus it clearly states the Graphics API it will be using is called GX2


actually is 32 MB of eDRAM that is way bigger than 360 10 MB eDRAM, and the processor has 3 MB cache comparing to 360 1 MB cache and PS3 512 KB cache.

The PS3 has 256KB of L2 cache on each SPE too, bringing it's total to 2304 KB. As for the eDRAM it remains to be seen it's bandwidth and memory speed, since it is explicitly mentioned a "free" 4xAA @ 720p which isn't too much better than what the X360 does (4xAA @ 640p I believe? The reason Halo 3 wasn't exactly HD).