By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Nintendo Discussion - WiiU Confirmed specs from SDK and GPU info: DirectX 11 features!

haxxiy said:
kyu23 said:
Xenostar said:
Alot of that info is pretty generic like, it could cover a massive range of graphics cards, we need to know processor speeds, Ram Speeds,

32MB of VRAM suggest a unified memory system like the 360 but with enough dedicated graphics memory for the final frame buffer.

Take DirectX11 out of your title tho it will 100% not use that MS will not have written it for there console plus it clearly states the Graphics API it will be using is called GX2


actually is 32 MB of eDRAM that is way bigger than 360 10 MB eDRAM, and the processor has 3 MB cache comparing to 360 1 MB cache and PS3 512 KB cache.

The PS3 has 256KB of L2 cache on each SPE too, bringing it's total to 2304 KB. As for the eDRAM it remains to be seen it's bandwidth and memory speed, since it is explicitly mentioned a "free" 4xAA @ 720p which isn't too much better than what the X360 does (4xAA @ 640p I believe? The reason Halo 3 wasn't exactly HD).

Well we don't know the speed of Wii U eDRAM true, but in terms of cache, it's confirmed by IBM that wii u processor is based on POWER7 tech, wich means that it as 3 MB cache per core, since on that architeture all L(levels) are per each core, and i think that cache is important, because it's like a "pre"processing of the future/upcoming stuff that will be run in the console/game



Around the Network
kyu23 said:
haxxiy said:
kyu23 said:
Xenostar said:
Alot of that info is pretty generic like, it could cover a massive range of graphics cards, we need to know processor speeds, Ram Speeds,

32MB of VRAM suggest a unified memory system like the 360 but with enough dedicated graphics memory for the final frame buffer.

Take DirectX11 out of your title tho it will 100% not use that MS will not have written it for there console plus it clearly states the Graphics API it will be using is called GX2


actually is 32 MB of eDRAM that is way bigger than 360 10 MB eDRAM, and the processor has 3 MB cache comparing to 360 1 MB cache and PS3 512 KB cache.

The PS3 has 256KB of L2 cache on each SPE too, bringing it's total to 2304 KB. As for the eDRAM it remains to be seen it's bandwidth and memory speed, since it is explicitly mentioned a "free" 4xAA @ 720p which isn't too much better than what the X360 does (4xAA @ 640p I believe? The reason Halo 3 wasn't exactly HD).

Well we don't know the speed of Wii U eDRAM true, but in terms of cache, it's confirmed by IBM that wii u processor is based on POWER7 tech, wich means that it as 3 MB cache per core, since on that architeture all L(levels) are per each core, and i think that cache is important, because it's like a "pre"processing of the future/upcoming stuff that will be run in the console/game


Yeah, very true, but the specs above mentioned 3MB of aggregate cache and even gave the breakdown per core. Usually consoles use custom parts, like the RSX inside the PS3, where the GTX 7900 had 16 vertex processors, it had only 8, so I'm inclined to believe that's what we are going to get instead of a 9MB total.



 

 

 

 

 

JEMC said:
haxxiy said:
JEMC said:

And, didn't AMD change completelly the tessellation unit in the HD5xxx series because the older tessellator wasn't compatible with DX11?

They did, there is the Xenos/R600/R700 type of tessellation and the DX 11 tessellation, apparently.

In that case, I hope that one of the optimizations made to the chip was to change the old tessellator for a new one. Although I'm not sure if  being DX11 compatible or not makes a big difference for a console.

The difference in a DX11 GPU is that the DX11 standards have much higher spec requirements, DX11 Compute Shaders offer a lot more flexibility and also higher level language support making programing easier. And the Tesselation unit in DX11 is on top of the old AMD tesselation unit which supported displacement maping DX11 also supports smoothing, the introduction of patches allowing devs to apply tesselation to areas rather that per polygon and also increased performance via the inclusion of fixed function tesselation. 

From what I understood of the things I have read anyway.

There is nothing in the specs that actually suggest it's a DX11 GPU as has been stated tesselation was a feature of the Xenos tho it was never really used as far as I know, and AMD (ATI at the time) has technically had compute shader capabilities since the R580 GPUs tho they called it Close to Metal (later renamed Stream SDK) tho it was never really used as devs couldn't really justify supporting something that only a small fraction of machines could support.

Given enough work devs could probably create 90% of the DX11 effects tho they may run a bit slow. 



@TheVoxelman on twitter

Check out my hype threads: Cyberpunk, and The Witcher 3!

so this confirms that the WiiU is clearly more powerful than the HD consoles



zarx said:

The difference in a DX11 GPU is that the DX11 standards have much higher spec requirements, DX11 Compute Shaders offer a lot more flexibility and also higher level language support making programing easier. And the Tesselation unit in DX11 is on top of the old AMD tesselation unit which supported displacement maping DX11 also supports smoothing, the introduction of patches allowing devs to apply tesselation to areas rather that per polygon and also increased performance via the inclusion of fixed function tesselation.

From what I understood of the things I have read anyway.

There is nothing in the specs that actually suggest it's a DX11 GPU as has been stated tesselation was a feature of the Xenos tho it was never really used as far as I know, and AMD (ATI at the time) has technically had compute shader capabilities since the R580 GPUs tho they called it Close to Metal (later renamed Stream SDK) tho it was never really used as devs couldn't really justify supporting something that only a small fraction of machines could support.

Given enough work devs could probably create 90% of the DX11 effects tho they may run a bit slow.

So if the WiiU has the old tessellator unit, devs could do almost the same things that a DX11 tessellator, but it would be slower and harder to program.

Unless WiiU sells as good as Wii, Nintendo is screwed.



Please excuse my bad English.

Currently gaming on a PC with an i5-4670k@stock (for now), 16Gb RAM 1600 MHz and a GTX 1070

Steam / Live / NNID : jonxiquet    Add me if you want, but I'm a single player gamer.

Around the Network

 

 

Edit: Not sure if this info has been posted as confirmed before, but according to this, if a game is built from the ground up for Wii U there will be a noticeable leap in graphics from the last generation.


Of course.

The jump from SD to HD is huge.



Haters gonna hate...



"So if the WiiU has the old tessellator unit, devs could do almost the same things that a DX11 tessellator, but it would be slower and harder to program.

Unless WiiU sells as good as Wii, Nintendo is screwed."


WAIT WAIT
are you saying Nintendo is screwed because of that?.........pfff...never cease to amaze me Vgchartz.....



Thats, actually pretty good specs. well, better than I thought they were going to be



These specs were posted last month, but are they really confirmed as the dev-kit specs?
Does having a poster confirm it in a forum make it official?



@Twitter | Switch | Steam

You say tomato, I say tomato 

"¡Viva la Ñ!"