By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Wii U vs PS4 vs Xbox One FULL SPECS (January 24, 2014)

drake4 said:
fatslob-:O said:
drake4 said:
fatslob-:O said:
drake4 said:
Scoobes said:
drake4 said:
superchunk said:
drake4 said:
superchunk said:
RE the shaders...

Can someone post a source of actual devs saying it has 160 vs 320? I've had 320 for a very long time due to Chipworks and NeoGaf threads discussion combined with various tech sites.


How do you have 320 shaders when neogaf onfirmed it was 160 shaders, and in the very same thread the project cars dev said it was 192 shaders , 320 shaders has been ruled out already ruled out by neogaf and othere tech sites unless your not keeping up with the descussion and want to ignore the facts.

gaf's thread on the GPU has 320 in its OP.

http://www.neogaf.com/forum/showthread.php?t=710765 this was the updated version of the wiiu specs thread, the othere one was never updated cause the OP never got to it, or stopped updating since it was such a long process to figure out the specs.

here is the direct quote from the NFSWU/project car dev  Martin Griffiths

Not sure where they got that info from but the WiiU GPU has 192 Shader units, not 160. It also has 32MB of EDRAM, (the same amount as Xbox One) so comparing just the number of shader units against a PC card doesn't give a representative performance comparison. On the CPU side, WiiU also supports multi-threaded rendering that scales perfectly with the number of cores you throw at it, unlike PC DX11 deferred contexts which don't scale very well. The current WiiU build runs around 18-25fps with 5AI with all post (FXAA/Motion blur etc) enabled, which is fairly good given only the fairly cursory optimisation pass that it's had.

Nice find. That actually fits pretty well with a supposed dev we had on here that stated it was between 160-200 shaders.

even with this confirmation, i bet the op still won't update his chart so his console of choice looks better, 320 shaders has been ruled a long time ago the only descussion going on is 160-192 shaders.

Even if it had 192 shaders or 160 it really doesn't matter in the end because the WII U will end up being weak on all fronts regardless judging by it's meager 12.8 GB/s bandwidth. So even if the WII U had 320 shaders nintendo or any developer for that matter wouldn't be able to utilize it's power because of it's limited bandwidth just like how the X1 is starved like hell for it LOL. 

I heard developers can get past that problems of the bandwidth cause of the 32mb of edram, which would be enough for a 320 shader gpu, but microsoft has 768 shaders, and is relying mainly on the 32mb of esram, which is jut not enough for high demaning games at 1080p.

The only reason console manufacturers even use embedded ram in the first place was to reduce cost and save bandwidth when depth buffers and back buffers are being used in the process as well as applying alpha blending operations on the scene. For the most part it is the ROPS that benefit the most from it on the GPU seeing as how all of these buffers are heavily used by the ROPS. I say that 32 mb of eSRAM is better considering the fact that it can more elegantly implement a software tiled rendering due to it's higher bandwidth. 

when its all said and done, even though nintendo made a weak console its more balanced then the xbox1 which has a medium range gpu, with a huge bottle neck, i mean devs are even stuggling to run current gen games at 1080p. nintedno console was never meant for 1080p gaming but a 320 shader gpu would have been a nice step up to what we have on 360/ps3, even if the games were to run at 720p, at the very least we would get better framrates and AA solutions.

Actually the X1's solution is more balanced seeing as how it doesn't have an anemic CPU or memory bandwidth compared to the WII U. 4A games cancelled metro the last light on WII U cause they figured out that the CPU could only do 15 Gflops LOL. 



Around the Network

The most recent information has the PS4 CPU clocked at 1.8 GHz.

Also we know PS4 uses AMD TrueAudio DSP.

So basicly there are no XBone advantages in Hardware unless you include the HDMI in.



"These are the highest quality pixels that anybody has seen"

I think the RAM allocation on PS4 is wrong. It should read 4.5GB conventional RAM available for games plus 1GB OS controlled flex memory split into two. 512MB physical and 512MB paged. So in reality Developers get 5GB like the Xbone.

http://www.eurogamer.net/articles/digitalfoundry-ps3-system-software-memory



drake4 said:

http://www.neogaf.com/forum/showthread.php?t=710765 this was the updated version of the wiiu specs thread, the othere one was never updated cause the OP never got to it, or stopped updating since it was such a long process to figure out the specs.

here is the direct quote from the NFSWU/project car dev  Martin Griffiths

Not sure where they got that info from but the WiiU GPU has 192 Shader units, not 160. It also has 32MB of EDRAM, (the same amount as Xbox One) so comparing just the number of shader units against a PC card doesn't give a representative performance comparison. On the CPU side, WiiU also supports multi-threaded rendering that scales perfectly with the number of cores you throw at it, unlike PC DX11 deferred contexts which don't scale very well. The current WiiU build runs around 18-25fps with 5AI with all post (FXAA/Motion blur etc) enabled, which is fairly good given only the fairly cursory optimisation pass that it's had.

ok, read the thread, plus the shinen one that sort of contradicts the idea the hardware was decreased.

In the end, yeah it makes a lot of sense, especially when you know many of the peeps in that thread and their jobs.

It also lends to the lower range of the flops I have there too. So I'll adjust all of that.

Also,

1) cut the damn quote trees, so annoying.

2) please don't doubt my integrity. Regardless of my fan-favorite, I'd never purposefully lead someone astray. We have Rol for that.



Badassbab said:
I think the RAM allocation on PS4 is wrong. It should read 4.5GB conventional RAM available for games plus 1GB OS controlled flex memory split into two. 512MB physical and 512MB paged. So in reality Developers get 5GB like the Xbone.

http://www.eurogamer.net/articles/digitalfoundry-ps3-system-software-memory

I actually had it correct in OP but forgot to update the 2nd post to accurately match. ty



Around the Network
Tabular said:

The most recent information has the PS4 CPU clocked at 1.8 GHz.

Also we know PS4 uses AMD TrueAudio DSP.

So basicly there are no XBone advantages in Hardware unless you include the HDMI in.

Link?



JoeTheBro said:
Tabular said:

The most recent information has the PS4 CPU clocked at 1.8 GHz.

Also we know PS4 uses AMD TrueAudio DSP.

So basicly there are no XBone advantages in Hardware unless you include the HDMI in.

Link?


http://www.neogaf.com/forum/showthread.php?t=715035



superchunk said:
drake4 said:

http://www.neogaf.com/forum/showthread.php?t=710765 this was the updated version of the wiiu specs thread, the othere one was never updated cause the OP never got to it, or stopped updating since it was such a long process to figure out the specs.

here is the direct quote from the NFSWU/project car dev  Martin Griffiths

Not sure where they got that info from but the WiiU GPU has 192 Shader units, not 160. It also has 32MB of EDRAM, (the same amount as Xbox One) so comparing just the number of shader units against a PC card doesn't give a representative performance comparison. On the CPU side, WiiU also supports multi-threaded rendering that scales perfectly with the number of cores you throw at it, unlike PC DX11 deferred contexts which don't scale very well. The current WiiU build runs around 18-25fps with 5AI with all post (FXAA/Motion blur etc) enabled, which is fairly good given only the fairly cursory optimisation pass that it's had.

ok, read the thread, plus the shinen one that sort of contradicts the idea the hardware was decreased.

In the end, yeah it makes a lot of sense, especially when you know many of the peeps in that thread and their jobs.

It also lends to the lower range of the flops I have there too. So I'll adjust all of that.

Also,

1) cut the damn quote trees, so annoying.

2) please don't doubt my integrity. Regardless of my fan-favorite, I'd never purposefully lead someone astray. We have Rol for that.

thats good to know, cause i can't stand those kind of posters.



drake4 said:
JoeTheBro said:
Tabular said:

The most recent information has the PS4 CPU clocked at 1.8 GHz.

Also we know PS4 uses AMD TrueAudio DSP.

So basicly there are no XBone advantages in Hardware unless you include the HDMI in.

Link?


http://www.neogaf.com/forum/showthread.php?t=715035


Thank you for the link.



@superchunk

Why is GPU of Wii U in your chart is 176 GFLOPS? Are you denying factual evidence of die shot from Chipworks and believe individuals that force lowest common denominator when comes to Nintendo? 160 SPU theory would be correct if it was based on Liano's GPU, yet we can clearly see it is based on Bobcat's GPU when we see formation of SRAM cells and Chipworks them self said that Wii U's GPU is extremely customized which explains why it has twice amount of SRAM cells compared to GPU found in Bobcat APU's.

I won't be surprised that there will be individuals trying to deny it and fabricate information to support their agenda and I can see people easily confusing Wii U's GPU as 160 SPU because of die shot since they expect a regular GPU and sometimes compare it to GPU that is not really relevant for comparison.

We can clearly see that Wii U GPU is clearly based on low power GPU's found in Bobcat APU's...