By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Wii U vs PS4 vs Xbox One FULL SPECS (January 24, 2014)

z101 said:
DanneSandin said:
Man, I don't understand jack shit of any of these few days conversations!! All I'm interested is how much better will PS720 be in regards of PS320 and WiiU :P


According to the know Specs:

Wii U = 2-3 PS360.

PS4 = 3-4 PS360 (only if it gets a second GPU, only with APU10 1-2 PS360).

Xbox720 = 1-4 PS360 (the data is very vague)

What are you basing your numbers on?

I would say (also going by known and/or rumored specs):
Wii U = 1.5-3 PS360 (but this one is also still very vague as we don't know anything about Clock Rates yet)
PS4 (A10 rumor without dedicated GPU using an unmodified(!) HD 7660D @ 800MHz*) = 2-2.5 PS360
PS4 (A10 rumor using an additional dedicated GPU at least as powerful as the HD 7660D) = 4-?? PS360
XBox720 = not enough known in my opinion to say anything

And all of that is ignoring that the new consoles are built on a more modern hardware!

*the HD 7660D has ~600GFlops while the XBox360 GPU can do ~240GFlops. And I still think you can modify the HD 7660D. OC has been done to ~1200MHz which makes the GPU around 50% faster in theory if you provide enough bandwidth. And AMD can do more than just increasing the Clock Rate. The CPU is about on par to the PS360 ones but that strongly depends on how you use it (highly optimized Cell-Code probably would still crush this CPU, but that kind of optimizing isn't really possible in games afaik. In real world gaming the A10 may well be faster on average).



Around the Network
drkohler said:
HoloDust said:

I think A10-5700 + 7770 might be reasonable to expect. Though they cannot be crossfired (at least according to official specs),

I'm still puzzledd by this, because technically nothing speaks against hybrid crossfiring the A10 with a GCN card. It doesn't work on PCs because there is only one AMD driver. In theory you'd need two drivers for a A10/GCN hybrid, so welcome in "writing-a-driver-hell". Probably takes major thinking (The Portland Group (PGI) and AMD are working on the PGI Accelerator Compilers, for example).

Yes, I find it strange too, considering that A10 GPUs (which are VLIW4) can be crossfired with VLIW5 hadrware - so why not with GCN? I do admit I don't have required knowledge to fully understand all the intricacies of CrossFire, but I would assume they could make those 2 GPUs work togethr (now when I think of it, wasn't there quite some time ago a rumour just about that, using some different approach in pairing GPUs?)



Sirius87 said:

1. The point that the GPU in the APU in desktop environment is totally limited by the RAM is obviously true (there are a lot of tests showing how the GPUs profit from higher RAM-frequencies, but more than DDR3-1866 is still to expensive for consumers and so the APUs are probably designed for a maximum of DDR3-1866).

3. Afaik all DDR3-modules in desktops work at 64-bit. So using them in dual channel brings us to a combined 128-bit (I'm not sure if you can really say it that way, but at least from a pure available bandwidth POV you now have twice as much).

DDR3-packages themselves usually seem to be capable of up to 32-bit (but I may be wrong here). So if the PS4/PS Orbis uses 8 4Gb/512MB-packages @32-bit that brings us up to a maximum of 256-bit, twice as much as possible in a typical desktop-PC using DDR3-modules in dual channel.

You are mixing up too many things. Firstly, the A10 APU dram clock is limited to 1866MHz. This has nothing to do whether faster drams are exepensive or not, that is just the design limit (the higher the clock in a chip, the more engineering trouble. So there is essentially a point where more clock is just engineering suicide). Secondly, the A10 APU has two 64bit ddr3 memory controllers (so you can't hook ddr4/gddr5 dram to it to make it faster).. That gives you a 128bit bus width, no more, no less.



drkohler said:
Sirius87 said:

1. The point that the GPU in the APU in desktop environment is totally limited by the RAM is obviously true (there are a lot of tests showing how the GPUs profit from higher RAM-frequencies, but more than DDR3-1866 is still to expensive for consumers and so the APUs are probably designed for a maximum of DDR3-1866).

3. Afaik all DDR3-modules in desktops work at 64-bit. So using them in dual channel brings us to a combined 128-bit (I'm not sure if you can really say it that way, but at least from a pure available bandwidth POV you now have twice as much).

DDR3-packages themselves usually seem to be capable of up to 32-bit (but I may be wrong here). So if the PS4/PS Orbis uses 8 4Gb/512MB-packages @32-bit that brings us up to a maximum of 256-bit, twice as much as possible in a typical desktop-PC using DDR3-modules in dual channel.

You are mixing up too many things. Firstly, the A10 APU dram clock is limited to 1866MHz. This has nothing to do whether faster drams are exepensive or not, that is just the design limit (the higher the clock in a chip, the more engineering trouble. So there is essentially a point where more clock is just engineering suicide). Secondly, the A10 APU has two 64bit ddr3 memory controllers (so you can't hook ddr4/gddr5 dram to it to make it faster).. That gives you a 128bit bus width, no more, no less.


But can't AMD modify the memory controller? Without doing that of course there won't be a higher bandwidth possible. But as GPUs all the time use different bus width I thought that changing some parts will also be possible in the APU.

And 1866 is only the supported maximum Ram frequency. Tests with 2133 have been done with some mainboards that allow it and the FPS in games profited from it (but only by a small amount when not also OCing the GPU in the APU at the same time).



Sirius87 said:
z101 said:
DanneSandin said:
Man, I don't understand jack shit of any of these few days conversations!! All I'm interested is how much better will PS720 be in regards of PS320 and WiiU :P


According to the know Specs:

Wii U = 2-3 PS360.

PS4 = 3-4 PS360 (only if it gets a second GPU, only with APU10 1-2 PS360).

Xbox720 = 1-4 PS360 (the data is very vague)

What are you basing your numbers on?

I would say (also going by known and/or rumored specs):
Wii U = 1.5-3 PS360 (but this one is also still very vague as we don't know anything about Clock Rates yet)
PS4 (A10 rumor without dedicated GPU using an unmodified(!) HD 7660D @ 800MHz*) = 2-2.5 PS360
PS4 (A10 rumor using an additional dedicated GPU at least as powerful as the HD 7660D) = 4-?? PS360
XBox720 = not enough known in my opinion to say anything

You are leaving out a few key facts here. For example, the WiiU uses an average of 40W according to Nintendo. That translates into a clock rate for the CPU in the 2.2-2.4GHz region. There are other hints like the fact that some (launch) software is NOT running at 1080p, indicating some CPU starvation. So CPU wise, the performance of the WiiU would only be around 60-80% of a PS3/XBox. However, the heavily modified GPU (I don't think there is any similarity left to the standard ATI 4xxx GPU) makes good for that so WiiUperformance (Whichever way you define "performance")  is around 130% of the PS3/XBox as I explained in a precious post. Also my previous estimates for PS4/XBoxNext are still reasonable, based on what we know at this time (and there are enough hints for the XBoxNext). SImply forget that 4-? times more powerful (again the meaning of "power" should actually be defined).



Around the Network
drkohler said:
Sirius87 said:
z101 said:
DanneSandin said:
Man, I don't understand jack shit of any of these few days conversations!! All I'm interested is how much better will PS720 be in regards of PS320 and WiiU :P


According to the know Specs:

Wii U = 2-3 PS360.

PS4 = 3-4 PS360 (only if it gets a second GPU, only with APU10 1-2 PS360).

Xbox720 = 1-4 PS360 (the data is very vague)

What are you basing your numbers on?

I would say (also going by known and/or rumored specs):
Wii U = 1.5-3 PS360 (but this one is also still very vague as we don't know anything about Clock Rates yet)
PS4 (A10 rumor without dedicated GPU using an unmodified(!) HD 7660D @ 800MHz*) = 2-2.5 PS360
PS4 (A10 rumor using an additional dedicated GPU at least as powerful as the HD 7660D) = 4-?? PS360
XBox720 = not enough known in my opinion to say anything

You are leaving out a few key facts here. For example, the WiiU uses an average of 40W according to Nintendo. That translates into a clock rate for the CPU in the 2.2-2.4GHz region. There are other hints like the fact that some (launch) software is NOT running at 1080p, indicating some CPU starvation. So CPU wise, the performance of the WiiU would only be around 60-80% of a PS3/XBox. However, the heavily modified GPU (I don't think there is any similarity left to the standard ATI 4xxx GPU) makes good for that so WiiUperformance (Whichever way you define "performance")  is around 130% of the PS3/XBox as I explained in a precious post. Also my previous estimates for PS4/XBoxNext are still reasonable, based on what we know at this time (and there are enough hints for the XBoxNext). SImply forget that 4-? times more powerful (again the meaning of "power" should actually be defined).


I'm just using maximum Flops as a measurement for power for the GPUs (ignoring Texel/Pixelfillrate) and for the most part just ignored the CPUs. Only changes I made because of the CPUs: I set the lower boundary for the PS4 without a dedicated GPU to 2 instead of 2.5 (edit: and for the PS4 with d.GPU to a 4 instead of a 5) and reduced the minimum of the Wii U to a 1.5 (instead of a 2 I would have assigned otherwise, because I assume the WiiU-CPU needs some GPGPU-help to do on par with the PS360-CPU).



Some comparisons (I left out memory bandwidth, too much unknown there):

Xbox360:
Pixel: 4 GP/s
Texel: 8 GT/s
GFLOPS: 240

WiiU (based on e6760 rumour):
Pixel: 4.8 GP/s
Texel: 14.4 GT/s
GFLOPS: 576

WiiU (RV740 speculation @400MHz and @600Mhz (Mobility 4830 equivalents)):
Pixel: 6.4 GP/s ; 9.6 GP/s
Texel: 12.8GT/s ; 19.2 GT/s
GFLOPS: 512 ; 768

A10-5700 (7660D@760MHz):
Pixel: 6.1 GP/s
Texel: 18.2 GT/s
GFLOPS: 583.7

A10+7770 (@1GHz):
Pixel: 22.1 GP/s
Texel: 58.2 GT/s
GFLOPS: 1863.7

Just as reference - 7970GHz edition (with boost):
Pixel: 33.6GP/s
Texel: 134.4GT/s
GFLOPS: 4300



Sirius87 said:


But can't AMD modify the memory controller? Without doing that of course there won't be a higher bandwidth possible. But as GPUs all the time use different bus width I thought that changing some parts will also be possible in the APU.

Of course AMD can change things. Makes the chips more complex and bigger (translation: a lot more expensive). The point, however, is not to design a console with world-record-memory throughput, the point is get to 1080p/60Hz as cheap as possible.

There is a point, though, when your cache controller outfoxes your memory bandwidth (basically meaning all the data required by the CPU is in the cache, no matter how wide your memory controller shuffles data). On the PC, that bus size is apparently 128bit. Then you can play with more channels.. but it gets more expensive. With APU, things get even more complex as the bus width is doubled inside the chip for the GPu part. (Still APU was not a good GPU idea for a console that strives for high graphics performance).



DanneSandin said:
Man, I don't understand jack shit of any of these few days conversations!! All I'm interested is how much better will PS720 be in regards of PS320 and WiiU :P

My prediction is 6-9x PS360

That is 2-3x Wii U



drkohler said:

You are leaving out a few key facts here. For example, the WiiU uses an average of 40W according to Nintendo. That translates into a clock rate for the CPU in the 2.2-2.4GHz region. There are other hints like the fact that some (launch) software is NOT running at 1080p, indicating some CPU starvation. So CPU wise, the performance of the WiiU would only be around 60-80% of a PS3/XBox. However, the heavily modified GPU (I don't think there is any similarity left to the standard ATI 4xxx GPU) makes good for that so WiiUperformance (Whichever way you define "performance")  is around 130% of the PS3/XBox as I explained in a precious post. Also my previous estimates for PS4/XBoxNext are still reasonable, based on what we know at this time (and there are enough hints for the XBoxNext). SImply forget that 4-? times more powerful (again the meaning of "power" should actually be defined).

Are you using just clock rates as the means to surmise performance against the Cell and Xenon?



The rEVOLution is not being televised