By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming Discussion - Wii U vs PS4 vs Xbox One FULL SPECS (January 24, 2014)

Zappykins said:

Did anybody else notice that in the Architect Meeting, one of the Microsoft guys said that each CPU will execute six instructions per clock cycle per CPU Core? Sony’s paperwork shows one per CPU per cycle per core.

Is this a misunderstanding? How are they doing that? And if so, doesn’t it make the Xbox One’s CPU significantly more powerful? 

Additional Info: "Some performance numbers were given for the CPU and GPU themselves but these cast more shadow than they do light. Microsoft claimed that each CPU core can perform six operations per cycle. The CPU is believed to be using AMD's Jaguar core, but typically this would only be described as able to handle four operations per cycle; two each of integer and floating point (though even here counting operations is complicated; the floating point operations could use vector instructions such as SSE2, in which case one operation would result in four actual computations, potentially giving eight per cycle for floating point alone)."

http://arstechnica.com/gaming/2013/05/microsoft-talks-about-xbox-ones-internals-while-disclosing-nothing/

I don't know but each Jaguar core can perform 4 operation per cycle... that how the PS4 CPU works.



Around the Network



ethomaz said:
Zappykins said:

Did anybody else notice that in the Architect Meeting, one of the Microsoft guys said that each CPU will execute six instructions per clock cycle per CPU Core? Sony’s paperwork shows one per CPU per cycle per core.

Is this a misunderstanding? How are they doing that? And if so, doesn’t it make the Xbox One’s CPU significantly more powerful? 

Additional Info: "Some performance numbers were given for the CPU and GPU themselves but these cast more shadow than they do light. Microsoft claimed that each CPU core can perform six operations per cycle. The CPU is believed to be using AMD's Jaguar core, but typically this would only be described as able to handle four operations per cycle; two each of integer and floating point (though even here counting operations is complicated; the floating point operations could use vector instructions such as SSE2, in which case one operation would result in four actual computations, potentially giving eight per cycle for floating point alone)."

http://arstechnica.com/gaming/2013/05/microsoft-talks-about-xbox-ones-internals-while-disclosing-nothing/

I don't know but each Jaguar core can perform 4 operation per cycle... that how the PS4 CPU works.

Hmm, do you have a link for that?  I was surprised when the original stuff said one per CPU per cycle.  Even the go ol' Xbox 360 does two per CPU per cycle.  And that's really going back to essentialy 2004 tech.

That would still give the Xbox One a 2 execution per CPU per cycle advantage, more or less depending on the Hz it runs.  So kind of negating the GPU differences, or tilting power in the X1’s advantage.

Sigh, I should just make popcorn and watch the fans fight it out.



 

Really not sure I see any point of Consol over PC's since Kinect, Wii and other alternative ways to play have been abandoned. 

Top 50 'most fun' game list coming soon!

 

Tell me a funny joke!

Zappykins said:

Hmm, do you have a link for that?  I was surprised when the original stuff said one per CPU per cycle.  Even the go ol' Xbox 360 does two per CPU per cycle.  And that's really going back to essentialy 2004 tech.

That would still give the Xbox One a 2 execution per CPU per cycle advantage, more or less depending on the Hz it runs.  So kind of negating the GPU differences, or tilting power in the X1’s advantage.

Sigh, I should just make popcorn and watch the fans fight it out.

Rumors have always maintained one per cycle which is why I've had that in OP for awhile now. Then Sony themselves confirmed it.

I'm thinking this MS guy made a mistake or wasn't referring to same thing we're talking about.

I'm waiting for more clarification on that as well as the nm sizing before I change the OP.



Conegamer said:
So Wii U is about 1/4 to 1/5th the strength of the PS4?

I can live with that.


Unfortunately not - if  WiiU's GPU is indeed 320 shader part that makes it either Redwood LE (5550) or RV730 (4650) shrank to 40nm, while PS4 has 7850+ equivalent....of course, both WiiU's GPU and PS4's are not completely off the shelf parts, so this is just approximation for them, based on what we currently know.

Anyway, in both cases what PS4 packs inside is at least some 8x of WiiU (oh, and don't compare FLOPS when trying to figure out how different architectures perform, but real world tests or test aggregates, or in this case best approximations - 7850 on lower end for PS4 and 4650 or 5550 DDR2 for WiiU).



Around the Network
Zappykins said:

Hmm, do you have a link for that?  I was surprised when the original stuff said one per CPU per cycle.  Even the go ol' Xbox 360 does two per CPU per cycle.  And that's really going back to essentialy 2004 tech.

That would still give the Xbox One a 2 execution per CPU per cycle advantage, more or less depending on the Hz it runs.  So kind of negating the GPU differences, or tilting power in the X1’s advantage.

Sigh, I should just make popcorn and watch the fans fight it out.

Edit - I made a mistake... the Jaguar core can decode 4 intruction per each 2 clock cycle... so 2 instructions per clock cycle.

"The decoders can handle four instructions per clock cycle. Instructions that belong to different cores cannot be decoded in the same clock cycle. When both cores are active, the decoders serve each core every second clock cycle, so that the maximum decode rate is two instructions per clock cycle per core."



HoloDust said:
Conegamer said:
So Wii U is about 1/4 to 1/5th the strength of the PS4?

I can live with that.


Unfortunately not - if  WiiU's GPU is indeed 320 shader part that makes it either Redwood LE (5550) or RV730 (4650) shrank to 40nm, while PS4 has 7850+ equivalent....of course, both WiiU's GPU and PS4's are not completely off the shelf parts, so this is just approximation for them, based on what we currently know.

Anyway, in both cases what PS4 packs inside is at least some 8x of WiiU (oh, and don't compare FLOPS when trying to figure out how different architectures perform, but real world tests or test aggregates, or in this case best approximations - 7850 on lower end for PS4 and 4650 or 5550 DDR2 for WiiU).


The Ps4´s is relatively normal (off the shelf), the Wii U´s is completely custom, and by completely, I mean 100%:

http://www.neogaf.com/forum/showthread.php?t=511628



Updated OP with little tidbits like:

XOne controller still using batteries (AA, not built in)
XOne 3 USB ports
XOne only HDMI others with HDMI/Composite (guessing for sony)
XOne additional links etc



orniletter said:
HoloDust said:

Unfortunately not - if  WiiU's GPU is indeed 320 shader part that makes it either Redwood LE (5550) or RV730 (4650) shrank to 40nm, while PS4 has 7850+ equivalent....of course, both WiiU's GPU and PS4's are not completely off the shelf parts, so this is just approximation for them, based on what we currently know.

Anyway, in both cases what PS4 packs inside is at least some 8x of WiiU (oh, and don't compare FLOPS when trying to figure out how different architectures perform, but real world tests or test aggregates, or in this case best approximations - 7850 on lower end for PS4 and 4650 or 5550 DDR2 for WiiU).


The Ps4´s is relatively normal (off the shelf), the Wii U´s is completely custom, and by completely, I mean 100%:

http://www.neogaf.com/forum/showthread.php?t=511628

PS4's GPU is pretty much enhanced to what you can find off the shelf

http://www.gamasutra.com/view/feature/191007/inside_the_playstation_4_with_mark_.php?page=2



HoloDust said:
orniletter said:
HoloDust said:

Unfortunately not - if  WiiU's GPU is indeed 320 shader part that makes it either Redwood LE (5550) or RV730 (4650) shrank to 40nm, while PS4 has 7850+ equivalent....of course, both WiiU's GPU and PS4's are not completely off the shelf parts, so this is just approximation for them, based on what we currently know.

Anyway, in both cases what PS4 packs inside is at least some 8x of WiiU (oh, and don't compare FLOPS when trying to figure out how different architectures perform, but real world tests or test aggregates, or in this case best approximations - 7850 on lower end for PS4 and 4650 or 5550 DDR2 for WiiU).


The Ps4´s is relatively normal (off the shelf), the Wii U´s is completely custom, and by completely, I mean 100%:

http://www.neogaf.com/forum/showthread.php?t=511628

PS4's GPU is pretty much enhanced to what you can find off the shelf

http://www.gamasutra.com/view/feature/191007/inside_the_playstation_4_with_mark_.php?page=2

You are right, but I wanted emphasize how custom the Wii U´s GPU is