By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Microsoft - Penello specifies why X1vsPS4 performance difference is overstated

Adinnieken said:
Ashadian said:

"The CPU contained within the PS4 is running stock speeds, 1.6ghz compared to Microsofts 1.75 after an upclock - 150mhz does not make up for the different in number of CUs, and (aptly ignored by Albert) the difference in ROP units - hence why even with the 150mhz upclock, the Xbox One is still way behind in terms of raw power - To see just how little difference 150mhz makes, go into your bios and bump your clock speed up to be 150mhz faster, you will see framerates climb barely 2fps at best - most modern CPUs clock to at least 400mhz higher than stock as an average, with some people bumping that to over 900mhz.

But seriously, before you try and tell me im wrong, he is comparing 1.6 to 1.75 and claiming its 10% faster, asside from being factually wrong (10% would be 1.76ghz), it also compares the cpus on the most basic of levels, which is a stupid thing to do, even more so given that theyre APUS - compare a 2ghz celeron to a 2ghz pentium to see why.

Last but not least adding up the ddr3 and esram speeds to get a higher number is universally seen as a retarded pr stunt by those with half a brain because theoretical peaks for different ram types DO NOT COMBINE - just like if you put two 3ghz xeons in a server, it does NOT mean that the server is now running at 6ghz, its running at 3ghz with additional cores - adding ddr3 and the tiny space allotted by esram does not work, even for "on paper" results - you cannot fill the 8gb of ddr3 with the esram fast enough without the esram being bottlednecked, esram could not be used for a large majority of game resources, where as GDDR5 can be used for just about anything at the cost of slightly higher latency."

Actually, he doesn't ignore it.  He addressed it.  Each CU gets a 6% speed increase, not simply the entire GPU with the frequency increase.  So while there are fewer CUs those CUs operate faster than the PS4's.  When 12 CUs are operating at 6% greater speed than the PS4, that equates into a 72% speed improvement.

Early in that thread (not that post) he states near 10%.  He has also stated before greater than 9%.  If we want to get specific it's 9.375%    No, he also talks on a deeper level, about how the the Xbox One's CPU also has a 30GB/s transfer rate between the CPU and memory,   A 50% speed improvement over the PS4.

You missed the part where he also specified that the eSRAM's speed was 204GB/s.  That still happens to be 16% faster than the GDDR5 memory in the PS4.  Don't assume my argument is that 32MB is the same as 8GB, but I wonder what memory speed is necessary in order for a modern GPU to adequately process a 1920x1080 image without bottlenecking or taking a performance hit.  The eSRAM is used both as a buffer for the DDR memory, as well as a cache to temporarily hold data that's needed quickly.

First of all, if he was talking about a 50% increase in CPU cores then
he would be correct. However, GPUs are a highly parallel environment.
CPUs are regarded generally as the serial processing unit and GPUs as
the parallel processing unit.

In the case of 12 CUs & 768 Shaders vs 18 CUs & 1152 Shaders on a
GPU, more is always better. While I admit that in itself will not
create a 50% power advantage; it is a considerable advantage.

He also neglects to mention another important point; the PS4 GPU has 32
ROPS & 72 Texture Units vs the Xbox One GPU's 16 ROPS & 48
Texture Units; this in itself is another considerable advantage that
will cause the PS4 to age much better.

The Xbox One GPU has an 853MHz vs 800MHz advantage in clock speed over
the PS4 GPU. I'm going for the PS4 but any bumps in tech specs on any
system is always a plus for the industry. However, 12 Compute Units @
853MHz vs 18 Compute Units @ 800MHz(?) on a GPU and the latter still
wins out easily.

MS/Xbox continue to add the bandwidth of separate pipes together in
their pr statements to try and fool the uninformed when in reality it
does not work like that. It is NOT 272GB/s!

That 204GB/s peak theoretical read/write is limited to a 32MB chunk of
low latency eSRAM but the main DDR3 2166MHz RAM is unified pool limited
to 68GB/s of peak theoretical bandwidth. Around 20GB/s of that is
inherently available to the CPU. The Xbox advantage here is a 30GB/s
CPU-to-GPU HSA link.

On the PS4 you have a single, simple unified pool of GDDR5 5500MHz RAM
which acts as a unified address space with 176GB/s of peak theoretical
bandwidth across the board. Again, around 20GB/s of which is inherently
available to the CPU. The PS4 has a 20GB/s CPU-to-GPU HSA link.

...and before anyone brings up latency advantages, in a traditional PC
arena the DDR3 has a clear edge and the GDDR5 suffers (though the gap
isn't quite as pronounced in the first place as some would say),
however, hUMA/HSA implement a special memory controller configuration
which neautralises much of this issue; and even without this, the better
choice and compromise in a GPU-centric gaming console would be GDDR5.

So...the Xbox Advantages? A CPU with a marginally higher clock (though
PS4 CPU clock is yet to be confirmed) and a notable amount of additional
bandwidth coming in directly from the GPU. It's likely the SHAPE Audio
chip is stronger than the Audio chip in the PS4 so there will be a
slight ease in CPU load vs the PS4. A GPU with 53MHz clock advantage.

And...the PS4 advantages? A GPU with 29% additional computational power,
a 50% increase in Compute Units/Shaders, a 50% increase in texture
units, a 100% increase in fillrate and a 700% increase in compute queue
granularity. A simpler, faster pool of GDDR5 RAM that acts as a unified
address space.

Sony also have the stronger tools this time around, have had a better
policy from the outset, they've put together a console with more raw
power, likely with a chip die that is ironically smaller and
subsequently cooler, in a smaller, more iconic, cheaper box with an
internal psu that is selling in more countries earlier.

 

Multiplatform games will for the most part be developed to the lowest
common denominator; and the fact is that overall that is the Xbox One.
So we're only likely to see a slightly stronger Image Quality &
Performance on PS4 for most multiplats.

The first party PS4 titles though will pull away considerably though by
the second and third generation of titles; and the lack of
fillrate/texture units will age the Xbox One more by then.





Around the Network
Adinnieken said:
petalpusher said:


That's not how mathematic works.

 

(1 CU x 853 mhz x 64 ops x 2 cycles)  x 12  

is still the same than

12 CU x 853 Mhz x 64 ops x 2 cycles

= 1 310 208 fl ops

 

A 72% increase would be this gpu at 1376 Mhz (and probably burn quickly)

No.  A GPU is parrallel.  Therefore CU's can run independent of each other.  I can have one CU running a job each clock cycle, and another running one job over 853 clock cycles.  Because those each of those CUs will reduce the time to run an operation, the effect of those savings become accumulative.  For example, @ 800GHz I wouldn't be able to run 1000 operations.    This means it's capable of running 57 more operations. 

PS4 1 CU @ 800MHz @ 943 operations per clock cycle
XB1 1 CU @ 853MHz @ 1000 operations per clock cycle

So, at the end of one minute.

PS4 1CU @ 800MHz @ 943 operations per clock cycle = 754,400 operations per minute. 
XB1 1CU @ 853MHz @ 1000 operations per clock cycle = 853,000 operations per minute.

98,600 operations more.  Or 105 more clock cycles to do the same amount of work.  That's a 12% difference in performance right there.  Not to mention, that's one CU.

 

Please god you calculate things in minute for a gpu ? LOL

Do you even realize how much frames a gpu have to do in one minute ? that would be 1800 frames for a 30 fps games that's about 100 millions of operations for the gpu. It's not how you calculate ops anyway

Let's go back to basics and simple things...

1 CU @ 853 Mhz does 64 x 2 x 853 operations per second (floating point one). Thats 109 184 fl.op/s for each CU . Multiply it by 12 CUs and you get your 1.31 Tflops so this is the right calculation.

For fun if you want it per minute, it's 6 551 040 (yes +6 millions). So your math is wrong to begin with

And WE DON'T CARE if each CU is marginally faster by 6%, GPU are higly parrallel beast that's why we put more CU to get more powerfull cards.

Would you say an HD 7770 is faster or equal to a HD 7950 just because the 7770 CUs are clocked at 1000 Mhz and the 7950 are clocked at 800 Mhz on reference cards ?

NO, an HD 7950 blows away an HD 7770. And guess what's the main reason for that? It does have more CUs...it does also have more REAL bandwidth with GDDR5 (like the PS4 does) and twice the amount of ROPs (like the PS4 does)

 

See that chart ? the HD 7770 is all way down in performance and its CUs on their own are WAY faster than the HD 7950's CUs. 

 

It's not even a 53 Mhz difference it's 200 Mhz difference in 7770's favor, it's still get destroyed in real world performance (yes not gflops) 



I've seen some outlandish claims from both sides of the fence, but where Panello seems to be receiving a relentless pile-on, the Sony camp gets a pass for claims such as "supercharged PC" and throwing around paper numbers for what is a wheezy arch even by today's standards.



Legend11 said:

It has a lot more features and they really are different beasts in that regard.  For example if you took a PS4 out of the box and hooked it up and then said "PS4 on" what would happen?  What about if you wanted to talk to someone on your friends list, would saying "PS4 skype Iluvyermom" to your PS4 open a window and contact your friend?  What about if you wanted to exercise with a game would a PS4 version be able to tell you which muscles are currently being stressed or tell what your heart rate currently is by looking at you?  There are many things the Xbox One can do that the PS4 simply can't so it's not as simple as saying one is better because neither one is better at everything.

"...skype Iluvyermom..."

You sir, are hilarious.  I wonder if that is really a skype name.



 

Really not sure I see any point of Consol over PC's since Kinect, Wii and other alternative ways to play have been abandoned. 

Top 50 'most fun' game list coming soon!

 

Tell me a funny joke!

Microsoft employee specifies why the fact that Microsoft product is under powered is irrelevant



Around the Network
Madword said:

I pulled the free country bit because i made my point, and you get all up in my face and upset about it... hey just cause I think MS have screwed up royally and continue to do that, ... just chill out.

I guess we will have to wait until the cloud has proven itself to see who is right, but as of yet, even MS have been unable to prove this tech works in games (please don't tell me that drivertars is a good example of this tech otherwise you will have lost all credibility).... It may work in terms of crunching accounts data on a server and sending it client side to display (which doesnt need to display stuff in milliseconds), but as games don't work like that... well I look forward to the day when someone can say, the cloud has improved the performance of the console. It's taken 6-7 years for companies to reach the limits of the hardware of the current gen (and they can still push it further if they had the patience)... the gains they can make by improving their code locally on the console far outweigh the internet bandwidth speed issues and processing that happen on a server farm somewhere in the world. Games need to update many times a second... they are updating quicker than sending a data packet to a server farm, processing it, and sending it back. Ohhh what about collating the data of 1 million users and presenting the result to a console... you know what when you start to do crap like that the data you get back is just crap anyway... and will not present anything useful to 99% of the games. This is my point, the cloud is a PR BS statement, it can do something and 300,000 servers, but nothing that will help 99% of the games that exist (isn't that what were talking about or are you saying in some case situation it may be better to do something on the cloud). 

Is it true that all these cloud servers will improve Multiplayer games hosting and may provide less issues on day 1 of an MMO launching... yep... .single player games not 1 bit. You know what, dedicated servers like we've had for the last however many years would have the same benefits on hosting, and having better managed MMO launches would solve the other. 

Tell me how different the XboxOne and Ps4 are so different that they will both have 95% of the same features and games. Why is it that many people are saying "look at the games" when trying to compare the two consoles. It's because that and the actual performance to play the games is effectively the two things that matter. Let's not start getting into the tick for tat, but kinect can see in the dark when PS eye cannot... I talking about back of the box core features. The two consoles are now the same now that DRM has gone.

Lol somehow i dont see giving a internally made game such as knack away free to Japan is anywhere the same as having to pay out $millions of dollars to a 3rd party game. Seriously are you really that confused that you cannot see what you are saying is just silly. 

So you are saying Gaikai will not work?  Does that mean you think Sony is lying about the backwards compatibility?

Why would they do that, and do you have any proof?



 

Really not sure I see any point of Consol over PC's since Kinect, Wii and other alternative ways to play have been abandoned. 

Top 50 'most fun' game list coming soon!

 

Tell me a funny joke!

petalpusher said:
Adinnieken said:
petalpusher said:


That's not how mathematic works.

 

(1 CU x 853 mhz x 64 ops x 2 cycles)  x 12  

is still the same than

12 CU x 853 Mhz x 64 ops x 2 cycles

= 1 310 208 fl ops

 

A 72% increase would be this gpu at 1376 Mhz (and probably burn quickly)

No.  A GPU is parrallel.  Therefore CU's can run independent of each other.  I can have one CU running a job each clock cycle, and another running one job over 853 clock cycles.  Because those each of those CUs will reduce the time to run an operation, the effect of those savings become accumulative.  For example, @ 800GHz I wouldn't be able to run 1000 operations.    This means it's capable of running 57 more operations. 

PS4 1 CU @ 800MHz @ 943 operations per clock cycle
XB1 1 CU @ 853MHz @ 1000 operations per clock cycle

So, at the end of one minute.

PS4 1CU @ 800MHz @ 943 operations per clock cycle = 754,400 operations per minute. 
XB1 1CU @ 853MHz @ 1000 operations per clock cycle = 853,000 operations per minute.

98,600 operations more.  Or 105 more clock cycles to do the same amount of work.  That's a 12% difference in performance right there.  Not to mention, that's one CU.

 

Please god you calculate things in minute for a gpu ? LOL

Do you even realize how much frames a gpu have to do in one minute ? that would be 1800 frames for a 30 fps games that's about 100 millions of operations for the gpu. It's not how you calculate ops anyway

Let's go back to basics and simple things...

1 CU @ 853 Mhz does 64 x 2 x 853 operations per second (floating point one). Thats 109 184 fl.op/s for each CU . Multiply it by 12 CUs and you get your 1.31 Tflops so this is the right calculation.

For fun if you want it per minute, it's 6 551 040 (yes +6 millions). So your math is wrong to begin with

And WE DON'T CARE if each CU is marginally faster by 6%, GPU are higly parrallel beast that's why we put more CU to get more powerfull cards.

Would you say an HD 7770 is faster or equal to a HD 7950 just because the 7770 CUs are clocked at 1000 Mhz and the 7950 are clocked at 800 Mhz on reference cards ?

NO, an HD 7950 blows away an HD 7770. And guess what's the main reason for that? It does have more CUs...it does also have more REAL bandwidth with GDDR5 (like the PS4 does) and twice the amount of ROPs (like the PS4 does)

 

See that chart ? the HD 7770 is all way down in performance and its CUs on their own are WAY faster than the HD 7950's CUs. 

 

It's not even a 53 Mhz difference it's 200 Mhz difference in 7770's favor, it's still get destroyed in real world performance (yes not gflops) 

Comparing a HD 7770 to a HD 7950 at just looking at Engine Clock speed doesn't really do it justice.  It's the only thing that is a tiny bit better.

While yes the HD 7770 does have a faster Engine Clock at 1000MHz, over the 7950's at 850 (boosted up to 925MHz), but it loses out on most every other spec.

The 7950 has nearly three times the CU Cores(28 vs 10), faster memory at 5 Gpbs vs 4.5 gpbs, and much higher bandwidth at 240BG/s vs only 72GB/s.

There is so much more that 7950 has better.  So it should have a higher performance.  But even with all those advantages, it just over doubles performance, even with nearly three times the CU Cores.  It should be much higher. 



 

Really not sure I see any point of Consol over PC's since Kinect, Wii and other alternative ways to play have been abandoned. 

Top 50 'most fun' game list coming soon!

 

Tell me a funny joke!

Zappykins said:

Comparing a HD 7770 to a HD 7950 at just looking at Engine Clock speed doesn't really do it justice.  It's the only thing that is a tiny bit better.

While yes the HD 7770 does have a faster Engine Clock at 1000MHz, over the 7950's at 850 (boosted up to 925MHz), but it loses out on most every other spec.

The 7950 has nearly three times the CU Cores(28 vs 10), faster memory at 5 Gpbs vs 4.5 gpbs, and much higher bandwidth at 240BG/s vs only 72GB/s.

There is so much more that 7950 has better.  So it should have a higher performance.  But even with all those advantages, it just over doubles performance, even with nearly three times the CU Cores.  It should be much higher. 


The Radeon 7950/7970's real strength is when you start gaming beyond that rather low 1920x1080 resolution, it's then an entirely different ball game, the 7770 folds like no tomorrow, being essentially useless, that's where the extra bandwidth *really* comes into play, besides the consoles don't have 7950 levels of performance anyway.

This would be a little more accurate: http://www.anandtech.com/bench/product/777?vs=778
The 7850 is doubling the 7770 in Hitman and it doesn't even have double the hardware.




www.youtube.com/@Pemalite

Pemalite said:
Zappykins said:

Comparing a HD 7770 to a HD 7950 at just looking at Engine Clock speed doesn't really do it justice.  It's the only thing that is a tiny bit better.

While yes the HD 7770 does have a faster Engine Clock at 1000MHz, over the 7950's at 850 (boosted up to 925MHz), but it loses out on most every other spec.

The 7950 has nearly three times the CU Cores(28 vs 10), faster memory at 5 Gpbs vs 4.5 gpbs, and much higher bandwidth at 240BG/s vs only 72GB/s.

There is so much more that 7950 has better.  So it should have a higher performance.  But even with all those advantages, it just over doubles performance, even with nearly three times the CU Cores.  It should be much higher. 


The Radeon 7950/7970's real strength is when you start gaming beyond that rather low 1920x1080 resolution, it's then an entirely different ball game, the 7770 folds like no tomorrow, being essentially useless, that's where the extra bandwidth *really* comes into play, besides the consoles don't have 7950 levels of performance anyway.

This would be a little more accurate: http://www.anandtech.com/bench/product/777?vs=778
The 7850 is doubling the 7770 in Hitman and it doesn't even have double the hardware.

True, I wish the new consoles had that kind of GPU power in them. But it would take something close to that to make much a visual difference, so I guess we should be happy they don't cost $1000 each.

Just curious, do you have a great than 1080P resolution monitor or multiples?  I don't know many that do, and nobody with a 4K yet.



 

Really not sure I see any point of Consol over PC's since Kinect, Wii and other alternative ways to play have been abandoned. 

Top 50 'most fun' game list coming soon!

 

Tell me a funny joke!

Legend11 said:
Xenostar said:
Legend11 said:
Xenostar said:
Subie_Greg said:
PS4 is more powerful. Both on paper and in actual real world performance. Yes you can optimize on the Xbox One. But you can also do so on Ps4 (just pointing that out because Xbots now keep bringing that up for some reason)

Mutliplatform games will look the same as far as the eye test goes. So what difference will it make.

But PS4 is more powerful and you can twist and spin shit to mean whatever you like. Go ahead and add 8GB of ram with esram and times that by the month Xbox One comes out and yada yada so on and so forth. Whatever makes you feel better about the product you want to buy

I do enjoy reading the threads of Xbox One is almost as powerful = just as powerful = you should buy Xbox One haha.


Totally now if they could say "its almost as powerfull" and follow up with "and its alot cheaper" then that would make sence, but its way more expensive, its just a crazy place to start trying to win people over from. 

You would sell a car by going its almost as fast as Car X and we will only charge you more money for it!


It has a lot more features and they really are different beasts in that regard.  For example if you took a PS4 out of the box and hooked it up and then said "PS4 on" what would happen?  What about if you wanted to talk to someone on your friends list, would saying "PS4 skype Iluvyermom" to your PS4 open a window and contact your friend?  What about if you wanted to exercise with a game would a PS4 version be able to tell you which muscles are currently being stressed or tell what your heart rate currently is by looking at you?  There are many things the Xbox One can do that the PS4 simply can't so it's not as simple as saying one is better because neither one is better at everything.


Ill take the one thats better at games, than the one thats better at voice controls :)


So you're saying that PS4 is better at dance or fitness games for example?  What about quirky indie games that take advantage of features the Xbox One has that PS4 either can't do as well or can't do at all?  Apparently one of the best HD games shown so far is Titanfall and that features graphics that while good aren't pushing the system by any means yet it's completely owning the PS4 lineup in terms of awards and media attention.


Yep better multiplats everytime, over a motion game fad thats long passed and holds no interest to most anymore, especially me.