By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sony Discussion - SONY to reveal The PS4 Processor November 13th at AMD APU13

So what are you guys expecting to see as the CPU clock speed? I think it will probably be the 1.6 we've been hearing but I'd love to see something like a 2.0 :P



Around the Network
Normando said:
So what are you guys expecting to see as the CPU clock speed? I think it will probably be the 1.6 we've been hearing but I'd love to see something like a 2.0 :P

3.7ghz, 4.1ghz turbo

true story



Normando said:
So what are you guys expecting to see as the CPU clock speed? I think it will probably be the 1.6 we've been hearing but I'd love to see something like a 2.0 :P

Actually ... 

http://arstechnica.com/information-technology/2013/04/amds-heterogeneous-uniform-memory-access-coming-this-year-in-kaveri/ (Read the update. If this is true indie developers could thrive and flourish because they would getting good graphics with less effort while reducing some development costs for AAA game developers.)



fatslob-:O said:
Normando said:
So what are you guys expecting to see as the CPU clock speed? I think it will probably be the 1.6 we've been hearing but I'd love to see something like a 2.0 :P

Actually ... 

http://arstechnica.com/information-technology/2013/04/amds-heterogeneous-uniform-memory-access-coming-this-year-in-kaveri/ (Read the update. If this is true indie developers could thrive and flourish because they would getting good graphics with less effort while reducing some development costs for AAA game developers.)


Wait, if this is true why hasn't anyone made a bigger deal of this? I'm not really a tech guy, but this sounds like it would make the PS4 more powerful than it's specs would suggest while making it easier to develop for.



Normando said:
fatslob-:O said:
Normando said:
So what are you guys expecting to see as the CPU clock speed? I think it will probably be the 1.6 we've been hearing but I'd love to see something like a 2.0 :P

Actually ... 

http://arstechnica.com/information-technology/2013/04/amds-heterogeneous-uniform-memory-access-coming-this-year-in-kaveri/ (Read the update. If this is true indie developers could thrive and flourish because they would getting good graphics with less effort while reducing some development costs for AAA game developers.)


Wait, if this is true why hasn't anyone made a bigger deal of this? I'm not really a tech guy, but this sounds like it would make the PS4 more powerful than it's specs would suggest while making it easier to develop for.

The problem is the PS4 is already more powerful due to it's sheer power. (I'm not referring to shaders here but rather I am referring to its bandwidth and ROPS.) The only advantage that I think hUMA can bring to the table is better physics performance seeing as how the cpu can accelerate certain parts of the pipeline. 



Around the Network
fatslob-:O said:
Normando said:
fatslob-:O said:
Normando said:
So what are you guys expecting to see as the CPU clock speed? I think it will probably be the 1.6 we've been hearing but I'd love to see something like a 2.0 :P

Actually ... 

http://arstechnica.com/information-technology/2013/04/amds-heterogeneous-uniform-memory-access-coming-this-year-in-kaveri/ (Read the update. If this is true indie developers could thrive and flourish because they would getting good graphics with less effort while reducing some development costs for AAA game developers.)


Wait, if this is true why hasn't anyone made a bigger deal of this? I'm not really a tech guy, but this sounds like it would make the PS4 more powerful than it's specs would suggest while making it easier to develop for.

The problem is the PS4 is already more powerful due to it's sheer power. (I'm not referring to shaders here but rather I am referring to its bandwidth and ROPS.) The only advantage that I think hUMA can bring to the table is better physics performance seeing as how the cpu can accelerate certain parts of the pipeline. 

I don't mean more powerful compared to the One. I mean more powerful compared to what we already know about it's specs. would a CPU+GPU with HMU be more powerful than the same CPU+GPU without HMU? Or just easier to make games for?



Normando said:
fatslob-:O said:
Normando said:
fatslob-:O said:
Normando said:
So what are you guys expecting to see as the CPU clock speed? I think it will probably be the 1.6 we've been hearing but I'd love to see something like a 2.0 :P

Actually ... 

http://arstechnica.com/information-technology/2013/04/amds-heterogeneous-uniform-memory-access-coming-this-year-in-kaveri/ (Read the update. If this is true indie developers could thrive and flourish because they would getting good graphics with less effort while reducing some development costs for AAA game developers.)


Wait, if this is true why hasn't anyone made a bigger deal of this? I'm not really a tech guy, but this sounds like it would make the PS4 more powerful than it's specs would suggest while making it easier to develop for.

The problem is the PS4 is already more powerful due to it's sheer power. (I'm not referring to shaders here but rather I am referring to its bandwidth and ROPS.) The only advantage that I think hUMA can bring to the table is better physics performance seeing as how the cpu can accelerate certain parts of the pipeline. 

I don't mean more powerful compared to the One. I mean more powerful compared to what we already know about it's specs. would a CPU+GPU with HMU be more powerful than the same CPU+GPU without HMU? Or just easier to make games for?

It depends, if games were to be built to effectively utilize the fact that the cpu is within the graphics pipeline then it can come off as a pretty big advantage seeing as how they both have much better access to data simultaneuously. 



Kyuu said:
Pemalite said:
Kyuu said:
Crytek president said the CPU isn't much improved over last gen so I'm not expecting much. But hopefully it'll be clocked at over 1.6 Ghz.


You have made my my day. Thanks.


I'm sure I did LOL you seem to enjoy seeing consoles under delivering :))

I would've preferred an improved Cell 2.0 over this energy friendly wimp called Jaguar. But I guess third party developers would bitch and whine about its complexity.. :(

Jaguar is not only faster, but it's far more flexible too.

The Cell was incredible at some tasks, not all. I.E. It sucked at double precision floating point and integer.
Jaguar on the other hand is essentially great at everything in comparison. (It's still a crappy, slowest-of-the-slowest CPU in the x86 space however.)

Problem is, game engines use differing types of math for different tasks and problems, thus when you run a game engine on the Cell, the Cell CPU is never being fully utilised in a constant manner.

To put it in lamens term think of Jaguar as a car that does 100 kilometers per hour, constantly, regardless of the road conditions.
Think of the Cell as a Car that does 50 kilometers per hour constantly, however when the road is straight and there are no bumps in the road it can accellerate to 150 kilometers an hour.

Thus, over time because the Cell has such a varying top speed, Jaguar manages to get to the finish line first.

Still comparatively, both CPU's are crap compared to any decent Intel Dual-Core, which because of the lack of core counts is easier for developers to take advantage of.
However, the reason for both Microsoft and Sony choosing Jaguar is simple, it's cheap, it's energy efficient allowing for more TDP and die area to be spent on the GPU.
Microsoft, Sony and Nintendo have *never* taken CPU performance seriously, it's not what draws the pictures that sells games. - In general, the Cell would use CPU time for framebuffer and post-processing effects, but any CPU can also do that.



--::{PC Gaming Master Race}::--

Kyuu said:


I thought by now it should be possible to accelerate and improve Cell to the extent to easily outperform a 1.6 ghz Jaguar in most regards. Because apparently even a 7 year old Cell isn't too far behind the newer Jaguar Architecture. Plus, the fastest supercomputer in the world was Cell powered at one point.

I think Sony might've benefited from the Cell given the fact that it was developed in house with IBM and Toshiba. Not sure how it all econimically works for each compnay but since they've already invested over a billion dollar for the project. Why would they let it all go? where they too hopeful? did they miscaclulate or was the Cell planned as a temporary product from the get-go. I'd like to hear your take on the matter.

I really don't know much about this but wasn't Cell initially supposed to handle graphics alone without a GPU? Does it have an edge in graphics processing over similarly priced/powerful traditional CPUs or is this just gibberish?


The fastest super computer in the world currently isn't the fastest because of the CPU type, but the fact it's powered by nVidia's Tesla.
Besides, when you build a Super Computer you can take advantage of a particular CPU's strengths in order to extract maximum performance anyway, it's not always so variable like a game engine.

For sure, it would be relatively easy to improve Cell to the point it would make Jaguar look like something from the 1980's, but again that would cost Billions in R&D, money that Sony really doesn't have when they keep reporting losses to shareholders, not to mention the extra difficulty it creates for developers which might have created a situation where the Xbox One might have been the lead platform for all multiplatforms. (x86 is relatively easy to build games for.)
Not to mention it would have driven up costs for the Playstation 4 a situation that Sony would not have benifitted from. AMD has already spent the cash on R&D building Jaguar remember.
Thus, in the end it really all comes down to costs, which actually beneffited the consumer with a cheaper console.

As for the Cell doing graphics processing, that was mostly just advertising fluff.
If you remember decades ago, games would actually fall back to a form of "software rendering" - Where-as the CPU handled all the graphics effects in a game, this isn't something that's unique to the Cell and that was being performed on the old 486 CPU's, which are half the speed of the origional Pentiums.
Heck, even the Xbox 360 used it's CPU in some games to improve the graphics in a game by using CPU compute time to perform Morphological Anti-Aliasing which is merely just a filter on screen and very cheap to implement.
Half the problem though with CPU's doing GPU work is that rendering a game is stupidly parallel, CPU's however are very serial in the way they process information. (And Core counts!)
The best example for this is to use a book.
A CPU will read each page in a book, one page after the other untill it gets to the end, the GPU will read every single page in a book at the same time, the Cell isn't exactly GPU-like in the way it processes information, thus even though it could potentially render an entire game which even a CPU from 20 years ago could do, it wouldn't have been feasible from an image quality of performance perspective.



--::{PC Gaming Master Race}::--

Please no more Secret sauce... it was so overused these last few months - don't want to hear that term ever again.

So I wonder why Sony would do this announce this, seems a little strange. I wonder if they have upped the processor from 1.6 to 1.7 or whatever the XboxOne is so they can match it. Otherwise why bother talk about it... suppose its another bit of PR but only if its good PR...



Making an indie game : Dead of Day!