By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sony Discussion - NX Gamer PS5 Full Spec Analysis

EricHiggin said:

You know the jig is up when 12.1TF over 10.3TF isn't enough for some unimaginable reason, that you have to use an old leak that doesn't entirely match up with the official announcement, to 'prove' the official PS announcement is BS.

The real annoying part of this 12TF worshipping is the fact that the XSX will not run at 12TF most of the time. If you watch any of the modern gpus at work (I'm pretty sure there are enough DF videos available on yt), you'll see that they constantly fluctuate their clockrates. No modern gpu runs at a fixed, maximum clockrate. The gpu in the XSX won't, either. So the whole "it's not 10.3TF but 12TF" argument is a waste of time.



Fei-Hung said:
Dunno who this is but he has a different take on it and a the ssd. Worth a watch.

https://youtu.be/PW-7Y7GbsiY

I've been subscribed to Coreteks for a while now; he makes a compelling argument here which is basically AMD's solution inspired by Nvidia's solution, and Nvidia solution is to make room for two accelerators (compress/decompress) in the Turing architecture. AMD has implemented a similar solution in Sony's Custom IO of the APU. I really liked the part where he detailed Nvidia's perspective: Math is free, but Memory and Communication are expensive, meaning, to me, the bottlenecks.

I'm picturing the frame drops one gets constantly in multiplayer games. They start out very high (like what the full potential of the GPU/CPU can give), but tank in very intensive fight sequences where there maybe a lot of action or people in one session. How much of that frame drop does it belong to the CPU/GPU not keeping up or because there a huge bottleneck in Memory and Communication? Once these bottlenecks are improved, how much less will the CPU/GPU be stressed doing redundancies and freed up to perform different computational tasks? (Efficiency) After all, there is a saying that you are only as strong as your weakest link, and I truly believe that disk drives and hard drives (even some SSDs) are far too slow too keep up with the video games we have today.

It will be a very interesting upcoming generation, that is for sure. I'm exited, and Coretek's video is totally worth a view, thanks for posting it.



DraconianAC said:
Fei-Hung said:
Dunno who this is but he has a different take on it and a the ssd. Worth a watch.

https://youtu.be/PW-7Y7GbsiY

I've been subscribed to Coreteks for a while now; he makes a compelling argument here which is basically AMD's solution inspired by Nvidia's solution,

This is just one reson why you should NOT reference people like this guy who does not know what he is talking about. You should listen to people who are a) hardware knowledgeable b) programmers that actually know how to handle gpus

This guy is c) : Clickbaiter disguising as technobabbler.

(Not my assessment, but several devolpers' opinions).



drkohler said:
DraconianAC said:

I've been subscribed to Coreteks for a while now; he makes a compelling argument here which is basically AMD's solution inspired by Nvidia's solution,

This is just one reson why you should NOT reference people like this guy who does not know what he is talking about. You should listen to people who are a) hardware knowledgeable b) programmers that actually know how to handle gpus

This guy is c) : Clickbaiter disguising as technobabbler.

(Not my assessment, but several devolpers' opinions).

Which developer opinions? Would you be kind enough to share your source(s)?

About click bait: I've not seen anything from Coreteks that ever felt like click bate. Sure he talks about potential avenues that may occur, but he is clear in describing them as such. In fact, the guy has over 100k subscribers and most of his videos are highly liked, and I can tell what a click bate is; here is an example: CrapgamerReviews on Youtube (https://www.youtube.com/channel/UCFhc9fgFQBvZ9iH3BfLK9Ow). Coreteks has educated me about RISC-V ISA, x86-64 processors, GPU/CPU company technologies, ARM processors, etc. and the topics he dwells on are not click bate, so I respectfully disagree with your assessment.

About tecnobabble: I don't think having a civilized discourse on potential bottlenecks in X86-64 technology is technobabble. If you don't understand the content, then why would you bash it as technobabble? I would think most of us here are curious and want to learn more about the PS5 tech. I'm willing to entertain what Mark Cerny, Digital Foundry, Moore's law is dead, Coreteks, NX Gamer, etc are saying about the PS5 because I'm curious, and bored. (Covid 19)



twintail said:
Can someone simplify the CPU/ GPU thermal ?

My understanding is that because they have a set thermal limit, games will be able to nearly max out the CPU/ GPU for extended periods of time without any concern.
Whereas in a more traditional design, the hardware cooling and thermals need to guess theoretical max loads because of fluctuations?

I don't know.

Basically, you set your aircon to 23 degrees because you know regardless of the sunlight or activities in the room, you'll remain cool. You can party hard during the day because the Aircon has been set to accept all that play.
As opposed to setting it to 25 and having to drop the temp because the sunlight and in room activities are warming up the room, thus making the Aircon work harder?

Like I said, I'm trying to understand how it works on a more basic level.

Thanks.

-----------------------------

Basically what Cerny tried to explain was...

a) Game are hardly ever both CPU and GPU bottlenecked. Almost always, one of them is maxed out while the other one is partially idle. It's also really difficult to fully utilize an 8-core CPU, even if only 7 or 6 of them are available to developers, in which, not all cores are maxxed out 99% of the time.

b) When the CPU alone is maxxed out, it is either because the work is not sufficiently parallel and one are two cores are fully utilized, in which case, power budget is still underutilized and the extra power is directed to the GPU.

c) or because of the high level of parallelization, then the CPU frequencies are lowered but it is not as much of an issue as parallelization achieves a much level of efficiency.

d) When the GPU is maxxed out, the CPU is usually under-utilized.


In short, for the cases of b & d, thermal constraint is not an issue for 99% of the time. For the case of c, it is an issue about 5% of the time. Overall, the CPU and the GPU will meet the required frequencies 95-99% of the time. This translates to about 5-10% deficiency in terms of CPU and 15-20% deficiency in terms of GPU compared to XBSX, which is, in the whole scheme of things, negligible for developers. However, the 2.3x difference in SSD is not, which is gonna be a game changer for PS5.



Playstation 5 vs XBox Series Market Share Estimates

Regional Analysis  (only MS and Sony Consoles)
Europe     => XB1 : 23-24 % vs PS4 : 76-77%
N. America => XB1 :  49-52% vs PS4 : 48-51%
Global     => XB1 :  32-34% vs PS4 : 66-68%

Sales Estimations for 8th Generation Consoles

Next Gen Consoles Impressions and Estimates

twintail said:
Can someone simplify the CPU/ GPU thermal ?

My understanding is that because they have a set thermal limit, games will be able to nearly max out the CPU/ GPU for extended periods of time without any concern.
Whereas in a more traditional design, the hardware cooling and thermals need to guess theoretical max loads because of fluctuations?

I don't know.

Basically, you set your aircon to 23 degrees because you know regardless of the sunlight or activities in the room, you'll remain cool. You can party hard during the day because the Aircon has been set to accept all that play.
As opposed to setting it to 25 and having to drop the temp because the sunlight and in room activities are warming up the room, thus making the Aircon work harder?

Like I said, I'm trying to understand how it works on a more basic level.

Thanks.



KratosLives said:
although ps5 has a faster sdd, nearly double the gigs throughput, xbox has faster memory bandwidth and the 100 gb of data immidiately available. Will the faster memory of xboxSx offset the ssd advantage?

Correct me if I am wrong, but wouldn't we be able to upgrade the xbox SSD at any time for a faster one, taking the one small advantage PS5 hardware has over xbox?



shikamaru317 said:

This is my understanding as well based on what I've read on tech sites. Developers can choose either a GPU or a CPU overclock on PS5 based on rather their game is more CPU or GPU intensive, but thermally it can't handle full overclocks on both at once. 

a) The image is cool, but totally false.

b) Neither the cpu nor the gpu in the PS5 are overclocked. The nominal frequencies are 3.5GHz and 2.23GHz.

c) Unless you live in death valley on a front porch, your PS5 very likely runs at nominal cpu and gpu speeds whenever required.

d) No console will ever run at full cpu and gpu speed for a prolonged amount of time. (There are always bottlenecks that slow down parts of a gpu). This will be particularly apparent for the cpu, as the developers will have a hard time maxing out all cpu cores for prolonged amounts of time.

e) In case the gpu runs at nominal speed (2.23GHz) for prolonged amounts of time , it does not have to clock down if the cpu is not continuously running at nominal speed, as "unused" wattage is redirected from the cpu to the gpu.

f) If you play Pong! on the PS5, both the cpu and gpu will be significantly downclocked from their nominal clock rates. The same is true for the XSX, your big rectangles in the picture become tiny rectangles.

Again, if you do not understand it, modern gpus do not run at full clock rates if it is not required. There are enough DF videos on youtube showing this effect with various games.



shikamaru317 said:
drkohler said:

a) The image is cool, but totally false.

b) Neither the cpu nor the gpu in the PS5 are overclocked. The nominal frequencies are 3.5GHz and 2.23GHz.

c) Unless you live in death valley on a front porch, your PS5 very likely runs at nominal cpu and gpu speeds whenever required.

d) No console will ever run at full cpu and gpu speed for a prolonged amount of time. (There are always bottlenecks that slow down parts of a gpu). This will be particularly apparent for the cpu, as the developers will have a hard time maxing out all cpu cores for prolonged amounts of time.

e) In case the gpu runs at nominal speed (2.23GHz) for prolonged amounts of time , it does not have to clock down if the cpu is not continuously running at nominal speed, as "unused" wattage is redirected from the cpu to the gpu.

f) If you play Pong! on the PS5, both the cpu and gpu will be significantly downclocked from their nominal clock rates. The same is true for the XSX, your big rectangles in the picture become tiny rectangles.

Again, if you do not understand it, modern gpus do not run at full clock rates if it is not required. There are enough DF videos on youtube showing this effect with various games.

One of the Digital Foundry guys seems to disagree:

dictator and df are totally biased. 



shikamaru317 said:

One of the Digital Foundry guys seems to disagree:

Actually he totally agrees. Ryzen2 Cpu will not be fully used (no surprise here) so power is shifted to gpu.....