By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Sony Discussion - PS5 GDC Reveal and PS5 specs/performance Digital Foundry Video analysis : 3.5 Ghz 8 core Zen 2 CPU along with 10.3 TF RDNA 2 RT capable and 16GB GDDR6 RAM and also super crazy fast 5.5 GB/Second S

 

How do you feel

My brain become bigger su... 21 30.00%
 
I am wet 6 8.57%
 
What did he talked about??? 5 7.14%
 
I want some more info 9 12.86%
 
Total:41
the-pi-guy said:
alexxonne said:

I don't need to refute something that you already accepted are in denial, and I don't intent to argue.

You have your opinion and I respect that. I have mine, I expect the same.

But as hard facts goes even Digital Foundry article say it, and they are the ones with the most expertise.

I'd much rather take Digital Foundry's words over that! 

https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-specs-and-tech-that-deliver-sonys-next-gen-vision

https://www.eurogamer.net/articles/digitalfoundry-2020-playstation-5-the-mark-cerny-tech-deep-dive

Seriously you're taking the words of an article based off a year old leak, over the words of articles written after official specs were actually announced.

Man, the coronavirus is eating your brain. What are you talking about. The guy have been talking about this all these weeks, and just today he made mention to it again, and essentially reinforced that concept, "...to the kind of overclocking which essentially what is that ps5 is delivering...".

Other Links (2 weeks ago) - DF Direct: PlayStation 5 - The Official Specs, The Tech + Mark Cerny's Next-Gen Vision

Other Link 2 (One Month ago) - In Theory: Can a 4TF Navi GPU Deliver a Next-Gen Console Experience?

Obviously, we will have to wait until PS5 launches, so they can dissect and confirm it. But all the money and arrows point to that, and more importantly the math confirms and backs the flops estimation

Last edited by alexxonne - on 03 April 2020

Around the Network
alexxonne said:

The worst part, and this is a real speculation from my part, is that base clock/performance could be lesser than 2ghz/9.2TF, but gladly this will not matter too much given the build focus on boost and peak performance, and that gives Sony something.

Wow, another week and you wll downgrade the PS5 to 8 TFlops. For an absolute newbie, you are sticking your neck out very far.

You are in constant denial of what Cerny tried to explain in the GDC talk and in the talk with Leadbetter (who obviously can't fully talk about his One-on-One with Cerny). Hopefully for Cerny's peace of mind he doesn't read all the nonsense people like you spread in forums.

Let me speculate on something (I'm not saying this is true or not): One of the early DevUnits ran selected fully enabled Radeon 5700s at 2GHz. This makes it 10.3TFlop. Github then transmogrified this info into "standard" Radeon 5700 (whch have 4 Cus disabled as standard setting for yield reasons) and presto, the 9.2TFlop rumour appeared. Again I'm not saying this is true because you seem to absolutely love your fantasy that the PS5 is at most 9.2 TFlops. I'm not even going to try to explain to you how unlikely it is for the XSX to continuously run at 12TFlops because that would make you extremely unhappy (and you wouldn't understand it anyways).

So let's see yur next fantasy that the PS5 is nly 8Tflop, bring it on, we all need a good laugh....



drkohler said:
alexxonne said:

The worst part, and this is a real speculation from my part, is that base clock/performance could be lesser than 2ghz/9.2TF, but gladly this will not matter too much given the build focus on boost and peak performance, and that gives Sony something.

Wow, another week and you wll downgrade the PS5 to 8 TFlops. For an absolute newbie, you are sticking your neck out very far.

You are in constant denial of what Cerny tried to explain in the GDC talk and in the talk with Leadbetter (who obviously can't fully talk about his One-on-One with Cerny). Hopefully for Cerny's peace of mind he doesn't read all the nonsense people like you spread in forums.

Let me speculate on something (I'm not saying this is true or not): One of the early DevUnits ran selected fully enabled Radeon 5700s at 2GHz. This makes it 10.3TFlop. Github then transmogrified this info into "standard" Radeon 5700 (whch have 4 Cus disabled as standard setting for yield reasons) and presto, the 9.2TFlop rumour appeared. Again I'm not saying this is true because you seem to absolutely love your fantasy that the PS5 is at most 9.2 TFlops. I'm not even going to try to explain to you how unlikely it is for the XSX to continuously run at 12TFlops because that would make you extremely unhappy (and you wouldn't understand it anyways).

So let's see yur next fantasy that the PS5 is nly 8Tflop, bring it on, we all need a good laugh....

Perhaps you're the newbie. because you're arguing with no facts and without knowing the manufacturing process of a cpu or gpu.

I'm not spreading anything that top magazines, tech experts and Cerny haven't said.

Don't you know that whenever a new cpu/gpu comes, the die is unstable and a lot of testing must be done to correct instability issues and with it the base clock frequency can go up or down? All cpus/gpus need to pass this manufacturing process. That is how we were able to see PS5 leaks with a gpu clock as low as 1ghz to 1.8ghz and 2ghz respectively. Jesus Christ READ!! Even Mark Cerny first presentation mentioned it.

"...Running the GPU at 2ghz was looking like an unreachable target with the old frequency strategy with the new paradigm we're able to run way over that, in fact we have to cap GPU freq. at 2.23ghz..."

And with the Df article, again,

" The Gonzalo leak back in April suggested that PlayStation 5 would feature a Zen 2-based CPU cluster running at 3.2GHz paired with a Navi graphics core running at 1.8GHz.'

If we go by your speculation, You need a lot of reading. If you Actually READ the article it wasn't a 5700 card, the specifications in the leak were the same as the revealed PS5 specs. With the sole exception of the clocks, clearly something that Cerny confirmed they managed to increase it (CPU/GPU).

All these leaks come from testing different frequencies and stability targets, before mass producing them. Even when they are already made, some companies lower the base clock, because they're faulty or not stable enough. That is how CPU specs codes are made. Sometimes improvements are made, better clock or new instructions, etc. MAN YOU NEED TO READ A LOT.

Hence my educated guess, based on what Cerny said that 2ghz was becoming unreachable. So at the time, meaning that a GPU with a lower clock (<2ghz) would result in a slower performing card with less than 9.2TF at some point.

So my point has all the sense of the world if every tech magazine is talking about it, specially the top one. And those are FACTS, deal with them.

If you wanna marry a PS5 console, you have my blessing, but my love for the brand doesn't go that far.

So , I doubt you laughing now, Next one...

Last edited by alexxonne - on 03 April 2020

alexxonne said:

....Running the GPU at 2ghz was looking like an unreachable target with the old frequency strategy with the new paradigm we're able to run way over that, in fact we have to cap GPU freq. at 2.23ghz..."

Look, I have worked alongside electronic engineers, chip designers, etc in my life. I have been in chip factories when you weren't even born. All I can say is that electronics engineers, chip designers, name your special profession, has a "proprietary language" that gets the more irritating the more it gets deeper into the territory it covers - until it is no longer understandable to average Jane/Joe. When Cerny gave his GDC talk, he was talking in "chip design language". I understand that language (to a certain extent), you don't. The above quote you choose to handpick does NOT mean that Cerny tried to build and test chips at 1.6, 1.8, 2.0, 2.2GHz (and all the bullshit rumours on random blogs going along with it) before having his enlightment. He is a very experienced chip designer (in the conceptual sense, AMD has to do the grunt work in the end), amongst other things. Are we really going into the "referring to authority" mine field? I'm out here and now.



drkohler said:
alexxonne said:

....Running the GPU at 2ghz was looking like an unreachable target with the old frequency strategy with the new paradigm we're able to run way over that, in fact we have to cap GPU freq. at 2.23ghz..."

Look, I have worked alongside electronic engineers, chip designers, etc in my life. I have been in chip factories when you weren't even born. All I can say is that electronics engineers, chip designers, name your special profession, has a "proprietary language" that gets the more irritating the more it gets deeper into the territory it covers - until it is no longer understandable to average Jane/Joe. When Cerny gave his GDC talk, he was talking in "chip design language". I understand that language (to a certain extent), you don't. The above quote you choose to handpick does NOT mean that Cerny tried to build and test chips at 1.6, 1.8, 2.0, 2.2GHz (and all the bullshit rumours on random blogs going along with it) before having his enlightment. He is a very experienced chip designer (in the conceptual sense, AMD has to do the grunt work in the end), amongst other things. Are we really going into the "referring to authority" mine field? I'm out here and now.

You see, you are not laughing anymore. I told ya. And now you're serious.

But You're wrong as hell, not my problem.

If you want to think Digital Foundry is a random blog, then you can continue living your fantasy, sadly it discredits your mentioned background.

And pardon me but I must correct you, Mark Cerny doesn't design CPu/GPU  system chips, he is system lead designer. He leads the building process with the components others make. But his main profession and capacity is video game programmer, and it doesn't have anything to do with CPU/GPU chips. also he designed the PSP VITA....yes the VITA. So he isn't a holy grail as a designer. But I do respect him, his vision for the ps4 was mostly asserted.

You are in love with Mark Cerny, to the point of extremism. And that directly affects the positive criticism you can do, if you really love Sony.

Last edited by alexxonne - on 03 April 2020

Around the Network

Neither console will be running at 100% load on the hardware most of the time. The loads on the hardware will fluctuate on both constantly and not simply between 95% and 100% either. The hardware at times can be running at much lower loads even for demanding games, and can run at lower loads more often for less demanding games.

If PS5 is running at a 50% (GPU) load, then it's basically putting out 5.1 TF at that specific point in time you could say.
If XBSX is running at 50% (GPU) load, then it's basically putting out 6.0 TF at that specific point in time you could say.

So who's being more clear? SNY with their fluctuating performance, or MS with their 'locked' performance?



EricHiggin said:
Neither console will be running at 100% load on the hardware most of the time. The loads on the hardware will fluctuate on both constantly and not simply between 95% and 100% either. The hardware at times can be running at much lower loads even for demanding games, and can run at lower loads more often for less demanding games.

If PS5 is running at a 50% (GPU) load, then it's basically putting out 5.1 TF at that specific point in time you could say.
If XBSX is running at 50% (GPU) load, then it's basically putting out 6.0 TF at that specific point in time you could say.

So who's being more clear? SNY with their fluctuating performance, or MS with their 'locked' performance?

Nice way to say it. But I suppose is neither of them. Is up to the developers to actually squeeze that performance peak, and show each systems attributes. But that is something that will not happen right away. But basically is all true, what you say and propose.

Digital foundry mentioned that PS5 variable frequency is more similar to Switch power profiles. meaning developers may choose what kind of computing power they need. But the XBOX SX while traditional in a way, still have some unknowns to properly make a guess. We haven't heard any exotic feature or secret sauce yet, just raw brute force performance. If DirectX12.1 is any indication and the new apis are something to make a guess. Microsoft approach for secret sauce will be purely in software. Probably and technically they can create commands or levels of performance to simplify development of multi-platform titles, problem is to actually make it work without rebooting the console each time a game requires a different profile.

Last edited by alexxonne - on 03 April 2020

alexxonne said:
EricHiggin said:
Neither console will be running at 100% load on the hardware most of the time. The loads on the hardware will fluctuate on both constantly and not simply between 95% and 100% either. The hardware at times can be running at much lower loads even for demanding games, and can run at lower loads more often for less demanding games.

If PS5 is running at a 50% (GPU) load, then it's basically putting out 5.1 TF at that specific point in time you could say.
If XBSX is running at 50% (GPU) load, then it's basically putting out 6.0 TF at that specific point in time you could say.

So who's being more clear? SNY with their fluctuating performance, or MS with their 'locked' performance?

Nice way to say it. But I suppose is neither of them. Is up to the developers to actually squeeze that performance peak, and show each systems attributes. But that is something that will not happen right away. But basically is all true, what you say and propose.

Digital foundry mentioned that PS5 variable frequency is more similar to Switch power profiles. meaning developers may choose what kind of computing power they need. But the XBOX SX while traditional in a way, still have some unknowns to properly make a guess. We haven't heard any exotic feature or secret sauce yet, just raw brute force performance. If DirectX12.1 is any indication and the new apis are something to make a guess. Microsoft approach for secret sauce will be purely in software. Probably and technically they can create commands or levels of performance to simplify development of multi-platform titles, problem is to actually make it work without rebooting the console each time a game requires a different profile.

DF's most recent video explains things better, but it's still a little unclear as they don't even know for sure either just yet. It sounds as though they can allow the PS5 to shift power dynamically on the fly, which also means a potential reduction in clocks for the CPU or GPU depending on the load/thermals. It also sounds as though they can potentially lock things in from the start and make sure the GPU or CPU is always getting max power and max frequency, while the other receives less power and reduces frequency. Some games are more CPU or GPU bound in general, so if you know your game is going to be more GPU bound, you may very well choose to always push the GPU since the CPU won't be pushed that hard anyway. It certainly sounds more efficient than having beefier hardware always waiting for the occasion when it's put to it's fullest use.

As it was originally explained, PS5 sounded weaker than it would be, and XBSX sounded stronger than it would be. That doesn't change the worthy specs of each and the advantages each certainly hold regardless. As for what the API's will allow and what kind of extra efforts devs will put into each to optimize them, is something we can only hypothesize without more details or seeing actual game play (results). Software is MS specialty though, like hardware for SNY, so it'll be interesting to see how things play out for each.



You guys make it way too complicated.
We know a maximum of both machines, which they wont run 99% of the time



 "I think people should define the word crap" - Kirby007

Join the Prediction League http://www.vgchartz.com/predictions

Instead of seeking to convince others, we can be open to changing our own minds, and seek out information that contradicts our own steadfast point of view. Maybe it’ll turn out that those who disagree with you actually have a solid grasp of the facts. There’s a slight possibility that, after all, you’re the one who’s wrong.

alexxonne said:
drkohler said:

Wow, another week and you wll downgrade the PS5 to 8 TFlops. For an absolute newbie, you are sticking your neck out very far.

You are in constant denial of what Cerny tried to explain in the GDC talk and in the talk with Leadbetter (who obviously can't fully talk about his One-on-One with Cerny). Hopefully for Cerny's peace of mind he doesn't read all the nonsense people like you spread in forums.

Let me speculate on something (I'm not saying this is true or not): One of the early DevUnits ran selected fully enabled Radeon 5700s at 2GHz. This makes it 10.3TFlop. Github then transmogrified this info into "standard" Radeon 5700 (whch have 4 Cus disabled as standard setting for yield reasons) and presto, the 9.2TFlop rumour appeared. Again I'm not saying this is true because you seem to absolutely love your fantasy that the PS5 is at most 9.2 TFlops. I'm not even going to try to explain to you how unlikely it is for the XSX to continuously run at 12TFlops because that would make you extremely unhappy (and you wouldn't understand it anyways).

So let's see yur next fantasy that the PS5 is nly 8Tflop, bring it on, we all need a good laugh....

Perhaps you're the newbie. because you're arguing with no facts and without knowing the manufacturing process of a cpu or gpu.

I'm not spreading anything that top magazines, tech experts and Cerny haven't said.

Don't you know that whenever a new cpu/gpu comes, the die is unstable and a lot of testing must be done to correct instability issues and with it the base clock frequency can go up or down? All cpus/gpus need to pass this manufacturing process. That is how we were able to see PS5 leaks with a gpu clock as low as 1ghz to 1.8ghz and 2ghz respectively. Jesus Christ READ!! Even Mark Cerny first presentation mentioned it.

"...Running the GPU at 2ghz was looking like an unreachable target with the old frequency strategy with the new paradigm we're able to run way over that, in fact we have to cap GPU freq. at 2.23ghz..."

And with the Df article, again,

" The Gonzalo leak back in April suggested that PlayStation 5 would feature a Zen 2-based CPU cluster running at 3.2GHz paired with a Navi graphics core running at 1.8GHz.'

If we go by your speculation, You need a lot of reading. If you Actually READ the article it wasn't a 5700 card, the specifications in the leak were the same as the revealed PS5 specs. With the sole exception of the clocks, clearly something that Cerny confirmed they managed to increase it (CPU/GPU).

All these leaks come from testing different frequencies and stability targets, before mass producing them. Even when they are already made, some companies lower the base clock, because they're faulty or not stable enough. That is how CPU specs codes are made. Sometimes improvements are made, better clock or new instructions, etc. MAN YOU NEED TO READ A LOT.

Hence my educated guess, based on what Cerny said that 2ghz was becoming unreachable. So at the time, meaning that a GPU with a lower clock (<2ghz) would result in a slower performing card with less than 9.2TF at some point.

So my point has all the sense of the world if every tech magazine is talking about it, specially the top one. And those are FACTS, deal with them.

If you wanna marry a PS5 console, you have my blessing, but my love for the brand doesn't go that far.

So , I doubt you laughing now, Next one...

For someone claiming others have reading comprehension problems you seem to be the one with it.

He said that with previous paradigm on GPU development 2Ghz was unreachable (not that PS5 couldn't reach it, but that GPUs didn't get that far, just look at Xbox for examples, or current gen or all other GPUs). But they were able to work around that paradigm and get over 2.23 (it can go further but then it would be unstable, so you also got your answer of it being able to run at 2.23 most of time, and they can't say if it is 50, 80 or 99% of the time because that will be a game by game depending on the developer not the HW).



duduspace11 "Well, since we are estimating costs, Pokemon Red/Blue did cost Nintendo about $50m to make back in 1996"

http://gamrconnect.vgchartz.com/post.php?id=8808363

Mr Puggsly: "Hehe, I said good profit. You said big profit. Frankly, not losing money is what I meant by good. Don't get hung up on semantics"

http://gamrconnect.vgchartz.com/post.php?id=9008994

Azzanation: "PS5 wouldn't sold out at launch without scalpers."