By using this site, you agree to our Privacy Policy and our Terms of Use. Close
chakkra said:
DonFerrari said:

"Digital Foundry didn't disagree on Mark Cerny."

They didn't outright agree with him either. And they even said "We will have to wait to see how this translates into real world scenarios"
Besides, even if they had agreed with them, we have already seen these scenerarios MULTIPLE times. In every single Graphics Card generation before. Did you take the time to look at the chart above?

And there was one particular comment they did that people either missed or chose to ignore: " It's a fascinating idea - and entirely at odds with Microsoft's design decisions for Xbox Series X - and what this likely means is that developers will need to be mindful of potential power consumption spikes that could impact clocks and lower performance."

"Also what reason would Sony have to choose less CUs with higher frequency (and much higher than what could be expected, and one that makes dies harder to make and cooling also harder to achieve)? Just for the giggles?"

Errr... because at the moment of designing the console they did not know how many CUs and what Clock speed MS was going to use? You know that they dont meet in the same room to design these things, right?

They did know that microsoft was going to have more than 36. The guys at sony aren't dumb. They can get an idea what the other can achieve under a certain budget and whats possible at the time, with discussions with engineers and so, and have a few scenarios to work with. Cerny said they could have went for "more" compute units, but chose for the better ssd and io chip and such, after their feedback from developers.