|
Machiavellian said: LOL, so let me get this straight. Richard pressed Mark on a rumor and Mark cleared it up and this is bias. So what is the difference from this interview Richard had with MS engineer when he pressed about the ROPs "Having attempted to comprehensively address questions about the ESRAM and system memory bandwidth of the architecture, the issue of the Xbox One's fill-rate and ROPs deficit compared to PlayStation 4 is now under the microscope. ROPs are the elements of the GPU that physically write the final image from pixel, vector and texel information: PlayStation 4's 32 ROPs are generally acknowledged as overkill for a 1080p resolution (the underlying architecture from AMD was never designed exclusively just for full HD but for other resolutions such as 2560x1400/2560x1600 too), while Xbox One's 16 ROPs could theoretically be overwhelmed by developers." "In our interview, Microsoft revealed research it had carried out that suggested that the 6.6 per cent increase to GPU clock speed was more beneficial to the system than two additional AMD Radeon Graphics Core Next compute units. Our question was straightforward enough - were the results of these tests skewed by the code saturating the ROPs?" Lets talk about the conclusion he makes about the ROPs "Our take on the ROPs situation is that while these figures make perfect sense, there are many other scenarios that could be potentially challenging - depth-only passes, shadows, alpha test and Z pre-pass for example. But from a user perspective, the fact is that native 1080p isn't supported on key first-party titles like Ryse and Killer Instinct. Assuming this isn't a pixel fill-rate issue as Microsoft suggests, surely at the very least, this impacts the balanced system argument?" Does not sound like he is in MS pocket or that he is drinking the MS Koolaid. Instead what I find is that when people see something negative about the system of their choice, they blind themselves to everything else. |
Wait... what?
You ask for a source that PS4 is not using GPU for OS and you reply with that? Now I understand... you don't makes sense at all... logic is not is your strong point.
What GPU use in OS have to do with Leadbetter bias? Explain to me please.
PS. You you are confusing the subjects again.. please don't try to discuss two things at the same time lol







