DF talks about the advantage of PS4 over Xbone! More to come soon.
"Factoring out other differences in the system, Sony's rendering tech has 50 per cent more raw computational power than the Xbox One equivalent. The question is, what is the impact in actual gameplay conditions?"
"The Xbox One and PS4 graphics chips have much in common with two existing PC designs from AMD - codenamed Bonaire and Pitcairn."
"The results pretty much confirm the theory that more compute cores doesn't result in a linear scaling of performance. Our PS4 target obviously has an advantage though - 24 per cent on average."
"Crysis 3 is the closest thing we have to a next-gen game and both our target platforms are capable of running it at 1080p on the high quality preset, with very high quality textures in place."
"A comparison of the Radeon HD 7790 with the HD 7850 is fascinating. Here we have two cards with the same compute power, but very different levels of bandwidth available - with profound results on frame-rates."
Conclusion: the PS4 advantage and the Xbox One challenge
In summary, the PS4 enjoys two key strengths over the Xbox One in terms of its rendering prowess: raw GPU power and masses of bandwidth. On the face of it, the specs look like a wash, but it seems clear that one of those advantages - the 50 per cent increase in compute power - doesn't result in the stratospheric boost to performance you might imagine. Clearly the PS4 is more powerful, but the evidence suggests that quality tweaks and/or resolution changes could help produce level frame-rates on both platforms running the same games. Bandwidth remains the big issue - the PS4's 256-bit bus is established technology, and easy to utilise. Xbox One's ESRAM is the big unknown, specifically in terms of how fast it actually is and the speed at which developers are able to make the most of it. In our benchmarks and game testing we gave the Xbox One target the benefit of the doubt by equalising bandwidth levels, but clearly this is in no way guaranteed.
Clearly, the next-gen battle is shaping up to be a fascinating contest. What we're looking at are two consoles designed from the same building blocks but with two entirely different approaches in mind. According to inside sources at Microsoft, the focus with Xbox One was to extract as much performance as possible from the graphics chip's ALUs. It may well be the case that 12 compute units was chosen as the most balanced set-up to match the Jaguar CPU architecture. Our source says that the make-up of the Xbox One's bespoke audio and "data move engine" tech is derived from profiling the most advanced Xbox 360 games, with their designs implemented in order to address the most common bottlenecks. In contrast, despite its undoubted advantages - especially in terms of raw power, PlayStation 4 looks a little unbalanced by comparison. And perhaps that's why the Sony team led by Mark Cerny set about redesigning and beefing up the GPU compute pipeline - they would have seen the unused ALU resources and realised that there was an opportunity here to transform that into an opportunity for developers to do something different with the graphics hardware.
Cerny himself admits that utilisation of GPU compute isn't likely to come into its own until year three or year four of the PS4's lifecycle. One development source working directly with the hardware, told us that "GPU compute is the new SPU" in reference to the difficulty coders had in accessing the power of the PS3's Cell processor, but also in terms of the potential of the hardware. There's a sense that this is uncharted territory, that it's an aspect of the graphics tech that's going to give the system a long tail in terms of extracting its full potential. But equally, this isn't going to happen overnight and almost certainly not in the launch period. That being the case, as unlikely as this may sound bearing in mind the computational deficit in its graphics hardware, theoretically Xbox One multi-platform games have a pretty good shout in getting close to their PS4 equivalents, with only minor compromises. Further into the lifecycle it becomes a question of whether PS4 GPU compute becomes a significant factor in development when the additional effort would not yield much in the way of dividends for the Xbox One version of the game.
To conclude, in terms of graphics tech at least, there's little doubt that the PlayStation 4 is the more capable performer of the two next-gen consoles. However, in the short term, provided Microsoft brings home the promised performance improvements to its graphics libraries, and that the ESRAM is easy to utilise, there's every reason to believe that the stark on-paper compute deficit may not be as pronounced in actual gameplay as the spec suggests. Gamescom should be a fascinating experience, and a chance to judge progress after an E3 where games on both consoles felt somewhat unoptimised.
Away from the core comparisons, what we found quite heartening from this experience is that the target graphics hardware we created proved to be reasonably adept at handling some of the most difficult PC gaming benchmarks available, not to mention providing a highly playable Crysis 3 experience at the high preset with the best quality textures available. Bearing in mind that the absolute top-end settings challenge top-end tech like the GTX Titan, dropping down a single notch on the quality scale and still getting decent frame-rates at 1080p resolution isn't to be sniffed at.
Crytek's technological masterpiece aims to scale up to the very highest resolutions via its advanced effects work and ultra high-res art. At the 1080p next-gen console standard, that level of texture quality looks simply phenomenal and the overall experience is completely transformed from the compromised current-gen editions. Assuming the CPU power is there, the graphics hardware of both consoles should be able to sustain a 1080p30 performance level on this demanding game with just minor fluctuations, and that's before we factor in the benefits of closed platform APIs and console-specific optimisation. And as a starting point for performance in the next-gen era, that's not too bad at all.