| CrazyGPU said: Performance graphs are great, and it shows that theoretically a PS4 GPU is in line with Radeon HD 7850 or near HD 7870. Its weird that even with that amount of power, its not able to run games like Battlefield 4 at 1080p 30 fps. Same with Watch Dogs. Those games run at 900p and scale. Scalliing makes textures more blurry. Now, Could it be that the AMD Jaguar multicore CPU is holding back the performance of the GPU?. I was thinking that with optimization, the "PS4 HD 7850 GPU equivalent" would get better graphics than the PC counterpart, but if the CPU is holding back performace, then it might be the case why the PS4 cant achieve 1080p in most games like the PC card does, and the XBONE is almost 720p because of its weaker GPU. Cerny says that the GPGPU on the PS4 can make up for CPU weakness, but if the graphic card is doing compute, would it be able to cope with the same graphic quality? Actually Nvidia takes a hit when a graphic card use PhysX. Also this link from this forum is saying that memory can become a bottleneck too. Any PS4 dev here or someone with deep knowledge to comment on this?
|
I have no PS4 dev skill, but what I can say for sure is that any talk about how much percent of the CPU is used is a complete and utter bullsh*t.
Any dev, any team, and any game at any time can easily use 100% of the cpu. The work and talent is about using the cpu efficiently to achieve good results, which can't be expressed easily in percent (percent of what ?).
Also the discussion about raw power of the gpu doesn't really work. If we have similar hardware, for example from the same generation, that's fine, but if not it just doesn't work, because power is a lot about what can do in hardware or not. Like having shader or not, what generation, how many pipelines. Even Sony talk BS about it, like PS4 being 54 more powerfull than PS2, you can't count count like that.







