torok said:
Mind that these are generally small differences. Like PS4 at stable 30 fps and X1 with drops to 26 or something. And in a lot of these cases, companies are pushing for similar or equal resolution and that will make the X1 struggle more. 900p vs 1080p has been a sweet spot for multiplats because this difference in pixel count is roughly the same power difference for the GPUs. Now 30 vs 60 is a way, way bigger problem. You have to halve frame render time. It's not like we are seeing games running at 60 x 30 or 30 x 15 when talking PS4 and X1. As they are pushing for parity, it's reasonable to have a CPU that's only a bit better. Upgrading the GPU will allow higher res and better effects but without making the regular model look far behind. Remember, they won't stop selling the regular version. You can count on a price cut or even a slim model. Last gen, they suffered with extremely outdated consoles in the final years. It's impossible to make it cheap and powerful at the same time. So now they will have a cheap model, that runs all the games and is affordable, and a premium one, to make power users happy. |
I thought the whole point of GPGPU compute was allowing the GPU to offload some of the workload from the CPU for labor intensive tasks. I did research on this quite a few months back and to my understanding, GPGPU compute is not generally practiced in game development but rather in things like video editing and so on. If that's the case, and since we know that the first Ps4 is heavily unified, wouldn't it stand to reason that the GPU could help with framerates for games that utilized such a feature? (Most definitely it would just be in exclusives, but the first party titles Sony has put out have very consistent framerates. Even BloodBorne which comes from a series that is known for its performance issues)







