starcraft said:
I am familiar (in laymens terms anyway) with the technical disparity. But what we've seen is that the real-world difference is minimal. The PS3 was more powerful than the Xbox 360 (arguably a greater disparity), and ultimately its best exclusives still looked marginally better than Xbox 360's best exclusives. |
Totally different story. PS3 actually had the weaker GPU overall as well as less free RAM for games.
CPU was a thing. CELL was meant for graphics tasks from the beginning. In the end only a few first party games used it properly for shiny graphics. For multiplat that probably would have needed far to much extra work and even on the Sony side basically only Guerilla and Naughty Dog managed to do that right.
Now we do have 43% difference in shader power (what really matters like 95% of the time), faster RAM (probably overestimated) on PS4 side, al slightly faster CPU and, without Kinect slightly less CPU usage on Xbox One side. So on the hardware side both consoles might be not twins but at least brothers.
Software side will be interesting for the rest of the gen. DirectX 12 will have far less overhead. Especially for GPGPU. PSSL and GNM/GNMX seems to be better in some of those points right now.
Not to talk about drivers. Yeah, they can make some difference.
So seeing the software side improve will be the interesting part from now on. It's just that Sony basically can optimize the software as well as Microsoft.
As for real differences: 900p vs. 1080p, some more grass here, some more frames there. Basically that's it. Both are far to close together to have any actual real difference.
Just for the uninformed. hUMA belongs to HSA and is a feature every GCN GPU has and DirectX 12 utilises.
So, there will never be 100% parity. But there will as well never be Xbox/PS2 like differences as well.
If that stupid little difference bothers you, go, buy a good rig with a GTX980, R9 290X or the upcoming 390X. Now that is a difference.