By using this site, you agree to our Privacy Policy and our Terms of Use. Close
disolitude said:

You're probably talking about low level access to CPU/GPU that console have. Yeah consoles can utilize hardware better than PCs, this is no secret... But these advances in game development usually make it to PC as well and push games on both platforms. 

PCs also have the ability to customize game performance. If you value high res textures but don't mind 30 fps, there is a setting for that. lower res textures/resolution but 60 fps...etc. On consoles you are stuck with what developers think is best.


Yes, I'm talking about low level access, but another important factor is to know exactly what GPU and CPU it's in use. That can allow a massive jump in performance. You can't optimize code for a generic approach, because of that you get an overhead.

An example, talking about CPU. Let's suppose you know exactly the model used. The distribution of task between processor cores can impact performance considerably because of the distribution of cache memory. If you have an 8-core processor (like AMD ones) where you have a big L3 cache and 4 L2 caches, each one shared by 2 cores and have to run a task in 2 cores. With the knowledge about your cache memory in a specific architeture, you will try to run the parallel task on two cores that share the same cache so you will allow them to share the same data without having both going to RAM instead of finding it right on the cache. If we look at the GPU side, you have a lot of thing to watch too. 

This is something on consoles that can't be reproduced on a PC. The solution is to use brute force, like PCs are doing these days. Of course, the new gen will improve things on PC side too. PC games are not in a state that I like now, I'm seeing a lot of badly optimized games (Hitman: Absolution is a good example of that). With new consoles and better graphics, devs will have to rely less on brute force and more on using functions on new GPUs in PCs (and leaving some legacy GPUs out of the games) to improve the results.

Edit: I have seen you and ethomaz talking about Intel vs. AMD on next gen consoles. Actually, the choice wasn't just because of price. I've read an article talking about it (link below) that shows that the choice was motivated by the fact that Sony and MS wanted a APU/SoC design.

They first looked at possible architetures (ARM, x86 and MIPS. Power was discarted because the architeture was having a slower evolution). MIPS was discarded because none of them wanted to create an APU from scratch and MIPS wasn't considered developer-friendly. ARM would mean NVidia and x86 would mean AMD (because Intel don't have APUs with the necessary GPU power). Tests revealed that even with the latest improvements ARM could't match x86 (they noted that it can happen soon, but only after the launch of the next gen). So AMD actually was the choice because it was the only one that could offer that product. 

http://www.forbes.com/sites/patrickmoorhead/2013/06/26/the-real-reasons-microsoft-and-sony-chose-amd-for-consoles/