By using this site, you agree to our Privacy Policy and our Terms of Use. Close
torok said:
disolitude said:

You're probably talking about low level access to CPU/GPU that console have. Yeah consoles can utilize hardware better than PCs, this is no secret... But these advances in game development usually make it to PC as well and push games on both platforms. 

PCs also have the ability to customize game performance. If you value high res textures but don't mind 30 fps, there is a setting for that. lower res textures/resolution but 60 fps...etc. On consoles you are stuck with what developers think is best.


Yes, I'm talking about low level access, but another important factor is to know exactly what GPU and CPU it's in use. That can allow a massive jump in performance. You can't optimize code for a generic approach, because of that you get an overhead.

An example, talking about CPU. Let's suppose you know exactly the model used. The distribution of task between processor cores can impact performance considerably because of the distribution of cache memory. If you have an 8-core processor (like AMD ones) where you have a big L3 cache and 4 L2 caches, each one shared by 2 cores and have to run a task in 2 cores. With the knowledge about your cache memory in a specific architeture, you will try to run the parallel task on two cores that share the same cache so you will allow them to share the same data without having both going to RAM instead of finding it right on the cache. If we look at the GPU side, you have a lot of thing to watch too. 

This is something on consoles that can't be reproduced on a PC. The solution is to use brute force, like PCs are doing these days. Of course, the new gen will improve things on PC side too. PC games are not in a state that I like now, I'm seeing a lot of badly optimized games (Hitman: Absolution is a good example of that). With new consoles and better graphics, devs will have to rely less on brute force and more on using functions on new GPUs in PCs (and leaving some legacy GPUs out of the games) to improve the results.

The thing is that the poorly optimized PC games are 99% of the time console prots. They also usually get patched... Hell Sonic Generations ran like shit when it first came out for no apparent reason but now its smooth 60 fps non stop. PC first devs tend to have well optimized games on PC which then get transfered to consoles very well as well. Look at Witcher 2, stellar looking on PC, pretty good looking on X360 too. It takes a significant amount of extra power to show the slightest increase in visual fidelity. That is why its easy to downscale a game to consoles.

I'm glad you brought up multi core allocation...I think the aspect which will improve PC gaming once next gen console development comes out is core usage and allocation. Console ports like Skyrim tend to use 2-3 cores (and run like crap on AMD hardware if you run them at max). Now that we have AMD CPU's in consoles with 8 cores, we should see proper multicore utilization which PC's will benefit from as well.