By using this site, you agree to our Privacy Policy and our Terms of Use. Close
disolitude said:
The thing is that the poorly optimized PC games are 99% of the time console prots. They also usually get patched... Hell Sonic Generations ran like shit when it first came out for no apparent reason but now its smooth 60 fps non stop. PC first devs tend to have well optimized games on PC which then get transfered to consoles very well as well. Look at Witcher 2, stellar looking on PC, pretty good looking on X360 too. It takes a significant amount of extra power to show the slightest increase in visual fidelity. That is why its easy to downscale a game to consoles.

I'm glad you brought up multi core allocation...I think the aspect which will improve PC gaming once next gen console development comes out is core usage and allocation. Console ports like Skyrim tend to use 2-3 cores (and run like crap on AMD hardware if you run them at max). Now that we have AMD CPU's in consoles with 8 cores, we should see proper multicore utilization which PC's will benefit from as well.


Yes, Skyrim is a good example of something poorly optimized. I have a Phenom X4 955 + GTX 650ti, so it runs at max settings. But I have seen people with GPUs a little worser havingg to cut things. Sleeping Dogs achieves a much better graphical result withou using much more resources (it's actually a beautiful game close to max). With next gen hardware we will have better console games and better (and more optimized) PC games. Is a win-win situation.

But the AMD domination on consoles left me thinking. Could we see a situation where PC games (specifically console ports) will run better on AMD GPUs (or even CPUs)? That could be bad news for NVidia.