By using this site, you agree to our Privacy Policy and our Terms of Use. Close
fatslob-:O said:
Landguy said:

Aren't they both basically using x86?

That's incorrect ... 

The Xbox 360's CPU is based off of one of the Power architecture family. 

The funniest thing here is people are so quick here as to point out the culprit being both of the consoles CPU being x86 as an excuse to not expect a lot of improvements overtime when people here should really be focusing on both of the console's GPU and it's ISA to really see the difference. After all, why pay attention to a component that does less than 10% of a game's workload ?

The GPU's in both consoles offer sooo many more things compared to last generation that it's not even funny. You get programmable vertex pulling which takes GPU independent rendering to the next level so that in turn dramatically reduces CPU dependence on rendering, more fixed function operations being handled in shaders which gives developers more freedom in how they program shaders, tessellation units, cache coherence for L2 cache, and they're even capable of doing a true function call too! 


I get what you said, but the problem ( a little bit) with the idea that the GPU is going to be better optomized over time is that these are not new GPUs either.  They are mostly off the shelf parts.  They are using gpu's and the driver sets from 2-3 years ago that have been modded for these systems.  It is not something new to program for and really learn how to use.  Can they improve some over time?  Yes, but the level of improvement we saw last gen from year 1 to year 3 was dramatic.  It would be extremely odd that developers simply can't figure out this gens cpu/GPU's faster.  With the scrutiny being made on every releases  resolution and framerate, developers are already maxing out what can be done.  By the end of 2016, I would be surprised if any game is taking "better advantage" of the systems than they do by march/june of this coming year.  



It is near the end of the end....