By using this site, you agree to our Privacy Policy and our Terms of Use. Close
CGI-Quality said:

Regarding bottlenecking ~ let's take a look.

Is the CPU able to process data fast enough to keep up with the Xbox One X's GPU? Well, considering the relative weakness of the Jaguar CPU in the X, yes, it is often bottlenecked as a result (though you won't see stutters as much as you will just see a higher number of games running @ 4K/30fps versus 4K/60fps). The GPU, on the other hand, isn't, namely because it isn't the one that's slower in the relationship.

Ultimately, the X1X, like the Pro, is very much bottlenecked by the slower part. Not saying the device is 'bad', but it is bottlenecked by the slower CPU. Best example? I have a relatively fast dodeca-core proc running at a max [turbo] speed of 4.4GHz in my PC. The problem? It is nowhere close to as fast as my dual-GPU setup, causing a bottleneck on the proc. It's apparent in games that are more CPU heavy, like Rise of the Tomb Raider (where the GPU has to pick up much of the slack, causing stutters/hiccups/frame drops), and in PC benchmarks (this especially, in fact).

That's the science involved with the Xbox One X as well. This was a different story for the PS3, where the RSX Reality Synthesizer (GPU) was the bottleneck when paired with the super fast, at the time, Cell processor. 

I mean, sure the hardware definitely favors the GPU, but there are still plenty of instances where the Xbox One X will be GPU bound as it can't maintain a locked resolution.

It really depends where developers invest their resources, the CPU certainly has been a sticking point all generation long, which is understandable it's AMD's worst CPU at a time when AMD's CPU's were absolutely terrible.



--::{PC Gaming Master Race}::--