By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Chrkeller said:
curl-6 said:

The problem here is you are taking one aspect of a system's technical makeup as if it's the only factor, while ignoring the numerous other components.

Many things other than bandwidth can bottleneck a system and limit performance, from CPU draw calls to pixel and texel fillrate to asset streaming.

Bandwidth is only one factor among many.

But look at what developers are doing across the three main sectors of fidelity.

1) resolution impacts bandwidth.  is the S2 rendering games at 360p and look like the witcher 3 on the S1?  Nope.  Resolution rendering, in many cases is quite high.  

2) image quality impacts bandwidth. is the S2 rending games with rebuilt assets like Hogwarts on the S1?  Nope, image quality (especially textures are quite high).

3) fps impact bandwidth.  is the S2 running game at a reduced fps compared to current gene?  YES.  

Third party developers, thus far, are address the memory bandwidth bottleneck by dropping fps...  this is literally happening, it is a fact.

There are things other than bandwidth can limit framerate, from CPU load to pixel/polygon fillrate.

It will depend on the game of course; in the right circumstances (say, a game that pushes a ton of alpha transparencies at a high resolution) bandwidth could become the limiting factor, but I expect that in most ports of PS/Xbox games, it'll be more the CPU that necessitates 30fps.