Chrkeller said:
But look at what developers are doing across the three main sectors of fidelity. 1) resolution impacts bandwidth. is the S2 rendering games at 360p and look like the witcher 3 on the S1? Nope. Resolution rendering, in many cases is quite high.  2) image quality impacts bandwidth. is the S2 rending games with rebuilt assets like Hogwarts on the S1? Nope, image quality (especially textures are quite high). 3) fps impact bandwidth. is the S2 running game at a reduced fps compared to current gene? YES.  Third party developers, thus far, are address the memory bandwidth bottleneck by dropping fps... this is literally happening, it is a fact. |
There are things other than bandwidth can limit framerate, from CPU load to pixel/polygon fillrate.
It will depend on the game of course; in the right circumstances (say, a game that pushes a ton of alpha transparencies at a high resolution) bandwidth could become the limiting factor, but I expect that in most ports of PS/Xbox games, it'll be more the CPU that necessitates 30fps.







