CGI-Quality said:
A 2GB difference amount is one thing (the GTX Titan was a 6GB card in 2013). The highest consumer grade amount right now is 12GB. We won't be seeing a console with an additional 22GB from that. |
If they stick with UMA, the GDDR must work again as both graphics and system RAM, so it's not very precise to compare that amount of RAM with the amount used in a discrete graphics card. Obviously a GDDR based UMA offers far greater graphics performance than cheap DDR based UMAs used in cheap PCs, while offering almost the same advantages in design simplicity, and it proved to be a great solution for a console, but should it become too expensive if RAM prices don't drop quick enough for the amount of total RAM they wish to have next gen, they could settle for going back to a classic separate memories solution, and in this case, even 12GB GDDR for the graphics would be a very good amount, while for system RAM we could maybe get 16-20GB latest DDR4 or early DDR5. This if obviously a possible scenario if they end up deciding that 16GB wouldn't be enough, as we all agree that 16GB GDDR UMA is definitely feasible and viable. In the end, anyway, the reality is that the ridiculous RAM price situation forced devs to stop the growth of at least games minimum specs for RAM size, and as the quality of the best games increased anyway, this most probably means that lower level devs made the latest versions of game engines more CPU/GPU-intensive while keeping them not too much more memory-intensive than the previous versions, at least in the low and mid settings







