By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Pemalite said:
Egann said:

Besides, I'm not entirely convinced "more is better" with RAM. My PC rig has a paltry 6 GB, and task manager confirms I seldom use more than 4, often with stuff running in the background. I expect RAM--like other electronic components--hits a diminishing returns wall somewhere between 2 and 8 GB. Less than two and you'll hit memory limits all the time, more than 8 and you're going to have developers making spaghetti code.


The PC doesn't equate to a console in regards to memory usage, the PC is doing *far* more just in the background than a console ever has to manage, which makes sense, because the PC is the most flexible and powerfull platform available.

Plus, the PC has multiple memory pools, you have your System Ram, which may be 6Gb, but on top of that you also have your Video cards Ram which today is usually around 2-6Gb in size.

On a console the video and system memory are one in the same this time around, thus if you were to compare say... The PS4 to a PC a developer might use 4Gb of that ram for GPU operations, leaving 3Gb for other tasks and 1Gb for the OS, suddenly that 8Gb of Ram doesn't sound like allot when you compare it to my 32Gb of System Ram and 9Gb of total video card memory. (3Gb is the max usable in games, however with compute I can use the whole 9Gb.)

You also can't forget that these machines are expected to last a decade without any changes in the hardware, what might seem "somewhat decent" today... Will seem paltry and pathetic in 5-10 years time. (Hence my wanting for better hardware in all the consoles.)

Oh, I'm sure developers will use all of whatever they're given, but I wonder how much of that will actually be constructive use. PC's are in an odd situation because most monitors can go higher than 1080p or will during this generation, so they actually can use a lot of power. Consoles? I don't think any of them support 4K, so past a certain point what difference does it make? More RAM will basically translate to running uncompressed textures which your HDTV screen can't display even if your system can churn them out. Games announced thusfar like MGS5 and Witcher 3 make me think the new hardware can run open world environments with visuals maxed out for the display--or close enough, anyway, my eye won't be able to tell the difference.

I just don't see RAM adding meaningfully to the equation when the limiting reagent is the 1080p HDTV screen.