Pemalite said:
Nyleveia said:
there is no benefit to adding more ram to the ps4 or the xone, there are only downsides, higher price, longer period before pricedrops, higher temperature (more modules heating up), higher possibility for failure, developers being lazy with their code because "theres spare space to fill", increasing load times while said space is filled with shit you might not even need.
|
Although the increase to 12Gb of ram won't happen due to technical and cost reasons, the temperature excuse isn't that big of a deal, DDR3 typically is 1.5v and can be cooled passively. (Without a heatsink and fan.) - They typically use around 1-5 watts at the most. (I.E. Competitive with the amount of energy a 120mm case fan would use.) A bigger impact on heat would actually come from ambient temperatures.
As for whether the amount would be usefull... Those who say it wouldn't, I want their Crystal Ball to see in the future, remember how we would never need more than 640k of Ram? Look where we are today. Fact of the matter is, with 50% more ram, you can have substantually improved textures and other game assets, not just increased world sizes. - Seriously, the origional Xbox had Morrowind which was a console with 1/8th the Ram of the Xbox 360 and it was more than capable of having roughly the same game world size as Oblivion without more Ram.
Sure, games today won't use that amount of memory, but whats to say in a year or two they won't be memory constrained again?
|
There is a reason why the memory will not sufficiently be used to be worth adding, that being that the supporting hardware does not have the bandwidth to use the ram effectively, the APU does not have the raw power required to drive the data bandwidth needed to make adding additional memory worthwhile, and when your target framebuffer is 1920x1080 or less, theres only so much texture data that can be displayed before adding more resolution to the texture data becomes a waste of bandwidth - sure its nice to have pretty textures that are still pretty when your face is right up against a wall, but no developer is going to waste the system resources to add additional texture detail that for 99% of the time you will not see, for two very good reasons.
1) adding that level of texture quality is extremely costly on the creators side
2) the data thats put into memory needs to first be put in to memory to begin with - assuming they took 4gb of ram of system, and left 8gb for video - even if we assume they used only 2gb of ram for textures, to support these textures you would also need to bump up the lightmaps, bumpmaps and shader quality - resulting much of your memory available being used - AHA! i head you scream, SO IT DOES USE IT ALL!, yes, indeed it would, but then you would need to wait for that data to be loaded from hard disk to memory, or from optical disk to mem, both of which would take a long time - and for the former, it would also mean that games take up a substantially greater amount of hard disk space and given that the xbox one has a fixed, non-swappable hard disk that would be a stupid choice.
But i digress, if any of you would like to point to *ANY* game, mods installed or otherwise on PC that uses more than 4gb of vram at 1080p then I will hold up my hands and say "okay, i might be wrong", but until then, youre just ignoring the reality of the situation.
walsufnir said:
Nyleveia said:
Kyuu said: Mark my words, It's never gonna happen. 8 Gigs (5 for games) of RAM is being considered an overkill by many. 12 Gigs would be a complete waste. Microsot would be so stupid if they do something like that. Think about it, would they raise the costs of an already expensive console for little to no reason? come on.. |
My GTX titan has 6gb of gddr5 and yet theres virtually no game, regardless of setting that can use all of that ram at 1080p, higher than that and ram usage does increase a lot, hitting 4k just about maxes out 5gb of my titans memory in BF3, and yet we have people fangasming over the prospect of a system potentially having 12gb when both consoles are primarilly going to be aiming for (and sometimes missing) 1080p, its just silly, the systems cant use the ram they currently have because the platforms as a whole arent powerful enough to output a usable framerate for a decent 3d title higher than 1080p.
of course, they may do a few 4k games but these are going to be basic and hardly worth the resolution bump to begin with, those expecting 4k with their latest AAA titles though.. are probably those expecting most games for ps3 and 360 to be 1080p60.
In short, 5-6gb of ram is easilly more than enough for this entire generation, adding more isnt going to add much benefit beyond extremely expansive open world where data is streamed into the idle ram to be used if you explore far enough, the downside here is that youre basically wasting ram to hold data the user may not even need during a playthrough - and this can be achieved through assets streaming without much to any impact in gameplay on both ddr3 and gddr5 anyway.
there is no benefit to adding more ram to the ps4 or the xone, there are only downsides, higher price, longer period before pricedrops, higher temperature (more modules heating up), higher possibility for failure, developers being lazy with their code because "theres spare space to fill", increasing load times while said space is filled with shit you might not even need.
Rather than praying for stuff like this to be true, it would make much more sense to just say "okay, if games ever do need more ram, microsoft and sony should move the goalposts for OS footprint and free some up once the OS has matured enough to know a large block wont be needed".
But that would make too much sense to raging fanboys.
|
You totally ignore that console-memory is used by graphics *and* game-data. You will also see a bump in PC-requirements as engines now can finally evolve because of the new console-generation. Your downsides are also, well, at least questionable, to be honest and some are even wrong.
|
System memory is used primarilly to load meshes, sound and textures that are not time critical, which are then passed along to the graphics unit to be processed, or placed in vram to be accessed faster when needed, this is an entirely abritrary process however, you can run a modern game on a pc with a decent graphics card with just 2GB of system ram, typically the system ram is used to stream game data from hdd to system ram to speed up load times and provide faster data pool switching but this is primarilly a result of the inefficiencies of a modular system, a closed system like a games console relies much less on this process than you are suggesting, because data can be loaded from fixed media directly into whichever memory pool the program defines - but the end result is the same, even if you split 8gb down the middle, a decent spec PC even with its inefficiences as a platform, with 4gb of video ram and 4gb of system ram, will still run games at 1080p perfectly fine and even then will still NOT FILL UP ALL 4GB OF VRAM.
Even if current engines magically "evolved" as you put it, there is a platau with regards to renderspace and framebuffer bandwidth, a large part of why there is still such a push for faster more powerful graphics cards isnt because engines demand it, its because modern monitors resolutions are getting higher, and with them the power and resources needed to feed that video data requirement grow - people who are still using a 1080p monitor and an old graphics card are still able to play modern games just fine.
But i will say it again
the PS4 and Xbox One, do not have the CPU or GPU power to use more than 6gb of vram effectively at 1080p, if you started pushing more texture data through the framebuffer you would just be deminishing the performance, 12gb of ram would not make the system better, 12gb of ram would impact the framerates in exchange for an extremely minor bump in graphical fidelity.