By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Pemalite said:
Billjw said:
Can developers afford to utilize all the extra ram? Is the progress of console hardware outpacing the feasible potential of developers and possibly resulting in leaving a lot of untapped horsepower? It seems there have been a boatload of stories of developers downsizing in recent years because of failures to reach target sales. Target sales that may just not be sustained by the market because of the outrageous cost to produce these games requires several millions of units need to be sold to turn profit.


Ram is easy to waste and use. However, just remember Ram itself doesn't do any type of processing.
Think of it as a super ultra fast flash drive that holds information that the CPU, GPU and other I/O stuff grab when they need it.

The thing with this current Generation though is comparatively, the Xbox 360 and Playstation 3 had relatively high-end graphics processors compared to what was available on the PC at the time.
This time around they're running with only a Radeon 7850, which is mid-range stuff. - Better than the Radeon 6670/7670 rumours that flew around at one stage that I was dreading, but I would have loved for them to have dropped something even faster into the box.

As for the CPU that's going into the PS4, it's a slow heap of crap. - Something I would use in a HTPC under my television or in a Tablet or a Netbook.
Then again the Xbox 360's CPU and the Playstation 3's CPU aren't exactly speed demons either and there are simple reasons for that, one is power budgets, second is cost which is also in direct relation to the fabrication process and transister counts.

Just remember that whenever Sony or Microsoft or Nintendo "claim" something in regards to what their consoles can do, take it with a pinch of salt, they're advertising to get you to buy the machines, they're not super computers or even equivalent to a high-end PC.
Heck most developers will probably abuse the Unreal Engine anyway again, rather than push image quality.

Thanks for posting this. I stayed out of this thread because I didn't want to tell people *again* that they don't know what RAM actually is for but I totally agree with you :)

Again, in short, RAM is to reduce computing units io-wait. Simple as that. All can be inferred to such a simple statement.