Also the PS4 GPU is more like the 7970M (M as mobile) than the 7850. For the simple reason that it is energy saving. Not that this simple fact would change much about the RAM management, true. But it is also true that we have no frigging clue of what's really inside that chip. We just know that the memory bandwidth is higher than both 7850's and 7970M's. There could be, and most likely there is, the typical "secret sauce" we find in consoles.
Actually, it doesn't have more bandwidth dedicated to graphics than the 7850, remember, it uses a unified memory architecture, everything from the CPU, Hard Drive, Optical Drive, I/O stuff all share a chunk of that bandwidth, if you tax the entire system to 100% with allot of bus transfers, the bandwidth available to graphics will tank, the PS3 doesn't have EDRAM to make up for that hit.
That is the main problem with unified memory over split memory architectures that are found in the PC and to a lesser extent the PS3.
Teraflops wise, it's almost the same as a desktop Radeon 7850, a mid-range card, not a high-end one.
Still, after the debacle of the PS3 splitted RAM architecture to have 8GB of unified video RAM is a huge step forward. Even Skyrim at the 6th or 7th patch fixing the memory leaks could run just fine in the PS3 (I played it personally. Not even a freeze and the frame drops disappeared) , issues like that will never happen again. And more video RAM can be useful to pre-render and store some parts of the environment, while the GPU is not very busy. Sure, it must be optimized. That's why Naughty Dog games will look greater than ever while the multiplatform one, especially the ones ported from PC, will most likely run at a lower fps.
Skyrim was Bethesda's own fault, Skyrims game engine is essentially an upgraded Gamebryo engine which is the same engine found in Oblivion, hell, even allot of the shader code in Oblivion, Fallout 3 and Skyrim is the same, it's a horrible engine, with some benefits like amazing modding capability.
Remember Oblivion performs fairly horribly on the Xbox 360 and Playstation 3 too, with lots of item pop-in, freezes and framerate issues, so it's not like they didn't have any experience in trying to get this game engine running on consoles.
It's just Bethesda has bad testers, period.
This is only academic anyway. The truth is that there are plenty of (PC) games that do not need to fully use the top GPUs. I can see big issue only with games like Crysis 4, unless they improve their stupid engine. After that, having for example Lara Croft with or without dynamic hair won't change my life nor may gaming leisure at all.
Actually, that's not entirely accurate.
It's pretty well known any high-end card made in the last 4-5 years on the PC can max pretty much max any console port at a 720P resolution that most console games run at anyway, with relative ease.
PC's do need to use 2-4 of the fastest GPU's on the market in tandem, not at 720P or 1080P, but at 8-10x higher in resolution, that's what the extra performance has allowed the PC gaming master race to achieve.
And let me tell you, until you game with Eyefinity/Surround Vision with that kind of resolution... You haven't really "gamed". :)
Heck, 5760x1080 is a massive step up to, the immersion can be incredible in some game titles.