| SvennoJ said:
If the ps3 had 2 x 512MB split memory instead of 2 x 256MB it would not have been a problem. Therefore a hardware limitation. Arguing about architecture is irrelevant as that's part of the hardware design.
And Skyrim became unplayable on ps3 because it had to track too many changes, the change log in memory got too big. Plus an ancient engine with lousy garbage cleanup. The game did remain playable, but only after turning off all auto save features and restarting the game every 15 minutes. Either some memory leak or broken garbage cleanup routines, as load the same save file from a fresh start and it worked again (for a while) FS2020 had the same problem initially, the clean-up routines to store cache lagged behind the fetch routines causing the game to use more and more memory. I clocked it at using 51GB ram before it finally crashed, already running horribly maxing out the page file.
You can throw more RAM at it, or optimize the engine. But first you need enough RAM and storage to actually keep track of everything. With enough storage you can track everything in a huge seamless open world. Either in the cloud, in ram or on disk. I've worked on that myself in GPS navigation with maps of entire Europe and US with user changes and updates anywhere, share-able online. Basically a dynamic map stored as a multi level quad tree with bit flags to indicate where changes are, with a hash table to lookup the actual changes in an efficient way. You can build onto it yourself (but mostly used for temporary closures, direction changes, speed traps, traffic jams etc etc) And that ran on piss poor hardware nearly 20 years ago with little ram. Streaming data back and forth to disk, fetching what's relevant. Surely a game can do that as well...
Dynamic machines, sure, they will only be active in a bubble around the player, just like any other moving parts like NPCs. The enemies don't patrol on the other side of the map either. That's the whole point of storage, go out of range, freeze and store until back in range. Minecraft does that as well. I don't expect to make a harvester that keeps running on the other side of the map. That's left to MMO style games that run in the cloud. (Although there are plenty ways to simplify stuff in case of harvesters or machines that appear to keep on going by logging the output. Games are deterministic by nature, everything can be turned back into an equation when the user is absent)
|
Except we know the hardware was fine because the specs were the same as the 360 the problem was the architecture which only allowed half to be utilised it isn't anyone else's problem if you don't understand the relationship between architecture and hardware. Architecture is why a platform like Switch can run games like the Witcher 3 and Doom with the specs it has even though the numbers on paper don't look possible.
Having the hardware is only one part of the equation architecture breaks down how each component functions in unison and in the PS3s case the Ram was divided. The whole throw more numbers argument is flawed because you can never have enough memory to do what you ask as even the so called data that you say should freeze when you are out of range is taking up space and it will only build up which is why you're being unrealistic. What you're complaining about will always be there no matter what the hardware and architecture because that is the developers working smart.
To highlight the flaw in what you are saying it is essentially telling someone that if you leave a tap on the bathtub will eventually overflow so you need an overflow pipe installed as well as drainage. Your answer of more Ram is effectively like saying well just get a bigger bathtub, it doesn't remove the core issue and the need for the measures to deal with it.
Last edited by Wyrdness - on 28 May 2023