By using this site, you agree to our Privacy Policy and our Terms of Use. Close
curl-6 said:
sc94597 said:

My point was that simulations usually are updated by the precision required, not necessarily the frames rendered, and at times they could scale constantly with frame-rate while other times they could scale super-linearly with frame-rate. 

As to the second point,

Sure, these things scale. That's not in dispute. What has also scaled since Morrowind are memory capacity, memory bandwidth, storage space, CPU speed, etc.  Even if the number of objects (and their storage size) scaled by 10000 times Morrowind (for a game-world that is 8 times larger than Morrowind ) all of this hardware has also scaled by at least that much, if not more, than that and Bethesda has likely learned many more tricks since then (i.e using more efficient data structures, putting a time-limit of 30 days on object permanence, etc.) 

Object-permanence mostly is a storage/memory management problem. The CPU does have a role in it (writing from memory to the disk/pulling to the memory from the disk) but it isn't some magic workload that is special. 

Most developers don't implement it in their games because it is a system that is developmentally costly to pull of correctly, not necessarily runtime costly (if it is pulled off correctly.) 

I'm aware, but that doesn't change the fact that running at 60 gives you half as much frame time to work with. There may be technicalities with the update rate of different tasks and systems or how your load is spread across cores/threads/components, but ultimately 33.3ms is double 16.7ms.

And world scale doesn't tell you much, it's what's in it that will determine how demanding it is. You can make a single room unable to hit 60 if you fill it with enough processing tasks. 

As Digital Foundry points out, 30fps vs 60fps is a design choice. Developers have a vision, and limited amount of power to try to achieve it. There are some things current hardware simply can't do at 60.

Sure it gives you double as much time to finish a calculation, my point is that it doesn't necessarily mean double the available processing power. 

Here is how item locations are implemented in Skyrim. Do you think Starfield will work fundamentally differently? I doubt it and nothing shown so far tells us otherwise. I suppose the location space itself is fungible because of the procedural generation, that might be a difference that affects calculations once (as the world is procedurally generated.)  

https://en.uesp.net/wiki/Skyrim_Mod:Save_File_Format

globalDataTable1 Global Data[fileLocationTable.globalDataTable1Count] Types 0 to 8.
globalDataTable2 Global Data[fileLocationTable.globalDataTable2Count] Types 100 to 114.
changeForms Change Form[fileLocationTable.changeFormCount]
globalDataTable3 Global Data[fileLocationTable.globalDataTable3Count] Types 1000 to 1005.

You might also have longer object permanence (in Skyrim if the tile wasn't touched in 30 in-game days, then the object is removed from the table.) Or you might have more variety of items remain permanent. 

But that will mostly affect the save file size. Nothing that has been shown so far hints, for example, that if you are located on planet X there are some physics going on - on planet Y's objects while you are gone and the objects' locations might change due to that. "Permanence" in Elder Scrolls/Fallout/etc. is mostly permanence of the object's location and form, which are stored in a table and pulled when you load a save file. 

This is contrastingly different from say a game like Tears of the Kingdom where there are interactions going on when you leave an object (which is why there is a 20 object and maximum distance limit, overall producing very little permanence in that game.)