By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Zkuq said:
Pemalite said:

It was only done on optical disks, mechanical hard disks didn't "duplicate data" to strategically position chunks of data for performance reasons.
Game developers don't control where data gets placed on a mechanical hard drive anyway, it's the Operating System that manages that through the file system... And not all file systems are competent at managing data placement, thus requiring de-fragmentation of mechanical drives on a semi-regular basis.

Really? Do games duplicate data because of optical media despite probably everything getting installed on mass storage anyway since like ten years ago...? I thought it had something to do with seek times, which, as far as I know, is an issue with HDDs as well.

Actually a quick search online reinforces my understanding, but I didn't look into it that deeply.

On optical media the game developer has the power to determine what data can be duplicated and where.

On a consoles internal storage, game developers don't have any control to dictate what part of a game gets installed on what sectors of a mechanical hard drive... They cannot shift the install of another game while they are installing to be on certain parts of the drive.

The performance of a mechanical hard drive is still orders of magnitude better than optical media.

Zippy6 said:

Asset duplication was absolutely a thing on HDDs, hell it's likely still done on some games now as it's just how developers have structured their files for ages, but it was more a necessity on HDDs for streaming assets.

Mark Cerny talked about it before.

"Without duplication, drive performance drops through the floor - a target 50MB/s to 100MB/s of data throughput collapsed to just 8MB/s in one game example Cerny looked at. Duplication massively increases throughput, but of course, it also means a lot of wasted space on the drive."

https://www.digitalfoundry.net/articles/digitalfoundry-2020-playstation-5-the-mark-cerny-tech-deep-dive

Spiderman for example stores all the assets needed for a city block together, so when that block is loaded as you move through the city all the data is together on the drive to be read fast. Which lead to them at one point having a trash bag asset duplicated over 600 times and taking up a gig of data before they kept it stored in RAM instead.

A lot of games will have big chunk files for levels that contain all the assets for that level, even if the asset has appeared in another level previously.

A game developer doesn't get to choose what chunk of sectors they will occupy in a file system.. So they can never guarantee a block of data will exist next to each other, they cannot demand data be evicted from one area and moved to another for performance reasons.

The seek times from the inner most tracks to the outermost tracks of a hard drive is around 10ms.
And if your hard drive has latency of 250ms... Then something is wrong with your drive, which makes me question that article, that is optical drive latency figures.

Now consoles like the Xbox 360 leveraged XTAF in order to *try* and ensure data is available sequentially, Xbox One moved over to exFat which relied on indexing to keep track of chunks of data spread across the entire drive... And because of that, fragmentation of data is a massive issue on Xbox One which can result in performance degradation over time.

Last edited by Pemalite - on 18 January 2026


www.youtube.com/@Pemalite