Currently my PC is 8 tb, using about 3 tb at the moment. All M2 drives, so feeling pretty gold.
Currently my PC is 8 tb, using about 3 tb at the moment. All M2 drives, so feeling pretty gold.


Zkuq said:
Really? Do games duplicate data because of optical media despite probably everything getting installed on mass storage anyway since like ten years ago...? I thought it had something to do with seek times, which, as far as I know, is an issue with HDDs as well. Actually a quick search online reinforces my understanding, but I didn't look into it that deeply. |
On optical media the game developer has the power to determine what data can be duplicated and where.
On a consoles internal storage, game developers don't have any control to dictate what part of a game gets installed on what sectors of a mechanical hard drive... They cannot shift the install of another game while they are installing to be on certain parts of the drive.
The performance of a mechanical hard drive is still orders of magnitude better than optical media.
| Zippy6 said: Asset duplication was absolutely a thing on HDDs, hell it's likely still done on some games now as it's just how developers have structured their files for ages, but it was more a necessity on HDDs for streaming assets. Mark Cerny talked about it before. "Without duplication, drive performance drops through the floor - a target 50MB/s to 100MB/s of data throughput collapsed to just 8MB/s in one game example Cerny looked at. Duplication massively increases throughput, but of course, it also means a lot of wasted space on the drive." Spiderman for example stores all the assets needed for a city block together, so when that block is loaded as you move through the city all the data is together on the drive to be read fast. Which lead to them at one point having a trash bag asset duplicated over 600 times and taking up a gig of data before they kept it stored in RAM instead. A lot of games will have big chunk files for levels that contain all the assets for that level, even if the asset has appeared in another level previously. |
A game developer doesn't get to choose what chunk of sectors they will occupy in a file system.. So they can never guarantee a block of data will exist next to each other, they cannot demand data be evicted from one area and moved to another for performance reasons.
The seek times from the inner most tracks to the outermost tracks of a hard drive is around 10ms.
And if your hard drive has latency of 250ms... Then something is wrong with your drive, which makes me question that article, that is optical drive latency figures.
Now consoles like the Xbox 360 leveraged XTAF in order to *try* and ensure data is available sequentially, Xbox One moved over to exFat which relied on indexing to keep track of chunks of data spread across the entire drive... And because of that, fragmentation of data is a massive issue on Xbox One which can result in performance degradation over time.
Last edited by Pemalite - on 18 January 2026
www.youtube.com/@Pemalite
Yeah its pretty crazy.
My gaming computer (4070 Super, 7800x3D, 32 GB RAM and 2 TB SSD Storage) cost me $1,400 when I bought it in 2024.
A almost equivalent computer (5070, 7800x3D, 32 GB RAM, and 1 TB SSD Storage) cost $2,200 I checked out few days ago lol.

Pemalite said:
On optical media the game developer has the power to determine what data can be duplicated and where.
A game developer doesn't get to choose what chunk of sectors they will occupy in a file system.. So they can never guarantee a block of data will exist next to each other, they cannot demand data be evicted from one area and moved to another for performance reasons. Now consoles like the Xbox 360 leveraged XTAF in order to *try* and ensure data is available sequentially, Xbox One moved over to exFat which relied on indexing to keep track of chunks of data spread across the entire drive... And because of that, fragmentation of data is a massive issue on Xbox One which can result in performance degradation over time. |
If you put related data contiguously, I bet you the chances are it'll be together on the disk as well. The file system is free to do whatever it wants, but most sensible file systems probably try to not scatter data around when they know it belongs together (e.g. if it's a single file, even if that single file comprises of a large number of individual assets). That to me seems way different than e.g. storing each asset as an indivual file, which the file system is much more likely to scatter around because it doesn't know they're related - especially if they're in different directories and could be used in a number of different situations. If you need, say, 2,000 assets for a scene (often a map), and it takes 10 ms to load each one, that's already 20 s in total just for seeking, let alone actually loading the assets. I don't know how many assets you need to load at once in a modern game, but I don't think 2,000 sounds way off for an AAA game. Regardless, even if it's just 200 assets, it's still 20 s just for seeking, then some for loading, some for actually running code etc., and I don't think it's hard to see why loading times might be high if data is scattered all over the place.
Developers do all kinds of stuff that's not guaranteed to work, and I'm sure game developers do it just as much if not more than others. I know that in low-level programming languages it's common to consider how exactly data is aligned in memory, how the CPU is likely to cache it etc. even though there might not be any actual guarantees. If it works now and has seemingly little reason to ever change on the target platform, why worry about it not being guaranteed? Stuff like that makes me uneasy personally, because I like guarantees, but that's also something that happens and works to my best knowledge.
I spent £215 for an 8TB HDD today. It's an old bulky 3.5inch model, but that's fine.
I use the HDD to store games, and when I want to play them, I transfer them to my SDD.
| Signalstar said: My Switch issue is not so pressing but I would still like to upgrade my memory card, even before I eventually buy a Switch 2. |
If you are planning to upgrade to an Switch 2, it is better not to upgrade your Switch 1 memory card.
You need a MicroSD Express card for the Switch 2, the "normal" MicroSD cards can only be used for screenshot storage.


| Zkuq said: If you put related data contiguously, I bet you the chances are it'll be together on the disk as well. The file system is free to do whatever it wants, but most sensible file systems probably try to not scatter data around when they know it belongs together (e.g. if it's a single file, even if that single file comprises of a large number of individual assets). That to me seems way different than e.g. storing each asset as an indivual file, which the file system is much more likely to scatter around because it doesn't know they're related - especially if they're in different directories and could be used in a number of different situations. If you need, say, 2,000 assets for a scene (often a map), and it takes 10 ms to load each one, that's already 20 s in total just for seeking, let alone actually loading the assets. I don't know how many assets you need to load at once in a modern game, but I don't think 2,000 sounds way off for an AAA game. Regardless, even if it's just 200 assets, it's still 20 s just for seeking, then some for loading, some for actually running code etc., and I don't think it's hard to see why loading times might be high if data is scattered all over the place. Developers do all kinds of stuff that's not guaranteed to work, and I'm sure game developers do it just as much if not more than others. I know that in low-level programming languages it's common to consider how exactly data is aligned in memory, how the CPU is likely to cache it etc. even though there might not be any actual guarantees. If it works now and has seemingly little reason to ever change on the target platform, why worry about it not being guaranteed? Stuff like that makes me uneasy personally, because I like guarantees, but that's also something that happens and works to my best knowledge. |
exFAT is not that smart of a file system.
Data can and does scatter over time, game installation content isn't always static... Pieces of data can be created and deleted or modified.
Say for example you install game "A" that is roughly 10,000 Megabytes (10GB) then you install game "B" that is 10,0000 Megabytes after that one.
Then you load up Game A and create a save file and start playing. - That game may compile shaders into a container, that container isn't going to exist in that initial 10,000 Megabyte block of installed data, it's going to exist on the drive after Game B.
Then when you load Game B up, it may decompress some data, that isn't going to exist in the Game B chunk, it's going to exist after the second Game A chunk.
Then say you uninstall Game B, suddenly you have two FREE spaces between two separate Game A chunks.
Let's say you decide to install a 3rd game... That game "C" Just happens to be 20,000 Megabytes, the game is going install 10,000MB into that first free chunk, fill in the second free chunk, then install the rest after the second Game A chunk.
And this is how files and data become fragmented over time.
But imagine this with OS files and data constantly being created and deleted... And this is why Xbox One consoles with a mechanical hard drive tend to deteriorate with performance over time.
Again, game developers don't get to chose where game data is installed, Microsoft (Sony and Nintendo included), doesn't allow that kind of fine-grained access to storage as it can open the system up the hacks/abuse.
Obviously this is no longer an issue with modern devices anymore that use NAND memory.

www.youtube.com/@Pemalite
Pemalite said:
exFAT is not that smart of a file system. |
Hmm. Aren't those initial 10 GB installations and how exactly they're spread across files likely to be more impactful for performance than whatever gets created after the game is installed? Also, aren't those initial 10 GB also likely to be split into several files by developers even on consoles? Not the whole 10 GB needs to be contiguous for efficient loading, only parts of it need to be contiguous with each other, right? For example, you can have 1 GB file for each map, containing all the assets in the map (some of it probably duplicated from other maps), and the file system is likely to find a contiguous chunk for such a smaller file even when it couldn't fit the whole 10 GB contiguosly. Sure, after enough time, the file system will look like Swiss cheese anyway, but it'll take time before that becomes too problematic.
I have zero experience with developing console games, but personally I'd be really surprised if developers had zero control over how data gets stored. Files are a form of control anyway, so unless the console simply ultimately doesn't support game files, I don't see how files couldn't be used to organize data. I have no idea why the console manufacturers wouldn't allow developers to organize their data this way. Again, I'm fairly confident that each file gets stored contiguously whenever possible, and it's much easier to find contiguous spaces for relatively small files than whole files, so it's probably possible fairly often. I don't have time right now to look this up properly, but AI thinks even exFAT tries to do this, and it does seem like a fairly simple thing to do, so I don't see why it wouldn't.
I'm sure there are many complications (e.g. updates, especially changes to assets), but I'm sure there are more or less effective workarounds to them as well.


| Zkuq said: Hmm. Aren't those initial 10 GB installations and how exactly they're spread across files likely to be more impactful for performance than whatever gets created after the game is installed? Also, aren't those initial 10 GB also likely to be split into several files by developers even on consoles? Not the whole 10 GB needs to be contiguous for efficient loading, only parts of it need to be contiguous with each other, right? For example, you can have 1 GB file for each map, containing all the assets in the map (some of it probably duplicated from other maps), and the file system is likely to find a contiguous chunk for such a smaller file even when it couldn't fit the whole 10 GB contiguosly. Sure, after enough time, the file system will look like Swiss cheese anyway, but it'll take time before that becomes too problematic. I have zero experience with developing console games, but personally I'd be really surprised if developers had zero control over how data gets stored. Files are a form of control anyway, so unless the console simply ultimately doesn't support game files, I don't see how files couldn't be used to organize data. I have no idea why the console manufacturers wouldn't allow developers to organize their data this way. Again, I'm fairly confident that each file gets stored contiguously whenever possible, and it's much easier to find contiguous spaces for relatively small files than whole files, so it's probably possible fairly often. I don't have time right now to look this up properly, but AI thinks even exFAT tries to do this, and it does seem like a fairly simple thing to do, so I don't see why it wouldn't. I'm sure there are many complications (e.g. updates, especially changes to assets), but I'm sure there are more or less effective workarounds to them as well. |
Obviously I am dumbing things down to an extreme example here... Because I haven't touched on sector sizes and how data is read/written to sectors... Or how exFAT does searches... It doesn't rely on Indexing, it relies on doing a manual search of a directory until a match is found sequentially by relying on a hash record, great for keeping CPU load down, not great for transfer rates... Which is why it's such an ideal file system for NAND.
But yes, that 10GB is going to be split into smaller files... And those files will be split up into smaller sector-sized chunks.
In short, the file system is going to write the games data to whatever free space it can, that doesn't guarentee it will be sequential.
Developers don't get to choose where data is written, because they can't. Another game may occupy that space.
With an Optical disk it's a non-issue, the developer has complete control over the optical disks contents, thus can optimize the placement of data or duplicate data to reduce seek times... It doesn't have to compete with other software.
Consequently, we just need to look at the performance deterioration of mechanical hard drives in the Xbox One and Playstation 4, they do degrade over time, not because the drives have gotten slower, but the data has become more fragmented... This has always been an issue for mechanical hard drives since they existed.
PC's tried to get around this with defrag software... But later on, File Systems tried to do it in the background when the computer wasn't being used, but exFAT doesn't have that kind of intelligence.
The Playstation 4's ext4 file system is a little more advanced than the Xbox One's exFAT and will try and optimize game installations as much as possible during install times, but that will also suffer from fragmentation and a reduction in performance over the long term just like the Xbox One... But it does happen sooner on Xbox, which sadly is because they are relying on an outdated PC file system.

www.youtube.com/@Pemalite
AI is just going to fuck the gaming industry sideways.