Beloved series that trace back to the 8- or 16-bit era will always face one major problem: the evolution of technology.
By definition, due to technological limitations, an 8-bit game (we’re talking about 1986) had to be “primitive”: simple graphics, simple stories (outside of RPGs, story wasn’t really a thing back then), no voice acting, and no real cutscenes (at best a few still images).
16-bit games improved on everything 8-bit games could do: some voice acting, some cutscenes, and generally more possibilities for storytelling.
With the PS1/Saturn/N64 generation came the 3D era. Its technological advancements meant that everything could become bigger—storytelling, voice acting, and cinematic cutscenes included.
The Dreamcast/PS2/Xbox/GameCube generation expanded on that and pushed everything even further.
The PS3/Xbox 360/Wii generation introduced HD graphics, making games more impressive and more cinematic. Many titles aimed to feel almost like movies.
The PS4/Xbox One/Wii U generation went even further in its cinematic direction.
The current generation—PS5, Xbox Series X/S, Switch 1+2—continues that evolution with further technological and graphical improvements. Games and cutscenes now look even more film-like.
I didn't mention the handheld generations, but you can slot the Game Boy, GBA, and DS into the above eras however you prefer.
What I want to say is this: as game technology evolves, game series evolve as well. If a series refuses to evolve, it becomes niche and ends up in the indie-download-only sector, where you can’t sell it for $60, $70, or even $80.
If game technology had been much more advanced when the first Metroid was developed, surely it would already have included more storytelling, cutscenes, and voice acting.
Fans often don’t acknowledge the evolution of their medium and want their beloved series to stay the same. But that’s neither realistic nor advisable.







