By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Pemalite said:

In what way? If you are suggesting it somehow increases framerates...

Regarding to SSD in low powered laptops, swap file and general browsing (the amount of temporary files that comes with browsing is staggering) But I'm not sure what the context was for that. It also helps frame stutter. Train simulator for example, old game with complex scenery updates for an 8 year old engine. Without SSD the stuttering while loading in the next section would be much worse.

Swap files are still used in 2020 on the latest and greatest PC's. They never stopped being used, even when you change your storage. - Currently my PC is using about 3GB of swap.

Swap file is the first thing I disable and delete when I get a new laptop. I rather spend extra on RAM than sacrifice the SSD disk space to a last resort measure that windows has never been able to manage well. Maybe it's better now, I've been disabling the swap file since windows XP, blue screen of death was better than heavy swap file use. Sacrificing 6.8 GB for sleep mode is enough (hiberfil.sys)

More power doesn't necessarily mean you need more memory, there are operations that are pretty memory insensitive all things considered, things like texturing need more space sure, but using the compute overhead to bolster an effect somewhere else may not.

Higher resolution does need more memory, native 4K means all screen buffers are 2.25x larger than 1440p. More power is expected to deliver native 4K. Maybe not a lot more comparatively, but more than sticking to lower rendering resolutions.

The 7th and 8th gen drives were 60MB/s tops on a good day, often 30-40MB/s with random access.

Ugh, no wonder it takes 35 minutes to make a copy of GTS for patching (103 GB) That's about 50 MB/s on ps4 pro.
(yet read and write on the same HDD, so not that bad actually)

Skyrim didn't run into any trouble on the Xbox 360 despite it only using DVD+Mechanical HDD.

360 had 512 MB unified RAM though, plus an extra 10MB edram. ps3 had 256 MB system ram, minus what the OS used. Skyrim afaik kept track of all the changes you make in the world (anything you displace gets flagged and remembered in a change file, you know better probably having worked on it) So at first the game runs fine (fresh save) yet the further you get the worse it gets. There must have been memory leaks as well since I had to disable auto save and restart the game every 15 minutes to be able to finish it on ps3. The frame rate would slowly degrade to unplayable.

Skyrims issue was more or less a game engine issue... Despite Bethesdas claims that the Creation Engine is "ground up" brand-new, it's still based on Gamebryo which in turn is based on Net Immerse... And one characteristic of that engine is that it is extremely powerful with scripting and cell loading, but it does mean it comes with the occasional hitching irrespective of the underlying hardware.

Would an SSD have helped? Sure. No doubt. But so would more Ram. - The 7th gen was held back by that 512Mb memory pool.

I did do some nitty gritty work on oblivions engine at some point, working to get it to run on less than optimal hardware configurations... I.E. 128MB of Ram even.

The SSD could have hosted a swap file to keep track of the changes :)

There is a big jump in hardware feature sets and efficiency between Wii U and Switch. - The fact that the Switch can take every WiiU game and run them at higher resolutions and framerates with little effort is a testament to that.

Keep in mind that Breath of the Wild was the best looking WiiU game... Where-as on the Switch, it still hasn't had it's best looking games yet.

In saying that... It's not the same kind of jump between the Playstation 3 > Playstation 4 or Xbox 360 > Xbox One, it's a much smaller jump in overall capability, but it's there. (And I do own both the Switch and WiiU so can compare in real time.)

I have the WiiU as well, played it a lot more than the Switch actually. In docked mode Switch does look better but indeed not really a generational leap. BotW was a bit blurry on my 1080p projector, but still great looking.

Considering that the consoles are low-end PC rigs... They tend to be the lowest common denominator.

Developers don't *need* to build their games to target low-end PC's, heck. Crysis didn't.

Next gen, target resolutions are going to be unimportant, heck they almost are now.

Microsoft has DirectML and AMD has Radeon Image Sharpening which should provide some impressive image reconstruction, it's going to make pixel counting redundant for allot of games.

A game like Call of Duty Modern Warfare 2019 has a very filmic presentation due to the image reconstruction and anti-aliasing and other post-process techniques being employed.

If a game is 1080P, I am okay with that, but it better be an impressive 1080P.

The Series X is also for 1080P owners, the super sampling of the Xbox One X on a 1080P display does bring with it a ton advantages... Plus the higher-end console will have better texturing, shadowing, lighting which will pop far better on a 1080P display.

The Series S is more or less for those on a budget who don't really give a crap about specifications... They just wanna game and have fun.

Agreed, enough vgcharting, time for some Tlou2!