By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Barozi said:
KratosLives said:

No one will want to go back to 720p after getting used to 1080.

and why would I want to play this 1080p grand epic on Series X when I was just getting used to (near) 4k resolution with Xbox One X?

You just debunked your own point.

More to an output image than resolution.

We aren't even at the point where we can have photo-realistic images even at 720P, let alone 2160P.

Mr Puggsly said:
KratosLives said:

No one will want to go back to 720p after getting used to 1080.

720p with image reconstruction tech isn't the same as 720p of the 7th gen per se.

Also, dynamic resolutions help keep the average resolution higher.

Many titles of the 7th gen weren't even 720P... Halo 3, Call of Duty and so forth being prime examples.
Plus many games opted for morphological anti-aliasing rather than MSAA or such... So images were terrible.

There is a big difference between 1080P and 2160P, but the extent of which depends on a myriad of factors like individual eyesight acuity, display size, display technology, display resolution, distance seated away from the display, even the ambient lighting.

Conina said:

Graphics can much better scale up and down than non-graphic stuff.

The bottleneck of the Xbox One X is clearly the Jaguar CPU-part.

Why do you think that only a few Xbox One X games have a performance mode with 60 fps and lower resolution/eye candy instead of 30 fps 4K?

A console is the sum of it's parts.
You can have the fastest CPU in the world, if you are GPU or Memory limited... You still are not going to achieve 60fps.

Besides, the Super Nintendo had 60fps games and that console only had a single core, 3.58Mhz processor... Verses the Xbox 360 with a 3-core, 3200Mhz hyperthreaded processor or Xbox One with an 8-core 1750Mhz processor.

The Xbox One X games are very slim on the 60fps pickings because that is what developers opted for, the GPU is also capable of offloading many tasks that used to be done on the CPU as well, they are highly programmable these days...

haxxiy said:

Surely not, but the point was that there is no way to even begin to compare the functionality of 12 GB of RAM fed by a 50 MB/s HDD versus 8 or 10 GB fed by a 3 GB/s SSD. One can fill the RAM in a few seconds. The other, I'm not even sure how developers accomplish the miracle of making games appear on screen after only half a minute or so to begin with.

Many games also didn't stream (Especially simpler titles!), they stored everything in memory and just simply had chunky load points... Those titles it does not  make a difference if it's an optical disk with 5MB/s transfer rates or 5500MB/s of SSD transfer rates.

An SSD is only a tiny fraction of the speed of Ram (And much higher latency!), it cannot replace Ram. - Otherwise we wouldn't bother with Ram, but the performance implications of ignoring the 400-500GB/s of Ram bandwidth for only 5.5GB/s of SSD bandwidth is catastrophic.






--::{PC Gaming Master Race}::--