|Mr Puggsly said:
Some of you guys are asking some technical questions, but ultimately a 4x pixel count just isn't a boost in clarity I anticipated. A common comparison is its not the jump from 480p to HD. The disparity from 720p to 1080p is also more apparent. I actually fiddled with various resolutions on the console just to see if I can detect a difference, but the disparity just wasn't that significant unless I'm looking very close.
Either way, I do like games running north of 1080p because super sampling looks great on a 1080p screen. But I'm sure 9th gen games will still look great at 1080p with good AA. I feel 1080p is a sweet spot because its inherently sharp to the human eye. However, 1080p with no AA can look bad due to jaggies.
We are asking for technical information for a good reason. - Variables.
And to be fair... The jump was pretty insignificant to the point of worthless going from 720P to 1080P on my 32" panel in the shed, on my old 24" Television that the 32" replaces, there was not much of a difference between 480P and 720P.
The jump from 1080P to 4k on my 75" display however is extremely pronounced.
It's not just is/or. - There are so many factors involved that help determine if a jump in resolution is worth it or not... And that has always held true.
I think there will be increasingly diminishing returns with every new jump in pixel count. There's only so much that the human eye can appreciate. Full HD to UHD already comes across as a minimal improvement with most people, it seems.
The human eye doesn't see in terms of pixels.
However... You are right, there is a fixed upper limit... But we aren't anywhere near it yet.
Give this a read. - https://wolfcrow.com/notes-by-dr-optoglass-the-resolution-of-the-human-eye/
The tv that i'm talking about, is for me and only me, to use. The curve definitely helps me be centered better, to the screen, even more than when i had my previous flat tv, which was 10" smaller.
Curved does have caveats, it does add distortion to the image... And off-center viewing can be a mixed bag.
Personally, I would rather not have it, not even for my computer monitor.
Need more information on panel sizes, panel type, distance from panel, connection method and so on to really make a proper assessment on whether 4k is going to be a decent jump for you.
Xbox One X likely does have the GPU power for a low-end next-gen console, it's the other aspects where it certainly falls short though.
Pretty much this.
I feel like next gen we will see a much smaller focus on better visuals than we have been seeing from gen to gen. I mean people were ready disappointed with 7th gen into 8th gen, so the focus needs a shift to remain relevant.
I feel like games will focus more on special effects, AI, physics , and world immersion. This means a huge leap in CPU levels compared to the underpowered jumps we have recieved in the past, and the X1X is still lacking in this aspect serverly to be even a low end next gen console. Like you said, while the GPU is able to get the job done, that is likely not what people are even concerned about.
It's simulation quality, I.E. The little things. - We want to see small insects going about their little lives in an open world.
We want to see weather deform the terrain.
We want those small details.
But that requires substantial CPU and Memory resources... Higher resolutions should also allow those smaller details to really pop as well.
They all work together to give an overall presentation... Resolution is important, but so much more goes into the graphics pipeline that it shouldn't be the only thing we are worried about.
The X One X might has the GPU power to render in native 4K, yet not to actually render 4K games. What you get now is games made for 1080p rendered at a higher resolution. It will look a bit better, just like rendering last gen 720p games on a 1080p screen. The assets, geometry, textures are still last gen. Same for games today rendered in 4K. There is no level of detail jump to go with the 1080p to 4K switch as there is between last gen and current gen games.
Generally there are visual concessions made in order for the Xbox One X to hit 4k or getting near 4k anyway.
Comparing my Xbox One X to my PC with it's RX 580 GPU, the differences becomes readily apparent... The Xbox One X will generally have less visual effects, but higher resolution and more often than not, lower framerates.
AMD's GPU's really aren't the most efficient hardware at hitting 4k to start with.
Personally I would rather more Xbox One X games target 1440P and drive up the effects than 4k with games looking fairly average.
|Mr Puggsly said:
So I got a good deal on a Vizio 4K TV, which...
...is your problem.
Get a Sony or a Samsung. If you're on a tight budget, select TCLs might do the trick.
The brand isn't actually important.
Many manufacturers like Vizio, TCL and so on... Use panels from Samsung and LG anyway.
Where things differ is in the other components that do the processing... For example... Allot of content isn't native 4k these days, thus the scaling algorithms on the Vizio may be inferior to the Samsung, despite leveraging the same Samsung panel, resulting in reduced image quality.