Quantcast
I am underwhelmed by my 4K TV and maybe X1X already has the GPU power for 9th gen.

Forums - Gaming Discussion - I am underwhelmed by my 4K TV and maybe X1X already has the GPU power for 9th gen.

Pemalite said:
Mr Puggsly said:
Some of you guys are asking some technical questions, but ultimately a 4x pixel count just isn't a boost in clarity I anticipated. A common comparison is its not the jump from 480p to HD. The disparity from 720p to 1080p is also more apparent. I actually fiddled with various resolutions on the console just to see if I can detect a difference, but the disparity just wasn't that significant unless I'm looking very close.

Either way, I do like games running north of 1080p because super sampling looks great on a 1080p screen. But I'm sure 9th gen games will still look great at 1080p with good AA. I feel 1080p is a sweet spot because its inherently sharp to the human eye. However, 1080p with no AA can look bad due to jaggies.

We are asking for technical information for a good reason. - Variables.
And to be fair... The jump was pretty insignificant to the point of worthless going from 720P to 1080P on my 32" panel in the shed, on my old 24" Television that the 32" replaces, there was not much of a difference between 480P and 720P.

The jump from 1080P to 4k on my 75" display however is extremely pronounced.

It's not just is/or. - There are so many factors involved that help determine if a jump in resolution is worth it or not... And that has always held true.

Dante9 said:
I think there will be increasingly diminishing returns with every new jump in pixel count. There's only so much that the human eye can appreciate. Full HD to UHD already comes across as a minimal improvement with most people, it seems.

The human eye doesn't see in terms of pixels.
However... You are right, there is a fixed upper limit... But we aren't anywhere near it yet.

Give this a read. - https://wolfcrow.com/notes-by-dr-optoglass-the-resolution-of-the-human-eye/


Sixteenvolt420 said:

The tv that i'm talking about, is for me and only me, to use. The curve definitely helps me be centered better, to the screen, even more than when i had my previous flat tv, which was 10" smaller.

Curved does have caveats, it does add distortion to the image... And off-center viewing can be a mixed bag.

Personally, I would rather not have it, not even for my computer monitor.

Shiken said:

Pretty much this.

I feel like next gen we will see a much smaller focus on better visuals than we have been seeing from gen to gen.  I mean people were ready disappointed with 7th gen into 8th gen, so the focus needs a shift to remain relevant.

I feel like games will focus more on special effects, AI,  physics , and world immersion.  This means a huge leap in CPU levels compared to the underpowered jumps we have recieved in the past, and the X1X is still lacking in this aspect serverly to be even a low end next gen console.  Like you said, while the GPU is able to get the job done, that is likely not what people are even concerned about.

It's simulation quality, I.E. The little things. - We want to see small insects going about their little lives in an open world.
We want to see weather deform the terrain.
We want those small details.

But that requires substantial CPU and Memory resources... Higher resolutions should also allow those smaller details to really pop as well.
They all work together to give an overall presentation... Resolution is important, but so much more goes into the graphics pipeline that it shouldn't be the only thing we are worried about.

SvennoJ said:
The X One X might has the GPU power to render in native 4K, yet not to actually render 4K games. What you get now is games made for 1080p rendered at a higher resolution. It will look a bit better, just like rendering last gen 720p games on a 1080p screen. The assets, geometry, textures are still last gen. Same for games today rendered in 4K. There is no level of detail jump to go with the 1080p to 4K switch as there is between last gen and current gen games.

Generally there are visual concessions made in order for the Xbox One X to hit 4k or getting near 4k anyway.

Comparing my Xbox One X to my PC with it's RX 580 GPU, the differences becomes readily apparent... The Xbox One X will generally have less visual effects, but higher resolution and more often than not, lower framerates.
AMD's GPU's really aren't the most efficient hardware at hitting 4k to start with.

Personally I would rather more Xbox One X games target 1440P and drive up the effects than 4k with games looking fairly average.

LivingMetal said:

...is your problem.

Get a Sony or a Samsung.  If you're on a tight budget, select TCLs might do the trick.

The brand isn't actually important.
Many manufacturers like Vizio, TCL and so on... Use panels from Samsung and LG anyway.

Where things differ is in the other components that do the processing... For example... Allot of content isn't native 4k these days, thus the scaling algorithms on the Vizio may be inferior to the Samsung, despite leveraging the same Samsung panel, resulting in reduced image quality.

Not referring to the brand.  I'm referring to the TVs they make.  In fact, you just validated my statement since these tv manufacturers differ in how the image is processed.  Thank you.



Around the Network
Intrinsic said:
I have said this a million times......... Resolution is not as important as most will have people believe.

Simple truth is that if sitting anywhere from 8ft to 10ft from a TV it starts getting pretty hard pointing out the difference "at a glance" between 4k and 1080p. Things like HDR are actually more obvious that "higher resolutions".

And I am with you on the next gen thing; I honestly would prefer some sort of 4kCB solution that allows devs run most games at 60fps or at 30fps with a lot more glitz and glamour. I am still in the camp that feels the resources sent on pushing higher resolutions is nothing but a waste. I mean its nice to have, but I feel there are things more important than the sez of a game.........

Lol i sit 10 feet away from my 55inch and a quality h264 rip even at standard resolution is good. Is it as good as a 1080p? no but a good video codec can make SD resolution watchable without you being pissed off.



 

 

LivingMetal said:

Not referring to the brand.  I'm referring to the TVs they make.  In fact, you just validated my statement since these tv manufacturers differ in how the image is processed.  Thank you.

I wasn't really disagree or agreeing, just expanding upon the issue of brands and panels.



--::{PC Gaming Master Race}::--

m0ney said:

Ehh at 1080p some games look like everything is covered in oil, 2k or 4k or 8k it will look the same. Until games are made with internal resolution of 4k in mind, differences will be negligible.

It will always be a trade off on resolution and potential graphics/performance though. I mean Halo 5 for example on X1X pushed for dynamic 4K. But I would prefer a lower resolution and boost on other settings.

Temporal AA is becoming popular due to the low overhead but does make games look a bit soft, more so when below 1080p. That has been a bigger problem on the base X1.

I say 1080p with good AA is fine. Halo:MCC for example is 1080p with no AA and the jagggies make it look a bit rough even if its a vast improvment over the original.



Recently Completed
Crackdown 3
for X1S/X1X (4/5) - Infinity Blade III - for iPad 4 (3/5) - Infinity Blade II - for iPad 4 (4/5) - Infinity Blade - for iPad 4 (4/5) - Wolfenstein: The Old Blood for X1 (3/5) - Assassin's Creed: Origins for X1 (3/5) - Uncharted: Lost Legacy for PS4 (4/5) - EA UFC 3 for X1 (4/5) - Doom for X1 (4/5) - Titanfall 2 for X1 (4/5) - Super Mario 3D World for Wii U (4/5) - South Park: The Stick of Truth for X1 BC (4/5) - Call of Duty: WWII for X1 (4/5) -Wolfenstein II for X1 - (4/5) - Dead or Alive: Dimensions for 3DS (4/5) - Marvel vs Capcom: Infinite for X1 (3/5) - Halo Wars 2 for X1/PC (4/5) - Halo Wars: DE for X1 (4/5) - Tekken 7 for X1 (4/5) - Injustice 2 for X1 (4/5) - Yakuza 5 for PS3 (3/5) - Battlefield 1 (Campaign) for X1 (3/5) - Assassin's Creed: Syndicate for X1 (4/5) - Call of Duty: Infinite Warfare for X1 (4/5) - Call of Duty: MW Remastered for X1 (4/5) - Donkey Kong Country Returns for 3DS (4/5) - Forza Horizon 3 for X1 (5/5)

Pemalite said:

The human eye doesn't see in terms of pixels.
However... You are right, there is a fixed upper limit... But we aren't anywhere near it yet.

Give this a read. - https://wolfcrow.com/notes-by-dr-optoglass-the-resolution-of-the-human-eye/

That article over complicates things, yet the base of it is that 0.4 arc minute is the absolute maximum in the fovea.

0.4 arc minute translates to 150 pixel per degree. So a 4K tv with 3840 pixels can take up 25.6 degrees of your field of view if you have perfect vision. It's overkill though and you only get that 0.4 arc resolution at maximum contrast. 20/20 vision corresponds to 60 pixels per degree.

Diminishing returns in full effect

Test subjects were asked to tell which picture looked more real. As you can see it turns into guess work over 100 pixels per degree




Around the Network
Screenshot said:
Your playing console games with a one size fits all mentality on a cheap 4k tv. Not surprising you won't see much difference.

Not sure what that means. My 1080p 40 inch was much cheaper by the way and a less notable brand. My Vizio is 50 inches and has quadruple the pixel count, Im saying the difference is relatively minor unless you're looking closely for differences.

But the objectively noticeable improvment is what the X1X does on any screen. The higher native resolution gets rid of jaggies, the increased graphics and performance is also significant. Hence, I didnt need a TV upgrade per se because the X1X was already fixing the flaws I could see on 1080p screen. The extra sharpness I'm getting the new TV is less significant.



Recently Completed
Crackdown 3
for X1S/X1X (4/5) - Infinity Blade III - for iPad 4 (3/5) - Infinity Blade II - for iPad 4 (4/5) - Infinity Blade - for iPad 4 (4/5) - Wolfenstein: The Old Blood for X1 (3/5) - Assassin's Creed: Origins for X1 (3/5) - Uncharted: Lost Legacy for PS4 (4/5) - EA UFC 3 for X1 (4/5) - Doom for X1 (4/5) - Titanfall 2 for X1 (4/5) - Super Mario 3D World for Wii U (4/5) - South Park: The Stick of Truth for X1 BC (4/5) - Call of Duty: WWII for X1 (4/5) -Wolfenstein II for X1 - (4/5) - Dead or Alive: Dimensions for 3DS (4/5) - Marvel vs Capcom: Infinite for X1 (3/5) - Halo Wars 2 for X1/PC (4/5) - Halo Wars: DE for X1 (4/5) - Tekken 7 for X1 (4/5) - Injustice 2 for X1 (4/5) - Yakuza 5 for PS3 (3/5) - Battlefield 1 (Campaign) for X1 (3/5) - Assassin's Creed: Syndicate for X1 (4/5) - Call of Duty: Infinite Warfare for X1 (4/5) - Call of Duty: MW Remastered for X1 (4/5) - Donkey Kong Country Returns for 3DS (4/5) - Forza Horizon 3 for X1 (5/5)

I wonder if the difference seen makes a bigger difference if the assets are 4K. Which games actually use 4K assets and resolution. I would believe those games should show a much better difference in image quality.



Machiavellian said:
I wonder if the difference seen makes a bigger difference if the assets are 4K. Which games actually use 4K assets and resolution. I would believe those games should show a much better difference in image quality.

That's how I tend to see it. 4k Isn't all that worthwhile to me, not until we start seeing 4k assets (not upressed) for everything. I've done 4k from 1080p and 1440p, and it doesn't make the 1080x sub 1080x assets looking any better. 



                                       

Machiavellian said:
I wonder if the difference seen makes a bigger difference if the assets are 4K. Which games actually use 4K assets and resolution. I would believe those games should show a much better difference in image quality.

It's simpler and more complicated. The simple part is that if you have 4K x 4K texture assets available (for every texture in the game), getting them on-screen requires a patch to add those files to the local filesystem, whether that be PS4 / XB1 or PC. No changes to levels / etc need to be made. The complicated part is that there are compromises needed to be made to run at 4K output without blowing HW budgets. Deferred rendering uses 3-5 (or 8 if you're Infamous Second Son, at a whopping 41 bytes / pixel) screen-sized render targets, often 4 bytes a piece, read multiple times per frame. Those get heavy quickly in terms of processing time and memory bandwidth. Blow your bandwidth budget, and you don't make frame time or have to compromise on anti-aliasing. IMO, anti-aliasing is the next frontier in console graphics. I've read about an alternative to deferred shading called the visibility buffer. It's deferred but designed for high resolution displays. It's 2 render targets (12 bytes total) and quite a bit smaller than most deferred pipelines (16-20+), storing triangle references but no triangle attributes (color / normal / specular / roughness / etc.). MGSV is 4 render targets including depth, but it has to compromise and try to cram normals into 8 bits / channel when you want 10 or 11. You get artifacts from that, that AA can't totally resolve. It sucks and the visibility buffer solution has none of those problems. Even better, you get MSAA compatibility for free. With 4K being a thing, I think (and hope) that the Visibility Buffer wins out next-gen. We'll get better visuals at a smaller performance cost. That leaves more room for anti aliasing and newer / better rendering techniques.

See this for the gritty details on the Visibility Buffer.

http://www.conffx.com/Visibility_Buffer_GDCE.pdf



Currently (Re-)Playing: Starcraft 2: Legacy of the Void Multiplayer, The Legend of Zelda: A Link to the Past

Currently Watching: The Shield, Stein's;Gate, Narcos

Mr Puggsly said:
Screenshot said:
Your playing console games with a one size fits all mentality on a cheap 4k tv. Not surprising you won't see much difference.

Not sure what that means. My 1080p 40 inch was much cheaper by the way and a less notable brand. My Vizio is 50 inches and has quadruple the pixel count, Im saying the difference is relatively minor unless you're looking closely for differences.

But the objectively noticeable improvment is what the X1X does on any screen. The higher native resolution gets rid of jaggies, the increased graphics and performance is also significant. Hence, I didnt need a TV upgrade per se because the X1X was already fixing the flaws I could see on 1080p screen. The extra sharpness I'm getting the new TV is less significant.

Point is neither X1X or the PRO are powerful enough to showcase 4k properly. We are still at least 3+ gpu generations away. So the current games don't showcase it either. Especially draw distance in open world games.  Comparing 1080p to 4k in console games is rather pointless due to the current hardware limitations. I game on pc and upgraded from a 37" 1080p to a 40"k tv 2 years ago. It is more noticeable on pc since you can crank up the graphics beyond default in many games. However, you should see the difference watching You Tube videos or movies in 4k or 8k downsampled to 4k which looks even better if you can stream 8k.