Quantcast
I am underwhelmed by my 4K TV and maybe X1X already has the GPU power for 9th gen.

Forums - Gaming Discussion - I am underwhelmed by my 4K TV and maybe X1X already has the GPU power for 9th gen.

Depending on which model you purchased, your underwhelming experience is understandable.  There are a lot of budget 4k tvs out there, and they aren't apples to apples with the higher end panels.  Vizio's P series Quantum is a pretty darn nice display, so if that's the one you got, it should look great.  I didn't get that one because its processing power for fast movement and for upscaling lower resolutions was poor, and I have lots of older content being played.  I went from a Samsung 50" 1080p LED DLP to a 65" Q7fn.  It's very bright, with great native contrast, and gaming mode looks and feels absolutely awesome. 

No dissapointment here and its only the tv upscaling.  native 4k games will be sweet on this tv when I go that route.

I'm stoked for micro led tv's, which up the ante even further for amazing hdr viewing; when it becomes affordable of course.



Around the Network

To 4k make difference, we need specific size, distance, etc. So, it will depend.
Still seems a premium feature, do not seems that everyone to get a nextbox will have the exact settings(distance, screen size, etc) to take the maximum advantage of the resolution of the TV.
Also, with less than half processing power to natively render 4k, you can render a 99% accurate 4k image using the modern SS techniques from a 1440p image.
So, i think going to full native 4k is more a marketing move than the best engineering solution.



The best thing for you to do is play on your 4k screen for the next month or two then hook it back up to a 1080p screen and then you will see the difference your new 4k TV makes and be like wtf is this 1080p blurriness I'm seeing.



Native 4K and Checker board rendering  are of diminishing returns: It's like comparing Sand at two different beaches at 6 feet away. The Org XB1 and PS4 was like comparing boulders and rocks at the beach. As the resolutions go higher and higher, the human eye is going to notice less of a difference between to resolutions like Native/Checker, especially at 6 to 8 feet or more away from a screen where most reside when playing console games.
Sure extra textures and graphical effects are def a bonus but are we really gonna see it with the parity needed between the org Xbox and X. So far we've seen nothing that extraordinary. Sony has managed to show more with Density in graphics than with a higher resolution.



Mr Puggsly said:

So I got a good deal on a Vizio 4K TV, which...

...is your problem.

Get a Sony or a Samsung.  If you're on a tight budget, select TCLs might do the trick.



Around the Network

As for the debate going on :

Anything with HDR >>> No HDR. It's that simple guys. The evolution of display will have to be different than a simple increase in resolution since diminishing returns are kick'n



Switch Friend Code : 3905-6122-2909 

I didn't notice much difference but that was because I played on the ps4 pro before X.



Your playing console games with a one size fits all mentality on a cheap 4k tv. Not surprising you won't see much difference.



WoodenPints said:
The best thing for you to do is play on your 4k screen for the next month or two then hook it back up to a 1080p screen and then you will see the difference your new 4k TV makes and be like wtf is this 1080p blurriness I'm seeing.

Not really. I have my old 52" 1080p tv sitting next to my 65" 4K HDR tv. The big difference is HDR. Native 1080p content actually looks sharper on the 1080p tv, no upscaling and smaller screen size. My 1080p projector does look kinda blurry nowadays. Yet that also has the worst contrast of the three. Colors are also better on the new 4K set, yet resolution wise 1080p still looks great. 1080p60 with HDR and rec.2020 color space is what you really want. Much better than 4k30 with fake HDR (yes you RDR2)



Mr Puggsly said:
Some of you guys are asking some technical questions, but ultimately a 4x pixel count just isn't a boost in clarity I anticipated. A common comparison is its not the jump from 480p to HD. The disparity from 720p to 1080p is also more apparent. I actually fiddled with various resolutions on the console just to see if I can detect a difference, but the disparity just wasn't that significant unless I'm looking very close.

Either way, I do like games running north of 1080p because super sampling looks great on a 1080p screen. But I'm sure 9th gen games will still look great at 1080p with good AA. I feel 1080p is a sweet spot because its inherently sharp to the human eye. However, 1080p with no AA can look bad due to jaggies.

We are asking for technical information for a good reason. - Variables.
And to be fair... The jump was pretty insignificant to the point of worthless going from 720P to 1080P on my 32" panel in the shed, on my old 24" Television that the 32" replaces, there was not much of a difference between 480P and 720P.

The jump from 1080P to 4k on my 75" display however is extremely pronounced.

It's not just is/or. - There are so many factors involved that help determine if a jump in resolution is worth it or not... And that has always held true.

Dante9 said:
I think there will be increasingly diminishing returns with every new jump in pixel count. There's only so much that the human eye can appreciate. Full HD to UHD already comes across as a minimal improvement with most people, it seems.

The human eye doesn't see in terms of pixels.
However... You are right, there is a fixed upper limit... But we aren't anywhere near it yet.

Give this a read. - https://wolfcrow.com/notes-by-dr-optoglass-the-resolution-of-the-human-eye/


Sixteenvolt420 said:

The tv that i'm talking about, is for me and only me, to use. The curve definitely helps me be centered better, to the screen, even more than when i had my previous flat tv, which was 10" smaller.

Curved does have caveats, it does add distortion to the image... And off-center viewing can be a mixed bag.

Personally, I would rather not have it, not even for my computer monitor.

Shiken said:
Pemalite said:
Need more information on panel sizes, panel type, distance from panel, connection method and so on to really make a proper assessment on whether 4k is going to be a decent jump for you.

Xbox One X likely does have the GPU power for a low-end next-gen console, it's the other aspects where it certainly falls short though.

Pretty much this.

I feel like next gen we will see a much smaller focus on better visuals than we have been seeing from gen to gen.  I mean people were ready disappointed with 7th gen into 8th gen, so the focus needs a shift to remain relevant.

I feel like games will focus more on special effects, AI,  physics , and world immersion.  This means a huge leap in CPU levels compared to the underpowered jumps we have recieved in the past, and the X1X is still lacking in this aspect serverly to be even a low end next gen console.  Like you said, while the GPU is able to get the job done, that is likely not what people are even concerned about.

It's simulation quality, I.E. The little things. - We want to see small insects going about their little lives in an open world.
We want to see weather deform the terrain.
We want those small details.

But that requires substantial CPU and Memory resources... Higher resolutions should also allow those smaller details to really pop as well.
They all work together to give an overall presentation... Resolution is important, but so much more goes into the graphics pipeline that it shouldn't be the only thing we are worried about.

SvennoJ said:
The X One X might has the GPU power to render in native 4K, yet not to actually render 4K games. What you get now is games made for 1080p rendered at a higher resolution. It will look a bit better, just like rendering last gen 720p games on a 1080p screen. The assets, geometry, textures are still last gen. Same for games today rendered in 4K. There is no level of detail jump to go with the 1080p to 4K switch as there is between last gen and current gen games.

Generally there are visual concessions made in order for the Xbox One X to hit 4k or getting near 4k anyway.

Comparing my Xbox One X to my PC with it's RX 580 GPU, the differences becomes readily apparent... The Xbox One X will generally have less visual effects, but higher resolution and more often than not, lower framerates.
AMD's GPU's really aren't the most efficient hardware at hitting 4k to start with.

Personally I would rather more Xbox One X games target 1440P and drive up the effects than 4k with games looking fairly average.

LivingMetal said:
Mr Puggsly said:

So I got a good deal on a Vizio 4K TV, which...

...is your problem.

Get a Sony or a Samsung.  If you're on a tight budget, select TCLs might do the trick.

The brand isn't actually important.
Many manufacturers like Vizio, TCL and so on... Use panels from Samsung and LG anyway.

Where things differ is in the other components that do the processing... For example... Allot of content isn't native 4k these days, thus the scaling algorithms on the Vizio may be inferior to the Samsung, despite leveraging the same Samsung panel, resulting in reduced image quality.



--::{PC Gaming Master Race}::--