By using this site, you agree to our Privacy Policy and our Terms of Use. Close
 

What res are you at now?

1080p 301 61.30%
 
720p 34 6.92%
 
sub HD 8 1.63%
 
1440p 26 5.30%
 
4k 85 17.31%
 
5k 3 0.61%
 
8k, somehow. 13 2.65%
 
Pixels, I require none. 5 1.02%
 
None. 2 0.41%
 
Other. 14 2.85%
 
Total:491
vivster said:
caffeinade said:

Yeah, I was talking about consumer level.
I should of clarified though, my bad.

Consumers too because higher numbers = better. That's why there are 800Hz TVs.

Trust me, they will easily sell 8K TVs in the near future which will only run 4k content because there isn't 8k content yet. After that they'll sell 16k TVs that will maybe have a bit 8K content.

Fair point, there is a reason that mobile games make so much money.
People sure do love their increasing numbers.



Around the Network
Lenny93 said:
vivster said:

Consumers too because higher numbers = better. That's why there are 800Hz TVs.

Trust me, they will easily sell 8K TVs in the near future which will only run 4k content because there isn't 8k content yet. After that they'll sell 16k TVs that will maybe have a bit 8K content.

The human eye can not percieve extra detail past a certain resolution (8K, 90 degrees field of view) so higher numbers does not always equal better.  

It's no quite as clear cut as that. Double blind tests have shown that humans can still tell which is the better display above 100 pixels per degree, but it starts dropping off rapidly after that to become guesswork between 150 to 200 pixels per degree. 20/20 vision is said to be comparable to 60 pixels per degree, but that's a lower limit at which you can pass reading tests, and see all the jaggies.


Test with 3 static images show probability dropping to 50% at aout 100 cycles or 200 pixels per degree.
Moving images could betray jaggies more easily though. But as you can see at about 85 pixels per degree (8K at 90 degrees) 90% of test subbjects can still tell which image looks more clear. For a 70 inch 8K screen you get 60 pixels per degree at 2.3 ft viewing distance, 200 pixels per degree at 7.6 ft viewing distance.

16K is only relevant for VR. Ofcourse the human eye can only resolve that much detail in a 2 degree fov, outside 10 you hardly see any detail. With proper eye tracking and foveated rendering, giving the illision of a full 16K VR display won't be anymore taxing than rendering 4K currently. You do first need dual 16K panels small enough to fit in glasses and the bandwidth to display dual 16k at 120hz. (Since only a small part of the image has actual 16K detail, smart compression will be key) Anyway that way you can have 102 pixels per degree at 150 degree fov per eye. (human limit)
And sure, some will argue 32k glasses will be better and in theory you should still be able to tell the difference.


Outside 10 degree fov your eyes resolve less than 20% of resolution. So basically your 16k image can drop to a 2K render 15 degrees away from where you're focussing. Ofcourse current headsets are only 1K ish per eye horizontally. Still foveated rendering can already begin to help.



vivster said:
caffeinade said:

I do find it hard to believe that people will need 32k, and I strongly advocate for 8k.
Though that could just be ignorance.

Higher resolutions will always be strived for in science. Ask any astronomer and they'll tell you that 128k isn't even close to enough. We need gigapixels!

Not in screens. Cameras, yes, but you don't need a 128k screen. You can just zoom in (that way, the object you are looking at won't be so tiny that you can't see it).



Teeqoz said:
vivster said:

Higher resolutions will always be strived for in science. Ask any astronomer and they'll tell you that 128k isn't even close to enough. We need gigapixels!

Not in screens. Cameras, yes, but you don't need a 128k screen. You can just zoom in (that way, the object you are looking at won't be so tiny that you can't see it).

Not on my watch.



Lenny93 said:
vivster said:

And now tell me that you believe that the average customer cares.

Why would manufacturers invest in a resolution with no percievable gain? And why would consumers purchase TVs with a resolution that leads to no percievable gains? I think they will invest in ultra HDR (if it's a significant improvement) and better image quality. There's a reason we never had HD audio CDs, humans couldn't tell the difference between them and regular CD quality.

They'll invest on high pixel count because the big number helps them sell new TVs to naïve people (the vast majority, alas) that would be perfectly fine with the 4k (or even 1080p) they already own, and that's a relatively modest investment for them, as display tech still has a lot of miniaturization available, unlike integrated circuits that are getting close to silicon physical limits. They'll also invest on HDR, but they'll always boast pixel count.
The sad news though is that such tech will have few contents on broadcast TV, mainly premium ones, as bandwidth is a limited resource, and TV netwoks will offer high res and HDR only on their main clear channels, while the other, like now, will keep on receive more limited banwidth, so people will watch on them interpolated 4k, furtherly interpolated to 8k by their TV set if they own a 8k one, and forget about HDR, they'll find some cheap way to replace it (BTW MPEG and other lossy formats already use very cheap colour encoding with fewer bits for red than for green, even fewer for blue, and they encode colour infos at a lower resolution than brightness).



Stwike him, Centuwion. Stwike him vewy wuffly! (Pontius Pilate, "Life of Brian")
A fart without stink is like a sky without stars.
TGS, Third Grade Shooter: brand new genre invented by Kevin Butler exclusively for Natal WiiToo Kinect. PEW! PEW-PEW-PEW!