People dumb enough to spend money on 4K are the only people convinced its the future. Its not. Its not going to be the standard any time soon. Maybe in 5 years.

Are you excited for 4K? | |||
| Yes | 58 | 29.15% | |
| No | 123 | 61.81% | |
| See Results | 16 | 8.04% | |
| Total: | 197 | ||
People dumb enough to spend money on 4K are the only people convinced its the future. Its not. Its not going to be the standard any time soon. Maybe in 5 years.

There's f all content for 4K, so upgrading now is pointless. TV brodcasts are still only 1080i . 4K games are a ways off yet,
TV companies have moved on to 4K simply because they need to convince you that your old TV is shit and needs to be upgraded. Remember how hard they were pimping 3D a couple of years back?
| JazzB1987 said: There is simply nothing to watch in 4k. Old movies also have no information to fill a 4k movie file/disk because noone thought that we would ever need something like that. |
You're very wrong about that.
Most movies and tv shows in the pre digital era were shot on 35mm, some on 65mm. A lot of tv shows are still shot on 35mm.
- 35mm film can resolve detail upto 3.2K, and since it's used anamorphically (full frame is used to store a widescreen movie) it's close to 3200x2400 for any aspect ratio. That even exceeds the vertical information for 2160p.
- 35mm film has full color info. Every consumer level video you see now is 4:2:0 chroma subsampled, which means that only a quarter of the color info stored. The grey level is stored in 1920x1080, color only at 960x540. 4K video will probably use the same chroma subsampling scheme, but at least you get 1920x1080 color information.
- 35mm film has bettter color definition then 8 bit rec.720, 4K video will hopefully support rec.2020 and 10 or even 12 bit deep color.
You're right about 2000's movies. They were a step back from 35mm film. Movies from 2000-2010 ish were all mastered in 2K. They won't have any resolution benefit, however they will have benefits in better color definition.
So yes Back to the futures and Star Wars episode 1,4,5,6 will benefit, Star Wars episode 2,3 and Lotr not so much.
A lot of current movies are shot in 5K and mastered in 4K. However the CGI is often still rendered at 2K, so still kind of a mixed bag. Ofcourse better compression and higher bandwidth will still help with picture stability, less artifacts, less smoothing out of moving objects and over all better definition in scenes with lots of action and particle effects.
You can get most of those benefits on a 1080p display as well. A 1080p OLED display with a 4K video source will look astounding. Downsampling can yield great results, just look at Samsara, scanned from 65mm negatives in 8K, mastered in 4K, downsampled to 1080p on blu-ray. It makes regular movies look soft and dull in comparison. That's one of the first movies I want to see in 4K.
Ironically it's the damn 4K players that were absent at CES 2014. The BDA has promised to finalize the specs before the end of the year, but I had really hoped to see them this CES. Sony already has plenty 4K masters ready, but for now you can only get them via streaming 40GB files, sure it's 4K but it still uses h.264 for now and the bandwidth is not much better then blu-ray. It looks great in slow scenes, action scenes still suffer.
Anyway that doesn't mean a 4K tv is useless atm. As long as you get one with HDMI 2.0 and preferably h.265 (HEVC) streaming ready. You will be able to enjoy 4K netflix soon. (if you have the bandwidth) Plus 720p to 4K upscaling looks much better then 720p to 1080p. Upscaling blu-ray yields good results too as the chroma subsmapled signal can be interpolated more precisely. And 4K tvs ara lot cheaper then other alternatives for improved picture quality.
Your choices for better picture quality:
$11K LG 55" 55EA9800 1080p OLED, THX certified, best picture quality for sure.
$5k Panasonic Viera 65" TCP65ZT60 600hz 1080p Plasma, THX cerified, 2nd best in picture quality.
Now OLED is not going anywhere atm due to production difficulties, Plasma is being discontinued, while 4K is dropping to $999 for 50".
http://ca.ign.com/articles/2014/01/08/ces-vizio-4k-tvs-will-start-at-999-for-the-50-inch-model
At $2600 for the 70" version, it's a very good alternative to spending $5k or more on 1080p.
Or you can keep your eye on this
http://ces.cnet.com/8301-35303_1-57617001/panasonics-prototype-4k-led-boasts-plasma-like-picture/
That should be able to replace the ZT60 with the benefit of 4K. (No price announced yet though)
TomaTito said:
You are comparing apples with oranges. It is obviously beneficial for screens that are centimeters away from the viewer; but TVs are normally meters away, only 55+ would start to see benefits.
|
That chart is based on SMPTE 30. It sets the line at 60 pixels per degree before recommending a higher resolution. Now it's true that 20/20 vision is based on 30 cycles per degree resolution to be able to read text. But human visual acuity easily goes up to 40 or 50 cycles per degree and beyond. The point is not from what distance you can still make out the individual pixels, but from where you see a smooth life like picture.
Here's another chart from the absolute limits of human vision, to get pictures indistinguishable from looking out a window.


those tvs are just for watching tv at that quality, cant see current consoles display that by a long shot
"I think people should define the word crap" - Kirby007
Join the Prediction League http://www.vgchartz.com/predictions
Instead of seeking to convince others, we can be open to changing our own minds, and seek out information that contradicts our own steadfast point of view. Maybe it’ll turn out that those who disagree with you actually have a solid grasp of the facts. There’s a slight possibility that, after all, you’re the one who’s wrong.
|
That chart is based on SMPTE 30. It sets the line at 60 pixels per degree before recommending a higher resolution. Now it's true that 20/20 vision is based on 30 cycles per degree resolution to be able to read text. But human visual acuity easily goes up to 40 or 50 cycles per degree and beyond. The point is not from what distance you can still make out the individual pixels, but from where you see a smooth life like picture. |
I doubt those results. I have played 720p and 1080p with my 40" TV from distance of around 7 feet and difference is very minimal. Actually, there have been a lot of people who have thought 720p is 1080p and 1080p is 720p. I'm sure that 99% of the people won't see any difference at all between 4K and 1080p on less than 40" screen at view distances more than 6 feet when most can't even see the difference between 720p and 1080p on same screen. Even for those 1% difference is extremely minimal.
Anyway, even 1080p is overhyped and 4K even more so. Sure, it might be the "future" but it will never be a must-have. 1080p is just fine even 10 years from now.
I am not going 4K until the next new gen. Games even now get released in non 1080P still.

There's just no reason to buy 4K yet, because there's no content. The situation is just like 3DTV - just even much worse:
- current 4K TVs are still crap, because to properly get 4K content to your TV you need HDMI 2.0, which was introduced just three months ago. I don't know of any current 4K model that already uses this new standard, all the ones I checked were using HDMI 1.4 - which is technically capable of displaying 4K content, but only at 24 frames per second. So people who buy a 4K TV right now will buy a model that will be technically outdated in a few months
- BluRay was able to handle 3DTV, so people can simply walk into a store and buy a disc with 3D content. but BluRay cannot handle anything beyond 1080p - so there's not even a disc standard yet that could be used for providing 4K content. And even if such a standard would be introduced today, it would take years to get a somewhat reasonable adoption rate.
Video game consoles will not use 4K resolution anytime soon. It's just not worth it. They'd need 4 times the graphics processing power of 1080p resolutions, and yet, most people would hardly see any difference in their personal TV setup.
Remember that at a time when 4K TVs were already available and affordable, Microsoft still decided to rather go for higher graphics quality at 720p resolution rather than lower graphics quality at 1080p resolution. And as much as I like to make fun of that: that was a wise decision.
In practice, it will be like 3DTV: In a few years, everyone will be buying TVs with 4K resolution, even those who are not really interested in the technology. Just like people nowadays are pretty much all buying 3D capable TVs. Because once the price difference to models without these features is neglectable, there's just no reason for buyers to miss out on these features.
Untamoi said:
Anyway, even 1080p is overhyped and 4K even more so. Sure, it might be the "future" but it will never be a must-have. 1080p is just fine even 10 years from now. |
720p at 40" from 7ft is below 20/20 vision. The difference should be clear, especially if one is upscaled to 1080p and the other 1080p native.
The study was done with double blind tests, asking people which of 2 pictures feels more real. It's not a matter of can you make out the pixels, it's the point at which aliasing disappears altogether. I can see a clear difference in sharpness between watching 1080p at 60 pixels per degree (92" 1080p projector screen at 12ft) and 100 pixels per degree (52" 1080p at just over 11ft)
Anyway 360p you tube is also still fine, that doesn't mean an IMAX experience in your home isn't fun to have. It's all about field of view with the resolution to back it up. THX recommends a field of view of 40 degrees for the most immersive cinematic experience, which is 1.2 x diagonal viewing distance. That equates to 4ft from a 40" screen. 4K looks a lot nicer at that distance. With 1080p you're only at 46 pixels per degree at that distance.
(So yes if you want to sit at 7ft you need a 70" tv to get the full THX experience, at 12ft a 120" projector screen)
Just as with Occulus rift, you have to see it first to get convinced of the benefits.