By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Movies & TV - Do you prefer your TV over a cinema screen?

 

I prefer...

Cinema 9 30.00%
 
Home TV 17 56.67%
 
Home projector 0 0%
 
Any screen, even my phone 1 3.33%
 
Imax 2 6.67%
 
4DX 1 3.33%
 
Total:30
LegitHyperbole said:
Pajderman said:

Yep, I know I'm not in a majority when it comes to this preference. Hench why cinemas and movie makers generally do not increase the framerates presented. As well as they shouldn't since most prefer it that way. And because of this I like watching movies at home more than at the cinema. 

One of the most important aspects for my when buying a new TV is the frame insertion and motion handling tech the TV provides. I want a smooth, clear, stutter free picture without washed out details around the moving objects or ugly artifacts. The best option would be if the source already had higher fps but since so few releases go for it I have to put my trust in the TV-manufacturers ability. 

Yep they'd really need to film at 60fps. I'd hold out hope though, I'd reckon it will change as the younger generation grow up with smooth frame rates and never experience the soap opera effect. 

The issue is with the lighting and lack of motion blur. Shorter exposure ruins the classic 'film look'. Movies have been playing at higher fps on TV all through the boomer and gen x generations. 3:2 pulldown to put 24fps into 60hz with interlacing. So it's not a dislike of higher frame rates, it's filming in higher frame rates that is the problem / has to mature.

Studio lights having to be brighter to compensate for shorter exposure makes it look different. And less motion blur makes it look more like TV. So in your case better frame insertion is the way to go to reduce stutter.

It would be best of both worlds if the blu-ray master (or streaming) offered 2 tracks. One at 24fps and one with AI enhanced frame insertion at 60 fps. Let a super computer analyze the whole movie (from the source bit rate without compression artifacts) and create intermediate frames, then re-render it all at 60hz.



Around the Network
SvennoJ said:
LegitHyperbole said:

Yep they'd really need to film at 60fps. I'd hold out hope though, I'd reckon it will change as the younger generation grow up with smooth frame rates and never experience the soap opera effect. 

The issue is with the lighting and lack of motion blur. Shorter exposure ruins the classic 'film look'. Movies have been playing at higher fps on TV all through the boomer and gen x generations. 3:2 pulldown to put 24fps into 60hz with interlacing. So it's not a dislike of higher frame rates, it's filming in higher frame rates that is the problem / has to mature.

Studio lights having to be brighter to compensate for shorter exposure makes it look different. And less motion blur makes it look more like TV. So in your case better frame insertion is the way to go to reduce stutter.

It would be best of both worlds if the blu-ray master (or streaming) offered 2 tracks. One at 24fps and one with AI enhanced frame insertion at 60 fps. Let a super computer analyze the whole movie (from the source bit rate without compression artifacts) and create intermediate frames, then re-render it all at 60hz.

I'd reckon it'd be much easier to film at 60 and make it look 24fps in Post processing the the reverse of this. 



SvennoJ said:

The issue is with the lighting and lack of motion blur. Shorter exposure ruins the classic 'film look'. Movies have been playing at higher fps on TV all through the boomer and gen x generations. 3:2 pulldown to put 24fps into 60hz with interlacing. So it's not a dislike of higher frame rates, it's filming in higher frame rates that is the problem / has to mature.

Studio lights having to be brighter to compensate for shorter exposure makes it look different. And less motion blur makes it look more like TV. So in your case better frame insertion is the way to go to reduce stutter.

It would be best of both worlds if the blu-ray master (or streaming) offered 2 tracks. One at 24fps and one with AI enhanced frame insertion at 60 fps. Let a super computer analyze the whole movie (from the source bit rate without compression artifacts) and create intermediate frames, then re-render it all at 60hz.

I think that type of frame insertion is on the way even doing it live by analyzing a few seconds of the movie at a time. Samsung's latest high end TVs have a neural engine and I think with time that will be standard. But mastering the blu-rays with options of different frame rates would be great. Donno how much of an audience there is though. Picture purists - the customers who buy 4K media - generally dislike the high frame rate look and despises any frame insertion algorithms in use.

Good point regarding the exposure time for the picture. For someone like me who have no real experience with photography I stupidly mostly think that a picture moment just is but when that moment is meant to be captured time is needed for the collection of the data. 



Pajderman said:
SvennoJ said:

The issue is with the lighting and lack of motion blur. Shorter exposure ruins the classic 'film look'. Movies have been playing at higher fps on TV all through the boomer and gen x generations. 3:2 pulldown to put 24fps into 60hz with interlacing. So it's not a dislike of higher frame rates, it's filming in higher frame rates that is the problem / has to mature.

Studio lights having to be brighter to compensate for shorter exposure makes it look different. And less motion blur makes it look more like TV. So in your case better frame insertion is the way to go to reduce stutter.

It would be best of both worlds if the blu-ray master (or streaming) offered 2 tracks. One at 24fps and one with AI enhanced frame insertion at 60 fps. Let a super computer analyze the whole movie (from the source bit rate without compression artifacts) and create intermediate frames, then re-render it all at 60hz.

I think that type of frame insertion is on the way even doing it live by analyzing a few seconds of the movie at a time. Samsung's latest high end TVs have a neural engine and I think with time that will be standard. But mastering the blu-rays with options of different frame rates would be great. Donno how much of an audience there is though. Picture purists - the customers who buy 4K media - generally dislike the high frame rate look and despises any frame insertion algorithms in use.

Good point regarding the exposure time for the picture. For someone like me who have no real experience with photography I stupidly mostly think that a picture moment just is but when that moment is meant to be captured time is needed for the collection of the data. 

You can see the difference in Saving Private Ryan. The scene where they storm the beach was filmed with short exposure time to have it stand out, more dramatic look due to lack of motion blur. Making it look more like a documentary. That works fine in bright daylight, yet with artificial lighting, needing to have brighter lights is what made The Hobbit HFR look 'weird'. Less light is easier to control and 24fps filming has 100 years of experience behind it to make it look good.

Cameras getting better at capturing low light, and more experience at filming at higher frame rates will also make HFR movies look better. Motion blur can always be added later for classic film effect.

https://cinemashock.org/2012/07/30/45-degree-shutter-in-saving-private-ryan/

You can also see several explosions, and Janusz came up with the idea of shooting with the shutter open to 45 degrees or 90 degrees, which completely negated any blurring. Often, when you see an explosion with a 180-degree shutter it can be a thing of beauty, but a 45-degree shutter looks very frightening. [STEVEN SPIELBERG]

45 degree shutter would be exposure for 1/8th of the frame, or 5.2 (1/192) ms per frame as opposed to the standard 20.8 (1/48) ms per frame.

For HFR (48 fps) a 180 degree shutter would give you 10.4 ms per frame. I guess it's technically possible nowadays to go full 360 shutter (no shutter at all) and expose each frame for 20.8 ms or close enough to allow for time to reset the sensor for the next frame.

(The degrees are from mechanical rotary shutters, basically a plate that rotates with a gap passing before the lens. The bigger the gap the longer the exposure)



I've noticed a few high end TVs are supporting 144 hz, which is pretty cool. Not sure how necessary it is. I find the difference between 60 fps and 90 fps incredibly stark, 90 to 120 fps isn't as big of jump. Having said that gaming at 144 fps would be likely increase accuracy of controls.



i7-13700k

Vengeance 32 gb

RTX 4090 Ventus 3x E OC

Switch OLED

Around the Network
Chrkeller said:

I've noticed a few high end TVs are supporting 144 hz, which is pretty cool. Not sure how necessary it is. I find the difference between 60 fps and 90 fps incredibly stark, 90 to 120 fps isn't as big of jump. Having said that gaming at 144 fps would be likely increase accuracy of controls.

Playing FH4 at 144hz was very smooth! However dropping it down to 72fps for Halo Infinite was still very responsive. That seems to be a sweet spot, better than 60, more stable than aiming for higher. I had to drop it down to 48fps for the outdoor sections and there the difference between 48 and 72 was noticeable. Still better than 30 but not quite as smooth.

Anyway 144hz display gives you more options for capped frame rates.