By using this site, you agree to our Privacy Policy and our Terms of Use. Close

Forums - Gaming - Difference between 60hz and 120hz?

platformmaster918 said:
so you're absolutely sure that there won't be a difference between 60hz and 120hz for gaming because that's obviously the main reason I'm buying it.  

If you want a great monitor for PC gaming, I would not get a 120-240Hz budget 40-46 inch LCD/LEDs. Just get a 2560x1440 Catleap 27 inch IPS from eBay for $300. Better color accuracy, deeper blacks, higher resolution for crisper textures and has received positive feedback from many satisfied PC gamers. OTOH, if you want a great TV for movies and sports, forget LED/LCD and get a Plasma (Panasonic or at least a Samsung). Better color accuracy, viewing angles, response times for fast moving sports/games, deeper blacks, higher real world contrast ratio, etc than 97% of LED/LCDs. I'd rather get a 42 inch Panasonic Plasma and spend $300 for a Catleap than get a crappy Sony 120-240Hz LED/LCD that sucks at both gaming and videophile IQ for sports/movies. Sony's and other low end LCD/LEDs TVs have especially poor image quality. To get decent IQ on LED/LCD, you have to step up to more expensive LEDs. Essentially, to even come close to Plasma in IQ, you are going to need to spend way more $ with a like-performing LED, which means you can basically get a 27 inch Catleap and a Panny plasma for the same price. Also, most 120-240Hz TVs only output real 60Hz and just extrapolate the refresh rate. You won't get actual 120-240Hz refresh rate with a PC from a Sony 120-240Hz LED. It's not the same as getting something like an Asus VG278H monitor specifically made for 120Hz on the PC.



Around the Network
Hynad said:
slowmo said:
Hynad said:
slowmo said:
I think the difference of a 120Hz TV to the 60Hz TV's is more to do with the beefed up scaler they use in the TV's rather than any actual big wow factor from the frequency as most sources will never exceed 60Hz except for 3D content and PC's.


The main benefit I see with 120Hz sets is that unlike 60Hz sets, 120 is a multiple of 24. 

So the standard of 24 images per seconds from the movie industry fits confortably in there. Meaning that the TV doesn't need to do the 3:2 pull down to compensate. Which introduces judders especially noticeable during camera pans. 24 FPS actually fits 5 times in 120. So the TV plays each frames 5 times, making for a smoother experience than on 60Hz sets.

That's only Bluray sources really though, every other source would usually be 60hz.  I just don't see the point of creating extra virtual frames personally.  My projector supports 24-120 hz but that was bought for 3D mainly rather than any perception of smoother performance.


The tv creates virtual frames only when the interpolation option is on. Otherwise, the TV [wont' create frames, but] will play 5 times the same frames if it's a 120Hz set, or 10 times if it's 240Hz. Otherwise, it will use 3:2 pulldown and judder will be introduced. In fact, for as long as we've been using 60Hz TVs (since before anyone here was born, really), we've been used to the imperfect motion of 24 fps for movies. That's why when you watch a movie like The Hobbit at 48Hz, it looks quite weird.


I found the Hobbit looked jerky in panning shots to my eyes.  The extra detail was nice but I'm not sure they shouldn't have added some motion blur or gone for 60Hz.  It took half the film to try and ignore the technology which is annoying.



A smooth frame rate is most important, be it 60hz, 120hz, 240hz, 480hz, 600hz, etc.

2:3 pull down on 60 hz screens is okay for most people. since that has been the standard since forever and that is what people are used to.

Interpolated frames suck, period.
Watch out for any displays that use this technology (usually found on 120 hz, 240 hz, and 480 hz monitors). The problem, besides visual artifacts, lag, soap opera effect, etc, is that the artificial frames mixed in often give an oldschool Cheap CGI look to whatever is in focus, which sucks... so yeah, avoid.

That said, I don't know which monitors have neither 2:3 or Interpolation...



Hynad said:
slowmo said:
Hynad said:
slowmo said:
I think the difference of a 120Hz TV to the 60Hz TV's is more to do with the beefed up scaler they use in the TV's rather than any actual big wow factor from the frequency as most sources will never exceed 60Hz except for 3D content and PC's.


The main benefit I see with 120Hz sets is that unlike 60Hz sets, 120 is a multiple of 24. 

So the standard of 24 images per seconds from the movie industry fits confortably in there. Meaning that the TV doesn't need to do the 3:2 pull down to compensate. Which introduces judders especially noticeable during camera pans. 24 FPS actually fits 5 times in 120. So the TV plays each frames 5 times, making for a smoother experience than on 60Hz sets.

That's only Bluray sources really though, every other source would usually be 60hz.  I just don't see the point of creating extra virtual frames personally.  My projector supports 24-120 hz but that was bought for 3D mainly rather than any perception of smoother performance.


The tv creates virtual frames only when the interpolation option is on. Otherwise, the TV [wont' create frames, but] will play 5 times the same frames if it's a 120Hz set, or 10 times if it's 240Hz. Otherwise, it will use 3:2 pulldown and judder will be introduced. In fact, for as long as we've been using 60Hz TVs (since before anyone here was born, really), we've been used to the imperfect motion of 24 fps for movies. That's why when you watch a movie like The Hobbit at 48Hz, it looks quite weird.


That doesn't make any sense for LCD/LED tv's. They don't paint an image over and over like CRT or DLP. The back light is always on. Playing the same frames 5 times is identical to only changing the LCD state 1 time. 60hz tvs are perfectly capable of timing the LCD state updates every 1/24th of a second. 3:2 pulldown has not been a problem for a while.
3:2 pulldown doesn't introduce judder, judder is there because 24fps is pretty low for pans. It is more noticeable in bright environments then in a dark cinema. The darker it is, the lower the frame rate you need to perceive smooth motion. If you want to lessen the impact of judder without affecting picture quality, turn off the lights and turn down the brightness.

3:2 pulldown is a messy way that uses interlaced frames to convert from 24fps to 30fps, it doesn't even convert to 60hz

Glad to be rid of that mess :)

Fine motion creates new frames, for 24 fps movies instead of A-B-C you get A A2 A3 A4 A5 B B2 B3 B4 B5 C. Just as we finally got the movies as they were made, we're going to even more radically change the source material.

For 30fps games you get A B C -> A A2 A3 A4 B B2 B3 B4 C. Obviously this can only be done after the next frame has been received, adding a full 33ms before any signal processing can start, actually some sets have been reported to add over 200ms of lag with enchanced fine motion turned on.

Here is an interesting article on why The Hobbot at 48fps looks fake:
http://movieline.com/2012/12/14/hobbit-high-frame-rate-science-48-frames-per-second/

But scientists and researchers in the field of consciousness perception say that the human brain perceives reality at a rate somewhere between 24 fps and 48 fps — 40 conscious moments per second, to be more exact — and exceeding the limit of the brain’s speed of cognition beyond the sweet spot that connotes realism is where Jackson & Co. get into trouble.

They’re not taking into account what’s called The Uncanny Valley in psychology. The Uncanny Valley says that, statistically, if you map out a consumer’s reaction to something they’re seeing, if they’re seeing something artificial and it starts to approach something looking real, they begin to inherently psychologically reject it.

120hz enhanced fine motion will solve the judder problem, at the cost that everything looks soap opera fake.



CGI-Quality said:
brendude13 said:

I meant console gaming only, I included PC monitors as an advantage for 120Hz because PCs can actually output 120Hz. Get a 120Hz PC monitor for gaming and you're good to go. 120Hz PC monitors have a huge premium price tag on them though, but looking at the rigs you've built, I guess that's no problem for you.

My Samsung was only $299.99 when I got it. I try not to over spend, believe it or not.

That's not bad at all, the cheapest I could find on eBuyer was £320.



Around the Network
BlueFalcon said:
platformmaster918 said:
so you're absolutely sure that there won't be a difference between 60hz and 120hz for gaming because that's obviously the main reason I'm buying it.  

If you want a great monitor for PC gaming, I would not get a 120-240Hz budget 40-46 inch LCD/LEDs. Just get a 2560x1440 Catleap 27 inch IPS from eBay for $300. Better color accuracy, deeper blacks, higher resolution for crisper textures and has received positive feedback from many satisfied PC gamers. OTOH, if you want a great TV for movies and sports, forget LED/LCD and get a Plasma (Panasonic or at least a Samsung). Better color accuracy, viewing angles, response times for fast moving sports/games, deeper blacks, higher real world contrast ratio, etc than 97% of LED/LCDs. I'd rather get a 42 inch Panasonic Plasma and spend $300 for a Catleap than get a crappy Sony 120-240Hz LED/LCD that sucks at both gaming and videophile IQ for sports/movies. Sony's and other low end LCD/LEDs TVs have especially poor image quality. To get decent IQ on LED/LCD, you have to step up to more expensive LEDs. Essentially, to even come close to Plasma in IQ, you are going to need to spend way more $ with a like-performing LED, which means you can basically get a 27 inch Catleap and a Panny plasma for the same price. Also, most 120-240Hz TVs only output real 60Hz and just extrapolate the refresh rate. You won't get actual 120-240Hz refresh rate with a PC from a Sony 120-240Hz LED. It's not the same as getting something like an Asus VG278H monitor specifically made for 120Hz on the PC.

it's strictly for console gaming and blurays.




Get Your Portable ID!Lord of Ratchet and Clank

Duke of Playstation Plus

Warden of Platformers

platformmaster918 said:

it's strictly for console gaming and blurays.

Same as me then. I would say avoid 120Hz completely, the money could go to something far better.



brendude13 said:
platformmaster918 said:

it's strictly for console gaming and blurays.

Same as me then. I would say avoid 120Hz completely, the money could go to something far better.

yeah that's basically the vibe I'm getting and I can get a 60hz 40inch 1080p for under $400 so that'll do




Get Your Portable ID!Lord of Ratchet and Clank

Duke of Playstation Plus

Warden of Platformers

platformmaster918 said:
it's strictly for console gaming and blurays.

You should get a Plasma then. It's better for both of those uses. Everyone I know who switched to Plasma for their PS360 consoles is never going back to LED/LCD. Every single one of those people thought that LED>Plasma until they stopped buying into the myths and started doing their research after I told them. The IQ is superior in nearly every regard on a Plasma for console games. It's pretty simple why - more accurate colors, deeper blacks, higher real world constract ratio, faster response time and higher viewing angles. BluRays? There isn't even a contest there because deeper blacks is crucial for movies. The only caveat is if you are going to put the TV in a very brightly-lit room, where LED would work better. Otherwise, Plasma > LED in IQ for games, sports or movies.

120-240Hz LED is going to be absolutely worthless for console games as console games on PS360 don't render beyond 60 fps, while most run at 20-30 fps. Even if you could get 75-80 fps, it would be screen tearing galore without VSync. For console games, pixel response time matters way more because your console can't push 120fps like a PC can. Response time is the time taken for a pixel to change value and back again (effectively in an LCD/LED display, how quickly the pixels shift to allow different amounts of light to pass through). In simple terms, the lower this number, the better as it means the pixels will shift quicker. Therefore, displaying fast-paced motion (like games) will have less visual artefacts than a display with a higher pixel response time (which basically means any plasma > any LED for pixel response time). 

Since Plasma is a gas, the pixel response time is nearly instantaneous (0.002ms), much closer to a CRT style gaming experience. Most LED/LCDs have a response time of 6-8ms, with newer ones might be coming in at 2-3ms (but even these #s are often just marketing by LED/LCD makers). This is where a plasma crushes the LED. No motion blur on a plasma like a CRT, no 'screen door'/pixel moving effect and no pixel bleeding/poor backlighting uniformity that many LED/LCDs have. Plasma is actual ideal for console gaming. Bonus: Plasmas even cost less for similar sizes. 

If your budget is only $400 for a 40 inch 1080P TV, try to stretch it to $600 and you can get expontentially higher quality TV.

 

Panasonic TC-P50U50

 

http://reviews.cnet.com/flat-panel-tvs/panasonic-tc-p50u50/4505-6482_7-35149658.html



BlueFalcon said:
platformmaster918 said:
it's strictly for console gaming and blurays.

You should get a Plasma then. It's better for both of those uses. Everyone I know who switched to Plasma for their PS360 consoles is never going back to LED/LCD. Every single one of those people thought that LED>Plasma until they stopped buying into the myths and started doing their research after I told them. The IQ is superior in nearly every regard on a Plasma for console games. It's pretty simple why - more accurate colors, deeper blacks, higher real world constract ratio, faster response time and higher viewing angles. BluRays? There isn't even a contest there because deeper blacks is crucial for movies. The only caveat is if you are going to put the TV in a very brightly-lit room, where LED would work better. Otherwise, Plasma > LED in IQ for games, sports or movies.

120-240Hz LED is going to be absolutely worthless for console games as console games on PS360 don't render beyond 60 fps, while most run at 20-30 fps. Even if you could get 75-80 fps, it would be screen tearing galore without VSync. For console games, pixel response time matters way more because your console can't push 120fps like a PC can. Response time is the time taken for a pixel to change value and back again (effectively in an LCD/LED display, how quickly the pixels shift to allow different amounts of light to pass through). In simple terms, the lower this number, the better as it means the pixels will shift quicker. Therefore, displaying fast-paced motion (like games) will have less visual artefacts than a display with a higher pixel response time (which basically means any plasma > any LED for pixel response time). 

Since Plasma is a gas, the pixel response time is nearly instantaneous (0.002ms), much closer to a CRT style gaming experience. Most LED/LCDs have a response time of 6-8ms, with newer ones might be coming in at 2-3ms (but even these #s are often just marketing by LED/LCD makers). This is where a plasma crushes the LED. No motion blur on a plasma like a CRT, no 'screen door'/pixel moving effect and no pixel bleeding/poor backlighting uniformity that many LED/LCDs have. Plasma is actual ideal for console gaming. Bonus: Plasmas even cost less for similar sizes. 

If your budget is only $400 for a 40 inch 1080P TV, try to stretch it to $600 and you can get expontentially higher quality TV.

 

Panasonic TC-P50U50

 

http://reviews.cnet.com/flat-panel-tvs/panasonic-tc-p50u50/4505-6482_7-35149658.html

thanks for all the info I will check out plasmas and see if I can find one a bit smaller (hopefully less in price and fits better in my small apartment).




Get Your Portable ID!Lord of Ratchet and Clank

Duke of Playstation Plus

Warden of Platformers