Ssenkahdavic said:
You are confusing Hertz (Cycles per second) with FPS (frames per second) While sometimes they are interchangable, for the most part they are not. One is how many times your screen "redraws itself" per second (hz) and the other is how many different Frames are drawn Per second. When they are equal (or FPS > Hz) a much smoother picture is born.
And for top/bottom (from page 7 on the 3D HDMI white sheet) "For Top-and-Bottom, the original full left and right pictures are sub-sampled to half resolution on the vertical axis. Sub-sampled pictures are arranged in Top-and-Bottom layout. See Figure 8-6." |
Ssenkahdavic, I mentioned the subsampling above," They are averaging every 2 vertical lines in a1920x1080 frame (subsampling) to end up with frames that are 1920x540" My point is how can they call this a 1080p format, when it's actually only 540p???? Do they just double the verticle lines at the tv to make it 1080p, or are they some how interlacing to get back up to 1080p? Just doubling the 540 to get back to 1080, then calling it 1080, is...lame.
I wasn't confusing the two, they are the same spec. applied to different things. 60hz refresh means 60 frames drawn per second. If a monitors refresh rate is 60hz, it's entire screen is redrawn 60 times a second. If a game runs at 320fps, and you have it hooked up to a 60hz monitor, your still only going to see 60 fps. I meant that if the maximum a tv could show was 1080p at 30hz, which is 30 1080 frames per second. That would be more than enough for 95 % of the games on consoles, that struggle to do even 720 at 30fps.
Just read up on the 480hz refresh backlight trickery of those new Vizios coming next month, i'll just have to wait until the reviews come in.







