By using this site, you agree to our Privacy Policy and our Terms of Use. Close
Ssenkahdavic said:
 

After working out the equation, they can both support the exact same 1280x720 @60x2 for all color depths (I was almost sure before doing the actual math that 32bit would be to much bandw for HDMI 1.2, but it isnt...barely)

1280x720 (pixels) x 60 (fps) x 2 (both eyes) x 32bit (color) = 3,538,944,000 gigabites/s  (HDMI 1.2 can support 3.95gigabites/s)

Just to be complete: 

1920x1080 (pixels) x 24 (fps) x 2 (both eyes) x 32bit (color) = 3,185,049,600 gigabites/s (HDMI 1.2 COULD support this...atleast it has the bandw to do so)  

1920x1080 (pixels) x 60 (fps) x 2 (both eyes) x 32bit (color) = 7,962,624,000 gigabites/s (HDMI 1.2 COULD not support this...but neither do todays supposed 1.4a HDMI tvs.... tho they do have the bandw)

1920x1080 (pixels) x 60 (fps) x 1 (regular 2D) x 32bit (color) = 3,981,312,000 gigabites/s (HDMI 1.2 COULD not support this, probably why most 1080p games are 24bit anyway)

1080@60x 32bit 2D > 720@60x 32bit 3D.  Interesting isnt it?


Damn, well put.

Yeah I figured that 1080p @ 60 is greater than 2X 720p @ 60 as 720p isn't an accurate half of 1080p.

I think that if you took 1360X768 resolution X 2...that should be as close to 1 X 1080p as humanly possible (while keeping the same aspect ratio. Hence why I think a lot of HDTVs have that as their resolution, over true 720p.