I often get confused by the MB/s (megabyte per second) and Mb/s (megabit per second) in practical scenarios. It's easy to confuse the little b with the big B in your own head, and you can't count on everyone else always typing it right. But there's a huge difference between the two, since 1MB/s = 8Mb/s, an error can be crucial.
My internet speed is 60-100Mb/s download speed. Right now I'm downloading a game from Steam and it says my download speed is 12MB/s and it's quite stable. That means I'm downloading at maximum capacity, right? Since 1 byte is 8 bits, 12MB/s means 12x8 = 96Mb/s (which is almost at my maximum capacity of 100Mb/s).
Could somebody confirm I am correct so far?
Now what I would want to know is my capacity to stream 4K content in the future. 4K is the real deal! Everybody wants to watch movies in 4K.
On the Swedish Netflix page they recommend "an internet speed of 25Mb/s" to be able to stream in "Ultra-HD quality" (Ultra-HD, or "UHD", is the same as 4K, a 3840x2160 resolution. There are other 4K standards like 4096x2160, but 3840x2160 is the most commonly used).
Now, there is a lot more to it than just saying something is "Ultra-HD". All streams have different quality due to the compression of the original source (ultimately the movie footage itself) and the Ultra-HD of Netflix might be quite different from the 4K of Youtube. And of course it matters in what framerate is the stream, if it's in say 24fps or 60fps.
Typically YouTube suffers from bad quality. You can have a 1080p stream that is so compressed that its bitrate and its quality is much worse than a DVD at 720x576, and usually the 4K quality from YouTube is much worse than watching a typical 1080p Blu-ray movie.
Does somebody know what is the bitrate of 4K streaming from Netflix and the bitrate of 4K from Youtube? And also very important, is there some sort of standard under development among the big movie streaming services? Like a standard of what is considered decent quality, or a minimum quality a customer can expect. I mean something like this; "most video streaming services stream their 4K content in at least 20Mbit/s, but we're increasingly seeing content in 40Mbit/s", is there some sort of general assumption or agreement like this?
And as a comparison, what is the actual bitrate of Ultra-HD Blu-ray running in HDR and at 60fps, which obviously is encoded/compressed with a much better quality than typical 4K-video streams from Netflix and Youtube? Wikipedia doesn't tell this.
HDR Ultra-HD Blu-ray is the ultimate viewing experience!
Also, HDR is extremely cool and everybody will want to watch their movies in HDR in the future (if somebody doesn't know it yet HDR means High dynamic range and means the ability to display a wider range of colors. In practice it also means a huge increase in a TV's maximum brightness, it can look almost like the sun is shining from inside your TV). But how does HDR affect the streaming capacity? Will support for HDR demand a higher bitrate (by my logic it will, a lot)? Are current Netflix Ultra-HD streams with or without HDR?
My dream is to be able to buy an LG OLED HDR TV and be able to stream beautiful HDR 4K movie content to it.
Thoughts?







