wtf? why would the UK pick 50 hz for broadcasting?
Because TV was always in 50hz in Europe (used to be tied to 50hz AC power at the start) and thus all TV programs have always been in 50hz. Why would they convert all their existing content for such a little upgrade. Imo they should have switched to 100hz broadcasting for HDTV, compatible with 50hz and much better then your tv inserting frames.
You could also ask the question why would the US stick to 60hz for broadcasting, majority is 50hz.
but they don't want to upgrade to 100Hz or even 60Hz .. maybe because that could mean you'd need more bandwith per channel
the electricity frequency should have no effect on that anymore anyway, as all TVs released in the last .. ~30 years have a frequency-voltage transformer built in
True, but 90% of tv is reruns, all made or already converted to 50hz and all the infrastructure would need to be adjusted and/or replaced.
It would also be more convenient if the whole world uses 60hz 220v. Faster cycle and higher voltage has less energy loss in transport after all, or how about we all drive on the same side of the road. It's simply too much work, and nobody wants to be the one that has to change.
Actually 50hz was better for watching movies until 1080p24 came along. Movies in Europe were simply sped up to 25fps and it allowed 576 lines of resolution instead of 480. The 4% faster sound wasn't noticeable and you get a perfectly smooth picture unlike the 3:2 pulldown filter used for 60hz conversion. Check the runtime differences between PAL and NTSC DVDs, PAL ones are all 4% shorter.
Anyway many people don't notice judder, or upscaled versus native resolutions, or screen tearing. For people that do, they probably don't like an extra signal processing step in the chain anyway, I don't. Native resolution, native framerate for me please.