On 9/Jul/20 18:00, Christopher Munz-Michielin wrote:
I'd assume it's a question of available bandwidth and availability of decoders. From my observations most HD satellite feeds seem to sit between 3 and 5 mbps, a typical Ku band transponder might have a bandwidth of around 20-25mbps. This means you can cram 5-8 HD feeds onto a single transponder. With 4K streams the bandwidth requirements double, meaning you can cram a lot less in the same amount of transponder space and satellite bandwidth is expensive!
The other issue is on the decoder side. Right now, the vast majority of satellite subscribers receive programming though dedicated decoders (set top boxes). Most of these decoders only have hardware to decode MPEG2 and H.264 video, while 4K stuff is almost exclusively H.265. That means it's not a simple matter of turning on 4K, you'd have to arrange to send new decoders to all your subscribers wanting to receive 4K.
As time moves along, I'm sure we'll start to see more satellite feeds available in 4K but like the transition to HD video, it'll be a slow process.
The above are all the reasons I've been positing as well. It's just that with more and more stuff being loaded on to IP (not to mention, good ol' IPTV), does it make sense for broadcasters to upgrade satellite infrastructure and decoders to support 1080p, 4K, 8K, 16K, e.t.c., when all you need is an app and an Internet connection for the very same (if not better) quality? Not to mention, considerations for eyeball time in the fight between the linear TV and VoD? Mark.