Just to clarify, Patrick is right here. Assumptions: All the movies is 120 minuters long. Each movie has an average bitrate of 50 Mbit/s. (50 Mbit/s / 8 (bits) * 7 200 (2 hours) / 1000 (MB) = 45 GB). That means that the storage capacity for the movies is going to be: 10 000 000 * 45 (GB) / 1000 (TB) / 1000 (PB) = 450 PB of storage. Some of you might want to raise your hand to say that this quality of the movie is to good. Ok, so we make it 10 times smaller to 5 Mbit/s in average: 450 PB / 10 = 45 PB or 45 000 TB. If we are using 800 GB SSD drives: 45 000 TB / 0,8 TB = 56 250 SSD drives! (And we don't have any kind of backup of the content here. That need more SSD drives as well. And don't forget the power consumption). So over to the streaming part. 10 000 000 Customers watching, each with a bandwidth of 5 Mbit/s = 50 000 000 Mbit/s / 1000 (Gbit/s) = 50 000 Gbit/s. We only need 500 * 100 Gbit/s connections to solve this kind of demand. For each ISP around the world with 10 000 000 Millions of customers. Will TLMC be able to solve the 100k users watching 10 different movies? Yes. Will TLMC be able to solve the other 10 Million watching 10 Million movies. No, since your network can not handle this kind of load in the first place.
One of us has a different dictionary than everyone else.
Assume I have 10 million movies in my library, and 10 million active users. Further assume there are 10 movies being watched by 100K users each, and 9,999,990 movies which are being watched by 1 user each.
Which has more total demand, the 10 popular movies or the long tail?
This doesn't mean Netflix or Hulu or iTunes or whatever has the aforementioned demand curve. But it does mean my "definition" & yours do not match.
Either way, I challenge you to prove the long tail on one of the serious streaming services is a "tiny fraction" of total demand.
-- //fredan The Last Mile Cache - http://tlmc.fredan.se