My contention is simple. The content providers will not allow P2P video as a legal commercial service anytime in the near future. Furthermore, most ISPs are going to side with the content providers on this one. Therefore, discussing it at
Gian Anthony ConstantineSenior Network Design EngineerEarthlink, Inc. On Jan 8, 2007, at 9:49 PM, Thomas Leavitt wrote:So, kind of back to the original question: what is going to be the reaction of your average service
Is it your contention that there's no economic model, given the architecture of current networks, which would would generate enough revenue to offset the cost of
Thomas Gian Constantine wrote: There may have been a disconnect on my part, or at least, a failure to disclose my position. I am looking at things from a provider standpoint, whether as an ISP or a strict video service provider. I agree with you. From a consumer standpoint, a trickle or off-peak download model is the ideal low-impact solution to content delivery. And absolutely, a 500GB drive would almost be overkill on space for disposable content encoded in H.264. Excellent SD (480i) content can be achieved at ~1200 to 1500kbps, resulting in about a 1GB file for a 90 minute title. HD is almost out of the question for internet download, given good 720p at ~5500kbps, resulting in a 30GB file for a 90 minute title. Service providers wishing to provide this service to their customers may see some success where they control the access medium (copper loop, coax, FTTH). Offering such a service to customers outside of this scope would prove very expensive, and likely, would never see a return on the investment without extensive peering arrangements. Even then, distribution rights would be very difficult to attain without very deep pockets and crippling revenue sharing. The studios really dislike the idea of transmission outside of a closed network. Don't forget. Even the titles you mentioned are still owned by very large companies interested in squeezing every possible dime from their assets. They would not be cheap to acquire. Further, torrent-like distribution is a long long way away from sign off by the content providers. They see torrents as the number one tool of content piracy. This is a major reason I see the discussion of tripping upstream usage limits
I am with you on the vision of massive content libraries at the fingertips of all, but I see many roadblocks in the way. And, almost none of them are technical in nature. Gian Anthony ConstantineSenior Network Design EngineerEarthlink, Inc.Office: 404-748-6207Cell: 404-808-4651Internal Ext: x22007constantinegi@corp.earthlink.net <mailto:constantinegi@corp.earthlink.net>
On Jan 8, 2007, at 7:51 PM, Bora Akyol wrote:
Please see my comments inline: -----Original Message-----From: Gian Constantine [mailto:constantinegi@corp.earthlink.net] Sent: Monday, January 08, 2007 4:27 PMTo: Bora AkyolCc: nanog@merit.edu <mailto:nanog@merit.edu>Subject: Re: Network end users to pull down 2 gigabytes a day, continuously? <snip> I would also argue storage and distribution costs are not asymptotically zero with scale. Well designed SANs are not cheap. Well designed distribution systems are not cheap. While price does decrease when scaled upwards, the cost of such an operation remains hefty, and increases with additions to the offered content
To the end user, there is no cost to downloading videos when they aresleeping.I would argue that other than sports (and some news) events, there ispretty much no content thatneeds to be real time. What the downloading (possibly 24x7) does is to stress the ISP network to its max since the assumptions of statisticalmultiplexinggoes out the window. Think of a Tivo that downloads content off theInternet24x7. The user is still paying for only what they pay each month, and this is"network neutrality 2.0" all over again.
You are correct on the long tail nature of music. But music is not consumed in a similar manner as TV and movies. Television and movies involve a little more commitment and attention. Music is more for the moment and the mood. There is an immediacy with music consumption. Movies and television require a slight degree more patience from the consumer. The freshness (debatable :-) ) of new release movies and TV can often command the required patience from the consumer. Older content rarely has the same pull. I would argue against your distinction between visual and auditorycontent.There is a lot of content out there that a lot of people watch and thecontentis 20-40+ years old. Think Brady Bunch, Bonanza, or archived games fromNFL,MLB etc. What about Smurfs (for those of us with kids)? This is only the beginning. If I can get a 500GB box and download MP4 content, that's a lot ofessentially free storage. Coming back to NANOG content, I think video (not streamed but multi-pathdistributed video) is going to bring the networks down not by sheerbandwidth alone but by challenging the assumptions behind theengineering of
Gian wrote: "From a big picture standpoint, I would say P2P distribution is a non-starter, too many reluctant parties to appease. From a detail standpoint, I would say P2P distribution faces too many hurdles in existing network infrastructure to be justified. Simply reference the discussion of upstream bandwidth caps and you will have a wonderful example of those hurdles." Speaking about upstream hurdles, just out of curiosity (since this is merely a diversionary discussion at this point;) ... wouldn't peer-to-peer be the LEAST desirable approach for an SP that is launching WiFi nets as its primary first mile platform? I note that Earthlink is launching a number of cityscale WiFi nets as we speak, which is why I'm asking. Has this in any way, even subliminally, been influential in the shaping of your opinions about P2P for content distribution? I know that it would affect my views, a whole lot, since the prospects for WiFi's shared upstream capabilities to improve are slim to none in the short to intermediate terms. Whereas, CM and FTTx are known to raise their down and up offerings periodically, gated only by their usual game of chicken where each watches to see who'll be first. Frank On Mon Jan 8 22:26 , Gian Constantine sent: this point in time is purely academic, or more so, diversionary.Personally, I am not one for throttling high use subscribers. Outside of the fine print, which no one reads, they were sold a service of Xkbps down and Ykbps up. I could not care less how, when, or how often they use it. If you paid for it, burn it up.I have questions as to whether or not P2P video is really a smart distribution method for service provider who controls the access medium. Outside of being a service provider, I think the economic model is weak, when there can be little expectation of a large scale take rate.Ultimately, my answer is: we're not there yet. The infrastructure isn't there. The content providers aren't there. The market isn't there. The product needs a motivator. This discussion has been putting the cart before the horse.A lot of big pictures pieces are completely overlooked. We fail to question whether or not P2P sharing is a good method in delivering the product. There are a lot of factors which play into this. Unfortunately, more interest has been paid to the details of this delivery method than has been paid to whether or not the method is even worthwhile.From a big picture standpoint, I would say P2P distribution is a non-starter, too many reluctant parties to appease. From a detail standpoint, I would say P2P distribution faces too many hurdles in existing network infrastructure to be justified. Simply reference the discussion of upstream bandwidth caps and you will have a wonderful example of those hurdles. provider to the presence of an increasing number of people sucking down massive amounts of video and spitting it back out again... nothing? throttling all traffic of a certain type? shutting down customers who exceed certain thresholds? or just throttling their traffic? massive upgrades of internal network hardware? traffic generated by P2P video? through content distribution as moot. library and a swelling of demand for this content. I believe the graph becomes neither asymptotic, nor anywhere near zero. the network. I don't think you need huge SANs per se tostore the content either, since it is multi-source/multi-sink, thereliability is built-in.
The SPs like Verizon & ATT moving fiber to the home hoping to get in onthe "value add" action are in for an awakening IMHO. Regards Boraps. I apologize for the tone of my previous email. That sounded grumpierthan I usually am.
-- Thomas Leavitt - thomas@thomasleavitt.org - 831-295-3917 (cell) *** Independent Systems and Network Consultant, Santa Cruz, CA *** <thomas.vcf>