Re: [Nanog] ATT VP: Internet to hit capacity by 2010
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 - -- "Scott Weeks" <surfer@mauigateway.com> wrote:
I looked around for text or video from Mr. Cicconi at the "Westminster eForum" but can't find anything.
www.westminsterforumprojects.co.uk/eforum/default.aspx
For what it's worth, I agree with Ryan Paul's summary of the issues here: http://arstechnica.com/news.ars/post/20080420-analysis-att-fear-mongering-o n-net-capacity-mostly-fud.html ...but take it at face value. $.02, - - ferg -----BEGIN PGP SIGNATURE----- Version: PGP Desktop 9.6.3 (Build 3017) wj8DBQFIDBkpq1pz9mNUZTMRAlZ1AKCehJ0/xwgXXA9RBRwuIWfcLGp+9ACfbcJw lsmtPaDeGkV5/PllhBqBV88= =z8LR -----END PGP SIGNATURE----- -- "Fergie", a.k.a. Paul Ferguson Engineering Architecture for the Internet fergdawg(at)netzero.net ferg's tech blog: http://fergdawg.blogspot.com/ _______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
On Mon, 21 Apr 2008, Paul Ferguson wrote:
I looked around for text or video from Mr. Cicconi at the "Westminster eForum" but can't find anything.
www.westminsterforumprojects.co.uk/eforum/default.aspx
For what it's worth, I agree with Ryan Paul's summary of the issues here:
The rest of the story? http://www.usatoday.com/tech/products/services/2008-04-20-internet-broadband... By 2010, the average household will be using 1.1 terabytes (roughly equal to 1,000 copies of the Encyclopedia Britannica) of bandwidth a month, according to an estimate by the Internet Innovation Alliance in Washington, D.C. At that level, it says, 20 homes would generate more traffic than the entire Internet did in 1995. How many folks remember InternetMCI's lack of capacity in the 1990's when it actually needed to stop installing new Internet connections because InternetMCI didn't have any more capacity for several months. _______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
On Mon, 21 Apr 2008, Sean Donelan wrote:
The rest of the story?
http://www.usatoday.com/tech/products/services/2008-04-20-internet-broadband...
By 2010, the average household will be using 1.1 terabytes (roughly equal to 1,000 copies of the Encyclopedia Britannica) of bandwidth a month, according to an estimate by the Internet Innovation Alliance in Washington, D.C. At that level, it says, 20 homes would generate more traffic than the entire Internet did in 1995.
How many folks remember InternetMCI's lack of capacity in the 1990's when it actually needed to stop installing new Internet connections because InternetMCI didn't have any more capacity for several months.
I've been on the side arguing that there's going to be enough growth to cause interesting issues (which is very different than arguing for any specific remedy that the telcos think will be in their benefit), but the numbers quoted above strike me as an overstatement. Let's look at the numbers: iTunes video, which looks perfectly acceptable on my old NTSC TV, is .75 gigabytes per viewable hour. I think HDTV is somewhere around 8 megabits per second (if I'm remembering correctly; I may be wrong about that), which would translate to one megabyte per second, or 3.6 gigabytes per hour. For iTunes video, 1.1 terabytes would be 1,100 gigabytes, or 1,100 / .75 = 1,467 hours. 1,467 / 30 = 48.9 hours of video per day. Even assuming we divide that among three or four people in a household, that's staggering. For HDTV, 1,100 gigabytes would be 1,100 / 3.6 = 306 hours per month. 306 / 30 = 10.2 hours per day. Maybe I just don't spend enough time around the "leave the TV on all day" demographic. Is that a realistic number? Is there something bigger than HDTV video that ATT expects people to start downloading? -Steve _______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
Steve Gibbard wrote:
Maybe I just don't spend enough time around the "leave the TV on all day" demographic. Is that a realistic number? Is there something bigger than HDTV video that ATT expects people to start downloading?
I would not be surprised if many households watch more than 10hrs of TV per day. My trusty old series 2 TiVo often records 5-8hrs of TV per day, even if I don't watch any of it. Right now I can get 80 or so channels of basic cable, and who knows how many of Digital Cable/Satellite for as many TVs as I can fit in my house without the Internet buckling under the pressure. I assume AT&T is just saying "We use this pipe for TV and Internet, hence all TV is now considered Internet traffic"? How many people are REALLY going to be pulling 10hrs of HD or even SD TV across their Internet connection, rather than just taking what is Multicasted from a Satellite base station by their TV service provider? Is there something significant about AT&T's model (other than the VDSL over twisted pair, rather than coax/fiber to the prem) that makes them more afraid than Comcast, Charter or Cox? Maybe I'm just totally missing something - Wouldn't be the first time. Why would TV of any sort even touch the 'Internet'. And, no, YouTube is not "TV" as far as I'm concerned. _______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
Why would TV of any sort even touch the 'Internet'. And, no, YouTube is not "TV" as far as I'm concerned.
FWIW: http://www.worldmulticast.com/marketsummary.html _______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
Steve Gibbard wrote:
Maybe I just don't spend enough time around the "leave the TV on all day" demographic. Is that a realistic number? Is there something bigger than HDTV video that ATT expects people to start downloading?
I would not be surprised if many households watch more than 10hrs of TV per day. My trusty old series 2 TiVo often records 5-8hrs of TV per day, even if I don't watch any of it.
Right now I can get 80 or so channels of basic cable, and who knows how many of Digital Cable/Satellite for as many TVs as I can fit in my house without the Internet buckling under the pressure. I assume AT&T is just saying "We use this pipe for TV and Internet, hence all TV is now considered Internet traffic"? How many people are REALLY going to be pulling 10hrs of HD or even SD TV across their Internet connection, rather than just taking what is Multicasted from a Satellite base station by their TV service provider? Is there something significant about AT&T's model (other than the VDSL over twisted pair, rather than coax/fiber to the prem) that makes them more afraid than Comcast, Charter or Cox?
Maybe I'm just totally missing something - Wouldn't be the first time. Why would TV of any sort even touch the 'Internet'. And, no, YouTube is not "TV" as far as I'm concerned.
The real problem is that this technology is just in its infancy. Right now, our TiVo's may pull in many hours a day of TV to watch. In my case, it's from satellite. In yours, maybe from a cable company. That's fine, that's manageable, and the technology used to move the signal from the broad/multicast point to your settop box is only vaguely relevant. It is not unicast. There is, however, an opportunity here for a fundamental change in the distribution model of video, and this should terrify any network operator. That would be an evolution towards unicast, particularly off-net unicast. I posted a message on Oct 10 of last year suggesting one potential model for evolution of video services. We're seeing the market target narrower segments of the viewing public, and if this continues, we may well see some "channel" partner with TiVo to provide on-demand access to remote content over the Internet. That could well lead to a model where you would have TiVo speculatively preloading content, and potentially vast amounts of it. Or, worse yet, the popularity of YouTube suggests that at some point, we may end up with a new "local webserver service" on the next generation Microsoft Whoopta OS that was capable of publication of video from the local PC, maybe vaguely similar to BitTorrent under the hood, allowing for a much higher bandwidth podcast-like service where your TiVo (and everyone else's) is downloading video slowly from lots of different sources. ... JG -- Joe Greco - sol.net Network Services - Milwaukee, WI - http://www.sol.net "We call it the 'one bite at the apple' rule. Give me one chance [and] then I won't contact you again." - Direct Marketing Ass'n position on e-mail spam(CNN) With 24 million small businesses in the US alone, that's way too many apples. _______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
Once upon a time, Steve Gibbard <scg@gibbard.org> said:
iTunes video, which looks perfectly acceptable on my old NTSC TV, is .75 gigabytes per viewable hour. I think HDTV is somewhere around 8 megabits per second (if I'm remembering correctly; I may be wrong about that), which would translate to one megabyte per second, or 3.6 gigabytes per hour.
You're a little low. ATSC (the over-the-air digital broadcast format) is 19 megabits per second or 8.55 gigabytes per hour. My TiVo probably records 12-20 hours per day (I don't watch all that of course), often using two tuners (so up to 38 megabits per second). That's not all HD today of course, but the percentage that is HD is going up. 1.1 terabytes of ATSC-level HD would be a little over 4 hours a day. If you have a family with multiple TVs, that's easy to hit. That also assumes that we get 40-60 megabit connections (2-3 ATSC format channels) that can sustain that level of traffic to the household with widespread deployment in 2 years and that the "average" household hooks it up to their TVs. -- Chris Adams <cmadams@hiwaay.net> Systems and Network Administrator - HiWAAY Internet Services I don't speak for anybody but myself - that's enough trouble. _______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
On Mon Apr 21, 2008 at 02:43:14PM -0500, Chris Adams wrote:
You're a little low. ATSC (the over-the-air digital broadcast format) is 19 megabits per second or 8.55 gigabytes per hour.
I think you're too high there! MPEG2 SD is around 4-6Mbps, MPEG4 SD is around 2-4Mbps, MPEG4 HD is anywhere from 8 to 20Mbps, depending on how much wow factor the broadcaster is trying to give. A typical satellite TV multiplex is 20-30Mbps for 4-8 channels, depending on how much the broadcaster pays for higher bitrate, and thus higher quality. Simon -- Simon Lockhart | * Sun Server Colocation * ADSL * Domain Registration * Director | * Domain & Web Hosting * Internet Consultancy * Bogons Ltd | * http://www.bogons.net/ * Email: info@bogons.net * _______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
Once upon a time, Simon Lockhart <simon@slimey.org> said:
On Mon Apr 21, 2008 at 02:43:14PM -0500, Chris Adams wrote:
You're a little low. ATSC (the over-the-air digital broadcast format) is 19 megabits per second or 8.55 gigabytes per hour.
I think you're too high there! MPEG2 SD is around 4-6Mbps, MPEG4 SD is around 2-4Mbps, MPEG4 HD is anywhere from 8 to 20Mbps, depending on how much wow factor the broadcaster is trying to give.
Nope, ATSC is 19 (more accurately 19.28) megabits per second. That can carry multiple sub-channels, or it can be used for a single channel. Standard definition DVDs can be up to 10 megabits per second. Both only use MPEG2; MPEG4 can be around half that for similar quality. The base Blu-Ray data rate is 36 megabits per second (to allow for high quality MPEG2 at up to 1080p60 resolution). -- Chris Adams <cmadams@hiwaay.net> Systems and Network Administrator - HiWAAY Internet Services I don't speak for anybody but myself - that's enough trouble. _______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
On Mon, 21 Apr 2008, Chris Adams wrote:
Nope, ATSC is 19 (more accurately 19.28) megabits per second. That can carry multiple sub-channels, or it can be used for a single channel. Standard definition DVDs can be up to 10 megabits per second. Both only use MPEG2; MPEG4 can be around half that for similar quality. The base Blu-Ray data rate is 36 megabits per second (to allow for high quality MPEG2 at up to 1080p60 resolution).
From wikipedia (see: Appeal to authority :-): The different resolutions can operate in progressive scan or interlaced mode, although the highest 1080-line system cannot display progressive images at the rate of 59.94 or 60 frames per second. (Such technology was seen as too advanced at the time, plus the image quality was deemed to be too poor considering the amount of data that can be transmitted.) A terrestrial (over-the-air) transmission carries 19.39 megabits of data per second, compared to a maximum possible bitrate of 10.08 Mbit/s allowed in
the DVD standard. Ric _______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
My directivo records wads of stuff every day, but they are the same bits that rain down on gazillions of other potential recorders and viewers. Incremental cost to serve one more household, pretty much zero. There are definitely narrowcast applications that don't make sense to broadcast down from a bird, but it also makes no sense at all to claim for capacity planning purposes that every household will need a unicast IP stream of all it's TV viewing capacity... -dorn On Mon, Apr 21, 2008 at 4:25 PM, Ric Messier <kilroy@washere.com> wrote:
On Mon, 21 Apr 2008, Chris Adams wrote:
Nope, ATSC is 19 (more accurately 19.28) megabits per second. That can carry multiple sub-channels, or it can be used for a single channel. Standard definition DVDs can be up to 10 megabits per second. Both only use MPEG2; MPEG4 can be around half that for similar quality. The base Blu-Ray data rate is 36 megabits per second (to allow for high quality MPEG2 at up to 1080p60 resolution).
From wikipedia (see: Appeal to authority :-): The different resolutions can operate in progressive scan or interlaced mode, although the highest 1080-line system cannot display progressive images at the rate of 59.94 or 60 frames per second. (Such technology was seen as too advanced at the time, plus the image quality was deemed to be too poor considering the amount of data that can be transmitted.) A terrestrial (over-the-air) transmission carries 19.39 megabits of data per second, compared to a maximum possible bitrate of 10.08 Mbit/s allowed in the DVD standard.
Ric
_______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
_______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
I think you're too high there! MPEG2 SD is around 4-6Mbps, MPEG4 SD is around 2-4Mbps, MPEG4 HD is anywhere from 8 to 20Mbps, depending on how much wow factor the broadcaster is trying to give.
Nope, ATSC is 19 (more accurately 19.28) megabits per second.
So why would anyone plug an ATSC feed directly into the Internet? Are there any devices that can play it other than a TV set? Why wouldn't a video services company transcode it to MPEG4 and transmit that? I can see that some cable/DSL companies might transmit ATSC to subscribers but they would also operate local receivers so that the traffic never touches their core. Rather like what a cable company does today with TV receivers in their head ends. All this talk of exafloods seems to ignore the basic economics of IP networks. No ISP is going to allow subscribers to pull in 8gigs per day of video stream. And no broadcaster is going to pay for the bandwidth needed to pump out all those ATSC streams. And nobody is going to stick IP multicast (and multicast peering) in the core just to deal with video streams to people who leave their TV on all day whether they are at home or not. At best you will see IP multicast on a city-wide basis in a single ISP's network. Also note that IP multicast only works for live broadcast TV. In today's world there isn't much of that except for news. Everything else is prerecorded and thus it COULD be transmitted at any time. IP multicast does not help you when you have 1000 subscribers all pulling in 1000 unique streams. In the 1960's it was reasonable to think that you could deliver the same video to all consumers because everybody was the same in one big melting pot. But that day is long gone. On the other hand, P2P software could be leveraged to download video files during off-peak hours on the network. All it takes is some cooperation between P2P software developers and ISPs so that you have P2P clients which can be told to lay off during peak hours, or when they want something from the other side of a congested peering circuit. Better yet, the ISP's P2P manager could arrange for one full copy of that file to get across the congested peering circuit during the time period most favorable for that single circuit, then distribute elsewhere. --Michael Dillon As far as I am concerned the killer application for IP multicast is *NOT* video, it's market data feeds from NYSE, NASDAQ, CBOT, etc. _______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
It's certainly not reasonable to assume the same video goes to all consumers, but on the other hand, there *is* plenty of video that goes to a *lot* of consumers. I don't really need my own personal unicast copy of the bits that make up an episode of BSG or whatever. I would hope that the future has even more tivo-like devices at the consumer edge that can take advantage of the right (desired) bits whenever they are available. A single "box" that can take bits off the bird or cable tv when what it wants is found there or request over IP when it needs to doesn't seem like rocket science... -dorn On Tue, Apr 22, 2008 at 6:33 AM, <michael.dillon@bt.com> wrote:
I think you're too high there! MPEG2 SD is around 4-6Mbps, MPEG4 SD is around 2-4Mbps, MPEG4 HD is anywhere from 8 to 20Mbps, depending on how much wow factor the broadcaster is trying to give.
Nope, ATSC is 19 (more accurately 19.28) megabits per second.
So why would anyone plug an ATSC feed directly into the Internet? Are there any devices that can play it other than a TV set? Why wouldn't a video services company transcode it to MPEG4 and transmit that?
I can see that some cable/DSL companies might transmit ATSC to subscribers but they would also operate local receivers so that the traffic never touches their core. Rather like what a cable company does today with TV receivers in their head ends.
All this talk of exafloods seems to ignore the basic economics of IP networks. No ISP is going to allow subscribers to pull in 8gigs per day of video stream. And no broadcaster is going to pay for the bandwidth needed to pump out all those ATSC streams. And nobody is going to stick IP multicast (and multicast peering) in the core just to deal with video streams to people who leave their TV on all day whether they are at home or not.
At best you will see IP multicast on a city-wide basis in a single ISP's network. Also note that IP multicast only works for live broadcast TV. In today's world there isn't much of that except for news. Everything else is prerecorded and thus it COULD be transmitted at any time. IP multicast does not help you when you have 1000 subscribers all pulling in 1000 unique streams. In the 1960's it was reasonable to think that you could deliver the same video to all consumers because everybody was the same in one big melting pot. But that day is long gone.
On the other hand, P2P software could be leveraged to download video files during off-peak hours on the network. All it takes is some cooperation between P2P software developers and ISPs so that you have P2P clients which can be told to lay off during peak hours, or when they want something from the other side of a congested peering circuit. Better yet, the ISP's P2P manager could arrange for one full copy of that file to get across the congested peering circuit during the time period most favorable for that single circuit, then distribute elsewhere.
--Michael Dillon
As far as I am concerned the killer application for IP multicast is *NOT* video, it's market data feeds from NYSE, NASDAQ, CBOT, etc.
_______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
_______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
"IP multicast does not help you when you have 1000 subscribers all pulling in 1000 unique streams. In the 1960's it was reasonable to think that you could deliver the same video to all consumers because everybody was the same in one big melting pot. But that day is long gone." ... well multicast could be used - one stream for each of the "500 channels" or whatever, and the time-shifting could be done on the recipients' sides ... just like broadcast TV + DVR today ... as long as we aren't talking about adding place-shifting (a la SlingBox) also! The market (or, atleast in the short-mid term - the provider :) ) would decide on that. /TJ
-----Original Message----- From: michael.dillon@bt.com [mailto:michael.dillon@bt.com] Sent: Tuesday, April 22, 2008 6:34 AM To: nanog@nanog.org Subject: Re: [Nanog] ATT VP: Internet to hit capacity by 2010
I think you're too high there! MPEG2 SD is around 4-6Mbps, MPEG4 SD is around 2-4Mbps, MPEG4 HD is anywhere from 8 to 20Mbps, depending on how much wow factor the broadcaster is trying to give.
Nope, ATSC is 19 (more accurately 19.28) megabits per second.
So why would anyone plug an ATSC feed directly into the Internet? Are there any devices that can play it other than a TV set? Why wouldn't a video services company transcode it to MPEG4 and transmit that?
I can see that some cable/DSL companies might transmit ATSC to subscribers but they would also operate local receivers so that the traffic never touches their core. Rather like what a cable company does today with TV receivers in their head ends.
All this talk of exafloods seems to ignore the basic economics of IP networks. No ISP is going to allow subscribers to pull in 8gigs per day of video stream. And no broadcaster is going to pay for the bandwidth needed to pump out all those ATSC streams. And nobody is going to stick IP multicast (and multicast peering) in the core just to deal with video streams to people who leave their TV on all day whether they are at home or not.
At best you will see IP multicast on a city-wide basis in a single ISP's network. Also note that IP multicast only works for live broadcast TV. In today's world there isn't much of that except for news. Everything else is prerecorded and thus it COULD be transmitted at any time. IP multicast does not help you when you have 1000 subscribers all pulling in 1000 unique streams. In the 1960's it was reasonable to think that you could deliver the same video to all consumers because everybody was the same in one big melting pot. But that day is long gone.
On the other hand, P2P software could be leveraged to download video files during off-peak hours on the network. All it takes is some cooperation between P2P software developers and ISPs so that you have P2P clients which can be told to lay off during peak hours, or when they want something from the other side of a congested peering circuit. Better yet, the ISP's P2P manager could arrange for one full copy of that file to get across the congested peering circuit during the time period most favorable for that single circuit, then distribute elsewhere.
--Michael Dillon
As far as I am concerned the killer application for IP multicast is *NOT* video, it's market data feeds from NYSE, NASDAQ, CBOT, etc.
_______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
_______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
All this talk of exafloods seems to ignore the basic economics of IP networks. No ISP is going to allow subscribers to pull in 8gigs per day of video stream. And no broadcaster is going to pay for the bandwidth needed to pump out all those ATSC streams. And nobody is going to stick IP multicast (and multicast peering) in the core just to deal with video streams to people who leave their TV on all day whether they are at home or not.
The floor is littered with the discarded husks of policies about what ISP's are going to allow or disallow. "No servers", "no connection sharing", "web browsing only," "no voip," etc. These typically last only as long as the errant assumptions upon which they're based remain somewhat viable. For example, when NAT gateways and Internet Connection Sharing became widely available, trying to prohibit connection sharing went by the wayside. 8GB/day is less than a single megabit per second, and with ISP's selling ultra high speed connections (we're now able to get 7 or 15Mbps), an ISP might find it difficult to defend why they're selling a premium 15Mbps service on which a user can't get 1/15th of that.
At best you will see IP multicast on a city-wide basis in a single ISP's network. Also note that IP multicast only works for live broadcast TV. In today's world there isn't much of that except for news.
Huh? Why does IP multicast only work for that?
Everything else is prerecorded and thus it COULD be transmitted at any time. IP multicast does not help you when you have 1000 subscribers all pulling in 1000 unique streams.
Yes, that's potentially a problem. That doesn't mean that multicast can not be leveraged to handle prerecorded material, but it does suggest that you could really use a TiVo-like device to make best use. A fundamental change away from "live broadcast" and streaming out a show in 1:1 realtime, to a model where everything is spooled onto the local TiVo, and then watched at a user's convenience. We don't have the capacity at the moment to really deal with 1000 subs all pulling in 1000 unique streams, but the likelihood is that we're not going to see that for some time - if ever. What seems more likely is that we'll see an evolution of more specialized offerings, possibly supplementing or even eventually replacing the tiered channel package offerings of your typical cable company, since it's pretty clear that a-la-carte channel selection isn't likely to happen soon. That may allow some "less popular" channels to come into being. I happen to like holding up SciFi as an example, because their current operations are significantly different than originally conceived, and they're now producing significant quantities of their own original material. It's possible that we could see a much larger number of these sorts of ventures (which would terrify legacy television networks even further). The biggest challenge that I would expect from a network point of view is the potential for vast amounts of decentralization. For example, there's low-key stuff such as the "Star Trek: Hidden Frontier" series of fanfic- based video projects. There are almost certainly enough fans out there that you'd see a small surge in viewership if the material was more readily accessible (read that as: automatically downloaded to your TiVo). That could encourage others to do the same in more quantity. These are all low-volume data sources, and yet taken as a whole, they could represent a fairly difficult problem were everyone to be doing it. It is not just tech geeks that are going to be able produce video, as the stuff becomes more accessible (see: YouTube), we may see stuff like mini soap operas, home & garden shows, local sporting events, local politics, etc. I'm envisioning a scenario where we may find that there are a few tens of thousands of PTA meetings each being uploaded routinely onto the home PC's of whoever recorded the local meeting, and then made available to the small number of interested parties who might then watch, where (0<N<20). If that kind of thing happens, then we're going to find that there's a large range of projects that have potential viewership landing anywhere between this example and that of the specialty broadcast cable channels, and the question that is relevant to network operators is whether there's a way to guide this sort of thing towards models which are less harmful to the network. I don't pretend to have the answers to this, but I do feel reasonably certain that the success of YouTube is not a fluke, and that we're going to see more, not less, of this sort of thing.
As far as I am concerned the killer application for IP multicast is *NOT* video, it's market data feeds from NYSE, NASDAQ, CBOT, etc.
You can go compare the relative successes of Yahoo! Finance and YouTube. While it might be nice to multicast that sort of data, it's a relative trickle of data, and I'll bet that the majority of users have not only not visited a market data site this week, but have actually never done so. ... JG -- Joe Greco - sol.net Network Services - Milwaukee, WI - http://www.sol.net "We call it the 'one bite at the apple' rule. Give me one chance [and] then I won't contact you again." - Direct Marketing Ass'n position on e-mail spam(CNN) With 24 million small businesses in the US alone, that's way too many apples. _______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
On Tue, Apr 22, 2008 at 2:02 PM, Joe Greco <jgreco@ns.sol.net> wrote:
As far as I am concerned the killer application for IP multicast is *NOT* video, it's market data feeds from NYSE, NASDAQ, CBOT, etc.
You can go compare the relative successes of Yahoo! Finance and YouTube.
While it might be nice to multicast that sort of data, it's a relative trickle of data, and I'll bet that the majority of users have not only not visited a market data site this week, but have actually never done so.
As if most financial (and other mega-dataset) data was on consumer Web sites. Think pricing feeds off stock exchange back-office systems. _______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
On Tue, Apr 22, 2008 at 2:02 PM, Joe Greco <jgreco@ns.sol.net> wrote:
As far as I am concerned the killer application for IP multicast is *NOT* video, it's market data feeds from NYSE, NASDAQ, CBOT, etc.
You can go compare the relative successes of Yahoo! Finance and YouTube.
While it might be nice to multicast that sort of data, it's a relative trickle of data, and I'll bet that the majority of users have not only not visited a market data site this week, but have actually never done so.
As if most financial (and other mega-dataset) data was on consumer Web sites. Think pricing feeds off stock exchange back-office systems.
Oh, you got my point. Good. :-) This isn't a killer application for IP multicast, at least not on the public Internet. High volume bits that are not busily traversing a hundred thousand last-mile residential connections are probably not the bits that are going to pose a serious challenge for network operators, or at least, that's my take on things. ... JG -- Joe Greco - sol.net Network Services - Milwaukee, WI - http://www.sol.net "We call it the 'one bite at the apple' rule. Give me one chance [and] then I won't contact you again." - Direct Marketing Ass'n position on e-mail spam(CNN) With 24 million small businesses in the US alone, that's way too many apples. _______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
IP multicast does not help you when you have 1000 subscribers all pulling in 1000 unique streams.
Yes, that's potentially a problem. That doesn't mean that multicast can not be leveraged to handle prerecorded material, but it does suggest that you could really use a TiVo-like device to make best use.
You mean a computer? Like the one that runs file-sharing clients? Or Squid? Or an NNTP server? Is video so different from other content? Considering the volume of video that currently traverses P2P networks I really don't see that there is any need for an IP multicast solution except for news feeds and video conferencing.
What seems more likely is that we'll see an evolution of more specialized offerings,
Yes. The overall trend has been to increasingly split the market into smaller slivers with additional choices being added and older ones still available. During the shift to digital broadcasting in the UK, we retained the free-to-air services with more channels than we had on analog. Satellite continued to grow in diversity and now there is even a Freesat service coming online. Cable TV is still there although now it is usually bundled with broadband Internet as well as telephone service. You can access the Internet over your mobile phone using GPRS, or 3G and wifi is spreading slowly but surely. But one thing that does not change is the number of hours in the day. Every service competes for scarce attention spans, and a more-or-less fixed portion of people's disposable income. Based on this, I don't expect to see any really huge changes.
That may allow some "less popular" channels to come into being.
YouTube et al.
I happen to like holding up SciFi as an example, because their current operations are significantly different than originally conceived, and they're now producing significant quantities of their own original material.
The cost to film and to edit video content has dropped dramatically over the past decade. The SciFi channel is the tip of a very big iceberg. And what about immigrants? Even 50 years ago, immigrants to the USA joined a bigger melting pot culture and integrated slowly but surely. Nowadays, they have cheap phonecalls back home, the same Internet content as the folks back home, and P2P to get the TV shows and movies that people are watching back home. How is any US channel-based system going to handle that diversity and variety?
There are almost certainly enough fans out there that you'd see a small surge in viewership if the material was more readily accessible (read that as: automatically downloaded to your TiVo).
Is that so different from P2P video? In any case, the Tivo model is limited to the small amount of content, all commercial, that they can classify so that Tivo downloads the right stuff. P2P allows you to do the classification, but it is still automatically downloaded while you sleep.
I'm envisioning a scenario where we may find that there are a few tens of thousands of PTA meetings each being uploaded routinely onto the home PC's of whoever recorded the local meeting, and then made available to the small number of interested parties who might then watch, where (0<N<20).
Any reason why YouTube can't do this today? Remember the human element. People don't necessarily study the field of possibilities and them make the optimal choice. Usually, they just pick what is familiar as long as it is good enough. Click onto a YouTube video, then click the pause button, then go cook supper. After you eat, go back and press the play button. To the end user, this is much the same experience as P2P, or programming a PVR to record an interesting program that broadcasts at an awkward time.
I don't pretend to have the answers to this, but I do feel reasonably certain that the success of YouTube is not a fluke, and that we're going to see more, not less, of this sort of thing.
Agreed.
As far as I am concerned the killer application for IP multicast is *NOT* video, it's market data feeds from NYSE, NASDAQ, CBOT, etc.
You can go compare the relative successes of Yahoo! Finance and YouTube.
Actually, Yahoo! Finance is only one single subscriber to these market data feeds. My company happens to run an IP network supporting global multicast, which delivers the above market data feeds, and many others, to over 10,000 customers in over 50 countries. Market data feeds are not a mass market consumer product but they are a realtime firehose of data that people want to receive right now and not a microsecond later. It is not unusual for our sales team to receive RFPs that specify latency times that are faster than the speed of light. The point is that IP multicast is probably the only way to deliver this data because we cannot afford the additional latency to send packets into a server and back again. I.e. a CDN type of solution won't work. It's not only nice to multicast this data, it is mission critical. People are risking millions of dollars every hour based on the data in these feeds. The way it usually works (pioneered by NYSE I believe) is that they send two copies of every packet through two separate multicast trees. If there is too much time differential between the arrival of the two packets then the service puts up a warning flag so that the traders know their data is stale. Add a few more milliseconds and it shuts down entirely because the data is now entirely useless. When latency is this important, those copies going to multiple subscribers have to be copied in the packet-forwarding device, i.e. router supporting IP multicast. Of course consumer video doesn't have the same strict latency requirements, therefore my opinion that IP multicast is unneeded complexity. Use the best tool for the job. --Michael Dillon _______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
IP multicast does not help you when you have 1000 subscribers all pulling in 1000 unique streams.
Yes, that's potentially a problem. That doesn't mean that multicast can not be leveraged to handle prerecorded material, but it does suggest that you could really use a TiVo-like device to make best use.
You mean a computer? Like the one that runs file-sharing clients?
Like the one that nobody really wants to watch large quantities of television on? Especially now that it's pretty common to have large, flat screen TV's, and watching TV even on a 24" monitor feels like a throwback to the '80's? How about the one that's shaped like a TiVo and has a built-in remote control, sane operating software, can be readily purchased and set up by a non-techie, and is known to work well? I remember all the fuss about how people would be making phone calls using VoIP and their computers. Yet most of the time, I see VoIP consumers transforming VoIP to legacy POTS, or VoIP hardphones, or stuff like that. I'm going to make a guess and take a stab and say that people are going to prefer to keep their TV's somewhat more TV- like.
Or Squid? Or an NNTP server?
Speaking as someone who's run the largest Squid and news server deployments in this region, I think I can safely say - no. It's certainly fine to note that both Squid and NNTP have elements that deal with transferring large amounts of data, and that fundamentally similar elements could play a role in the distribution model, but I see no serious role for those at the set-top level.
Is video so different from other content? Considering the volume of video that currently traverses P2P networks I really don't see that there is any need for an IP multicast solution except for news feeds and video conferencing.
Wow. Okay. I'll just say, then, that such a position seems a bit naive, and I suspect that broadband networks are going to be crying about the sheer stresses on their networks, when moderate numbers of people begin to upload videos into their TiVo, which then share them with other TiVo's owned by their friends around town, or across an ocean, while also downloading a variety of shows from a dozen off-net sources, etc. I really see the any-to-any situation as being somewhat hard on networks, but if you believe that not to be the case, um, I'm listening, I guess.
What seems more likely is that we'll see an evolution of more specialized offerings,
Yes. The overall trend has been to increasingly split the market into smaller slivers with additional choices being added and older ones still available.
Yes, but that's still a broadcast model. We're talking about an evolution (potentially _r_evolution) of technology where the broadcast model itself is altered.
During the shift to digital broadcasting in the UK, we retained the free-to-air services with more channels than we had on analog. Satellite continued to grow in diversity and now there is even a Freesat service coming online. Cable TV is still there although now it is usually bundled with broadband Internet as well as telephone service. You can access the Internet over your mobile phone using GPRS, or 3G and wifi is spreading slowly but surely.
Yes.
But one thing that does not change is the number of hours in the day. Every service competes for scarce attention spans,
Yes. However, some things that do change: 1) Broadband speeds continue to increase, making it possible for more content to be transferred 2) Hard drives continue to grow, and the ability to store more, combined with higher bit rates (HD, less artifact, whatever) means that more bits can be transferred to fill the same amount of time 3) Devices such as TiVo are capable of downloading large amounts of material on a speculative basis, even on days where #hrs-tv-watched == 0. I suspect that this effect may be a bit worse as more diversity appears, because instead of hitting stop during a 30-second YouTube clip, you're now hitting delete 15 seconds into a 30-minute InterneTiVo'd show. I bet I can clear out a few hours worth of not-that-great programming in 5 minutes...
and a more-or-less fixed portion of people's disposable income. Based on this, I don't expect to see any really huge changes.
That's fair enough. That's optimistic (from a network operator's point of view.) I'm afraid that such changes will happen, however.
That may allow some "less popular" channels to come into being.
YouTube et al.
The problem with that is that there's money to be had, and if you let YouTube host your video, it's YouTube getting the juicy ad money. An essential quality of the Internet is the ability to eliminate the middleman, so even if YouTube has invented itself as a new middleman, that's primarily because it is kind of a new thing, and we do not yet have ways for the average user to easily serve video clips a different way. That will almost certainly change.
I happen to like holding up SciFi as an example, because their current operations are significantly different than originally conceived, and they're now producing significant quantities of their own original material.
The cost to film and to edit video content has dropped dramatically over the past decade. The SciFi channel is the tip of a very big iceberg. And what about immigrants? Even 50 years ago, immigrants to the USA joined a bigger melting pot culture and integrated slowly but surely. Nowadays, they have cheap phonecalls back home, the same Internet content as the folks back home, and P2P to get the TV shows and movies that people are watching back home. How is any US channel-based system going to handle that diversity and variety?
Well, that's the point I'm making. It isn't, and we're going to see SOMEONE look at this wonderful Internet thingy and see in it a way to "solve" this problem, which is going to turn into an operational nightmare as traffic loads increase, and a larger percentage of users start to either try to use the bandwidth they're being "sold," or actually demand it.
There are almost certainly enough fans out there that you'd see a small surge in viewership if the material was more readily accessible (read that as: automatically downloaded to your TiVo).
Is that so different from P2P video? In any case, the Tivo model is limited to the small amount of content, all commercial, that they can classify so that Tivo downloads the right stuff. P2P allows you to do the classification, but it is still automatically downloaded while you sleep.
I guess I'm saying that I would not expect this to remain this way indefinitely. To be clear, I don't necessarily mean the current TiVo device or company, I'm referring to a TiVo-like device that is your personal video assistant. I'd like to think that the folks over at TiVo be the one to leverage this sort of thing, but that's about it. This could come from anywhere. Slingbox comes to mind as one possibility.
I'm envisioning a scenario where we may find that there are a few tens of thousands of PTA meetings each being uploaded routinely onto the home PC's of whoever recorded the local meeting, and then made available to the small number of interested parties who might then watch, where (0<N<20).
Any reason why YouTube can't do this today?
Primarily because I'm looking towards the future, and there are many situations where YouTube isn't going to be the answer. For example, consider the PTA meeting: I'm not sure if YouTube is going to want to be dealing with maybe 10,000 videos that are each an hour or two long which are watched by maybe a handful of people, at however frequently your local PTA meetings get held. Becuase there's a lot of PTA's. And the meetings can be long. Further, it's a perfect situation where you're likely to be able to keep a portion of the traffic on-net through geolocality effects. Of course, I'm assuming some technology exists, possibly in the upcoming fictional Microsoft Whoopta OS, that makes local publication and serving of video easy to do. If there's a demand, we will probably see it.
Remember the human element. People don't necessarily study the field of possibilities and them make the optimal choice.
That's the argument to discuss this now rather than later.
Usually, they just pick what is familiar as long as it is good enough. Click onto a YouTube video, then click the pause button, then go cook supper. After you eat, go back and press the play button. To the end user, this is much the same experience as P2P, or programming a PVR to record an interesting program that broadcasts at an awkward time.
I would say that it is very much NOT the same experience as programming a PVR. I watch exceedingly little video on the computer, for example. I simply prefer the TV. And if more than one person's going to watch, it *has* to be on the TV (at least here).
I don't pretend to have the answers to this, but I do feel reasonably certain that the success of YouTube is not a fluke, and that we're going to see more, not less, of this sort of thing.
Agreed.
As far as I am concerned the killer application for IP multicast is *NOT* video, it's market data feeds from NYSE, NASDAQ, CBOT, etc.
You can go compare the relative successes of Yahoo! Finance and YouTube.
Actually, Yahoo! Finance is only one single subscriber to these market data feeds. My company happens to run an IP network supporting global multicast, which delivers the above market data feeds, and many others, to over 10,000 customers in over 50 countries. Market data feeds are not a mass market consumer product but they are a realtime firehose of data that people want to receive right now and not a microsecond later. It is not unusual for our sales team to receive RFPs that specify latency times that are faster than the speed of light. The point is that IP multicast is probably the only way to deliver this data because we cannot afford the additional latency to send packets into a server and back again. I.e. a CDN type of solution won't work.
It's not only nice to multicast this data, it is mission critical. People are risking millions of dollars every hour based on the data in these feeds. The way it usually works (pioneered by NYSE I believe) is that they send two copies of every packet through two separate multicast trees. If there is too much time differential between the arrival of the two packets then the service puts up a warning flag so that the traders know their data is stale. Add a few more milliseconds and it shuts down entirely because the data is now entirely useless. When latency is this important, those copies going to multiple subscribers have to be copied in the packet-forwarding device, i.e. router supporting IP multicast.
Of course consumer video doesn't have the same strict latency requirements, therefore my opinion that IP multicast is unneeded complexity. Use the best tool for the job.
There are lots of things that multicast can be used for, and there's no question that financial data could be useful that way. However, what I'm saying is that this isn't particularly relevant on the public Internet in a general way. The thing that's going to kill networks isn't the presence or absence of the data you're talking about, because as a rule anybody who needs data in the sort of fashion you're talking about is capable of buying sufficient guaranteed network capacity to deal with it. I could just as easily say that the killer application for IP multicast is routing protocols such as OSPF, because that's probably just as relevant (in a different way) as what you're talking about. But both are distractions. What I'm concerned about are things that are going to cause major networks to have difficulties. Given this discussion, this almost certainly requires you to involve circuits where oversubscription is a key component in the product strategy. That probably means residential broadband connections, which are responsible for a huge share of the global Internet's traffic. My uninformed guess would be that there are more of those broadband connections than there are attachments to your global multicast n etwork. Maybe even by an order of magnitude. :-) ;-) Multicast may or may not be the solution to the problem at hand, but from a distribution point of view, multicast and intelligent caching share some qualities that are desirable. To write off multicast as being at least a potential part of the solution, just because the application is less critical than your financial transactions, may be premature. I see a lot of value in having content only arrive on-net once, and multicast could be a way to help that happen. The real problem is that neither your financial transactions nor any meaningful amount of video are able to transit multicast across random parts of the public Internet, which is a bit of a sticking point. ... JG -- Joe Greco - sol.net Network Services - Milwaukee, WI - http://www.sol.net "We call it the 'one bite at the apple' rule. Give me one chance [and] then I won't contact you again." - Direct Marketing Ass'n position on e-mail spam(CNN) With 24 million small businesses in the US alone, that's way too many apples. _______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
On 22 Apr 2008, at 12:47, Joe Greco wrote:
You mean a computer? Like the one that runs file-sharing clients?
Like the one that nobody really wants to watch large quantities of television on?
Perhaps more like the mac mini that's plugged into the big plasma screen in the living room? Or one of the many stereo-component-styled "media" PCs sold for the same purpose, perhaps even running Windows MCE, a commercial operating system sold precisely because people want to hook their computers up to televisions? Or the old-school hacked XBox running XBMC, pulling video over SMB from the PC in the other room? Or the XBox 360 which can play media from the home-user NAS in the back room? The one with the bittorrent client on it? :-) Joe _______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
On 4/22/08, Joe Abley <jabley@ca.afilias.info> wrote:
On 22 Apr 2008, at 12:47, Joe Greco wrote:
You mean a computer? Like the one that runs file-sharing clients?
Like the one that nobody really wants to watch large quantities of television on?
Perhaps more like the mac mini that's plugged into the big plasma screen in the living room? Or one of the many stereo-component-styled "media" PCs sold for the same purpose, perhaps even running Windows MCE, a commercial operating system sold precisely because people want to hook their computers up to televisions?
Or the old-school hacked XBox running XBMC, pulling video over SMB from the PC in the other room?
Or the XBox 360 which can play media from the home-user NAS in the back room? The one with the bittorrent client on it? :-)
Don't forget the laptop or thin desktop hooked up to the 24-60 inch monitor in the bedroom/living room to watch Netflix Watch It Now content (which there is no limit on how much can be viewed by a customer). -brandon _______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
The OSCAR is the first H.264 encoder appliance designed by HaiVision specifically for QuickTime environments. It natively supports the RTSP streaming media protocol. The OSCAR can stream directly to QuickTime supporting up to full D1 resolution (full standard definition resolution or 720 x 480 NTSC / 576 PAL) at video bit rates up to 1.5 Mbps. The OSCAR supports either multicast or unicast RTSP sessions. With either, up to 10 separate destination streams can be generated by a single OSCAR encoder (more at lower bit rates). So, on a college campus for example, this simple, compact, rugged appliance can be placed virtually anywhere and with a simple network connection can stream video to any QuickTime client on the local network or over the WAN. If more than 10 QuickTime clients need to view or access the video, the OSCAR can be directed to a QuickTime Streaming Server which can typically host well over 1000 clients
-----Original Message----- From: Brandon Galbraith [mailto:brandon.galbraith@gmail.com] Sent: Tuesday, April 22, 2008 1:51 PM To: Joe Abley Cc: nanog@nanog.org; Joe Greco Subject: Re: [Nanog] ATT VP: Internet to hit capacity by 2010
On 4/22/08, Joe Abley <jabley@ca.afilias.info> wrote:
On 22 Apr 2008, at 12:47, Joe Greco wrote:
You mean a computer? Like the one that runs file-sharing clients?
Like the one that nobody really wants to watch large
television on?
Perhaps more like the mac mini that's plugged into the big plasma screen in the living room? Or one of the many stereo-component-styled "media" PCs sold for the same purpose, perhaps even running Windows MCE, a commercial operating system sold precisely because
quantities of people want
to hook their computers up to televisions?
Or the old-school hacked XBox running XBMC, pulling video over SMB from the PC in the other room?
Or the XBox 360 which can play media from the home-user NAS in the back room? The one with the bittorrent client on it? :-)
Don't forget the laptop or thin desktop hooked up to the 24-60 inch monitor in the bedroom/living room to watch Netflix Watch It Now content (which there is no limit on how much can be viewed by a customer).
-brandon _______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
_______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
.......is the first H.264 encoder ...... designed by .... specifically for ....... environments. It natively supports the RTSP streaming media protocol. ........ can stream directly to .....
hi marc so your " oskar" can rtsp multicast stream over ipv6 and quicktime not , or was this just an ad ? cheers Marc -- Les enfants teribbles - research and deployment Marc Manthey - Hildeboldplatz 1a D - 50672 Kƶln - Germany Tel.:0049-221-3558032 Mobil:0049-1577-3329231 jabber :marc@kgraff.net blog : http://www.let.de ipv6 http://www.ipsix.org Klarmachen zum Ćndern! http://www.piratenpartei-koeln.de/ _______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
Just an ad used to illustrate the low cost and ease of use. The fact that it's quicktime also made me realize it's also ipods, iphones/wifi, and that Apple has web libraries ready for web site development on their darwin boxes. Also, I would imagine this device could easily be cross connected and multicasted into each access router so that the only bandwidth used is that bandwidth being paid for by customer or QoS unicast streams feeding an MCU. Rambling now, but happy to answer your question.
-----Original Message----- From: Marc Manthey [mailto:marc@let.de] Sent: Tuesday, April 22, 2008 9:07 PM To: nanog@nanog.org Subject: Re: [Nanog] ATT VP: Internet to hit capacity by 2010
.......is the first H.264 encoder ...... designed by .... specifically for ....... environments. It natively supports the RTSP streaming media protocol. ........ can stream directly to .....
hi marc so your " oskar" can rtsp multicast stream over ipv6 and quicktime not , or was this just an ad ?
cheers
Marc
-- Les enfants teribbles - research and deployment Marc Manthey - Hildeboldplatz 1a D - 50672 Kƶln - Germany Tel.:0049-221-3558032 Mobil:0049-1577-3329231 jabber :marc@kgraff.net blog : http://www.let.de ipv6 http://www.ipsix.org
Klarmachen zum Ćndern! http://www.piratenpartei-koeln.de/ _______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
_______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
Here is a spec sheet : <http://goamt.radicalwebs.com/images/products/ds_OSCAR_1106.pdf> Regards Marshall On Apr 23, 2008, at 10:08 AM, Williams, Marc wrote:
Just an ad used to illustrate the low cost and ease of use. The fact that it's quicktime also made me realize it's also ipods, iphones/wifi, and that Apple has web libraries ready for web site development on their darwin boxes. Also, I would imagine this device could easily be cross connected and multicasted into each access router so that the only bandwidth used is that bandwidth being paid for by customer or QoS unicast streams feeding an MCU. Rambling now, but happy to answer your question.
-----Original Message----- From: Marc Manthey [mailto:marc@let.de] Sent: Tuesday, April 22, 2008 9:07 PM To: nanog@nanog.org Subject: Re: [Nanog] ATT VP: Internet to hit capacity by 2010
.......is the first H.264 encoder ...... designed by .... specifically for ....... environments. It natively supports the RTSP streaming media protocol. ........ can stream directly to .....
hi marc so your " oskar" can rtsp multicast stream over ipv6 and quicktime not , or was this just an ad ?
cheers
Marc
-- Les enfants teribbles - research and deployment Marc Manthey - Hildeboldplatz 1a D - 50672 Kƶln - Germany Tel.:0049-221-3558032 Mobil:0049-1577-3329231 jabber :marc@kgraff.net blog : http://www.let.de ipv6 http://www.ipsix.org
Klarmachen zum Ćndern! http://www.piratenpartei-koeln.de/ _______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
_______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
_______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
Am 23.04.2008 um 16:08 schrieb Williams, Marc:
Just an ad
hi marc.... cool . so i have 3 computers that does not do the job and i have not much money can you send me one of those;) ? Or cheapest_beta_tester_non_commercial_offer you can make . ? I accept offlist conversation. thanks and sorry for my ramblings greetings from germany Marc
-----Original Message----- From: Marc Manthey [mailto:marc@let.de] Sent: Tuesday, April 22, 2008 9:07 PM To: nanog@nanog.org Subject: Re: [Nanog] ATT VP: Internet to hit capacity by 2010
.......is the first H.264 encoder ...... designed by .... specifically for ....... environments. It natively supports the RTSP streaming media protocol. ........ can stream directly to .....
hi marc so your " oskar" can rtsp multicast stream over ipv6 and quicktime not , or was this just an ad ?
cheers
Marc
-- Les enfants teribbles - research and deployment Marc Manthey - Hildeboldplatz 1a D - 50672 Kƶln - Germany Tel.:0049-221-3558032 Mobil:0049-1577-3329231 jabber :marc@kgraff.net blog : http://www.let.de ipv6 http://www.ipsix.org
Klarmachen zum Ćndern! http://www.piratenpartei-koeln.de/ _______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
_______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
On 22 Apr 2008, at 12:47, Joe Greco wrote:
You mean a computer? Like the one that runs file-sharing clients?
Like the one that nobody really wants to watch large quantities of television on?
Perhaps more like the mac mini that's plugged into the big plasma screen in the living room? Or one of the many stereo-component-styled "media" PCs sold for the same purpose, perhaps even running Windows MCE, a commercial operating system sold precisely because people want to hook their computers up to televisions?
Or the old-school hacked XBox running XBMC, pulling video over SMB from the PC in the other room?
Or the XBox 360 which can play media from the home-user NAS in the back room? The one with the bittorrent client on it? :-)
Pretty much. People have a fairly clear bias against watching anything on your conventional PC. This probably has something to do with the way the display ergonomics work; my best guess is that most people have their PC's set up in a corner with a chair and a screen suitable for work at a distance of a few feet. As a result, there's usually a clear delineation between devices that are used as general purpose computers, and devices that are used as specialized media display devices. The "Mac Mini" may be an example of a device that can be used either way, but do you know of many people that use it as a computer (and do all their normal computing tasks) while it's hooked up to a large TV? Even Apple acknowledged the legitimacy of this market by releasing AppleTV. People generally do not want to hook their _computer_ up to televisions, but rather they want to hook _a_ computer up to television so that they're able to do things with their TV that an off-the-shelf product won't do for them. That's an important distinction, and all of the examples you've provided seem to be examples of the latter, rather than the former, which is what I was talking about originally. If you want to discuss the latter, then we've got to include a large field of other devices, ironically including the TiVo, which are actually programmable computers that have been designed for specific media tasks, and are theoretically reprogrammable to support a wide variety of interesting possibilities, and there we have the entry into the avalanche of troubling operational issues that could result from someone releasing software that distributes large amounts of content over the Internet, and ... oh, my bad, that brings us back to what we were talking about. ... JG -- Joe Greco - sol.net Network Services - Milwaukee, WI - http://www.sol.net "We call it the 'one bite at the apple' rule. Give me one chance [and] then I won't contact you again." - Direct Marketing Ass'n position on e-mail spam(CNN) With 24 million small businesses in the US alone, that's way too many apples. _______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
You mean a computer? Like the one that runs file-sharing clients?
Like the one that nobody really wants to watch large quantities of television on? Especially now that it's pretty common to have large, flat screen TV's, and watching TV even on a 24" monitor feels like a throwback to the '80's?
How about the one that's shaped like a TiVo and has a built-in remote control, sane operating software, can be readily purchased and set up by a non-techie, and is known to work well?
Maybe I have a warped sense of how normal people set up their home networks but I do notice all kinds of network storage for sale in local computer shops, and various multi-media player devices that connect to a TV screen, network, etc. I can understand why a TiVo collects content over the air, because it has TV receivers built into it. My PVR does much the same thing. But when it comes to collecting content from the Internet, it seems easier to just let the file server do that job. Or run the nice easy software on your home PC that allows you to search the web for torrents and just click on the ones you want to download. Let's face it, TiVo may have a lot of mindshare in that people constantly talk about the thing as if it was some kind of magic, but it hardly has the same kind of market share as the iPod. The functions of that the TiVo carries out are software and software is rather malleable. The functions of the various devices can be mixed and matched in various ways. We can't predict which combos will prevail, but we can make a pretty close guess as to the functionality of the whole system.
I remember all the fuss about how people would be making phone calls using VoIP and their computers. Yet most of the time, I see VoIP consumers transforming VoIP to legacy POTS, or VoIP hardphones, or stuff like that.
Cisco sells computers that look like a telephone set but have and Ethernet jack out the back. Whether you use the Gizmoproject software on a PC or one of these Cisco devices, you are still making VoIP calls on a computer. The appearance of a telephone is not terribly relevant. My mobile phone is a computer with Python installed on it to run a Russian-English dictionary application but it also includes a two-way radio transciever that is programmed to talk to a local cell transciever and behave like a telephone. But it is still a computer at heart. Anyone remember when a switch was a switch and a router was a router? Now both of them are backplanes with computers and port interfaces attached.
Wow. Okay. I'll just say, then, that such a position seems a bit naive, and I suspect that broadband networks are going to be crying about the sheer stresses on their networks, when moderate numbers of people begin to upload videos into their TiVo, which then share them with other TiVo's owned by their friends around town, or across an ocean, while also downloading a variety of shows from a dozen off-net sources, etc.
Where have you been!? You have just described the P2P traffic that ISPs and other network operators have been complaining about since the dawn of this century. TiVo is just one of a thousand brand names for "home computer".
Yes. The overall trend has been to increasingly split the market into smaller slivers with additional choices being added and older ones still available.
Yes, but that's still a broadcast model. We're talking about an evolution (potentially _r_evolution) of technology where the broadcast model itself is altered.
I would say that splitting the market for content into many small slivers (a forest of shards) is pretty much a revolution. Whatever technology is used to deliver this forest of shards is irrelevant because the revolution is in the creation of this information superhighway with thousands of channels. And even though the concept predated the exponential growth of the Internet let's not forget that the web has been there and done that.
2) Hard drives continue to grow, and the ability to store more, combined with higher bit rates (HD, less artifact, whatever) means that more bits can be transferred to fill the same amount of time
This is key. Any scenario that does not expect the end user to amass a huge library of content for later viewing, is missing an important component. And if that content library is encrypted or locked in some way so that it is married to one brand name device, or pay-per-view systems, then the majority of the market will pass it by.
and a more-or-less fixed portion of people's disposable income. Based on this, I don't expect to see any really huge changes.
That's fair enough. That's optimistic (from a network operator's point of view.) I'm afraid that such changes will happen, however.
Bottom line is that our networks must be paid for. If consumers want to use more of our financial investment (capital and opex) then we will be forced to raise prices up to a level where it limits demand to what we can actually deliver. Most networks can live with a step up in consumption if it levels off because although they may lose money at first, if consumption dips and levels then they can make it back over time. If the content senders do not want this dipping and levelling off, then they will have to foot the bill for the network capacity. And if they want to recover that cost from the end users then they will also run into that limit in the amount of money people are able to spend on entertainment per month. Broadcast models were built based on a delivery system that scaled up as big as you want with only capex. But an IP network requires a lot of opex to maintain any level of capex investment. There ain't no free lunch.
The problem with that is that there's money to be had, and if you let YouTube host your video, it's YouTube getting the juicy ad money.
The only difference from 1965 network TV is that in 1965, the networks had limited sources capable of producing content at a reasonable cost. But today, content production is cheap, and competition has driven the cost of content down to zero. Only the middleman selling ads has a business model any more. Network operators could fill that middleman role but most of them are still stuck in the telco/ISP mindset.
Well, that's the point I'm making. It isn't, and we're going to see SOMEONE look at this wonderful Internet thingy and see in it a way to "solve" this problem, which is going to turn into an operational nightmare as traffic loads increase, and a larger percentage of users start to either try to use the bandwidth they're being "sold," or actually demand it.
If this really happens, then some companies will fix their marketing and sales contracts, others will go into Chapter 11. But at the end of the day, as with the telecom collapse, the networks keep rolling on even if the management changes.
For example, consider the PTA meeting: I'm not sure if YouTube is going to want to be dealing with maybe 10,000 videos that are each an hour or two long which are watched by maybe a handful of people, at however frequently your local PTA meetings get held. Becuase there's a lot of PTA's. And the meetings can be long. Further, it's a perfect situation where you're likely to be able to keep a portion of the traffic on-net through geolocality effects.
You're right. People are already building YouTube clones or adding YouTube like video libraries to their websites. This software combined with lots of small distributed data centers like Amazon EC2, is likely where local content will go. Again one wonders why Google and Amazon and Yahoo are inventing this stuff rather than ISPs. Probably because after the wave of acquisition by telcos, they neglected the data center half of the ISP equation. In other words, there are historical reasons based on ignorance, but no fundamental barrier to large carriers offering something like Hadoop, EC2, AppEngine.
I would say that it is very much NOT the same experience as programming a PVR. I watch exceedingly little video on the computer, for example. I simply prefer the TV.
Maybe PVR doesn't mean the same stateside as here in the UK. My PVR is a box with two digital TV receivers and 180 gig hard drive that connects to a TV screen. All interaction is through the remote and the TV. The difference between this and P2P video is only the software and the screen we watch it on. By the way, my 17-month old loves YouTube videos. There may be a generational thing coming down the road similar to the way young people have ditched email in favour of IM.
There are lots of things that multicast can be used for, and there's no question that financial data could be useful that way. However, what I'm saying is that this isn't particularly relevant on the public Internet in a general way.
If it were not for these market data feeds, I doubt that IP multicast would be as widely supported by routers.
The real problem is that neither your financial transactions nor any meaningful amount of video are able to transit multicast across random parts of the public Internet, which is a bit of a sticking point.
Then there is P2MP (Point to Multi-Point) MPLS... --Michael Dillon _______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
You mean a computer? Like the one that runs file-sharing clients?
Like the one that nobody really wants to watch large quantities of television on? Especially now that it's pretty common to have large, flat screen TV's, and watching TV even on a 24" monitor feels like a throwback to the '80's?
How about the one that's shaped like a TiVo and has a built-in remote control, sane operating software, can be readily purchased and set up by a non-techie, and is known to work well?
Maybe I have a warped sense of how normal people set up their home networks but I do notice all kinds of network storage for sale in local computer shops, and various multi-media player devices that connect to a TV screen, network, etc.
Yes, but there's no real standard. It's mostly hodgepodge based solutions that allow techie types to cobble together some random collection of hardware to resolve some particular subset of problems. What the public wants, though, is for someone to solve this problem and build it for them. As an example, consider that it's a lot more popular for home users to source their DVR from their cable company than it is for them to get a CableCARD receiver card for their PC and try to roll a MythTV box for themselves.
I can understand why a TiVo collects content over the air, because it has TV receivers built into it. My PVR does much the same thing. But when it comes to collecting content from the Internet, it seems easier to just let the file server do that job. Or run the nice easy software on your home PC that allows you to search the web for torrents and just click on the ones you want to download.
Let's face it, TiVo may have a lot of mindshare in that people constantly talk about the thing as if it was some kind of magic, but it hardly has the same kind of market share as the iPod. The functions of that the TiVo carries out are software and software is rather malleable. The functions of the various devices can be mixed and matched in various ways. We can't predict which combos will prevail, but we can make a pretty close guess as to the functionality of the whole system.
The magic of TiVo isn't that it records video. The magic bit is more abstract, and it is that someone made a device that actually does what the average consumer _wants_, rather than simply acting as a generic DVR. You actually said it yourself above, "it just seems easier" - but then you got sidetracked by the loveliness of your PC. The magic of a TiVo- like device is that end users perceive it as easier. The solution that doesn't involve them learning what torrents are, or filesharing is, or having to figure out how to hook a computer up to a TV is, because some TiVo-like device took it and internalized all of that and SOLVED the problem, and solved it not only for them but a million other TV viewers at the same time, that's the solution that's going to be truly successful. Not your homegrown DVR.
I remember all the fuss about how people would be making phone calls using VoIP and their computers. Yet most of the time, I see VoIP consumers transforming VoIP to legacy POTS, or VoIP hardphones, or stuff like that.
Cisco sells computers that look like a telephone set but have and Ethernet jack out the back. Whether you use the Gizmoproject software on a PC or one of these Cisco devices, you are still making VoIP calls on a computer. The appearance of a telephone is not terribly relevant. My mobile phone is a computer with Python installed on it to run a Russian-English dictionary application but it also includes a two-way radio transciever that is programmed to talk to a local cell transciever and behave like a telephone. But it is still a computer at heart.
The hell it is. It's still fundamentally a phone. That you can reprogram it to do other things is technologically interesting to a small number of geeks, but were you to ask the average person "what is this," they'd still see it as a phone, and see its primary job as making phone calls. Further, that does not even begin to argue against what I was saying, which is that most people are NOT making phone calls using VoIP from their computers.
Anyone remember when a switch was a switch and a router was a router? Now both of them are backplanes with computers and port interfaces attached.
Yes. There's a certain amount of sense to that, at least once you needed to be able to process things at wirespeed.
Wow. Okay. I'll just say, then, that such a position seems a bit naive, and I suspect that broadband networks are going to be crying about the sheer stresses on their networks, when moderate numbers of people begin to upload videos into their TiVo, which then share them with other TiVo's owned by their friends around town, or across an ocean, while also downloading a variety of shows from a dozen off-net sources, etc.
Where have you been!?
I've been right here, serving high bandwidth content for many years.
You have just described the P2P traffic that ISPs and other network operators have been complaining about since the dawn of this century.
No. I've just described something much worse, because there is the potential for so much more volume. TiVo implies that the device can do speculative fetch, not just the on-demand sort of things most current P2P networks do.
TiVo is just one of a thousand brand names for "home computer".
If you want to define "home computer" that way. Personally, while my light switches contain microprocessors, and may be reprogrammable, that does not mean that I view them as computers. I don't think I can run X11 on my light switch (even though it's got several LED's). I don't think that it's a good idea to try to run FreeBSD on my security system. I don't think that I'll be able to run OpenOffice on my Cisco 7960G's. I'm pretty sure that my thermostat isn't good for running Mahjongg. And the TiVo probably isn't going to run Internet Explorer anytime soon. There are microprocessors all over the place. Possessing a microprocessor, and even being able to affect the programming that runs on a uP, doesn't make every such device a home computer. One of these days, we're going to wake up and discover that someone (and I guess it's got to be someone more persuasive than Apple with their AppleTV doodad) is going to create some device that is compelling to users. I do not care that it has a microprocessor inside, or even that it may be programmable. The thing is likely to be a variation on a set-top box, is likely to have TiVo-like capabilities, and I'm worried about what's going to happen to IP networks.
Yes. The overall trend has been to increasingly split the market into smaller slivers with additional choices being added and older ones still available.
Yes, but that's still a broadcast model. We're talking about an evolution (potentially _r_evolution) of technology where the broadcast model itself is altered.
I would say that splitting the market for content into many small slivers (a forest of shards) is pretty much a revolution.
Agreed :-) I'm not sure it'll happen all at once, though.
Whatever technology is used to deliver this forest of shards is irrelevant because the revolution is in the creation of this information superhighway with thousands of channels. And even though the concept predated the exponential growth of the Internet let's not forget that the web has been there and done that.
Ok, I'll accept that. Except I'd like to note that the technology that I have seen that could enable this is probably the Internet; most other methods of transmission are substantially more restricted (i.e. it's pretty difficult for me to go and get a satellite uplink, but pretty much even the most lowly DSL customer probably has a 384k upstream).
2) Hard drives continue to grow, and the ability to store more, combined with higher bit rates (HD, less artifact, whatever) means that more bits can be transferred to fill the same amount of time
This is key. Any scenario that does not expect the end user to amass a huge library of content for later viewing, is missing an important component. And if that content library is encrypted or locked in some way so that it is married to one brand name device, or pay-per-view systems, then the majority of the market will pass it by.
I ABSOLUTELY AGREE... that I wish the world worked that way. ( :-) )
and a more-or-less fixed portion of people's disposable income. Based on this, I don't expect to see any really huge changes.
That's fair enough. That's optimistic (from a network operator's point of view.) I'm afraid that such changes will happen, however.
Bottom line is that our networks must be paid for. If consumers want to use more of our financial investment (capital and opex) then we will be forced to raise prices up to a level where it limits demand to what we can actually deliver. Most networks can live with a step up in consumption if it levels off because although they may lose money at first, if consumption dips and levels then they can make it back over time. If the content senders do not want this dipping and levelling off, then they will have to foot the bill for the network capacity.
That's kind of the funniest thing I've seen today, it sounds so much like an Ed Whitacre. I've somewhat deliberately avoided the model of having some large-channel-like "content senders" enter this discussion, because I am guessing that there will be a large number of people who may simply use their existing - paid for - broadband connections. That's the PTA example and probably the "Star Trek: Hidden Frontier" example, and then for good measure, throw in everyone who will be self-publishing the content that (looking back on today) used to get served on YouTube. Then Ed learns that the people he'd like to charge for the privilege of using "his" pipes are already paying for pipes.
And if they want to recover that cost from the end users then they will also run into that limit in the amount of money people are able to spend on entertainment per month.
Broadcast models were built based on a delivery system that scaled up as big as you want with only capex. But an IP network requires a lot of opex to maintain any level of capex investment. There ain't no free lunch.
I certainly agree, that's why this discussion is relevant.
The problem with that is that there's money to be had, and if you let YouTube host your video, it's YouTube getting the juicy ad money.
The only difference from 1965 network TV is that in 1965, the networks had limited sources capable of producing content at a reasonable cost. But today, content production is cheap, and competition has driven the cost of content down to zero.
Right, that's a "problem" I'm seeing too.
Only the middleman selling ads has a business model any more. Network operators could fill that middleman role but most of them are still stuck in the telco/ISP mindset.
So, consider what would happen if that were to be something that you could self-manage, outsourcing the hard work to an advertising provider. Call it maybe Google AdVideos. :-) Host the video on your TiVo, or your PC, and take advantage of your existing bandwidth. (There are obvious non- self-hosted models already available, I'm not focusing on them, but they would work too)
Well, that's the point I'm making. It isn't, and we're going to see SOMEONE look at this wonderful Internet thingy and see in it a way to "solve" this problem, which is going to turn into an operational nightmare as traffic loads increase, and a larger percentage of users start to either try to use the bandwidth they're being "sold," or actually demand it.
If this really happens, then some companies will fix their marketing and sales contracts, others will go into Chapter 11. But at the end of the day, as with the telecom collapse, the networks keep rolling on even if the management changes.
I would think that has some operational aspects that are worth talking about.
For example, consider the PTA meeting: I'm not sure if YouTube is going to want to be dealing with maybe 10,000 videos that are each an hour or two long which are watched by maybe a handful of people, at however frequently your local PTA meetings get held. Becuase there's a lot of PTA's. And the meetings can be long. Further, it's a perfect situation where you're likely to be able to keep a portion of the traffic on-net through geolocality effects.
You're right. People are already building YouTube clones or adding YouTube like video libraries to their websites. This software combined with lots of small distributed data centers like Amazon EC2, is likely where local content will go. Again one wonders why Google and Amazon and Yahoo are inventing this stuff rather than ISPs. Probably because after the wave of acquisition by telcos, they neglected the data center half of the ISP equation. In other words, there are historical reasons based on ignorance, but no fundamental barrier to large carriers offering something like Hadoop, EC2, AppEngine.
That's true, but it's also quite possible that we'll see it decentralize further. Why should I pay someone to host content if I could just share it from my PC... I'm not saying that I _want_ Microsoft to wake up and realize that it has a path to strike at some portions of Google, et.al, by changing the very nature of Internet content distribution, but it's a significant possibility. That P2P networks work as well as they do says gobs about the potential.
I would say that it is very much NOT the same experience as programming a PVR. I watch exceedingly little video on the computer, for example. I simply prefer the TV.
Maybe PVR doesn't mean the same stateside as here in the UK. My PVR is a box with two digital TV receivers and 180 gig hard drive that connects to a TV screen. All interaction is through the remote and the TV.
Then it's part of your TV system, not really a personal computer.
The difference between this and P2P video is only the software and the screen we watch it on. By the way, my 17-month old loves YouTube videos. There may be a generational thing coming down the road similar to the way young people have ditched email in favour of IM.
That's possible, but there are still some display ergonomics issues with watching things on a computer. AppleTV is perfectly capable of downloading YouTube and displaying it on a TV; this is not at issue. iPhones are _also_ capable of it, but that does not mean that you are going to want to watch hour-long TV shows on your iPhone with the rest of your family... that's where having a large TV set, surrounded by some furniture that people can relax on comes in. In any case, the point is still that I think there will be a serious problem if and when someone comes up with a TiVo-like device that implements what I like to refer to as InterneTiVo. That all the necessary technology to implement this is available TODAY is completely irrelevant; it is going to take someone taking all the technical bits, figuring out how to glue it all together in a usable way, package it up to hide the gory details, and then sell it as a set-top box for $cheap in the same way that TiVo did. When TiVo did that, not only did they make "DVR" a practical reality for the average consumer, but they also actually managed to succeed at a more abstract level - the device they designed wasn't just capable of recording Channel 22 from 8:00PM to 9:00PM every Wednesday night, but was actually capable of analyzing the broadcast schedule, picking up shows at whatever time they were available, rescheduling around conflicts, and even looking for things that were similar, that a user might like. A TiVo isn't a "DVR" (in the sense of the relatively poor capabilities of most of the devices that bear that tag) so much as it is a personal video assistant. So what I'm thinking of is a device that is doing the equivalent of being a "personal video assistant" on the Internet. And I believe it is coming. Something that's capable of searching out and speculatively downloading the things it thinks you might be interested in. Not some techie's cobbled together PC with BitTorrent and HDMI outputs. An actual set-top box that the average user can use.
There are lots of things that multicast can be used for, and there's no question that financial data could be useful that way. However, what I'm saying is that this isn't particularly relevant on the public Internet in a general way.
If it were not for these market data feeds, I doubt that IP multicast would be as widely supported by routers.
If it weren't for the internet, I doubt that IP would be as widely supported by routers. :-) Something always drives technology. The hardware specifics of this is getting a bit off-topic, at least for this list. Do we agree that there's a potential model in the future where video may be speculatively fetched off the Internet and then stored for possible viewing, and if so, can we refocus a bit on that? ... JG -- Joe Greco - sol.net Network Services - Milwaukee, WI - http://www.sol.net "We call it the 'one bite at the apple' rule. Give me one chance [and] then I won't contact you again." - Direct Marketing Ass'n position on e-mail spam(CNN) With 24 million small businesses in the US alone, that's way too many apples. _______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
If the content senders do not want this dipping and levelling off, then they will have to foot the bill for the network capacity.
That's kind of the funniest thing I've seen today, it sounds so much like an Ed Whitacre.
Then Ed learns that the people he'd like to charge for the privilege of using "his" pipes are already paying for pipes.
If they really were paying for pipes, there would be no issue. The reason there is an issue is because network operators have been assuming that consumers, and content senders, would not use 100% of the access link capacity through the ISP's core network. When you assume any kind of overbooking then you are taking the risk that you have underpriced the service. The ideas people are talking about, relating to pumping lots of video to every end user, are fundamentally at odds with this overbooking model. The risk level has change from one in 10,000 to one in ten or one in five.
But today, content production is cheap, and competition has driven the cost of content down to zero.
Right, that's a "problem" I'm seeing too.
Unfortunately, the content owners still think that content is king and that they are sitting on a gold mine. They fail to see that they are only raking in revenues because they spend an awful lot of money on marketing their content. And the market is now so diverse (YouTube, indie bands, immigrant communities) that nobody can get anywhere close to 100% share. The long tail seems to be getting a bigger share of the overall market.
Host the video on your TiVo, or your PC, and take advantage of your existing bandwidth. (There are obvious non- self-hosted models already available, I'm not focusing on them, but they would work too)
Not a bad idea if the asymmetry in ADSL is not too small. But this all goes away if we really do get the kind of distributed data centers that I envision, where most business premises convert their machine rooms into generic compute/storage arrays. I should point out that the enterprise world is moving this way, not just Google/Amazon/Yahoo. For instance, many companies are moving applications onto virtual machines that are hosted on relatively generic compute arrays, with storage all in SANs. VMWare has a big chunk of this market but XEN based solutions with their ability to migrate running virtual machines, are also in use. And since a lot of enterprise software is built with Java, clustering software like Terracotta makes it possible to build a compute array with several JVM's per core and scale applications with a lot less fuss than traditional cluster operating systems. Since most ISPs are now owned by telcos and since most telcos have lots of strategically located buildings with empty space caused by physical shrinkage of switching equipment, you would think that everybody on this list would be thinking about how to integrate all these data center pods into their networks.
So what I'm thinking of is a device that is doing the equivalent of being a "personal video assistant" on the Internet. And I believe it is coming. Something that's capable of searching out and speculatively downloading the things it thinks you might be interested in. Not some techie's cobbled together PC with BitTorrent and HDMI outputs.
Speculative downloading is the key here, and I believe that cobbled together boxes will end up doing the same thing. However, this means that any given content file will be going to a much larger number of endpoints, which is something that P2P handles quite well. P2P software is a form of multicast as is a CDN (Content Delivery Network) like Akamai. Just because IP Multicast is built into the routers, does not make it the best way to multicast content. Given that widespread IP multicast will *NOT* happen without ISP investment and that it potentially impacts every router in the network, I think it has a disadvantage compared with P2P or systems which rely on a few middleboxes strategically places, such as caching proxies.
The hardware specifics of this is getting a bit off-topic, at least for this list. Do we agree that there's a potential model in the future where video may be speculatively fetched off the Internet and then stored for possible viewing, and if so, can we refocus a bit on that?
I can only see this speculative fetching if it is properly implemented to minimize its impact on the network. The idea of millions of unicast streams or FTP downloads in one big exaflood, will kill speculative fetching. If the content senders create an exaflood, then the audience will not get the kind of experience that they expect, and will go elsewhere. We had this experience recently in the UK when they opened a new terminal at Heathrow airport and British-Airways moved operations to T5 overnight. The exaflood of luggage was too much for the system, and it has taken weeks to get to a level of service that people still consider "bad service" but bearable. They had so much misplaced luggage that they sent many truckloads of it to Italy to be sorted and returned to the owners. One of my colleagues claims that the only reason the terminal is now half-way functional is that many travellers are afraid to take any luggage at all except for carry-on. So far two executives of the airline have been sacked and the government is being lobbied to break the airport operator monopoly so that at least one of London's two major airports is run by a different company. The point is that only the most stupid braindead content provider executive would unleash something like that upon their company by creating an exaflood. Personally I think the optimal solution is for a form of P2P that is based on published standards, with open source implementations, and relies on a topology guru inside each ISP's network to inject traffic policy information into the system. --Michael Dillon _______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
p2p isn't the only way to deliver content overnight, content could also be delivered via multicast overnight. http://www.intercast.com/Eng/Index.asp http://kazam.com/Eng/About/About.jsp On Apr 22, 2008, at 5:33 AM, <michael.dillon@bt.com> wrote:
I think you're too high there! MPEG2 SD is around 4-6Mbps, MPEG4 SD is around 2-4Mbps, MPEG4 HD is anywhere from 8 to 20Mbps, depending on how much wow factor the broadcaster is trying to give.
Nope, ATSC is 19 (more accurately 19.28) megabits per second.
So why would anyone plug an ATSC feed directly into the Internet? Are there any devices that can play it other than a TV set? Why wouldn't a video services company transcode it to MPEG4 and transmit that?
I can see that some cable/DSL companies might transmit ATSC to subscribers but they would also operate local receivers so that the traffic never touches their core. Rather like what a cable company does today with TV receivers in their head ends.
All this talk of exafloods seems to ignore the basic economics of IP networks. No ISP is going to allow subscribers to pull in 8gigs per day of video stream. And no broadcaster is going to pay for the bandwidth needed to pump out all those ATSC streams. And nobody is going to stick IP multicast (and multicast peering) in the core just to deal with video streams to people who leave their TV on all day whether they are at home or not.
At best you will see IP multicast on a city-wide basis in a single ISP's network. Also note that IP multicast only works for live broadcast TV. In today's world there isn't much of that except for news. Everything else is prerecorded and thus it COULD be transmitted at any time. IP multicast does not help you when you have 1000 subscribers all pulling in 1000 unique streams. In the 1960's it was reasonable to think that you could deliver the same video to all consumers because everybody was the same in one big melting pot. But that day is long gone.
On the other hand, P2P software could be leveraged to download video files during off-peak hours on the network. All it takes is some cooperation between P2P software developers and ISPs so that you have P2P clients which can be told to lay off during peak hours, or when they want something from the other side of a congested peering circuit. Better yet, the ISP's P2P manager could arrange for one full copy of that file to get across the congested peering circuit during the time period most favorable for that single circuit, then distribute elsewhere.
--Michael Dillon
As far as I am concerned the killer application for IP multicast is *NOT* video, it's market data feeds from NYSE, NASDAQ, CBOT, etc.
_______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
--- Bruce Curtis bruce.curtis@ndsu.edu Certified NetAnalyst II 701-231-8527 North Dakota State University _______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
Am 22.04.2008 um 16:05 schrieb Bruce Curtis:
p2p isn't the only way to deliver content overnight, content could also be delivered via multicast overnight.
hmm sorry i did not get it IMHO multicast ist uselese for VOD , correct ? marc _______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
On Apr 22, 2008, at 9:15 AM, Marc Manthey wrote:
Am 22.04.2008 um 16:05 schrieb Bruce Curtis:
p2p isn't the only way to deliver content overnight, content could also be delivered via multicast overnight.
hmm sorry i did not get it IMHO multicast ist uselese for VOD , correct ?
marc
Michael said the same thing "Also note that IP multicast only works for live broadcast TV." and then mentioned that p2p could be used to download content during off-peak hours. Kazam is a beta test that uses Intercast's technology to download content overnight to a users PC via multicast. My point was p2p isn't the only way to deliver content overnight, multicast could also be used to do that, and in fact at least one company is exploring that option. The example seemed to fit in well with the other examples in the the thread that mentioned TiVo type devices recording content for later viewing on demand. I agree that multicast can be used for live TV and others have mentioned the multicasting of the BBC and www.ostn.tv is another example of live multicasting. However since TiVo type devices today record broadcast content for later viewing on demand there could certainly be devices that record multicast content for later viewing on demand. --- Bruce Curtis bruce.curtis@ndsu.edu Certified NetAnalyst II 701-231-8527 North Dakota State University _______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
On Tue, Apr 22, 2008, Marc Manthey wrote:
hmm sorry i did not get it IMHO multicast ist uselese for VOD , correct ?
As a delivery mechanism to end-users? Sure. As a way of feeding content to edge boxes which then serve VOD? Maybe not so useless. But then, its been years since I toyed with IP over satellite to feed ${STUFF}.. :) Adrian _______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
Chris Adams wrote:
Once upon a time, Steve Gibbard <scg@gibbard.org> said:
iTunes video, which looks perfectly acceptable on my old NTSC TV, is .75 gigabytes per viewable hour. I think HDTV is somewhere around 8 megabits per second (if I'm remembering correctly; I may be wrong about that), which would translate to one megabyte per second, or 3.6 gigabytes per hour.
You're a little low. ATSC (the over-the-air digital broadcast format) is 19 megabits per second or 8.55 gigabytes per hour. My TiVo probably records 12-20 hours per day (I don't watch all that of course), often using two tuners (so up to 38 megabits per second). That's not all HD today of course, but the percentage that is HD is going up.
1.1 terabytes of ATSC-level HD would be a little over 4 hours a day. If you have a family with multiple TVs, that's easy to hit.
That also assumes that we get 40-60 megabit connections (2-3 ATSC format channels) that can sustain that level of traffic to the household with widespread deployment in 2 years and that the "average" household hooks it up to their TVs.
I'm going to have to say that that's much higher than we're actually going to see. You have to remember that there's not a ton of compression going on in that. We're looking to start pushing HD video online, and our intial tests show that 1.5Mbps is plenty to push HD resolutions of video online. We won't necessarily be doing 60 fps or full quality audio, but "HD" doesn't actually define exactly what it's going to be. Look at the HD offerings online today and I think you'll find that they're mostly 1-1.5 Mbps. TV will stay much higher quality than that, but if people are watching from their PCs, I think you'll see much more compression going on, given that the hardware processing it has a lot more horsepower. -- Alex Thurlow Technical Director Blastro Networks _______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
I've found it interesting that those who do Internet TV (re)define HD in a way that no one would consider HD anymore except the provider. =) In the news recently has been some complaints about Comcast's HD TV. Comcast has been (selectively) fitting 3 MPEG-2 HD streams in a 6 MHz carrier (38 Mbps = 12.6 Mbps) and customers aren't happy with that. I'm not sure how the average consumer will see 1.5 Mbps for HD video as sufficient unless it's QVGA. Frank -----Original Message----- From: Alex Thurlow [mailto:alex@blastro.com] Sent: Monday, April 21, 2008 4:26 PM To: nanog@nanog.org Subject: Re: [Nanog] ATT VP: Internet to hit capacity by 2010 <snip> I'm going to have to say that that's much higher than we're actually going to see. You have to remember that there's not a ton of compression going on in that. We're looking to start pushing HD video online, and our intial tests show that 1.5Mbps is plenty to push HD resolutions of video online. We won't necessarily be doing 60 fps or full quality audio, but "HD" doesn't actually define exactly what it's going to be. Look at the HD offerings online today and I think you'll find that they're mostly 1-1.5 Mbps. TV will stay much higher quality than that, but if people are watching from their PCs, I think you'll see much more compression going on, given that the hardware processing it has a lot more horsepower. -- Alex Thurlow Technical Director Blastro Networks _______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog _______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
On Apr 21, 2008, at 9:35 PM, Frank Bulk - iNAME wrote:
I've found it interesting that those who do Internet TV (re)define HD in a way that no one would consider HD anymore except the provider. =)
The FCC did not appear to set a bit rate specification for HD Television. The ATSC standard (A-53 part 4) specifies aspect ratios and pixel formats and frame rates, but not bit rates. So AFAICT, no redefinition is necessary. If you are doing (say) 720 x 1280 at 30 fps, you can call it HD, regardless of your bit rate. If you can find somewhere where the standard says otherwise, I would like to know about it.
In the news recently has been some complaints about Comcast's HD TV. Comcast has been (selectively) fitting 3 MPEG-2 HD streams in a 6 MHz carrier (38 Mbps = 12.6 Mbps) and customers aren't happy with that. I'm not sure how the average consumer will see 1.5 Mbps for HD video as sufficient unless it's QVGA.
Well, not with a 15+ year old standard like MPEG-2. (And, of course, HD is a set of pixel formats that specifically does not include QVGA.) I have had video professionals go "wow" at H.264 dual pass 720 p encodings at 2 Mbps, so it can be done. The real question is, how often do you see artifacts ? And, how much does the user care ? Modern encodings at these bit rates tend to provide very good encodings of static scenes. As the on-screen action increases, so does the likelihood of artifacts, so selection of bit rate depends I think on user expectations and the typical content being down. (As an aside, I see lots of artifacts on my at-home Cable HD, but I don't know their bandwidth allocation.) Regards Marshall
Frank
-----Original Message----- From: Alex Thurlow [mailto:alex@blastro.com] Sent: Monday, April 21, 2008 4:26 PM To: nanog@nanog.org Subject: Re: [Nanog] ATT VP: Internet to hit capacity by 2010
<snip>
I'm going to have to say that that's much higher than we're actually going to see. You have to remember that there's not a ton of compression going on in that. We're looking to start pushing HD video online, and our intial tests show that 1.5Mbps is plenty to push HD resolutions of video online. We won't necessarily be doing 60 fps or full quality audio, but "HD" doesn't actually define exactly what it's going to be.
Look at the HD offerings online today and I think you'll find that they're mostly 1-1.5 Mbps. TV will stay much higher quality than that, but if people are watching from their PCs, I think you'll see much more compression going on, given that the hardware processing it has a lot more horsepower.
-- Alex Thurlow Technical Director Blastro Networks
_______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
_______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
_______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
Here in comcast land hdtv is actually averaging around 12 megabits a second. Still adds up to staggering numbers..:) Steve Gibbard wrote:
On Mon, 21 Apr 2008, Sean Donelan wrote:
The rest of the story?
http://www.usatoday.com/tech/products/services/2008-04-20-internet-broadband...
By 2010, the average household will be using 1.1 terabytes (roughly equal to 1,000 copies of the Encyclopedia Britannica) of bandwidth a month, according to an estimate by the Internet Innovation Alliance in Washington, D.C. At that level, it says, 20 homes would generate more traffic than the entire Internet did in 1995.
How many folks remember InternetMCI's lack of capacity in the 1990's when it actually needed to stop installing new Internet connections because InternetMCI didn't have any more capacity for several months.
I've been on the side arguing that there's going to be enough growth to cause interesting issues (which is very different than arguing for any specific remedy that the telcos think will be in their benefit), but the numbers quoted above strike me as an overstatement.
Let's look at the numbers:
iTunes video, which looks perfectly acceptable on my old NTSC TV, is .75 gigabytes per viewable hour. I think HDTV is somewhere around 8 megabits per second (if I'm remembering correctly; I may be wrong about that), which would translate to one megabyte per second, or 3.6 gigabytes per hour.
For iTunes video, 1.1 terabytes would be 1,100 gigabytes, or 1,100 / .75 = 1,467 hours. 1,467 / 30 = 48.9 hours of video per day. Even assuming we divide that among three or four people in a household, that's staggering.
For HDTV, 1,100 gigabytes would be 1,100 / 3.6 = 306 hours per month. 306 / 30 = 10.2 hours per day.
Maybe I just don't spend enough time around the "leave the TV on all day" demographic. Is that a realistic number? Is there something bigger than HDTV video that ATT expects people to start downloading?
-Steve
_______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
-- Registered Microsoft Partner My "Foundation" verse: Isa 54:17 _______________________________________________ NANOG mailing list NANOG@nanog.org http://mailman.nanog.org/mailman/listinfo/nanog
participants (22)
-
Adrian Chadd
-
Alex Thurlow
-
Alexander Harrowell
-
Brandon Galbraith
-
Bruce Curtis
-
Chris Adams
-
David Coulson
-
Dorn Hetzel
-
Frank Bulk - iNAME
-
Joe Abley
-
Joe Greco
-
Marc Manthey
-
Marshall Eubanks
-
michael.dillonļ¼ bt.com
-
Paul Ferguson
-
Ric Messier
-
Sean Donelan
-
Simon Lockhart
-
Steve Gibbard
-
TJ
-
William Warren
-
Williams, Marc