Some people have compared unwanted Internet traffic to water pollution, and proposed that ISPs should be required to be like water utilities and be responsible for keeping the Internet water crystal clear and pure. Several new projects have started around the world to achieve those goals. ITU anti-botnet initiative http://www.itu.int/ITU-D/cyb/cybersecurity/projects/botnet.html France anti-piracy initiative http://www.culture.gouv.fr/culture/actualites/index-olivennes231107.htm
On Tue, Nov 27, 2007 at 09:38:40AM -0500, Sean Donelan wrote:
Some people have compared unwanted Internet traffic to water pollution, and proposed that ISPs should be required to be like water utilities and be responsible for keeping the Internet water crystal clear and pure.
Several new projects have started around the world to achieve those goals.
ITU anti-botnet initiative
http://www.itu.int/ITU-D/cyb/cybersecurity/projects/botnet.html
I'm not sure how to reconcile two things: 1) e2e principle -- if someone starts doing some new proto 66 thing, how do you make sure it's accessible? 2) protection from unwanted garbage. I don't really want all these 404 byte udp/1434 packets anymore but the networks that originate them don't seem to care or notice they're still infected. one persons unsolicated traffic is anothers debgging/research project. I was at a thanksgiving party and made the following postulation: Within the next 2 major software releases (Microsoft OS) they're going to by default require signed binaries. This will be the only viable solution to the malware threat. Other operating systems may follow. (This was a WAG, based on gut feeling). This has some interesting implications and would require Microsoft to be a bit more small-app friendly, and there'd be a knob to twiddle if you're a developer and don't want to check signatures, but it's one of the few ways to resolve the issues IMHO, and cut down on the infections. So what if I own you via your browser, unless the malware i push to your host is signed, it's not gonna run. Game [closer to] over. - Jared -- Jared Mauch | pgp key available via finger from jared@puck.nether.net clue++; | http://puck.nether.net/~jared/ My statements are only mine.
On Tue, 27 Nov 2007 10:03:55 EST, Jared Mauch said:
Within the next 2 major software releases (Microsoft OS) they're going to by default require signed binaries. This will be the only viable solution to the malware threat. Other operating systems may follow. (This was a WAG, based on gut feeling).
This has some interesting implications and would require Microsoft to be a bit more small-app friendly, and there'd be a knob to twiddle if you're a developer and don't want to check signatures, but it's one of the few ways to resolve the issues IMHO, and cut down on the infections. So what if I own you via your browser, unless the malware i push to your host is signed, it's not gonna run. Game [closer to] over.
The problem with "active content" is that an exploit will quite happily run in the security context of the browser - and way too many sites insist on either/both Flash and Javascript. Ever notice that there's been far fewer pure Java based problems? That's because it started off with a semi-sane security model. Flash and Javascript didn't. And you can't allow the browser to create executables, obviously. Unfortunately, that *also* means that you can't allow the user to use the browser to download patches, updates, and new software.... (Well - it's at least theoretically *doable* in the right Trusted Computing type of scenario, but I doubt we're going to get users to buy into it...)
On Nov 27, 2007, at 7:03 AM, Jared Mauch wrote:
Other operating systems may follow. (This was a WAG, based on gut feeling).
Nokia by default require app installed on the phones to be signed, though one can disable this functionality (and in fact must, in order to run many of the desirable applications). It's been stated in the press that Apple are doing this with the iPhone SDK, too. ----------------------------------------------------------------------- Roland Dobbins <rdobbins@cisco.com> // 408.527.6376 voice Culture eats strategy for breakfast. -- Ford Motor Company
Roland Dobbins wrote:
On Nov 27, 2007, at 7:03 AM, Jared Mauch wrote:
Other operating systems may follow. (This was a WAG, based on gut feeling).
Nokia by default require app installed on the phones to be signed, though one can disable this functionality (and in fact must, in order to run many of the desirable applications). It's been stated in the press that Apple are doing this with the iPhone SDK, too.
It is a nearly ubiquitous solution for mobile phones, though many of the actual implementations have been subverted at one time or other, and users actually updating the firmware of their mobile devices is actually a rather infrequent event. So while they utilize this approach it is not a panacea and they have a ways to go themselves.
----------------------------------------------------------------------- Roland Dobbins <rdobbins@cisco.com> // 408.527.6376 voice
Culture eats strategy for breakfast.
-- Ford Motor Company
* Jared Mauch:
Within the next 2 major software releases (Microsoft OS) they're going to by default require signed binaries. This will be the only viable solution to the malware threat. Other operating systems may follow. (This was a WAG, based on gut feeling).
The code signing CAs have never been subject to serious attack. It's unlikely that they are sufficiently robust for this scheme to work on a large scale. There's also the issue that you can't reliably tell data (which, presumably, does not need to be signed) from code.
On Nov 27, 2007, at 4:04 PM, Florian Weimer wrote:
* Jared Mauch:
Within the next 2 major software releases (Microsoft OS) they're going to by default require signed binaries. This will be the only viable solution to the malware threat. Other operating systems may follow. (This was a WAG, based on gut feeling).
The code signing CAs have never been subject to serious attack. It's unlikely that they are sufficiently robust for this scheme to work on a large scale.
One would hope that the CA's wouldn't be connected to an attack path... The revocation stuff should be distributable if it's not already.
You're not familiar with that incident where VeriSign granted two certificates to a "Microsoft Corporation" to hackers? http://www.news.com/2100-1001-254586.html Fred Reimer, CISSP, CCNP, CQS-VPN, CQS-ISS Senior Network Engineer Coleman Technologies, Inc. 954-298-1697
-----Original Message----- From: owner-nanog@merit.edu [mailto:owner-nanog@merit.edu] On Behalf Of John Payne Sent: Tuesday, November 27, 2007 4:32 PM To: Florian Weimer Cc: Jared Mauch; Sean Donelan; nanog@merit.edu Subject: Re: Creating a crystal clear and pure Internet
On Nov 27, 2007, at 4:04 PM, Florian Weimer wrote:
One would hope that the CA's wouldn't be connected to an attack path...
The revocation stuff should be distributable if it's not already.
On Tue, 27 Nov 2007 22:04:23 +0100, Florian Weimer said:
There's also the issue that you can't reliably tell data (which, presumably, does not need to be signed) from code.
And "active content" is what happens when you *intentionally* blur the data/ code distinction. Unfortunately, it's (a) wildly popular with users and (b) usually horribly done from a security standpoint. Unfortunately, "Web 2.0" with its "glue stuff together" approach looks like it's just going to make things even worse, as clueless developers wedge stuff together with dangerous interactions and synergies....
No offense, but I think this is an overly political topic, and we just saw that politics are not supposed to be discussed. There is a huge political debate on what ISP's should and should not be doing to traffic that flows through their systems. There are other groups, like NNsquad, where these types of conversations are welcome, but even there on the forums, not the mailing list. But, if it's not viewed as political then... Your analogy is flawed, because the Internet is not a pipe system and ISP's are not your local water utility. And, there are many different ways that water utilities are handled in different parts of the world. In the US, most if not all water utilities are handled by the government, usually the county government where I'm from. ISP's are not government run, and can't be compared to a water utility for that simple reason. They don't have the same legal (again, an issue that is not supposed to be discussed, according to the AUP) requirements nor the legal protections available to governments (you can't sue most governments). And my personal opinion is that ISP's should not do anything to the traffic that passes through their network as far as filtering. The only discriminatory behavior that should be allowed is for QoS, to treat specific types or traffic in a different manner to give preferential treatment to specific classifications of traffic. My definition of QoS for the purposes of this discussion, if it is allowed to continue, would not include shaping or policing. If an ISP says you have a 5Mb downstream and a 512K upstream, you should actually be allowed to send 512K upstream all the time. However, that's not to say that an ISP should not be able to classify traffic as scavenger over a particular threshold, and preferentially drop that traffic at their overprescribed uplink if that is a bottleneck. The end user should also be allowed to specify their own QoS markings, and they should be honored as long as they don't go over specific thresholds as imposed, and documented, by the ISP. For example, the customer should be able to self-classify certain traffic as high priority (VoIP) and certain as low (P2P), but if the customer classified all traffic as high priority the ISP is free to remark anything over a set threshold (say 128K) as a lower priority, but NOT police it. If you want to use an analogy, ISP's are more like >private< road systems and owners, using >public< lands that have been given a right to use said >public< lands for >private< profits with specific restrictions. Some restrictions may be that you can't discriminate on the payload (and kind of identifying category for passengers, such as race, ethnicity, gender, etc, which in the network world would map to type of protocol or payload content, such as P2P traffic or email), but that you can create an HOV lane for high occupancy vehicles (QoS). Of course, ISP's are allowed to make sure the vehicles are in proper working condition (checking that various layer headers are in compliance). Much like with the self-marking of traffic with QoS tags, the customer should also be able to make their own decision and pack two other people in the car in order to get into that HOV lane. However, if the users of the road try and pack everything into the HOV lane, they can be reclassified (busses may have to pay a higher fee to use the road). However, in this world of religious warfare (another banned topic, I'm sure!) it is recognized that a certain level of profiling is acceptable. So, it may be O.K. for ISP's to profile and deny traffic depending on the payload only for specific types of traffic that have been shown to cause issues, and/or only be present for nefarious reasons. Examples may be known signatures for virus attacks, worms, or Trojans. Other examples may be identifying characteristics for SPAM (I'm reluctant to say "excessive email traffic" because I don't believe that is a proper identifying characteristic, I should be able to run my own SMTP server and send out as much legitimate email as I want). I realize that my views probably won't be shared by the vast majority of ISP's, and hence are overly political for this group. That's why I think any discussion is not necessarily on-topic. Thanks, Fred Reimer, CISSP, CCNP, CQS-VPN, CQS-ISS Senior Network Engineer Coleman Technologies, Inc. 954-298-1697
-----Original Message----- From: owner-nanog@merit.edu [mailto:owner-nanog@merit.edu] On Behalf Of Sean Donelan Sent: Tuesday, November 27, 2007 9:39 AM To: nanog@merit.edu Subject: Creating a crystal clear and pure Internet
Some people have compared unwanted Internet traffic to water pollution, and proposed that ISPs should be required to be like water utilities and be responsible for keeping the Internet water crystal clear and pure.
Several new projects have started around the world to achieve those goals.
ITU anti-botnet initiative
http://www.itu.int/ITU- D/cyb/cybersecurity/projects/botnet.html
France anti-piracy initiative
http://www.culture.gouv.fr/culture/actualites/index- olivennes231107.htm
On Nov 27, 2007 7:18 AM, Fred Reimer <freimer@ctiusa.com> wrote:
The only discriminatory behavior that should be allowed is for QoS, to treat specific types or traffic in a different manner to give preferential treatment to specific classifications of traffic.
I myself and I'm sure most others prefer net neutrality to the horrid alternative you're suggesting -- Best Regards, John Musbach
Horrid? Strong words. What's horrid about allowing an ISP to prefer that their BGP traffic has a higher priority than end-user traffic, so that the whole net doesn't fail when pipes are overprescribed, or there is a virus/worm on the net? What's horrid about allowing an end-user to decide which of its traffic should be dropped first, if by definition some traffic HAS to be dropped due to over-prescription? If you think it's horrid, then I'd like some examples, because I suspect that given certain specific scenarios you'd probably agree with what should happen (as neutral as can possibly be managed, and transparent). Thanks, Fred Reimer, CISSP, CCNP, CQS-VPN, CQS-ISS Senior Network Engineer Coleman Technologies, Inc. 954-298-1697
-----Original Message----- From: John Musbach [mailto:johnmusbach@gmail.com] Sent: Tuesday, November 27, 2007 10:31 AM To: Fred Reimer; nanog@nanog.org Subject: Re: Creating a crystal clear and pure Internet
On Nov 27, 2007 7:18 AM, Fred Reimer <freimer@ctiusa.com> wrote:
The only discriminatory behavior that should be allowed is for QoS, to treat specific types or traffic in a different manner to give preferential treatment to specific classifications of traffic.
I myself and I'm sure most others prefer net neutrality to the horrid alternative you're suggesting
-- Best Regards,
John Musbach
On Tue, 27 Nov 2007 07:31:08 PST, John Musbach said:
I myself and I'm sure most others prefer net neutrality to the horrid alternative you're suggesting
I dunno. I've often wished I *could* QoS some of my packets up/down so the Linux distro ISO I'm downloading doesn't make my SSH get piggy, and I'd certainly at least *consider* a provider that offered "NNN gig/month of priority traffic, and unlimited scavenger-class" or similar ideas I've seen proposed. I'd even be OK with the provider QoS'ing the packets because the *other* end of the connection did it (hey, you host a distro mirror, you want to save those bandwidth charges, I can understand and will show solidarity by playing along). It's only when my packets get QoS'ed downward because some *third party* paid the provider that it gets ugly and evil. (And yes, I know there's the nasty corner cases where I'm sharing a pipe with my next door neighbor who paid for a bigger slice of pipe. If it was *easy*, it would already be done rather than a big hairy policy issue.. ;) Of course, any *sane* provider will totally ignore what I and the other 2% lunatic fringe want, and market the plan that extracts maximum profit from the 98% Joe Sixpack customers out there. :)
On Tue, Nov 27, 2007 at 10:18:47AM -0500, Fred Reimer wrote:
No offense, but I think this is an overly political topic, and we just saw that politics are not supposed to be discussed. There is a huge political debate on what ISP's should and should not be doing to traffic that flows through their systems. There are other groups, like NNsquad, where these types of conversations are welcome, but even there on the forums, not the mailing list.
But, if it's not viewed as political then...
[SNIP!]
And my personal opinion is that ISP's should not do anything to the traffic that passes through their network as far as filtering. The only discriminatory behavior that should be allowed is for QoS, to treat specific types or traffic in a different manner to give preferential treatment to specific classifications of traffic. My definition of QoS for the [SNIP!]
Welcome to the non-regulated world. I think this is a general call to engage in these activities. The last thing I think most of us want to have happen is to wake up be regulated like the Chemical sector became, eg: http://www.dhs.gov/xprevprot/laws/gc_1166796969417.shtm There is an operational part of this whole internet thing that does matter, and I have to say, we can't just ignore the activities at the recent Rio, ITU, or other things. Without clued engagement will the policy wonks make the right choices/decisions? This does impact network operations. Take for example the FCC stuff on the emergency alert system. (excerpt from federal register follows) -- register excerpt -- Contra Costa states just as the Internet Protocols enable various kinds of computers to work together, CAP can provide the basis for a secure ``warning internet'' that can leverage all our warning assets to achieve more than any single system can alone. -- register excerpt -- Perhaps you don't care about this stuff, but maybe you'll soon be required to have your EAS testpoint connected to the local PSAP for them to do reverse-911 or other activities to users with naked dsl, etc.. If you think this doesn't impact your operational network or have the potential to, you're sorely mistaken. If you're not engaged, you may become blindsided by costs that you're unable to recover from and cause your network to close due to bankruptcy. I could be insane in thinking this, but I think we're in that time of the lifecycle where we need to be on-guard. - Jared -- Jared Mauch | pgp key available via finger from jared@puck.nether.net clue++; | http://puck.nether.net/~jared/ My statements are only mine.
On Tue, 27 Nov 2007, Jared Mauch wrote:
There is an operational part of this whole internet thing that does matter, and I have to say, we can't just ignore the activities at the recent Rio, ITU, or other things. Without clued engagement will the policy wonks make the right choices/decisions? This does impact network operations.
On a more practical/technical level, I'm interested in how French ISPs that worked on the plan to implement it on their networks? Orange France Free Neuf cegetel AFA GESTE SELL SIMAVELEC DAILYMOTION Numericable/Noos YouTube/Google AFORST What technical methods do they believe are going to work as outlined in the working document Annex 1: http://www.culture.gouv.fr/culture/actualites/conferen/albanel/rapportoliven... - Filtering URL or IP address (le filtrage d'URL ou d'adresse IP) - Filtering ports (le filtrage de ports) - Filtering protocols (le filtrage de protocoles) - Filtering content (le filtrage de contenus) - Filtering services (les outils de filtrage par les hibergeurs ou les iditeurs de services) - External monitoring (Le repirage des flux illicites par l'observation externe) Unfortunately, Babelfish isn't the best way to read such a document. There are probably some nuances which don't translate easily.
On a more practical/technical level, I'm interested in how French ISPs that worked on the plan to implement it on their networks?
http://www.culture.gouv.fr/culture/actualites/conferen/albanel/rapportol ivennes231107.pdf I couldn't get a good copy from that URL but I did manage to get one from <http://www.fluctuat.net/articles/IMG/pdf/rapport-olivennes.pdf> First, they start out by saying on page 12 that they are persuaded that stronger disincentives for pirating content have to be organised in a realistic and pragmatic way. There is no unique solution whose success is assured. It is an illusion to consider that all forms of piracy on the Internet can be stopped. Nevertheless it is necessary to communicate to the younger members of the public that free (as in beer) and illegal costlessness has a cost. (Note that French has a word meaning costlessness that is usually translated as free). They go on to talk about a variety of technical measures which they recognize pirates can pervert. They want to make it harder to accidentally pirate stuff. They talk about go after the uploader of content, not the downloader. Elsewhere in the document they get agreement from the industry to change their behavior as well, such as watermarking content, which presumably makes it easier to filter pirated content but leave the Linux ISO torrents alone. Under filtering of sites and protocols they mention that this can be legally possible in certain specific circumstances. Under filtering of files they talk about servers, where ISPs already will delete or block downloads of pirated files. The appendices (annexes) go into more technical details. I didn't read all 44 pages of this report but it is fairly balanced and they clearly talked to ISPs as well as content owners. I get the sense that a lot of this is alread best practices but is probably not all that well documented in the sense that there isn't a "Piracy Prevention Best Practices" document that most ISPs try to adhere to. If someone wanted to produce such a document, extracting the technical bits from this report would make a reasonable working draft. Given that operationally, there are no magic bullets for most things, we have to make do with a set of best practices, each of which deals with one aspect of the problem. I wonder why we don't see more support for documenting these best practices through something like this wiki http://bestpractices.wikia.com/wiki/Main_Page which was set up by a NANOG member a while back. --Michael Dillon
But, if it's not viewed as political then...
Your analogy is flawed, because the Internet is not a pipe system and ISP's are not your local water utility.
And the internet is not a big truck! It's....It's a series of tubes! Sorry, I couldn't resist... with all these things clogging all the tubes. :-) -Jerry
On Tue, 27 Nov 2007, Jerry Pasker wrote:
But, if it's not viewed as political then...
Your analogy is flawed, because the Internet is not a pipe system and ISP's are not your local water utility.
And the internet is not a big truck! It's....It's a series of tubes!
Sorry, I couldn't resist... with all these things clogging all the tubes. :-)
I'd like to draw attention to nanog AUP, particularly #6: Postings of political, philosophical, and legal nature are prohibited. While the "regulation of internet by filtering bad traffic" is clearly political and/or legal, I do think the *technical* implication of it are very much on-topic. After all, once this happens, we as network operators will be responsible for the filtering. Given that, I'd like to ask everyone to refrain off-hand comments about tubes and dump trucks - we all hear this joke every day. Discussion of morality of such filtering is also off-topic. Discussion of implementation of such filtering and effect of it on network operations at-large is clearly on-topic. Discussion of separating traffic (by network operators) into "bad" and "good" is also on-topic. The list is about technology and operations. This is not ITU. This is not C-SPAN. This is not 'general banter among network operators' list either. Before you post to the list, think - would you want to make a presentation at NANOG-conference based on your post? If it doesn't feel appropriate, the list post is similarly inappropriate. Also, this is another reminder that MLC *will* be giving formal warnings (which will eventually lead to removal from the list) to those who continue to post off-topic messages. As usual, should you wish to discuss this post, please do so on nanog-futures (reply-to has been set accordingly). Thanks! -alex [mlc chair]
<personal opinion> the position that politics, culture, and society have no place in internet operations is beyond even an ostrich. they bloody *drive* the car. while we're at it, why not eliminate finances too? sheesh! randy
On Tuesday 27 November 2007, Alex Pilosov wrote:
I'd like to draw attention to nanog AUP, particularly #6: Postings of political, philosophical, and legal nature are prohibited.
While the "regulation of internet by filtering bad traffic" is clearly political and/or legal, I do think the *technical* implication of it are very much on-topic. After all, once this happens, we as network operators will be responsible for the filtering.
With all due respect, Alex and the rest of the MLC, I believe AUP #6 (aka, the Ostrich Clause) should be amended. Operational folks must be involved in the nontechnical processes that impact their daily jobs, or one day you will find that operations will be impossible due to bone-headed regulations that might have been stopped had knowledgeable operational people spoken up soon enough. And while I realize this belongs more on nanog-futures, this thought should at least see the light of day on the main nanog list first. So I guess I'll subscribe to nanog-futures so that I can participate in what may become a lively discussion over there on the Ostrich Clause. I encourage everyone else who might be interested in joining this discussion to do so as well. I do not intend to post further on this subject on this list; no warning necessary. -- Lamar Owen Chief Information Officer Pisgah Astronomical Research Institute 1 PARI Drive Rosman, NC 28772 (828)862-5554 www.pari.edu
On Tue, 27 Nov 2007 09:38:40 EST, Sean Donelan said:
Some people have compared unwanted Internet traffic to water pollution, and proposed that ISPs should be required to be like water utilities and be responsible for keeping the Internet water crystal clear and pure.
What's the networking equivalent of "remember to build your water intake *upstream* of your sewage plant"? Or, more accurately - "how do you get all those people with private wells^Wcomputers to *not* insist on building their leach fields uphill of their wells?". There's a limit to what an ISP can do to make it "crustal clear and pure" without an incredibly intrusive presence. The technically easy way is what many corporations do - Borg the boxes into an Active Directory domain, and impose fascist controls via Group Policy (for all of my anti-MS ranting, I'll grant the AD/GP stuff *is* pretty slick ideas for corporate PC lockdown). But how do you sell that idea to the consumer user?
Valdis.Kletnieks@vt.edu wrote:
On Tue, 27 Nov 2007 09:38:40 EST, Sean Donelan said:
Some people have compared unwanted Internet traffic to water pollution, and proposed that ISPs should be required to be like water utilities and be responsible for keeping the Internet water crystal clear and pure.
What's the networking equivalent of "remember to build your water intake *upstream* of your sewage plant"?
Or, more accurately - "how do you get all those people with private wells^Wcomputers to *not* insist on building their leach fields uphill of their wells?".
There's a limit to what an ISP can do to make it "crustal clear and pure" without an incredibly intrusive presence. The technically easy way is what many corporations do - Borg the boxes into an Active Directory domain, and impose fascist controls via Group Policy (for all of my anti-MS ranting, I'll grant the AD/GP stuff *is* pretty slick ideas for corporate PC lockdown).
But how do you sell that idea to the consumer user?
I think we'd have to standardize on what our networking equivalent of "water" is. Are we talking about just port 80 traffic? Just email? A water utility is not an open network, nor is it bidirectional. If I want to start transporting Koolaid, I need to take the water from the utility, and create my own distribution network outside of the water utility; I can't lease capacity from the water utility. If all we were carrying were unidirectional traffic, "crystal clean" would be very easy. If all we were carrying was port 80 traffic, "pure" would be easy. A better analogy would be make sure all the roads and vehicles on it were crystal clean and "pure"/efficient -- hell, I'd settle for "insured". IMO, an intrusive Internet (ignoring the political talk about MSFT or not) would not be the Internet, but just a proprietary network [pick your flavor]. But rather than debate technology, I think regulators/operators/etc would need to settle on an unambiguous definition of "what" is carried and "what" isn't to be carried, at what quality level for a given level of service. Deepak Jain AiNET
On Nov 27, 2007 6:38 AM, Sean Donelan <sean@donelan.com> wrote:
France anti-piracy initiative
http://www.culture.gouv.fr/culture/actualites/index-olivennes231107.htm
I don't understand, how in the world do they plan to differentiate normal legal traffic from illegal pirating??? -- Best Regards, John Musbach
On Nov 27, 2007 3:28 PM, John Musbach <johnmusbach@gmail.com> wrote:
On Nov 27, 2007 6:38 AM, Sean Donelan <sean@donelan.com> wrote:
France anti-piracy initiative
http://www.culture.gouv.fr/culture/actualites/index-olivennes231107.htm
I don't understand, how in the world do they plan to differentiate normal legal traffic from illegal pirating???
Especially as they also want to ban DRM; it's like they gave half the report to Cory Doctorow to write and half to the MPAA.
On November 27, 2007 at 07:28 johnmusbach@gmail.com (John Musbach) wrote:
On Nov 27, 2007 6:38 AM, Sean Donelan <sean@donelan.com> wrote:
France anti-piracy initiative
http://www.culture.gouv.fr/culture/actualites/index-olivennes231107.htm
I don't understand, how in the world do they plan to differentiate normal legal traffic from illegal pirating???
It doesn't matter. That's the beauty of legal processes, they're not bound by physical laws or other external realities. Only by what a small group of people with appropriate jurisdiction and police powers can be convinced to agree upon. So if given the authority they decided you shall not deliver content they deem inappropriate OR ELSE then that will be your choice. If they decided you should fly around the room and spit wooden nickels OR ELSE, oh, do jail time, then you'll end up in jail. But, of course, it'll be a lot easier to get a committee with appropriate jurisdication to agree that any content they find objectionable oughta be filtered, OR ELSE, and leave the details to others to work out. To say this is not operational would be like claiming that filtering out rogue BGP announcements are not operational. -- -Barry Shein The World | bzs@TheWorld.com | http://www.TheWorld.com Purveyors to the Trade | Voice: 800-THE-WRLD | Login: Nationwide Software Tool & Die | Public Access Internet | SINCE 1989 *oo*
Rather than go after distilled water via reverse osmosis, I think a carbon filter would be a good place to start. Frank -----Original Message----- From: owner-nanog@merit.edu [mailto:owner-nanog@merit.edu] On Behalf Of Sean Donelan Sent: Tuesday, November 27, 2007 8:39 AM To: nanog@merit.edu Subject: Creating a crystal clear and pure Internet Some people have compared unwanted Internet traffic to water pollution, and proposed that ISPs should be required to be like water utilities and be responsible for keeping the Internet water crystal clear and pure. Several new projects have started around the world to achieve those goals. ITU anti-botnet initiative http://www.itu.int/ITU-D/cyb/cybersecurity/projects/botnet.html France anti-piracy initiative http://www.culture.gouv.fr/culture/actualites/index-olivennes231107.htm
On Tue, Nov 27, 2007 at 09:38:40AM -0500, Sean Donelan wrote:
Some people have compared unwanted Internet traffic to water pollution, and proposed that ISPs should be required to be like water utilities and be responsible for keeping the Internet water crystal clear and pure.
Quoting Wu Ming: Take everybody's ideas of clear and pure and overlay them and pretty soon the only things allowed to be sent over the Internet will be Shakespeare and the Bible, and much of that's a grey area anyway.
On Tue, Nov 27, 2007 at 09:38:40AM -0500, Sean Donelan wrote:
Some people have compared unwanted Internet traffic to water pollution, and proposed that ISPs should be required to be like water utilities and be responsible for keeping the Internet water crystal clear and pure.
Yes -- well, not "unwanted" IMHO, but "abusive". (Much traffic that's unwanted is not abusive. For example, in the view of some readers of this mailing list, some of the longer/more caustic/repetitive debates might very well be unwanted. But that traffic is clearly not abusive.)
Several new projects have started around the world to achieve those goals.
ITU anti-botnet initiative [snip France anti-piracy initiative
Only the first one has anything to do with keeping the Internet clean; the second is a political cave-in to the copyright cartel. I see a (mostly) clear line between "things that are abusive of the Internet, systems connected to it, and users of those systems" and "content that's unwanted, offensive, or claimed to be covered under someone's interpretation of IP law". The first category contains things like spam, phishing, spyware, spam/phishing/spyware support services (dns, web hosting, maildrops), DoS attacks, hijacked networks, etc. The second category contains things like porn, religion, politics, music, movies via whatever means are used to convey them (mail, web, p2p, etc.) all of which are certain to irritate someone, somewhere, and much of which could probably be construed (by a sufficiently creative legal practicioner) to infringe on somebody's IP. In my view, it's the responsibility of everyone on the net to do whatever they can to squelch the first. But they have no obligations at all when it comes to the second -- that way lies the slippery slope of content policing and censorship. ---Rsk
On Tue, 27 Nov 2007, Rich Kulawiec wrote:
In my view, it's the responsibility of everyone on the net to do whatever they can to squelch the first. But they have no obligations at all when it comes to the second -- that way lies the slippery slope of content policing and censorship.
The technical tools and techniques available to ISPs are essentially the same regardless of the type of use being targetted. So reviewing sets of documents from various groups for technical capabilities, how can ISPs implement them? Are there any technical capabilities available which haven't been included? Are there technical capabilities included which aren't really feasible? 1. Traceability and impersonation of identifiers 2. Accountability and dynamic changes of identifiers 3. Availability and interference with other communications 4. Confidentialty and privacy of communications 5. Integrity and changes to communications 6. Alertability and status of communications 7. Acceptability and choosing which communications
On November 27, 2007 at 09:38 sean@donelan.com (Sean Donelan) wrote:
Some people have compared unwanted Internet traffic to water pollution, and proposed that ISPs should be required to be like water utilities and be responsible for keeping the Internet water crystal clear and pure.
Several new projects have started around the world to achieve those goals.
On a related note: FCC Could Extend Reach To Cable TV Vote Scheduled for Today May Open Door to Regulation http://www.washingtonpost.com/wp-dyn/content/story/2007/11/26/ST200711260220... Basically the FCC is being broken out of their cage of "broadcast spectrum and telephone monopolies only" and being given the power to regulate cable TV content. No doubt internet content can't be far behind, the boundaries have just disappeared and all that's left is "whatever seems to us to be in the interest of the public". The FCC is being turned into The Ministry of Censorship before your eyes. The pretext is consumer pricing (unbundling etc) but go look at sites like http://www.parentstv.org ("Parents Televsion Council"), they're already gunning for the FCC's new power over cable content to install their own agenda. If anyone doesn't think this is operational they're missing the point. Making the net as "clean and wholesome" as prime time TV is going to fall in the laps of operations. And that's where this is going, fast.
ITU anti-botnet initiative
http://www.itu.int/ITU-D/cyb/cybersecurity/projects/botnet.html
France anti-piracy initiative
http://www.culture.gouv.fr/culture/actualites/index-olivennes231107.htm
-- -Barry Shein The World | bzs@TheWorld.com | http://www.TheWorld.com Purveyors to the Trade | Voice: 800-THE-WRLD | Login: Nationwide Software Tool & Die | Public Access Internet | SINCE 1989 *oo*
On Nov 27, 2007 8:08 PM, Sean Donelan <sean@donelan.com> wrote:
Several new projects have started around the world to achieve those goals. ITU anti-botnet initiative http://www.itu.int/ITU-D/cyb/cybersecurity/projects/botnet.html
I wrote this one. And there are a few things in there that nanogers would probably agree with me are best practice. At least in this case, you have ITU putting money and resources into putting these BCPs into practice, at a national level. 1Q08 - there'll be a pilot project, implementing these ideas in Malaysia. http://www.itu.int/ITU-D/cyb/cybersecurity/docs/itu-botnet-mitigation-toolki... It quotes your 40-40-20 rule somewhere btw .. I forgot to attribute it to you, that's coming up in version #2 of the draft srs
participants (22)
-
Alex Pilosov
-
Alexander Harrowell
-
Barry Shein
-
Deepak Jain
-
Florian Weimer
-
Frank Bulk
-
Fred Reimer
-
Jared Mauch
-
Jerry Pasker
-
Joel Jaeggli
-
John Musbach
-
John Payne
-
Lamar Owen
-
Michael Balasko
-
michael.dillon@bt.com
-
Randy Bush
-
Rich Kulawiec
-
Roland Dobbins
-
Sean Donelan
-
Suresh Ramasubramanian
-
Valdis.Kletnieks@vt.edu
-
ww