From a general level, this industries technological growth rate is
[NANOG WARNING: This document contains neither directions on contacting the operational support group for any entity, nor the configuration code for any network hardware] Against my better judgement, I've decided to add my tiny bit of content to this thread. If there is anyone to thank for getting me into it, Mr. Porter has done so with his excellent review of the thread. Without it, I might not have seen the humor in it. However, there is something that I just can't understand. It's amazing to me that with the number of examples in history, people are able to make grand statements like many have in this thread. Please allow me to point out a few historical statements that I feel are very similar to some of the grand statements made in this thread: 1. The world is flat and rests on the back of a giant turtle. 2. Man will never be able to fly faster than the speed of sound. 3. The Dow Jones Industrial Average will never break 4000 Grandiose statements can be generally risky, and almost always the more grand the statement, the more likely it will be wrong to some degree. So if you think some technology is just a short term hack, and is doomed to be forgotten, sure, tell us, and then tell us why you believe it. Just the grand statement alone really is worthless. On to the topic: As far as I can tell there are three major subjects to this thread. The ethical nature of using a transparent cache. The technical and operational problems and solutions associated with the implementation of a transparent cache. And finally, the customer support and management process involved with the implementation of the transparent cache. I'll address these in reverse order. Customer Support and process management seems to be the Achilles' heel of the Internet industry. This industry has seemed to build up a terrible reputation for poor customer support. Coming from the Nuclear industry, I am truly jaded when it comes to process management, but the Internet industry seems to have taken a track as far as possible on the other side. It amazes me the lack of process management in many (not all) of the organizations involved in this industry. Procedures seem to be a dirty word, and tend to result in the exodus of technical talent when imposed upon an organization. In many cases, you can find people in the industry jokingly referring to their facilities and really neat toys. Compared to the multiple interactions in the Internet, a nuclear power plant is simple. And yet, there are very few people if any who could singly determine what one particular action would have upon the operation of the plant as a whole. Obviously, the consequences of something bad happening is far worse, but in many cases, bad means the plant is shut down for some time period, and the company loses cash at an astonishing rate. However, the use of the Internet as a tool for many companies revenue stream continues to grow, and the resulting problems generated by poor process management is beginning to have much wider financial effect. It's apparent that we on the operational side are truly more at fault than anyone else. It's our duty to provide this service to our customers as best we can. If there is any area in which the customer could claim negligence or seek damages, this is one place that they may find a good chance. I'm not pointing fingers here, I've been in organizations that were party to this myself. I'm merely making a general observation. Many organizations have made good headway here. Events such as the infamous 7007 event, the widespread connectivity problems concurrent with filter implementations, down to local effects of such as those resulting in this thread will continue to occur as long as we continue to take a lax view as to the results of our actions. There apparently have been several organizations developed recently that claim to be attempting to address this problem. However, I have not seen an industry-wide affect by the operations of these organizations. As far as the technical and operational problems associated with the implementation of current transparent caching technology, it appears to me that the people on this list who have the most background in cache technology seem to have come to an agreement that technological answers to the problems experienced in this event exist. I can't, nor won't feign any significant technical knowledge in the caching field. I have a general knowledge level at best. phenomenal and the resulting technological implementation problems can be significant. However, there are organizations that are going to continue to develop and implement these new technologies. The successful ones will be financially rewarded in our capitalistic society, and the unsuccessful ones will fare less well. Sooner or later those implementing the new technology will have a significant advantage over those who don't. To avoid the challenge of new technology can be extremely damaging to the viability of a company in the Internet industry. Yes, I admit that the possibility that any current implementation of a new technology such as caching may actually end up being a temporary fix. However, I would be amazed to find that the effort put into development and operational testing of such a technology did not result in some advancement in that field. From caching, we may find interesting advancements in distributed technology for example. Finally, the ethical question comes around. Is it, or is it not ethical for someone to use a transparent cache? IMHO, the arguments on this portion of the thread have been nothing but sensationalist. It reeks of paranoia. Perhaps the X-Files can (or have they already) done a show on this. THEY are stealing your packets without your knowledge! THEY are monitoring your every transmission and receipt! THEY know your innermost secrets! Bah! Does transparent (or any caching technology, for that matter) give someone additional tools to pry into the data flow? Yes. Does the technology specifically do this and provide this information? No. It's a tool that could be mismanaged and used for something other than its original purpose, and perhaps even intrusive actions. However, it requires someone who wants to do such a thing, and the tools exist already to do similar things if someone so chooses. Attempting to limit the growth of a technology because it could result in the technology being used wrongly is pure folly. In the case of transparent caching, it's my belief that as long as an end user receives content as provided by the content provider in the format specified by the content provider (yes, this includes dynamic information), both the content provider and the end user could care less what physical, electromagnetic, or optical transformations take place from end to end. If the end to end transit is truly transparent, then the goal is accomplished. However, the expectation that this goal could be accomplished while content providers, transit providers, and client/server software providers work independently of each other, is ludicrous. These entities must work together to achieve this. The pointing of fingers and blaming of the other parties does nothing more than delay the maturation process of this industry. This is a complex and extremely interactive system, and must be dealt with as a whole for an optimal solution to arise. Perhaps it's the place here for a Grand Statement concerning the likelihood of this occurring. :) -Against my better judgement, Chris
On Tue, Jun 30, 1998 at 02:24:43PM -0700, Chris A. Icide wrote:
Against my better judgement, I've decided to add my tiny bit of content to this thread. If there is anyone to thank for getting me into it, Mr. Porter has done so with his excellent review of the thread. Without it, I might not have seen the humor in it.
Me too; there were one or two points that I'd like to sppeak to in your excellent summary, Chris.
3. The Dow Jones Industrial Average will never break 4000
Um, I think you misspelled "10,000". :-)
Events such as the infamous 7007 event,
Forgive me revealing my ignorance, but what was that?
Finally, the ethical question comes around. Is it, or is it not ethical for someone to use a transparent cache? IMHO, the arguments on this portion of the thread have been nothing but sensationalist. It reeks of paranoia. Perhaps the X-Files can (or have they already) done a show on this. THEY are stealing your packets without your knowledge! THEY are monitoring your every transmission and receipt! THEY know your innermost secrets! Bah!
Indeed. But that wasn't really the argument. You touch on it next.
In the case of transparent caching, it's my belief that as long as an end user receives content as provided by the content provider in the format specified by the content provider (yes, this includes dynamic information), both the content provider and the end user could care less what physical, electromagnetic, or optical transformations take place from end to end. If the end to end transit is truly transparent, then the goal is accomplished.
This is an example of something I see happen all too often lately. Any given system has obvious, expected, intended outcomes, and non-obvious, but yet still desired ones. When designing a system to replace an earlier system, the analysis which creates the requirements document -- the list of things the system must do -- must be thorough enough to include _all_ the effects of the current system, including side effects that people have come to take advantage of; not just the items that were part of the original design specification. For example: caching. Over the course of the evolution of the web, the lifetime of the contents of a page has declined, and the changes in these pages are not always merely trivial; someone's earlier point about stock quote pages is pertinent here. The assumptions about caching which were valid when the technique was first introduced are becoming less so, and the limitations of such a technique are becoming more important to the people at each end. That this is true is the reason questions of ethics are being raised on this topic. I concur with the people who say that "truly transparent" caching is acceptable... but it's impossible to build a _truly_ transparent cache without a mathematically precise definition -- at the systems level -- of the protocol involved... which we do not currently have, nor are we likely to have it any time soon. OTOH, I also concur with the people who say (this is the majority opinion I've seen advanced) that the crime Digex is committing is introducing the proxy (ie: changing the semantics of the protocol) without notifying any of it's customers that it was, in effect, _breaking_ the protocol for some of their possible uses, without telling anyone. Since I've had bad experiences with Digex's new parent, the 10-years- and-still-no-profit Intermedia Communications, this policy doesn't surprise me over much...
However, the expectation that this goal could be accomplished while content providers, transit providers, and client/server software providers work independently of each other, is ludicrous. These entities must work together to achieve this. The pointing of fingers and blaming of the other parties does nothing more than delay the maturation process of this industry. This is a complex and extremely interactive system, and must be dealt with as a whole for an optimal solution to arise.
Precisely. Obviously, if there are half a dozen "transparent proxy" boxen, and only one that peopl seem to think is "truly" transparent, then we still have a _long_ way to go.
Perhaps it's the place here for a Grand Statement concerning the likelihood of this occurring. :)
"Always and never are two words you should always remember never to say." Cheers, -- jra -- Jay R. Ashworth jra@baylink.com Member of the Technical Staff Unsolicited Commercial Emailers Sued The Suncoast Freenet "Two words: Darth Doogie." -- Jason Colby, Tampa Bay, Florida on alt.fan.heinlein +1 813 790 7592 Managing Editor, Top Of The Key sports e-zine ------------ http://www.totk.com
participants (2)
-
Chris A. Icide
-
Jay R. Ashworth