On Fri, Feb 08, 2002 at 10:09:12PM -0800, Stephen Stuart wrote:
The topic being discussed is to try to reduce USENET bandwidth. One way to do that is to pass pointers around instead of complete articles. If the USENET distribution system passed pointers to
Just to chime in, the idea of passing pointers around has come up in several different versions over the last so many years and combined with pre-fetching/caching seems like a very good idea; it's a pity that no one has really tried it given that many of the tools already exist (disclaimer: I've been dabbling with writing something using existing protocols on and off for some time). So far, and this was inspired by the loss of a clariNet newsfeed, I've written a silly little personal nntp server based on an SQL back-end that collects RSS pointers to website "articles". The server stores the "pointers" (URLs) and retrieves the article directly from the remote webserver (or local http cache) on demand (currently spits out html, ugh). The header info (overview) is composed of the RSS data (unfortunately quite sparse in most cases) and the "headline" fetcher script builds a "score" for each article based on my predefined criteria. I've found at least one mailing list archive that produces RSS files for each mailing list that can be simliarly used to populate a "newsgroup". So mailing lists provide a simple way of creating a "publishing" method other than a website. In all these cases, "replying" to a message is a bit involved, but not impossible. So, as far as I can tell, all the tools exist to implement a new set of newsgroups based on existing protocols that don't involve the massive bit movement of Usenet however preserve the highly efficient news reading mechanisms of news readers. Anyone have a sample, open-source, robust NNTP server implemented in Java or perl that needs a SQL backend and article population mechanism? Adi
participants (1)
-
R.P. Aditya