t On 26/09/2010, at 6:43 AM, Matthew Walster <matthew@walster.org> wrote:
On 25 September 2010 21:16, Rodrick Brown <rodrick.brown@gmail.com> wrote:
I think most people are aware that the Blizzard "World of WarcCraft" patcher distributes files through Bittorrent,
<snip>
I once read an article talking about making BitTorrent scalable by using anycasted caching services at the ISP's closest POP to the end user. Given sufficient traffic on a specified torrent, the caching device would build up the file, then distribute that direct to the subscriber in the form of an additional (preferred) peer. Similar to a CDN or Usenet, but where it was cached rather than deliberately pushed out from a locus.
Was anything ever standardised in that field? I imagine with much of P2P traffic being (how shall I put this...) less than legal, it's of questionable legality and the ISPs would not want to be held liable for the content cached there?
M
IMHO, Sooner or later our community will catch on and begin to deploy such technology. P2P is a really elegant 'tool' when used to distribute large files (which we all know). I expect that even the biggest last-mile providers will lose the arms race they currently engage in against this 'tool' and start participating in and controlling the flow of data. Throwing millions into technologies to thwart this 'tool,' technologies such as DPI only takes away from a last-mile provider's ability to offer service. I believe this is one reason the USA lags the Rest of the World in broadband deployment. Ultimately, I believe it will make sense to design last-mile networks to benefit from P2P (e.g. allow end stations to communicate locally rather than force traffic that could stay local to a central office through a session- based router). Then take advantage by deploying a scenario such as the one you've outlined to keep swarms local. Before we do that though, we need to cut the paranoia about this particular tool (created by the RIAA and others) and we need to see a few more exec's with vision. jy