On Thu, 31 Aug 2000, Jason Slagle wrote:
The problem is that SCP is several orders of magnitude slower then FTP. I use scp, rsync (on top of ssh), nfs, and several other methods of moving files around, and ftp blows them all away.
You also need to build a ftp like structure on top of it. ie: I pick the files I want instead of having to know the filenames.
Until this happens, I can see no viable alternative to FTP.
HTTP, perchance? The only things missing are a machine-parsable file indexing method (which would be easy enough to standardize on if someone felt the need to do so; think a "text/directory" MIME type, which would benefit more than just HTTP, or use a multipart list of URLs), and server-to-server transfers coordinated from your client, which most people have disabled anyway for security reasons. But, you get the added benefit of MIME typing, human-beneficial markup, caching if you have a nearby cache, inherent firewall friendliness (no data connection foolishness), and simple negotiation of encrypted transfers (SSL). And for command-line people like myself, there's lynx, w3m, and wget. FTP is disturbingly behind on features, some of which (decent certificate authentication, full-transaction encryption, data type labelling, and cache usefulness) are becoming more important today. Either the FTP RFC needs a near-complete overhaul, or the HTTP and other RFCs need to be updated to include the missing functionality. -- Edward S. Marshall <emarshal@logic.net> http://www.nyx.net/~emarshal/ ------------------------------------------------------------------------------- [ Felix qui potuit rerum cognoscere causas. ]