Unfortunately, just because we know how difficult it is to provide a solution to this problem, does not mean that everyone subscribes to it. One should not discount the argument made based purely on the source, especially since recently a few very "interesting" articles showed up in a number of publications, including current issue of Forbes. The author, whose name escapes me at this time, is under the ill-belief that since the internet traffic does flow though hubs, it would be possible to intercept it and store it on the computers located in those hubs. It is more likely
that
a white paper describing the issues arising from attempts to intercept and store that much data would do better than an argument about unreliability of the source.
Alex
It's obvious that many people spreading this information (no matter how credible the source, have little knowledge of how much data flows through such hubs). If I remember correctly, AOL-TW for example does over 100 Terabits of traffic every day. No storage system in the world (that I know of) can write at 10 GB/sec (not forgetting that at OC-192 speeds we are writing 36 Terabytes of Data per hour). Not even the most prestigious government agencies have the ability to sort through petabytes of data per day.