That is true, but if no one uses it, is it really gone?
There's an underlying, I think, assumption that people won't use access speed/bandwidth that keeps coming up.
I don't think this is an accurate assumption. I don't think it's really ever been accurate.
There are a bunch of examples in this thread of reasons why 'more than X' is a good thing for the end-user, and that average usage over time is a bad metric to use in the discussion. At the very least the ability to get around/out-of serialization delays and microburst behavior is beneficial to the end-user.
Maybe the question that's not asked (but should be) is:
"Why is 100/100 seen as problematic to the industry players?"