Mike, I know you have a lot of experience in this. I have built several networks and owned ISPs, too. How is it really all that more expensive to offer higher Internet speeds? The cost of the Internet bits per subscriber ceased being a major consideration in most budgets about 10 years ago. Staffing, trucks, billing and rents cost far more. I’m way more upset about having to hire another FTE or buy a truck then having to add another transit or peering port. Oh, wait, if you are still using last century technology to deliver last the last mile, I can see the problem. You cannot get enough bits to the subscriber without a large investment to replace the last mile. Most customers WILL notice a difference between a 10mbit and 1Gig connection day to day. Your performance assumptions seem to be based on there is only ever being a single traffic flow over that connection, from a single endpoint. Typical subscriber usage isn’t anything remotely like that anymore. It is several to dozens of devices and usually every resident using bandwidth simultaneously when they are home. Plus all the background downloads of smartphone updates, massive content updates on all the game consoles, operating system updates, all those cloud backups, plus IoT devices like cameras with cloud DVRs. You may not like all these devices and we can debate their usefulness, but the fact is, consumers are buying them and using them, and when things don’t work well, the belief is “‘my ISP sucks”, even if that isn’t entirely true. My strongly held opinion is that fiber optic cable to the premises is the best and only long term viable technology for “Broadband”, with 30 years or more projected life span. Everyone who has grid tied electrical service should get to have fiber if they want it. I also believe that ISPs need to manage the customer’s WiFi most of the time, because it is a is huge part of the end-user’s quality of experience. WiFi 6E will go a long way towards reducing interference and channel congestion and making “auto channel” actually work, but will still be another 2-3 years before it is really common. Fiber optic networks operated in a competent way are always going to win compared to any other technology. It is just a matter of time. On Tue, Jun 1, 2021 at 1:34 PM Mike Hammett <nanog@ics-il.net> wrote:
"Why is 100/100 seen as problematic to the industry players?"
In rural settings, it's low density, so you're spending a bunch of money with a low probability of getting any return. Also, a low probability that the customer cares.
"There's an underlying, I think, assumption that people won't use access speed/bandwidth that keeps coming up."
On a 95th% basis, no, they don't use it.
On shorter time spans, sure. Does it really matter, though? If I can put a 100 meg file into Dropbox in under a second versus 10 seconds, does that really matter? If Netflix gets my form submission in 0.01 seconds instead of .1 seconds, does it matter?
I think you'll find few to argue against "faster is better." The argument is at what price? At what perceived benefit?
Show me an average end-user that can tell the difference between a 10 meg upload and a 1 gig upload, aside from media-heavy professionals or the one-time full backup of a phone, PC, etc. Okay, show me two of them, ten of them...
99% of the end-users I know can't tell the difference in any amount of speed above 5 megs. It then just either works or doesn't work.
----- Mike Hammett Intelligent Computing Solutions http://www.ics-il.com
Midwest-IX http://www.midwest-ix.com
------------------------------ *From: *"Christopher Morrow" <morrowc.lists@gmail.com> *To: *"Mike Hammett" <nanog@ics-il.net> *Cc: *aaron1@gvtc.com, "nanog list" <nanog@nanog.org> *Sent: *Tuesday, June 1, 2021 12:14:43 PM *Subject: *Re: New minimum speed for US broadband connections
On Tue, Jun 1, 2021 at 12:44 PM Mike Hammett <nanog@ics-il.net> wrote:
That is true, but if no one uses it, is it really gone?
There's an underlying, I think, assumption that people won't use access speed/bandwidth that keeps coming up. I don't think this is an accurate assumption. I don't think it's really ever been accurate.
There are a bunch of examples in this thread of reasons why 'more than X' is a good thing for the end-user, and that average usage over time is a bad metric to use in the discussion. At the very least the ability to get around/out-of serialization delays and microburst behavior is beneficial to the end-user.
Maybe the question that's not asked (but should be) is: "Why is 100/100 seen as problematic to the industry players?"
-- Jim Troutman, jamesltroutman@gmail.com Pronouns: he/him/his 207-514-5676 (cell)