That bandwidth is available to a sufficient number of people that bandwidth availability isn't an impediment to any development. Some people had broadband, then came Napster, then broadband exploded. Obviously it isn't that clear cut.
From: "Blake Hudson" <blake@ispn.net>
To: nanog@nanog.org
Sent: Wednesday, June 2, 2021 9:29:53 AM
Subject: Re: New minimum speed for US broadband connections
On 6/1/2021 10:50 PM, Haudy Kazemi via
NANOG wrote:
On bandwidth: perhaps some kind of 80/20 or
90/10 rule could be applied that uses broadly available
national peak service speeds as the basis for a formula. An
example might be...the basic service tier speed available to
80% of the population is the definition of broadband. When 80%
of the population has access to 100/100 Mbps home service,
then 100/100 becomes the benchmark. When 80% of the population
has access to 1/1 Gbps home service, then 1/1 becomes the
benchmark. Areas that don't have service that meets the
benchmark would be eligible for future-proof build-out
incentives, with incentives exponentially increasing as the area falls
further and further behind the benchmark. With 100/100 Mbps
as the benchmark, areas that currently are stuck with
unreliable 1.5 Mbps/384k DSL should be receiving upgrade
priority. And even higher priority if the benchmark has
shifted to 1 Gbps.
I love this idea! I think this may be the most useful nugget in the
thread.
There is a bit of chicken vs egg situation where applications don't
use X+1 bandwidth because folks only have X bandwidth. New
applications could be developed, or new ways of using the bandwidth
could be possible, if only the bandwidth existed. On the other side
of the coin, ISPs don't invest in faster speeds and folks don't
purchase more than X bandwidth because no applications that exist
today requires more than X. The latter is where our current
conversation seems to have landed. However, we all know that the
trend is towards increasing performance, just at a steady pace and
some folks getting a performance bump before others. When the masses
gained access to consistent 10M download speeds, suddenly
applications that were niche before start becoming ubiquitous
(streaming HD video was a good example of this). When the masses
gained access to 3M upload, applications like video conferencing
suddenly started to became more common place. Unfortunately, the
folks that were late in receiving access to these performance
thresholds became the digital "have-nots" once these applications
become available (they were doing just fine before because everyone
around them was doing things differently).
I tried to think back towards a goal of ensuring that everyone has
"good internet" access (or that as few people are left behind as
possible), and wondered if a yearly "cost of living" type adjustment
was required. However, I think that might land us in an ever
competing situation that ultimately may be unproductive. Your
sliding scale based on the performance of the most common internet
access (an 80% threshold) makes great sense as applications will
target the performance level of the large market. An occasional
audit of the state of the internet and adjustment to our thresholds
for what is considered the norm would be a great way to define where
the low end is and lift these folks out of the "poor internet" group
and help get them into the "good internet" group. I am now really
curious where that threshold would land today. Would we be above or
below the current definition of broadband?