On Tue, 15 Jul 2014 13:08:58 -0600, Brett Glass said:
Estimates of the maximum bandwidths of all the human senses, combined, range between the capacity of a T1 line (at the low end) and about 4 Mbps (at the high end). A human being simply is not wired to accept more input. (Yes, machines could digest more... which means that additional bandwidth to and from the home might be useful for the purpose of spying on us.) What does this imply about the FCC's proposal to redefine "broadband" as a symmetrical 10 Mbps?
Actually, vision is higher bandwidth than that - most VR people estimate that approaching human vision requires a gigapixel/second (at 24 bits or more per pixel) - and even that needs to play lots of eye-tracking games to concentrate the rendering on where the eye is focused. Consider how fast even high-end NVidia cards can pump out pixels and you can *still* see it's CGI. Well-shot 4K video of real objects displayed on a good monitor is *just* reaching the "it actually looks real" level - and that's a hell of a lot more than 4Mbps. And remember that bits are consumed by more than just one human per dwelling - you can have multiple people watching different things, and silicon-based consumers burning lots of bandwidth on behalf of their carbon-based masters. There's about a half-zillion ways a gaming console can burn bandwidth, for example. Heck, the Raspberry Pi under my TV can soak up more than 4Mbits/sec just doing a software update. /me makes popcorn and waits for 4K displays to drop under US$1K and watch the network providers completely lose their shit....