I think the big deal here is the "100%" thing. If Speedtest is one of many tests, then I don't particularly see the problem. It shouldn't be any more difficult to convince politicians that any system (testing or otherwise) can have problems than it is to convince them of any other hard fact. (IOW: Nearly impossible, but you have to try. :) -- TTFN, patrick On Jan 29, 2011, at 1:29 PM, Jeff Richmond wrote:
Mike, nothing is perfect, so let's just start with that. What the FCC has done to measure this is to partner with Sam Knows and then have friendly DSL subs for the participating telcos to run modified CPE firmware to test against their servers. We have been collecting data for this for the past couple of months, actually. More can be found here:
http://www.samknows.com/broadband/fcc_and_samknows
While even that I have issues with, it certainly is better than hitting that speedtest site where anything at all problematic on the customer LAN side of the CPE can cause erroneous results.
Good luck, -Jeff
On Jan 29, 2011, at 10:00 AM, Mike wrote:
Hello,
My company is small clec / broadband provider serving rural communities in northern California, and we are the recipient of a small grant from the state thru our public utilities commission. We went out to 'middle of nowhere' and deployed adsl2+ in fact (chalk one up for the good guys!), and now that we're done, our state puc wants to gather performance data to evaluate the result of our project and ensure we delivered what we said we were going to. Bigger picture, our state is actively attempting to map broadband availability and service levels available and this data will factor into this overall picture, to be used for future grant/loan programs and other support mechanisms, so this really is going to touch every provider who serves end users in the state.
The rub is, that they want to legislate that web based 'speedtest.com' is the ONLY and MOST AUTHORITATIVE metric that trumps all other considerations and that the provider is %100 at fault and responsible for making fraudulent claims if speedtest.com doesn't agree. No discussion is allowed or permitted about sync rates, packet loss, internet congestion, provider route diversity, end user computer performance problems, far end congestion issues, far end server issues or cpu loading, latency/rtt, or the like. They are going to decide that the quality of any provider service, is solely and exclusively resting on the numbers returned from 'speedtest.com' alone, period.
All of you in this audience, I think, probably immediately understand the various problems with such an assertion. Its one of these situations where - to the uninitiated - it SEEMS LIKE this is the right way to do this, and it SEEMS LIKE there's some validity to whats going on - but in practice, we engineering types know it's a far different animal and should not be used for real live benchmarking of any kind where there is a demand for statistical validity.
My feeling is that - if there is a need for the state to do benchmarking, then it outta be using statistically significant methodologies for same along the same lines as any other benchmark or test done by other government agencies and national standards bodies that are reproducible and dependable. The question is, as a hotbutton issue, how do we go about getting 'the message' across, how do we go about engineering something that could be considered statistically relevant, and most importantly, how do we get this to be accepted by non-technical legislators and regulators?
Mike-