Last Mile ISP Quality Measurements
What are other last mile ISPs doing to measure the quality of their connections? We all know pinging various destinations. We also all know that pinging a destination doesn't necessarily tell you the whole quality story. I currently have Smokeping pulling the HTTPS for about 20 - 25 of the "top" websites, per the old Alexa rankings. I feel as though I could be doing more. I am more closely wanting to emulate the end-user experience in a repeatable, quantifiable fashion. I'd like to do A/B comparisons as well. When I make X change, how does it change? If I'm already doing the low-hanging fruit path, then so be it. ----- Mike Hammett Intelligent Computing Solutions Midwest Internet Exchange The Brothers WISP
We are doing something similar with netpath in Solarwinds, but mainly using the stream URLs of some popular streaming services that we see commonly used (FuboTV, etc). Came in handy recently in tracking down customer complaints that ended up being a peering capacity issue further upstream. Tim ________________________________ From: NANOG <nanog-bounces+tim=mid.net@nanog.org> on behalf of Mike Hammett <nanog@ics-il.net> Sent: Tuesday, August 8, 2023 4:08 PM To: NANOG <nanog@nanog.org> Subject: Last Mile ISP Quality Measurements What are other last mile ISPs doing to measure the quality of their connections? We all know pinging various destinations. We also all know that pinging a destination doesn't necessarily tell you the whole quality story. I currently have Smokeping pulling the HTTPS for about 20 - 25 of the "top" websites, per the old Alexa rankings. I feel as though I could be doing more. I am more closely wanting to emulate the end-user experience in a repeatable, quantifiable fashion. I'd like to do A/B comparisons as well. When I make X change, how does it change? If I'm already doing the low-hanging fruit path, then so be it. ----- Mike Hammett Intelligent Computing Solutions<http://www.ics-il.com/> [http://www.ics-il.com/images/fbicon.png]<https://www.facebook.com/ICSIL>[http://www.ics-il.com/images/googleicon.png]<https://plus.google.com/+IntelligentComputingSolutionsDeKalb>[http://www.ics-il.com/images/linkedinicon.png]<https://www.linkedin.com/company/intelligent-computing-solutions>[http://www.ics-il.com/images/twittericon.png]<https://twitter.com/ICSIL> Midwest Internet Exchange<http://www.midwest-ix.com/> [http://www.ics-il.com/images/fbicon.png]<https://www.facebook.com/mdwestix>[http://www.ics-il.com/images/linkedinicon.png]<https://www.linkedin.com/company/midwest-internet-exchange>[http://www.ics-il.com/images/twittericon.png]<https://twitter.com/mdwestix> The Brothers WISP<http://www.thebrotherswisp.com/> [http://www.ics-il.com/images/fbicon.png]<https://www.facebook.com/thebrotherswisp>[http://www.ics-il.com/images/youtubeicon.png]<https://www.youtube.com/channel/UCXSdfxQv7SpoRQYNyLwntZg>
I would personally have end IT friendly mimick and test their existing systems on the nee ISP. Especially cloud tech having read a book on cloud by some PHD administrators they redefined cloud as being cloudy in a sense to do with not knowing what route your clients packets will take to reach the server. Sent via BT Email App From: Tim Burke <tim@mid.net> Sent: 8 August 2023 22:47:06 BST To: Mike Hammett <nanog@ics-il.net>, NANOG <nanog@nanog.org> Subject: Re: Last Mile ISP Quality Measurements We are doing something similar with netpath in Solarwinds, but mainly using the stream URLs of some popular streaming services that we see commonly used (FuboTV, etc). Came in handy recently in tracking down customer complaints that ended up being a peering capacity issue further upstream. Tim From: NANOG <nanog-bounces+tim=mid.net@nanog.org> on behalf of Mike Hammett <nanog@ics-il.net> Sent: Tuesday, August 8, 2023 4:08 PM To: NANOG <nanog@nanog.org> Subject: Last Mile ISP Quality Measurements What are other last mile ISPs doing to measure the quality of their connections? We all know pinging various destinations. We also all know that pinging a destination doesn't necessarily tell you the whole quality story. I currently have Smokeping pulling the HTTPS for about 20 - 25 of the "top" websites, per the old Alexa rankings. I feel as though I could be doing more. I am more closely wanting to emulate the end-user experience in a repeatable, quantifiable fashion. I'd like to do A/B comparisons as well. When I make X change, how does it change? If I'm already doing the low-hanging fruit path, then so be it. ----- Mike Hammett Intelligent Computing Solutions <http://www.ics-il.com/> <https://www.facebook.com/ICSIL> <https://plus.google.com/+IntelligentComputingSolutionsDeKalb> <https://www.linkedin.com/company/intelligent-computing-solutions> <https://twitter.com/ICSIL> Midwest Internet Exchange <http://www.midwest-ix.com/> <https://www.facebook.com/mdwestix> <https://www.linkedin.com/company/midwest-internet-exchange> <https://twitter.com/mdwestix> The Brothers WISP <http://www.thebrotherswisp.com/> <https://www.facebook.com/thebrotherswisp> <https://www.youtube.com/channel/UCXSdfxQv7SpoRQYNyLwntZg>
Have you considered hosting a Ripe Anchor? https://atlas.ripe.net/anchors/about/ Minimal cost, good of the Internet project, good insights, answers the use case you describe. Steve P
I am reluctant to respond because it might end up sounding like an ad for libreqos.io. Leaving aside the tcp rtt tracking, the cake shaping, the mark and drop statistics in that product, the (mostly wireless) ISPs we work with typically have a dashboard of long term SNMP statistics of key parameters like signal strength (RSSI), a heatmap of rtts to those routers, and the upstreams, (smokeping cannot handle this kind of density), a bandwidth tracker (usually on a 5 minute interval), a few raspberry pi or equivalents at the towers doing active measurements on demand, and a set of actionable items derived from that that the support techs work off of. Then there is a per customer screen that captures as much as possible useful about the customer and every hop along the way. There are a lot of pics of dashboards like this on the web, see preseem, paraqum, and of course libreqos for examples. People use a variety of backend products for it (grafina, redis, influx are popular). hope this helps. On Wed, Aug 9, 2023 at 5:23 AM Steve Pointer <spointer@humdai.net> wrote:
Have you considered hosting a Ripe Anchor?
https://atlas.ripe.net/anchors/about/
Minimal cost, good of the Internet project, good insights, answers the use case you describe.
Steve P
-- Podcast: https://www.youtube.com/watch?v=bxmoBr4cBKg Dave Täht CSO, LibreQos
Would a QoE product be able to show me that my connectivity to Slack sucks right now? I've followed precinct for a long time, with Libre qos a bit less than that. I took it as ISP subscriber focused, not greater internet focused. ----- Mike Hammett Intelligent Computing Solutions http://www.ics-il.com Midwest-IX http://www.midwest-ix.com ----- Original Message ----- From: Dave Taht <dave.taht@gmail.com> To: Steve Pointer <spointer@humdai.net> Cc: Aaron C. de Bruyn via NANOG <nanog@nanog.org> Sent: Wed, 09 Aug 2023 07:36:19 -0500 (CDT) Subject: Re: Last Mile ISP Quality Measurements I am reluctant to respond because it might end up sounding like an ad for libreqos.io. Leaving aside the tcp rtt tracking, the cake shaping, the mark and drop statistics in that product, the (mostly wireless) ISPs we work with typically have a dashboard of long term SNMP statistics of key parameters like signal strength (RSSI), a heatmap of rtts to those routers, and the upstreams, (smokeping cannot handle this kind of density), a bandwidth tracker (usually on a 5 minute interval), a few raspberry pi or equivalents at the towers doing active measurements on demand, and a set of actionable items derived from that that the support techs work off of. Then there is a per customer screen that captures as much as possible useful about the customer and every hop along the way. There are a lot of pics of dashboards like this on the web, see preseem, paraqum, and of course libreqos for examples. People use a variety of backend products for it (grafina, redis, influx are popular). hope this helps. On Wed, Aug 9, 2023 at 5:23 AM Steve Pointer <spointer@humdai.net> wrote:
Have you considered hosting a Ripe Anchor?
https://atlas.ripe.net/anchors/about/
Minimal cost, good of the Internet project, good insights, answers the use case you describe.
Steve P
-- Podcast: https://www.youtube.com/watch?v=bxmoBr4cBKg Dave Täht CSO, LibreQos
I have a probe, but not an anchor. That would just help with simple reachability issues to probes that test against it, wouldn't it? It wouldn't necessarily be able to monitor popular Internet destinations or across different peers? ----- Mike Hammett Intelligent Computing Solutions http://www.ics-il.com Midwest-IX http://www.midwest-ix.com ----- Original Message ----- From: Steve Pointer <spointer@humdai.net> To: Aaron C. de Bruyn via NANOG <nanog@nanog.org> Sent: Wed, 09 Aug 2023 04:49:02 -0500 (CDT) Subject: Re: Last Mile ISP Quality Measurements Have you considered hosting a Ripe Anchor? https://atlas.ripe.net/anchors/about/ Minimal cost, good of the Internet project, good insights, answers the use case you describe. Steve P
measure the quality of their connections
Really depends on what you are trying to measure. Some metrics are going to be great at telling you the quality and performance of the network at L3, but thanks to the Stupid Content Provider Tricks that we use, won't tell you anything about the L4/L7 experience that your customers may have. Tons of different things you could measure, it's just putting them together in such a way that all parties in the conversation have the same context and understanding that can be exceptionally tricky. On Tue, Aug 8, 2023 at 5:09 PM Mike Hammett <nanog@ics-il.net> wrote:
What are other last mile ISPs doing to measure the quality of their connections? We all know pinging various destinations. We also all know that pinging a destination doesn't necessarily tell you the whole quality story.
I currently have Smokeping pulling the HTTPS for about 20 - 25 of the "top" websites, per the old Alexa rankings. I feel as though I could be doing more. I am more closely wanting to emulate the end-user experience in a repeatable, quantifiable fashion. I'd like to do A/B comparisons as well. When I make X change, how does it change?
If I'm already doing the low-hanging fruit path, then so be it.
----- Mike Hammett Intelligent Computing Solutions <http://www.ics-il.com/> <https://www.facebook.com/ICSIL> <https://plus.google.com/+IntelligentComputingSolutionsDeKalb> <https://www.linkedin.com/company/intelligent-computing-solutions> <https://twitter.com/ICSIL> Midwest Internet Exchange <http://www.midwest-ix.com/> <https://www.facebook.com/mdwestix> <https://www.linkedin.com/company/midwest-internet-exchange> <https://twitter.com/mdwestix> The Brothers WISP <http://www.thebrotherswisp.com/> <https://www.facebook.com/thebrotherswisp> <https://www.youtube.com/channel/UCXSdfxQv7SpoRQYNyLwntZg>
participants (6)
-
Dave Taht
-
Mike Hammett
-
Steve Pointer
-
Tim Burke
-
Tom Beecher
-
touseef.rehman1@btinternet.com