They don't take a one-second sample every five minutes, they take the five-minute average rate measured by their router.
Unless they're insane, or their routers don't support that. I dunno who makes routers that don't support that, though.
Sorry, perhaps I didn't make the extreme example sufficiently clear: In the extreme I cited, (full rate for 5 minutes, idle for five minutes, repeated), the five minute average rate oscillates between zero and full line rate. The period of oscillation is 10 minutes (i.e. five minutes for the five minute rate to decay from line rate to zero and fine minutes to build back to line rate). Now if you sample every five minutes, and the sample point is synchronized to the peak and trough of the five minute rate you will get successive readings of 'line rate', zero, 'line rate', zero, etc. The 95% sample value will be 'line rate'. If you change _nothing_ except shift the sample point two and one half minutes forward in time the sample points will consistently produce outcomes of 'half line rate', 'half line rate', ..., and the 95% point is 'one half of line rate'. Same algorithm, same raw data, different 95% answers, both valid, yet one is twice as large as the other. Great outcome for a billing system isn't it? (The comment in my earlier note about getting a zero reading requires using something other than a 5 minute average data rate. The point I'm trying to make in this posting is that even if you do the 'right' thing and collect interface data readings every five minutes and do the first order differentials yourself to get the five minute data rates, the 95% 'answer' is still variable.)