
3 Jan
2002
3 Jan
'02
1:55 p.m.
I'm looking for some real user input from network operators: To what degree would you want to be able to measure error_rates on device interfaces. Many tools currently calculate 1% and above (so if you have .9% it is displayed as 0%) but it is quite possible users may want to measure fractional percentages as well. Does anyone have any opinions/preferences based on your current experience? Does it matter the type of interface you are managing (Ethernet, serial, etc.)? Thanks - Dan Holmes