Ad blocker detected: Our website is made possible by displaying online advertisements to our visitors. Please consider supporting us by disabling your ad blocker on our website.
I started to graph "Interface - Errors/Discards" on a Cisco 6509 chassis switch. My graphed values do not match the values in RRD file or the results returned via SNMP.
My SNMP values do change, but very low amounts, if at all, with each poll. I expect cacti to take the delta from period 1 to period 2 then divide by 300 seconds to get an average. That average should be stored in the RRD. The values in my RRD files support my expectations of cacti behavior. The values are around .003 which works out to about 1/300.
What I get is a graph that shows values in the millions as well as correct values in the .001 - .01 range.
This is the rrdtool dump for the time frame in the screen captures:
Rookie mistake. At first glance I was assuming the graph was in millions when in fact the unit of measure was milli. So lesson learned, M is not the same as m.