Code: Select all
02/09/2011 10:25:05 PM - CMDPHP: Poller[0] Host[11] DS[52] CMD: perl /opt/cacti-0.8.7g/scripts/ping-latency.pl 192.168.3.1, output: min:685.425 avg:827.105 m
ax:1004.806 dev:109.234 loss:25
02/09/2011 10:25:05 PM - POLLER: Poller[0] Parsed MULTI output field 'min:685.425' [map min->min]
02/09/2011 10:25:05 PM - POLLER: Poller[0] Parsed MULTI output field 'avg:827.105' [map avg->avg]
02/09/2011 10:25:05 PM - POLLER: Poller[0] Parsed MULTI output field 'max:1004.806' [map max->max]
02/09/2011 10:25:05 PM - POLLER: Poller[0] Parsed MULTI output field 'dev:109.234' [map dev->dev]
02/09/2011 10:25:05 PM - POLLER: Poller[0] Parsed MULTI output field 'loss:25' [map loss->loss]
02/09/2011 10:25:05 PM - POLLER: Poller[0] CACTI2RRD: /usr/bin/rrdtool update /opt/cacti-0.8.7g/rra/tsp-idu_-_cantera_cecilio_min_52.rrd --template min:avg:m
ax:dev:loss 1297301103:685.425:827.105:1004.806:109.234:25
Now, I can't figure out why the graph is still having NaNs
I don't know if this problem is related to this bug: http://bugs.cacti.net/view.php?id=1658
Someone is experiencing the same?