Ok.. wrote a little shell script to snmpget uptimes, awk'd it to return just the days.. So right now one of our boxes is reporting 142 day uptime.
I've got my datasource set to Guage reading, and the graph right now is set to LAST and I get no data.. If set to AVERAGE I get a reading of 42 this also occurs if I do MAX.
Where am I off here? I've done comparisons to other numeric counters (how many people on a box or how many processes however they are set to AVERAGE) ..
Been wracking my stupidity but cannot clean up my own mess, any tips would be appreciated.
-C9-
Uptime, made script etc.. but weird graph result.
Moderators: Developers, Moderators
A few things.
You will definetely want to create the data source as a GAUGE, as you did. Also make note that when making data source changes, you must either 1) delete the .rrd file, or 2) check the "Update .rrd" checkbox.
You will want to select all of the items for "Associated RRA's".
You may also want to check the log file and see what kind of data your script is feeding cacti to make sure that it is correct.
-Ian
You will definetely want to create the data source as a GAUGE, as you did. Also make note that when making data source changes, you must either 1) delete the .rrd file, or 2) check the "Update .rrd" checkbox.
You will want to select all of the items for "Associated RRA's".
You may also want to check the log file and see what kind of data your script is feeding cacti to make sure that it is correct.
-Ian
Who is online
Users browsing this forum: No registered users and 1 guest