counter wrap and data source "maximum value"
Moderators: Developers, Moderators
counter wrap and data source "maximum value"
hi folks,
i have a snmp stat that reports "seconds of use" as a counter. this stat internally wraps at 43,000 seconds. when it wraps, the graph has a huge spike since this is nowhere near the 32 & 64 bit counter wrap limits that rrdtool can automatically handle.
so researching the problem, it looks like a possible solution is to set the data source's "maximum value". research also says that i should be setting "maximum value" to the "maximum value per second" for my stat. well, the "maximum seconds of use per second" can only be "1"...so i am tempted to use this value.
however, i hesitate because i know that there is the issue of "imperfect sampling intervals". ie, the poller cannot sample at exact 300 second intervals and rrdtool will interpolate to compensate. i am concerned that if i set the maximum to "1" and the interpolated value is 1.0000001, that i will end up with holes in my graph.
so is it wise to set the "maximum value" to a slightly higher number like "1.5" to allow for fudge in the system?
or should i be solving this problem a different way?
thanks,
jlg
i have a snmp stat that reports "seconds of use" as a counter. this stat internally wraps at 43,000 seconds. when it wraps, the graph has a huge spike since this is nowhere near the 32 & 64 bit counter wrap limits that rrdtool can automatically handle.
so researching the problem, it looks like a possible solution is to set the data source's "maximum value". research also says that i should be setting "maximum value" to the "maximum value per second" for my stat. well, the "maximum seconds of use per second" can only be "1"...so i am tempted to use this value.
however, i hesitate because i know that there is the issue of "imperfect sampling intervals". ie, the poller cannot sample at exact 300 second intervals and rrdtool will interpolate to compensate. i am concerned that if i set the maximum to "1" and the interpolated value is 1.0000001, that i will end up with holes in my graph.
so is it wise to set the "maximum value" to a slightly higher number like "1.5" to allow for fudge in the system?
or should i be solving this problem a different way?
thanks,
jlg
for the benefit of others, i will answer my own question here. ***YES*** you should allow for some fudge factor due to interpolation.
in my case, there is no way to get more than 1.000000 seconds out of a time period of 1.000000 seconds. so i had set my data source maximum to "1". and most of the time, this worked just fine. however, reviewing the data after 3 days, i found one unknown (NaN) in my rrd which i assume is due to going over "1".
jlg
in my case, there is no way to get more than 1.000000 seconds out of a time period of 1.000000 seconds. so i had set my data source maximum to "1". and most of the time, this worked just fine. however, reviewing the data after 3 days, i found one unknown (NaN) in my rrd which i assume is due to going over "1".
Code: Select all
<!-- 2007-01-09 19:20:00 PST / 1168399200 --> <row><v> 1.0000000000e+00 </v></row>
<!-- 2007-01-09 19:25:00 PST / 1168399500 --> <row><v> 1.0000000000e+00 </v></row>
<!-- 2007-01-09 19:30:00 PST / 1168399800 --> <row><v> 9.9667777778e-01 </v></row>
<!-- 2007-01-09 19:35:00 PST / 1168400100 --> <row><v> 9.9998888889e-01 </v></row>
<!-- 2007-01-09 19:40:00 PST / 1168400400 --> <row><v> 9.9668881506e-01 </v></row>
<!-- 2007-01-09 19:45:00 PST / 1168400700 --> <row><v> 9.9667774086e-01 </v></row>
<!-- 2007-01-09 19:50:00 PST / 1168401000 --> <row><v> NaN </v></row>
<!-- 2007-01-09 19:55:00 PST / 1168401300 --> <row><v> 1.0000000000e+00 </v></row>
<!-- 2007-01-09 20:00:00 PST / 1168401600 --> <row><v> 9.9668881506e-01 </v></row>
<!-- 2007-01-09 20:05:00 PST / 1168401900 --> <row><v> 9.9997785161e-01 </v></row>
<!-- 2007-01-09 20:10:00 PST / 1168402200 --> <row><v> 1.0000000000e+00 </v></row>
<!-- 2007-01-09 20:15:00 PST / 1168402500 --> <row><v> 1.0000000000e+00 </v></row>
<!-- 2007-01-09 20:20:00 PST / 1168402800 --> <row><v> 1.0000000000e+00 </v></row>
more updates:
1. cacti complains if i try to set the data source "maximum value" to any number with a decimal point. at least...in the version that i'm running which is debian sarge's version which i understand is a probably 1 or 2 years old. so i had to set it to "2".
btw, it would probably be helpful to list somewhere that "maximum value" is the "maximum rate per second" when using COUNTER. otherwise, i would naturally assume that i should put in 43,000 for my case.
2. note to #1 above: i can use "rrdtool tune <file.rrd> --maximum <ds>:1.1" to set the value to "1.1". so the limitation is in cacti.
3. the other possibility for NaN in my data is a wrap since my data source wraps at 43,000 instead of 4GB. however, i have been snapshotting the raw data and found no wrap at the date listed in my above data. but there is an inconsistency that i need to uncover before i can be certain of the above conclusion. i'll update here when i know more.
jlg
1. cacti complains if i try to set the data source "maximum value" to any number with a decimal point. at least...in the version that i'm running which is debian sarge's version which i understand is a probably 1 or 2 years old. so i had to set it to "2".
btw, it would probably be helpful to list somewhere that "maximum value" is the "maximum rate per second" when using COUNTER. otherwise, i would naturally assume that i should put in 43,000 for my case.
2. note to #1 above: i can use "rrdtool tune <file.rrd> --maximum <ds>:1.1" to set the value to "1.1". so the limitation is in cacti.
3. the other possibility for NaN in my data is a wrap since my data source wraps at 43,000 instead of 4GB. however, i have been snapshotting the raw data and found no wrap at the date listed in my above data. but there is an inconsistency that i need to uncover before i can be certain of the above conclusion. i'll update here when i know more.
jlg
- gandalf
- Developer
- Posts: 22383
- Joined: Thu Dec 02, 2004 2:46 am
- Location: Muenster, Germany
- Contact:
Please create a bug report at http://bugs.cacti.net for that issuejlg wrote: 2. note to #1 above: i can use "rrdtool tune <file.rrd> --maximum <ds>:1.1" to set the value to "1.1". so the limitation is in cacti.
jlg
Reinhard
will do...gandalf wrote:Please create a bug report at http://bugs.cacti.net for that issuejlg wrote: 2. note to #1 above: i can use "rrdtool tune <file.rrd> --maximum <ds>:1.1" to set the value to "1.1". so the limitation is in cacti.
jlg
Reinhard
jlg
update:
after having set the maximum to 1.1, i now see values larger than 1.00000
jlg
after having set the maximum to 1.1, i now see values larger than 1.00000
Code: Select all
<!-- 2007-01-11 13:00:00 PST / 1168549200 --> <row><v> 1.0000000000e+00 </v></row>
<!-- 2007-01-11 13:05:00 PST / 1168549500 --> <row><v> 9.9337763012e-01 </v></row>
<!-- 2007-01-11 13:10:00 PST / 1168549800 --> <row><v> 1.0032778883e+00 </v></row>
<!-- 2007-01-11 13:15:00 PST / 1168550100 --> <row><v> 1.0000111483e+00 </v></row>
<!-- 2007-01-11 13:20:00 PST / 1168550400 --> <row><v> 1.0033222222e+00 </v></row>
<!-- 2007-01-11 13:25:00 PST / 1168550700 --> <row><v> 1.0000111111e+00 </v></row>
<!-- 2007-01-11 13:30:00 PST / 1168551000 --> <row><v> 1.0000000000e+00 </v></row>
<!-- 2007-01-11 13:35:00 PST / 1168551300 --> <row><v> 1.0000000000e+00 </v></row>
<!-- 2007-01-11 13:40:00 PST / 1168551600 --> <row><v> 9.9337763012e-01 </v></row>
<!-- 2007-01-11 13:45:00 PST / 1168551900 --> <row><v> 1.0032778883e+00 </v></row>
jlg
Who is online
Users browsing this forum: No registered users and 0 guests