Thanks to Andreas for pointing me to the basic setup for reportit.
I've successfully created a report for traffic usage, and golly! it looks like the right stuff! Whoopie!
I (of course) have a couple of questions:
(1)
As it is, the report only takes a few seconds using the (slow) cacti connection. So speed is not an immense problem. However, it appears that I need to do some work to get the report to run as a cron job. I looked at the php module, but I don't think I have the right stuff on my cacti server to make it. It appears that I need to download the source for php in order to add the reportit php module to php? Is this correct?
(2)
Is it possible to add the in data and out data in the report? Or would it be simpler (and perhaps more general) to do this later in Excel by exporting the report as a CSV, and manipulating the results?
Many thanks again to Andreas for a _MOST_ useful plugin!
Whew, got reportit to run, now have a couple more questions
Moderators: Developers, Moderators
Whew, got reportit to run, now have a couple more questions
---------
The Glue Guy
The Glue Guy
- browniebraun
- Developer
- Posts: 791
- Joined: Tue Jun 13, 2006 1:17 am
- Location: Cologne, Germany
Hi GlueGuy!
For scheduled reporting it isn't necessary to use the PHP module for RRDtool - you can also use the default.
IMO it's recommend for greater reports (~2000 data objects) where the time difference between the default type and the php bindings increases significantly. If you want to use these bindings then you've to decide if you want to include them as a dynamic or a static PHP module.
The static version requires to recompile your PHP, but it's the alternative workaround for using this fast way of connection if the dynamic one doesn't work. (I had issues with PHP 5.2 and SUSE Linux 9 running the php bindings under the CLI.). The dynamic library is easier to implement.
Take a look at the following links to see how scheduled reporting has to be set up:
If you mean all measured values with in and out data then I've to say that it isn't possible.
Theoretical it would be, but I don't now where I should store this amount of data.
Best regards
-Andreas-
For scheduled reporting it isn't necessary to use the PHP module for RRDtool - you can also use the default.
IMO it's recommend for greater reports (~2000 data objects) where the time difference between the default type and the php bindings increases significantly. If you want to use these bindings then you've to decide if you want to include them as a dynamic or a static PHP module.
The static version requires to recompile your PHP, but it's the alternative workaround for using this fast way of connection if the dynamic one doesn't work. (I had issues with PHP 5.2 and SUSE Linux 9 running the php bindings under the CLI.). The dynamic library is easier to implement.
Take a look at the following links to see how scheduled reporting has to be set up:
If you mean all measured values with in and out data then I've to say that it isn't possible.
Theoretical it would be, but I don't now where I should store this amount of data.
Best regards
-Andreas-
Hat das Blümchen einen Knick, war der Schmetterling zu dick!
reportit v0.7.5a
SNMPAgent v0.2.3
Download ReportIt | Download SNMPAgent | ReportIt SVN | ReportIt Templates | Wish list
reportit v0.7.5a
SNMPAgent v0.2.3
Download ReportIt | Download SNMPAgent | ReportIt SVN | ReportIt Templates | Wish list
Thanks Andreas, you're a gem!
Since we only have ~~ 300 data objects, the runtime does not seem to be an issue at this point.
I have put in a cron job that seems to be working; I just have it set to run once per day.
Once I got the field delimiters and commas vs decimals sorted out, the data is very useful.
Is there an explanation of how to use the "nan" variable when setting up measurands? I think this can help normalize data from several customers that turn off their equipment when they're not using it. As it is, their usage levels go off the charts, as the nans don't count in the daily average.
What I'm doing to calculate daily aggregate Bytes usage is to take the 24 hour average and multipy it *60*60*24/8 to get the total Bytes transfered over a 24 hour period. When the average doesn't contain a full set of data points, it's not right for this aggregation.
Since we only have ~~ 300 data objects, the runtime does not seem to be an issue at this point.
I have put in a cron job that seems to be working; I just have it set to run once per day.
Once I got the field delimiters and commas vs decimals sorted out, the data is very useful.
Is there an explanation of how to use the "nan" variable when setting up measurands? I think this can help normalize data from several customers that turn off their equipment when they're not using it. As it is, their usage levels go off the charts, as the nans don't count in the daily average.
What I'm doing to calculate daily aggregate Bytes usage is to take the 24 hour average and multipy it *60*60*24/8 to get the total Bytes transfered over a 24 hour period. When the average doesn't contain a full set of data points, it's not right for this aggregation.
---------
The Glue Guy
The Glue Guy
Who is online
Users browsing this forum: No registered users and 2 guests