0.8.6b cactid upgrade - graphs not updated
Moderators: Developers, Moderators
0.8.6b cactid upgrade - graphs not updated
Just upgraded to 0.8.6b from 0.8.5a. Everything appeared to run perfect. Unfortunately, none of my graphs are updating with cactid. Are there any gotcha's I may have missed on this?
Same thing here, did a upgrade on a Debian system yesterday from 0.8.5b to 0.8.6b and none of the graphs were being updated. Changed crontab to use poller.php and tried both polling methods, no luck. When I turned up logging to DEBUG it would start polling data then end up saying:
Waiting for 1/1 pollers
Waiting for 1/1 pollers
.....
repeating this forever. I tried to increase the number of processes or threads, no help. Had to downgrade to 0.8.5b to get it working again. Anyone with suggestions?
System is Debian Linux (testing distro) with
mysql 4.0.21
php 4.3.9
rrdtool 1.0.48
Waiting for 1/1 pollers
Waiting for 1/1 pollers
.....
repeating this forever. I tried to increase the number of processes or threads, no help. Had to downgrade to 0.8.5b to get it working again. Anyone with suggestions?
System is Debian Linux (testing distro) with
mysql 4.0.21
php 4.3.9
rrdtool 1.0.48
Hey Folks,
Instead of starting a new thread I'm going to update this one as well.
I just noticed the same problem. I upgraded last thursday. No graph updates, although cactid claims 77 hosts are processed. So, I assumed it was a problem with cactid. I went ahead and changed over to poller.php.
Same problem. I've considered the permissions as well, and deleted the poller cache. Running as root from the command line, only some of my graphs are updated!
(and poller does see my hosts:
10/12/2004 09:12:21 AM - SYSTEM STATS: Time: 8.6298 s, Method: cmd.php, Processes: 1, Threads: N/A, Hosts: 77, Hosts/Process: 77)
Check this:
# ls -lt
-rw-r--r-- 1 cactiuser root 141484 Oct 12 09:12 mismon_load_1min_12.rrd
-rw-r--r-- 1 cactiuser root 47836 Oct 12 09:12 mismon_proc_14.rrd
-rw-r--r-- 1 cactiuser root 47836 Oct 12 09:12 mismon_users_13.rrd
-rw-r--r-- 1 cactiuser root 47836 Oct 12 09:12 mismon_mem_buffers_10.rrd
-rw-r--r-- 1 cactiuser root 47836 Oct 12 09:12 mismon_mem_swap_11.rrd
-rw-r--r-- 1 cactiuser root 94660 Oct 7 12:25 server25_ds0_302.rrd
-rw-r--r-- 1 cactiuser root 94660 Oct 7 12:25 server25_ds0_301.rrd
-rw-r--r-- 1 cactiuser root 94660 Oct 7 12:25 server25_ds0_300.rrd
-rw-r--r-- 1 cactiuser root 94660 Oct 7 12:25 access_system_ds0_294.rrd
-rw-r--r-- 1 cactiuser root 94660 Oct 7 12:25 access_system_ds0_295.rrd
-rw-r--r-- 1 cactiuser root 94660 Oct 7 12:25 access_system_ds0_296.rrd
As an example:
And I just ran this, as root, by hand from the command line.
I'm really interested in what's going on.
For me, I'm going to to have to revert to my old version This is far too important to wait on. I wish I caught it sooner!
Instead of starting a new thread I'm going to update this one as well.
I just noticed the same problem. I upgraded last thursday. No graph updates, although cactid claims 77 hosts are processed. So, I assumed it was a problem with cactid. I went ahead and changed over to poller.php.
Same problem. I've considered the permissions as well, and deleted the poller cache. Running as root from the command line, only some of my graphs are updated!
(and poller does see my hosts:
10/12/2004 09:12:21 AM - SYSTEM STATS: Time: 8.6298 s, Method: cmd.php, Processes: 1, Threads: N/A, Hosts: 77, Hosts/Process: 77)
Check this:
# ls -lt
-rw-r--r-- 1 cactiuser root 141484 Oct 12 09:12 mismon_load_1min_12.rrd
-rw-r--r-- 1 cactiuser root 47836 Oct 12 09:12 mismon_proc_14.rrd
-rw-r--r-- 1 cactiuser root 47836 Oct 12 09:12 mismon_users_13.rrd
-rw-r--r-- 1 cactiuser root 47836 Oct 12 09:12 mismon_mem_buffers_10.rrd
-rw-r--r-- 1 cactiuser root 47836 Oct 12 09:12 mismon_mem_swap_11.rrd
-rw-r--r-- 1 cactiuser root 94660 Oct 7 12:25 server25_ds0_302.rrd
-rw-r--r-- 1 cactiuser root 94660 Oct 7 12:25 server25_ds0_301.rrd
-rw-r--r-- 1 cactiuser root 94660 Oct 7 12:25 server25_ds0_300.rrd
-rw-r--r-- 1 cactiuser root 94660 Oct 7 12:25 access_system_ds0_294.rrd
-rw-r--r-- 1 cactiuser root 94660 Oct 7 12:25 access_system_ds0_295.rrd
-rw-r--r-- 1 cactiuser root 94660 Oct 7 12:25 access_system_ds0_296.rrd
As an example:
And I just ran this, as root, by hand from the command line.
I'm really interested in what's going on.
For me, I'm going to to have to revert to my old version This is far too important to wait on. I wish I caught it sooner!
Same problems here. I posted a new thread about my problems. Basically rrd's stopped updating in the middle of the morning last sunday while everyone was in bed.
In my searches I came across this topic and since I had lowered my logging level I decided to raise it.
This has helped with being able to see _some_ diagnostic information but of course hasn't helped get my RRD's updated.
I upgraded to the latest release with no joy...
In my searches I came across this topic and since I had lowered my logging level I decided to raise it.
This has helped with being able to see _some_ diagnostic information but of course hasn't helped get my RRD's updated.
I upgraded to the latest release with no joy...
- TheWitness
- Developer
- Posts: 17007
- Joined: Tue May 14, 2002 5:08 pm
- Location: MI, USA
- Contact:
Aspirax,
Your problem in the attached was that you were not using poller.php. Poller.php is required in Cacti 0.8.6x. You select cmd.php or cactid from the user interface under Settings->Poller.
TheWitness
Your problem in the attached was that you were not using poller.php. Poller.php is required in Cacti 0.8.6x. You select cmd.php or cactid from the user interface under Settings->Poller.
TheWitness
True understanding begins only when we realize how little we truly understand...
Life is an adventure, let yours begin with Cacti!
Author of dozens of Cacti plugins and customization's. Advocate of LAMP, MariaDB, IBM Spectrum LSF and the world of batch. Creator of IBM Spectrum RTM, author of quite a bit of unpublished work and most of Cacti's bugs.
_________________
Official Cacti Documentation
GitHub Repository with Supported Plugins
Percona Device Packages (no support)
Interesting Device Packages
For those wondering, I'm still here, but lost in the shadows. Yearning for less bugs. Who want's a Cacti 1.3/2.0? Streams anyone?
Life is an adventure, let yours begin with Cacti!
Author of dozens of Cacti plugins and customization's. Advocate of LAMP, MariaDB, IBM Spectrum LSF and the world of batch. Creator of IBM Spectrum RTM, author of quite a bit of unpublished work and most of Cacti's bugs.
_________________
Official Cacti Documentation
GitHub Repository with Supported Plugins
Percona Device Packages (no support)
Interesting Device Packages
For those wondering, I'm still here, but lost in the shadows. Yearning for less bugs. Who want's a Cacti 1.3/2.0? Streams anyone?
Here is my log file...
- Attachments
-
- cacti_log.zip
- no rrds created/updated
- (33.98 KiB) Downloaded 263 times
I am having these problems as well...anyone have a fix yet?
jgefaell wrote:Same problems here. I posted a new thread about my problems. Basically rrd's stopped updating in the middle of the morning last sunday while everyone was in bed.
In my searches I came across this topic and since I had lowered my logging level I decided to raise it.
This has helped with being able to see _some_ diagnostic information but of course hasn't helped get my RRD's updated.
I upgraded to the latest release with no joy...
Who is online
Users browsing this forum: No registered users and 0 guests