High load and graph gaps after yum update

Post support questions that directly relate to Linux/Unix operating systems.

Moderators: Developers, Moderators

Post Reply
PrimeEvil
Posts: 3
Joined: Thu Mar 24, 2016 7:45 am

High load and graph gaps after yum update

Post by PrimeEvil »

My cacti installation is borked after running yum updates on CentOS 7. I'm getting gaps in all my graphs (including localhost graphs) and overall load is significantly higher. Browsing pages and graph loads are painfully slow. Any help would be appreciated.

Here is my yum log showing what got updated:

Code: Select all

Mar 22 11:07:55 Updated: glusterfs-libs-3.7.1-16.0.1.el7.centos.x86_64
Mar 22 11:08:02 Updated: samba-libs-4.2.3-12.el7_2.x86_64
Mar 22 11:08:12 Updated: samba-common-tools-4.2.3-12.el7_2.x86_64
Mar 22 11:08:28 Updated: samba-common-4.2.3-12.el7_2.noarch
Mar 22 11:08:31 Updated: libwbclient-4.2.3-12.el7_2.x86_64
Mar 22 11:08:40 Updated: samba-client-libs-4.2.3-12.el7_2.x86_64
Mar 22 11:08:42 Updated: samba-common-libs-4.2.3-12.el7_2.x86_64
Mar 22 11:08:55 Updated: openssh-6.6.1p1-25.el7_2.x86_64
Mar 22 11:08:58 Updated: 32:bind-license-9.9.4-29.el7_2.3.noarch
Mar 22 11:09:02 Updated: 32:bind-libs-9.9.4-29.el7_2.3.x86_64
Mar 22 11:09:08 Updated: glusterfs-client-xlators-3.7.1-16.0.1.el7.centos.x86_64
Mar 22 11:09:56 Updated: glusterfs-3.7.1-16.0.1.el7.centos.x86_64
Mar 22 11:10:39 Updated: ntopng-data-2.2.160315-697.noarch
Mar 22 11:10:47 Updated: nss-util-3.19.1-9.el7_2.x86_64
Mar 22 11:13:15 Updated: pfring-dkms-6.2.0-487.noarch
Mar 22 11:14:50 Updated: pfring-6.2.0-487.x86_64
Mar 22 11:17:10 Updated: ntopng-2.2.160315-697.x86_64
Mar 22 11:19:29 Updated: firefox-38.7.0-1.el7.centos.x86_64
Mar 22 11:19:33 Updated: glusterfs-api-3.7.1-16.0.1.el7.centos.x86_64
Mar 22 11:19:36 Updated: 32:bind-utils-9.9.4-29.el7_2.3.x86_64
Mar 22 11:19:40 Updated: 32:bind-libs-lite-9.9.4-29.el7_2.3.x86_64
Mar 22 11:19:44 Updated: openssh-server-6.6.1p1-25.el7_2.x86_64
Mar 22 11:19:58 Updated: openssh-clients-6.6.1p1-25.el7_2.x86_64
Mar 22 11:20:07 Updated: samba-4.2.3-12.el7_2.x86_64
Mar 22 11:20:12 Updated: libsmbclient-4.2.3-12.el7_2.x86_64
Mar 22 11:21:29 Updated: tzdata-2016b-1.el7.noarch
Mar 22 11:21:38 Updated: tzdata-java-2016b-1.el7.noarch
Mar 22 11:21:47 Updated: libssh2-1.4.3-10.el7_2.1.x86_64
Cacti Version 0.8.8b
Cacti OS unix
SNMP Version NET-SNMP version: 5.7.2
RRDTool Version RRDTool 1.4.x
Hosts 320
Graphs 1242
Data Sources Script/Command: 11
SNMP: 1157
SNMP Query: 450
Script Query: 1
Script - Script Server (PHP): 1
Total: 1620

Interval 60
Type SPINE 0.8.8b Copyright 2002-2013 by The Cacti Group
Items Action[0]: 2063
Action[1]: 13
Action[2]: 1
Total: 2077
Concurrent Processes 3
Max Threads 10
PHP Servers 1
Script Timeout 30
Max OID 10
Last Run Statistics Time:30.4427 Method:spine Processes:3 Threads:10 Hosts:320 HostsPerProcess:107 DataSources:2074 RRDsProcessed:1502
User avatar
micke2k
Cacti User
Posts: 261
Joined: Wed Feb 03, 2016 3:38 pm

Re: High load and graph gaps after yum update

Post by micke2k »

Have you tried restarting the webserver?

Maybe increasing script timeout for testing aswell?

Do you see anything in "top" ?

I can try to update my centos and see if i can replicate it.
PrimeEvil
Posts: 3
Joined: Thu Mar 24, 2016 7:45 am

Re: High load and graph gaps after yum update

Post by PrimeEvil »

I've rebooted the system a few times now. Even tried running the previous kernel. No luck.

Top shows an aweful lot of php processes from cacti. I even reduced spine to 1 process and php scripts to 1. It has made my graphs a little better. But it seems like the number of processes is still double what they previously were, performance is aweful, and still gaps in the graphs.

Code: Select all

top - 13:55:02 up 1 day, 22:47,  1 user,  load average: 36.28, 37.58, 35.89
Tasks: 240 total,   2 running, 238 sleeping,   0 stopped,   0 zombie
%Cpu(s):  6.9 us,  5.7 sy,  0.0 ni,  0.0 id, 87.0 wa,  0.0 hi,  0.3 si,  0.0 st
KiB Mem :  3882824 total,   169744 free,   608008 used,  3105072 buff/cache
KiB Swap:  4210680 total,  3846808 free,   363872 used.  3037796 avail Mem

  PID USER      PR  NI    VIRT    RES    SHR S  %CPU %MEM     TIME+ COMMAND
 2273 cacti     20   0  112648    936    772 D   4.0  0.0   0:03.91 grep
 3110 cacti     20   0  112648    936    772 R   4.0  0.0   0:03.32 grep
 2184 mysql     20   0 1674648  47608   4000 S   2.7  1.2 399:18.68 mysqld
   33 root      20   0       0      0      0 D   2.0  0.0  18:55.86 kswapd0
21283 cacti     20   0  391984  23244   9052 D   1.0  0.6   0:13.89 php
22688 cacti     20   0  391724  23000   9052 D   1.0  0.6   0:13.21 php
26379 cacti     20   0  392024  22928   9052 D   1.0  0.6   0:16.31 php
28636 cacti     20   0  391724  22968   9052 D   1.0  0.6   0:11.93 php
30104 cacti     20   0  391984  23252   9052 D   1.0  0.6   0:14.12 php
  662 cacti     20   0  391784  22912   9052 D   0.7  0.6   0:10.33 php
 1538 cacti     20   0  112648    932    772 D   0.7  0.0   0:04.53 grep
18460 cacti     20   0  391724  11512   9052 D   0.7  0.3   0:20.69 php
19164 cacti     20   0  392504  11776   9052 D   0.7  0.3   0:18.55 php
19876 cacti     20   0  391784  22256   9052 D   0.7  0.6   0:18.92 php
23443 cacti     20   0  391984  23268   9052 D   0.7  0.6   0:15.31 php
27050 cacti     20   0  392244  23396   9052 D   0.7  0.6   0:08.23 php
27875 cacti     20   0  391984  23076   9052 D   0.7  0.6   0:08.82 php
32183 cacti     20   0  391768  22940   9052 D   0.7  0.6   0:08.55 php
 1371 nobody    20   0 1275380  45188   8120 S   0.3  1.2  27:57.05 ntopng
 1543 cacti     20   0  392024  22924   9052 D   0.3  0.6   0:07.77 php
 3121 cacti     20   0  229840   3208   2376 D   0.3  0.1   0:00.37 rrdtool
 3804 root      20   0  157812   2304   1488 R   0.3  0.1   0:00.07 top
24357 cacti     20   0  391816  22924   9052 D   0.3  0.6   0:15.78 php
25147 cacti     20   0  392024  23020   9052 D   0.3  0.6   0:20.66 php
29356 cacti     20   0  391816  22936   9052 D   0.3  0.6   0:12.98 php
    1 root      20   0  131000   7408   2592 S   0.0  0.2   2:45.17 systemd
    2 root      20   0       0      0      0 S   0.0  0.0   0:00.25 kthreadd
    3 root      20   0       0      0      0 S   0.0  0.0   0:10.83 ksoftirqd/0
    5 root       0 -20       0      0      0 S   0.0  0.0   0:00.00 kworker/0:+
    7 root      rt   0       0      0      0 S   0.0  0.0   0:00.87 migration/0
    8 root      20   0       0      0      0 S   0.0  0.0   0:00.00 rcu_bh
    9 root      20   0       0      0      0 S   0.0  0.0   0:00.00 rcuob/0
   10 root      20   0       0      0      0 S   0.0  0.0   0:00.00 rcuob/1
   11 root      20   0       0      0      0 S   0.0  0.0   4:04.54 rcu_sched
   12 root      20   0       0      0      0 S   0.0  0.0   1:59.90 rcuos/0
   13 root      20   0       0      0      0 S   0.0  0.0   2:12.41 rcuos/1
   14 root      rt   0       0      0      0 S   0.0  0.0   0:00.73 watchdog/0
Here is the cacti process list:

Code: Select all

# ps -u cacti -f
UID        PID  PPID  C STIME TTY          TIME CMD
cacti      660   622  0 13:51 ?        00:00:00 /bin/sh -c php /var/www/html/poller.php >&/dev/null
cacti      662   660  2 13:51 ?        00:00:11 php /var/www/html/poller.php
cacti     1537  1525  0 13:52 ?        00:00:00 /bin/sh -c php /var/www/html/poller.php >&/dev/null
cacti     1543  1537  2 13:52 ?        00:00:09 php /var/www/html/poller.php
cacti     2270  2267  0 13:53 ?        00:00:00 /bin/sh -c php /var/www/html/poller.php >&/dev/null
cacti     2272  2270  3 13:53 ?        00:00:12 php /var/www/html/poller.php
cacti     3105  3097  0 13:54 ?        00:00:00 /bin/sh -c php /var/www/html/poller.php >&/dev/null
cacti     3113  3105  2 13:54 ?        00:00:05 php /var/www/html/poller.php
cacti     3810  3807  0 13:55 ?        00:00:00 /bin/sh -c php /var/www/html/poller.php >&/dev/null
cacti     3811  3806  0 13:55 ?        00:00:00 /bin/sh -c grep "SYSTEM STATS: Time:" /var/www/html/log/cacti.log | tail -1
cacti     3812  3811  1 13:55 ?        00:00:03 grep SYSTEM STATS: Time: /var/www/html/log/cacti.log
cacti     3813  3811  0 13:55 ?        00:00:00 tail -1
cacti     3814  3811  0 13:55 ?        00:00:00 awk { print $7 }
cacti     3815  3810  3 13:55 ?        00:00:06 php /var/www/html/poller.php
cacti     4573  4569  0 13:56 ?        00:00:00 /bin/sh -c php /var/www/html/poller.php >&/dev/null
cacti     4574  4568  0 13:56 ?        00:00:00 /bin/sh -c grep "SYSTEM STATS: Time:" /var/www/html/log/cacti.log | tail -1
cacti     4575  4573  7 13:56 ?        00:00:10 php /var/www/html/poller.php
cacti     4576  4574  2 13:56 ?        00:00:03 grep SYSTEM STATS: Time: /var/www/html/log/cacti.log
cacti     4577  4574  0 13:56 ?        00:00:00 tail -1
cacti     4578  4574  0 13:56 ?        00:00:00 awk { print $7 }
cacti     5374  5369  0 13:57 ?        00:00:00 /bin/sh -c grep "SYSTEM STATS: Time:" /var/www/html/log/cacti.log | tail -1
cacti     5375  5370  0 13:57 ?        00:00:00 /bin/sh -c php /var/www/html/poller.php >&/dev/null
cacti     5376  5374  3 13:57 ?        00:00:02 grep SYSTEM STATS: Time: /var/www/html/log/cacti.log
cacti     5377  5374  0 13:57 ?        00:00:00 tail -1
cacti     5378  5374  0 13:57 ?        00:00:00 awk { print $7 }
cacti     5379  5375 12 13:57 ?        00:00:10 php /var/www/html/poller.php
cacti     6138  6134  0 13:58 ?        00:00:00 /bin/sh -c grep "SYSTEM STATS: Time:" /var/www/html/log/cacti.log | tail -1
cacti     6139  6135  0 13:58 ?        00:00:00 /bin/sh -c php /var/www/html/poller.php >&/dev/null
cacti     6140  6138 11 13:58 ?        00:00:02 grep SYSTEM STATS: Time: /var/www/html/log/cacti.log
cacti     6141  6138  1 13:58 ?        00:00:00 tail -1
cacti     6142  6139 25 13:58 ?        00:00:05 php /var/www/html/poller.php
cacti     6143  6138  0 13:58 ?        00:00:00 awk { print $7 }
cacti     6147     1 10 13:58 ?        00:00:02 /usr/local/spine/bin/spine 0 544
cacti     6151  6142  0 13:58 ?        00:00:00 /usr/bin/rrdtool -
cacti     6152  6147  0 13:58 ?        00:00:00 /usr/bin/php -q /var/www/html/script_server.php spine 0
cacti    20582 20578  0 13:34 ?        00:00:00 /bin/sh -c php /var/www/html/poller.php >&/dev/null
cacti    20586 20582  1 13:34 ?        00:00:17 php /var/www/html/poller.php
cacti    21281 21277  0 13:35 ?        00:00:00 /bin/sh -c php /var/www/html/poller.php >&/dev/null
cacti    21283 21281  1 13:35 ?        00:00:15 php /var/www/html/poller.php
cacti    21981 21977  0 13:36 ?        00:00:00 /bin/sh -c php /var/www/html/poller.php >&/dev/null
cacti    21982 21981  1 13:36 ?        00:00:14 php /var/www/html/poller.php
cacti    22684 22681  0 13:37 ?        00:00:00 /bin/sh -c php /var/www/html/poller.php >&/dev/null
cacti    22688 22684  1 13:37 ?        00:00:14 php /var/www/html/poller.php
cacti    23439 23435  0 13:38 ?        00:00:00 /bin/sh -c php /var/www/html/poller.php >&/dev/null
cacti    23443 23439  1 13:38 ?        00:00:16 php /var/www/html/poller.php
cacti    24354 24337  0 13:39 ?        00:00:00 /bin/sh -c php /var/www/html/poller.php >&/dev/null
cacti    24357 24354  1 13:39 ?        00:00:17 php /var/www/html/poller.php
cacti    25143 25136  0 13:40 ?        00:00:00 /bin/sh -c php /var/www/html/poller.php >&/dev/null
cacti    25147 25143  1 13:40 ?        00:00:22 php /var/www/html/poller.php
cacti    25724 25681  0 13:41 ?        00:00:00 /bin/sh -c php /var/www/html/poller.php >&/dev/null
cacti    25751 25724  2 13:41 ?        00:00:21 php /var/www/html/poller.php
cacti    26376 26258  0 13:42 ?        00:00:00 /bin/sh -c php /var/www/html/poller.php >&/dev/null
cacti    26379 26376  1 13:42 ?        00:00:17 php /var/www/html/poller.php
cacti    27049 27042  0 13:43 ?        00:00:00 /bin/sh -c php /var/www/html/poller.php >&/dev/null
cacti    27050 27049  1 13:43 ?        00:00:09 php /var/www/html/poller.php
cacti    27871 27867  0 13:44 ?        00:00:00 /bin/sh -c php /var/www/html/poller.php >&/dev/null
cacti    27875 27871  1 13:44 ?        00:00:10 php /var/www/html/poller.php
cacti    28635 28631  0 13:45 ?        00:00:00 /bin/sh -c php /var/www/html/poller.php >&/dev/null
cacti    28636 28635  1 13:45 ?        00:00:13 php /var/www/html/poller.php
cacti    29351 29348  0 13:46 ?        00:00:00 /bin/sh -c php /var/www/html/poller.php >&/dev/null
cacti    29356 29351  1 13:46 ?        00:00:14 php /var/www/html/poller.php
cacti    30100 30090  0 13:47 ?        00:00:00 /bin/sh -c php /var/www/html/poller.php >&/dev/null
cacti    30104 30100  2 13:47 ?        00:00:15 php /var/www/html/poller.php
cacti    30834 30819  0 13:48 ?        00:00:00 /bin/sh -c php /var/www/html/poller.php >&/dev/null
cacti    30835 30834  1 13:48 ?        00:00:12 php /var/www/html/poller.php
cacti    31614 31607  0 13:49 ?        00:00:00 /bin/sh -c php /var/www/html/poller.php >&/dev/null
cacti    31615 31614  2 13:49 ?        00:00:15 php /var/www/html/poller.php
cacti    32175 32151  0 13:50 ?        00:00:00 /bin/sh -c php /var/www/html/poller.php >&/dev/null
cacti    32183 32175  1 13:50 ?        00:00:09 php /var/www/html/poller.php
User avatar
micke2k
Cacti User
Posts: 261
Joined: Wed Feb 03, 2016 3:38 pm

Re: High load and graph gaps after yum update

Post by micke2k »

Have you rebuilt the poller cache?

You have several poller proccesses running at the same time.

Try to disable cacti polling and wait for all proccesses to start, then start it.

You should see alot of errormessages in your cacti.log. Any hosts that is causing problems? Any scripts not running?

EDIT:

Ive update to latest centos 7 now, no issues. Any new packages since you updated?

SElinux disabled?

Why is 4 greps running on cacti log?

Check for hosts not responding, you might have to increase your poller intervall, running 5 min?
PrimeEvil
Posts: 3
Joined: Thu Mar 24, 2016 7:45 am

Re: High load and graph gaps after yum update

Post by PrimeEvil »

Looks like I've finally figured it out. My cacti.log file had somehow reached 3.7GBs! I had a grep command polling the log file and it was taking forever to complete. Somehow it turns out my logs were not being rotated. My logrotate setup was missing. I created the necessary config file and ran a rotate. Everything dropped back to normal after a few minutes. For the life of me I can't figure out why this waited until I ran yum updates, but at least it's fixed now! :D
User avatar
micke2k
Cacti User
Posts: 261
Joined: Wed Feb 03, 2016 3:38 pm

Re: High load and graph gaps after yum update

Post by micke2k »

PrimeEvil wrote:Looks like I've finally figured it out. My cacti.log file had somehow reached 3.7GBs! I had a grep command polling the log file and it was taking forever to complete. Somehow it turns out my logs were not being rotated. My logrotate setup was missing. I created the necessary config file and ran a rotate. Everything dropped back to normal after a few minutes. For the life of me I can't figure out why this waited until I ran yum updates, but at least it's fixed now! :D
:lol: Good job!
Post Reply

Who is online

Users browsing this forum: No registered users and 2 guests