Boost 4.2 PNG rendering intermittent with PIA 2.9 and 0.8.7g
Moderators: Developers, Moderators
Boost 4.2 PNG rendering intermittent with PIA 2.9 and 0.8.7g
I am running a linux system with with current stable versions (patches applied) of all the components listed but there seems to be an issue with the graph PNGs not rendering consistently. I have a fairly large environment (~800 hosts and ~36000 Datasources) that I am trying to monitor but with boost, it seems that the polls and rrd updates are happening in a timely manner. It's just the rendering of the images that doesn't always work. Sometimes, all graphs will be rendered and others, only some will and the graphs that don't, seem to hang forever
Firebug reports that the graph_image.php is the part that is sitting there but the web server, and cacti logs don't show any indication of an error.
Is anyone else still experiencing this? Where else should I be looking to determine the cause? Thanks in advance.
Here are the details and I am willing to provide more if needed:
Hardware:
2 x Xeon Xeon(TM) CPU 2.40GHz
4GB Ram
2 (Older SCSI disks 10,000 rpm I think)
Software:
OS Centos5.5 - Linux cacti 6.18-194.el5PAE #1 SMP Fri Apr 2 15:37:44 EDT 2010 i686 i686 i386 GNU/Linux
Apache/2.2.3 (fairly vanilla config, some unused modules disabled)
PHP 5.2.10 (cli) (built: Nov 13 2009 11:24:03
MySQL 5.0.84
Cacti 0.8.7.g
Boost 4.2 (memory type tables)
PIA 2.9
No other plugins installed (I saw a similar issue in a post where this was recommended for Boost)
General Stats:
From Cacti Logs:
Log File [Total Lines: 6 - Non-Matching Items Hidden]
12/09/2010 08:52:56 AM - SYSTEM BOOST STATS: Time:76.0200 RRDUpdates:106616
12/09/2010 08:51:39 AM - SYSTEM STATS: Time:98.2937 Method:spine Processes:2 Threads:20 Hosts:812 HostsPerProcess:406 DataSources:35951 RRDsProcessed:0
12/09/2010 08:46:40 AM - SYSTEM STATS: Time:97.7493 Method:spine Processes:2 Threads:20 Hosts:812 HostsPerProcess:406 DataSources:35951 RRDsProcessed:0
12/09/2010 08:41:37 AM - SYSTEM STATS: Time:96.0806 Method:spine Processes:2 Threads:20 Hosts:812 HostsPerProcess:406 DataSources:35951 RRDsProcessed:0
12/09/2010 08:38:08 AM - SYSTEM BOOST STATS: Time:89.8800 RRDUpdates:106486
12/09/2010 08:36:38 AM - SYSTEM STATS: Time:96.5354 Method:spine Processes:2 Threads:20 Hosts:812 HostsPerProcess:406 DataSources:35951 RRDsProcessed:0
Boost:
Boost On Demand Updating: Idle
Total Data Sources: 35951
Total Boost Records: 1205
Boost Storage Statistics
Database Engine: MEMORY
Current Boost Table Size: 70 MBytes
Avg Bytes/Record: 545 Bytes
Max Record Length: 13 Bytes
Max Allowed Boost Table Size: 217 MBytes
Estimated Maximum Records: 418123 Records
Runtime Statistics
Last Start Time: 2010-12-09 8:51:40
Last Run Duration: 1 minutes 16 seconds (4% of update frequency)
RRD Updates: 106616
Peak Poller Memory: 10.41 MBytes
Detailed Runtime Timers: RRDUpdates:106616 TotalTime:76.0214 range_local_data_id:2.06 rcaston_add:1.34 get_records:4.05 results_cycle:66.86 rrd_path:13.61 rrd_template:31.59 rrd_lastupdate:4.84 rrd_field_names:3.52 rrdupdate:5.07 delete:1.25
Max Poller Memory Allowed: 1024 MBytes
Run Time Configuration
Update Frequency: 30 Minutes
Next Start Time: 2010-12-09 9:21:40
Maximum Records: 100000 Records
Maximum Allowed Runtime: 20 Minutes
Boost Server Details
Server Config Status: Enabled
Multiprocess Server: Multiple Process
Update Timeout: 10 Seconds
Server/Port: localhost@9050
Authorized Update Web Servers: 127.0.0.1
RRDtool Binary Used: /usr/bin/rrdupdate
Image Caching
Image Cacing Status: Disabled
Cache Directory: /var/cacti_safe/cacti_hd/img_cache
Cached Files: 75 Files
Cached Files Size: 1 MBytes
So here is the dump of the settings table. I figured this might be a good place to start:
| name | value |
| path_rrdtool | /usr/bin/rrdtool |
| path_php_binary | /usr/bin/php |
| path_snmpwalk | /usr/bin/snmpwalk |
| path_snmpget | /usr/bin/snmpget |
| path_snmpbulkwalk | /usr/bin/snmpbulkwalk |
| path_snmpgetnext | /usr/bin/snmpgetnext |
| path_cactilog | /var/log/cacti/cacti.log |
| snmp_version | net-snmp |
| rrdtool_version | rrd-1.4.x |
| poller_enabled | on |
| poller_type | 2 |
| poller_interval | 300 |
| cron_interval | 300 |
| concurrent_processes | 2 |
| process_leveling | on |
| max_threads | 20 |
| php_servers | 10 |
| script_timeout | 60 |
| max_get_size | 100 |
| availability_method | 2 |
| ping_method | 2 |
| ping_port | 23 |
| ping_timeout | 400 |
| ping_retries | 1 |
| ping_failure_count | 2 |
| ping_recovery_count | 3 |
| boost_rrd_update_enable | on |
| boost_rrd_update_system_enable | on |
| boost_rrd_update_interval | 30 |
| boost_rrd_update_max_records | 100000 |
| boost_rrd_update_max_records_per_select | 2000 |
| boost_mysql_string_length | 64000 |
| boost_rrd_update_string_length | 2000 |
| boost_poller_mem_limit | 1024 |
| boost_rrd_update_max_runtime | 1200 |
| boost_redirect | on |
| boost_server_enable | on |
| boost_server_effective_user | root |
| boost_server_multiprocess | 1 |
| boost_path_rrdupdate | /usr/bin/rrdupdate |
| boost_server_hostname | localhost |
| boost_server_listen_port | 9050 |
| boost_server_timeout | 10 |
| boost_server_clients | 127.0.0.1 |
| boost_png_cache_enable | |
| boost_png_cache_directory | /var/cacti_safe/cacti_hd/img_cache |
| boost_rrd_update_lockfile | /var/lock/subsys/boost_server.php |
| path_boost_log | /var/log/cacti/boost.log |
| boost_max_output_length | 1291862380:60 |
| log_destination | 1 |
| log_snmp | |
| log_graph | |
| log_export | |
| log_verbosity | 2 |
| log_pstats | |
| log_pwarn | |
| log_perror | |
| snmp_ver | 2 |
| snmp_community | <sanitized> |
| snmp_username | |
| snmp_password | |
| snmp_auth_protocol | MD5 |
| snmp_priv_passphrase | |
| snmp_priv_protocol | DES |
| snmp_timeout | 300 |
| snmp_port | 161 |
| snmp_retries | 3 |
| reindex_method | 1 |
| deletion_verification | on |
| poller_lastrun | 1291913102 |
| path_webroot | /var/cacti_safe/cacti_hd |
| path_rrdtool_default_font | |
| path_spine | /usr/local/spine/bin/spine |
| extended_paths | |
| date | 2010-12-09 08:46:39 |
| stats_poller | Time:97.7493 Method:spine Processes:2 Threads:20 Hosts:812 HostsPerProcess:406 DataSources:35951 RRDsProcessed:0 |
| stats_recache | RecacheTime:0.0 HostsRecached:0 |
| boost_last_run_time | 2010-12-09 8:36:38 |
| boost_next_run_time | 2010-12-09 9:06:38 |
| num_rows_graph | 30 |
| max_title_graph | 80 |
| max_data_query_field_length | 15 |
| default_graphs_new_dropdown | -2 |
| num_rows_data_query | 100 |
| num_rows_data_source | 30 |
| max_title_data_source | 45 |
| num_rows_device | 30 |
| num_rows_log | 200 |
| log_refresh_interval | 60 |
| title_size | 10 |
| title_font | |
| legend_size | 8 |
| legend_font | |
| axis_size | 6 |
| axis_font | |
| unit_size | 6 |
| unit_font | |
| plugin_discovery_version | 1.1 |
| discovery_subnet | |
| discovery_dns | |
| discovery_protocol | 0 |
| discovery_readstrings | |
| discovery_collection_timing | 1440 |
| discovery_base_time | 13:35 |
| discovery_query_rerun | |
| discovery_interface_up_only | |
| discovery_last_poll | 1291845301 |
| discovery_prev_base_time | 13:35 |
| discovery_last_run_time | 1291844100 |
| boost_poller_status | complete - end time:2010-12-09 8:38:08 |
| boost_peak_memory | 10914424 |
| stats_boost | Time:89.8800 RRDUpdates:106486 |
| stats_detail_boost | RRDUpdates:106486 TotalTime:89.8821 range_local_data_id:2.06 rcaston_add:1.36 get_records:3.98 results_cycle:80.78 rrd_path:16.41 rrd_template:39.97 rrd_lastupdate:5.85 rrd_field_names:3.48 rrdupdate:7.01 delete:1.26 |
| auth_method | 3 |
| guest_user | <sanitized> |
| user_template | <sanitized> |
| ldap_server | <sanitized> |
| ldap_port | <sanitized> |
| ldap_port_ssl | <sanitized> |
| ldap_version | 3 |
| ldap_encryption | 0 |
| ldap_referrals | 0 |
| ldap_mode | 0 |
| ldap_dn | <sanitized> |
| ldap_group_require | |
| ldap_group_dn | |
| ldap_group_attrib | |
| ldap_group_member_type | 1 |
| ldap_search_base | |
| ldap_search_filter | |
| ldap_specific_dn | |
| ldap_specific_password | |
Firebug reports that the graph_image.php is the part that is sitting there but the web server, and cacti logs don't show any indication of an error.
Is anyone else still experiencing this? Where else should I be looking to determine the cause? Thanks in advance.
Here are the details and I am willing to provide more if needed:
Hardware:
2 x Xeon Xeon(TM) CPU 2.40GHz
4GB Ram
2 (Older SCSI disks 10,000 rpm I think)
Software:
OS Centos5.5 - Linux cacti 6.18-194.el5PAE #1 SMP Fri Apr 2 15:37:44 EDT 2010 i686 i686 i386 GNU/Linux
Apache/2.2.3 (fairly vanilla config, some unused modules disabled)
PHP 5.2.10 (cli) (built: Nov 13 2009 11:24:03
MySQL 5.0.84
Cacti 0.8.7.g
Boost 4.2 (memory type tables)
PIA 2.9
No other plugins installed (I saw a similar issue in a post where this was recommended for Boost)
General Stats:
From Cacti Logs:
Log File [Total Lines: 6 - Non-Matching Items Hidden]
12/09/2010 08:52:56 AM - SYSTEM BOOST STATS: Time:76.0200 RRDUpdates:106616
12/09/2010 08:51:39 AM - SYSTEM STATS: Time:98.2937 Method:spine Processes:2 Threads:20 Hosts:812 HostsPerProcess:406 DataSources:35951 RRDsProcessed:0
12/09/2010 08:46:40 AM - SYSTEM STATS: Time:97.7493 Method:spine Processes:2 Threads:20 Hosts:812 HostsPerProcess:406 DataSources:35951 RRDsProcessed:0
12/09/2010 08:41:37 AM - SYSTEM STATS: Time:96.0806 Method:spine Processes:2 Threads:20 Hosts:812 HostsPerProcess:406 DataSources:35951 RRDsProcessed:0
12/09/2010 08:38:08 AM - SYSTEM BOOST STATS: Time:89.8800 RRDUpdates:106486
12/09/2010 08:36:38 AM - SYSTEM STATS: Time:96.5354 Method:spine Processes:2 Threads:20 Hosts:812 HostsPerProcess:406 DataSources:35951 RRDsProcessed:0
Boost:
Boost On Demand Updating: Idle
Total Data Sources: 35951
Total Boost Records: 1205
Boost Storage Statistics
Database Engine: MEMORY
Current Boost Table Size: 70 MBytes
Avg Bytes/Record: 545 Bytes
Max Record Length: 13 Bytes
Max Allowed Boost Table Size: 217 MBytes
Estimated Maximum Records: 418123 Records
Runtime Statistics
Last Start Time: 2010-12-09 8:51:40
Last Run Duration: 1 minutes 16 seconds (4% of update frequency)
RRD Updates: 106616
Peak Poller Memory: 10.41 MBytes
Detailed Runtime Timers: RRDUpdates:106616 TotalTime:76.0214 range_local_data_id:2.06 rcaston_add:1.34 get_records:4.05 results_cycle:66.86 rrd_path:13.61 rrd_template:31.59 rrd_lastupdate:4.84 rrd_field_names:3.52 rrdupdate:5.07 delete:1.25
Max Poller Memory Allowed: 1024 MBytes
Run Time Configuration
Update Frequency: 30 Minutes
Next Start Time: 2010-12-09 9:21:40
Maximum Records: 100000 Records
Maximum Allowed Runtime: 20 Minutes
Boost Server Details
Server Config Status: Enabled
Multiprocess Server: Multiple Process
Update Timeout: 10 Seconds
Server/Port: localhost@9050
Authorized Update Web Servers: 127.0.0.1
RRDtool Binary Used: /usr/bin/rrdupdate
Image Caching
Image Cacing Status: Disabled
Cache Directory: /var/cacti_safe/cacti_hd/img_cache
Cached Files: 75 Files
Cached Files Size: 1 MBytes
So here is the dump of the settings table. I figured this might be a good place to start:
| name | value |
| path_rrdtool | /usr/bin/rrdtool |
| path_php_binary | /usr/bin/php |
| path_snmpwalk | /usr/bin/snmpwalk |
| path_snmpget | /usr/bin/snmpget |
| path_snmpbulkwalk | /usr/bin/snmpbulkwalk |
| path_snmpgetnext | /usr/bin/snmpgetnext |
| path_cactilog | /var/log/cacti/cacti.log |
| snmp_version | net-snmp |
| rrdtool_version | rrd-1.4.x |
| poller_enabled | on |
| poller_type | 2 |
| poller_interval | 300 |
| cron_interval | 300 |
| concurrent_processes | 2 |
| process_leveling | on |
| max_threads | 20 |
| php_servers | 10 |
| script_timeout | 60 |
| max_get_size | 100 |
| availability_method | 2 |
| ping_method | 2 |
| ping_port | 23 |
| ping_timeout | 400 |
| ping_retries | 1 |
| ping_failure_count | 2 |
| ping_recovery_count | 3 |
| boost_rrd_update_enable | on |
| boost_rrd_update_system_enable | on |
| boost_rrd_update_interval | 30 |
| boost_rrd_update_max_records | 100000 |
| boost_rrd_update_max_records_per_select | 2000 |
| boost_mysql_string_length | 64000 |
| boost_rrd_update_string_length | 2000 |
| boost_poller_mem_limit | 1024 |
| boost_rrd_update_max_runtime | 1200 |
| boost_redirect | on |
| boost_server_enable | on |
| boost_server_effective_user | root |
| boost_server_multiprocess | 1 |
| boost_path_rrdupdate | /usr/bin/rrdupdate |
| boost_server_hostname | localhost |
| boost_server_listen_port | 9050 |
| boost_server_timeout | 10 |
| boost_server_clients | 127.0.0.1 |
| boost_png_cache_enable | |
| boost_png_cache_directory | /var/cacti_safe/cacti_hd/img_cache |
| boost_rrd_update_lockfile | /var/lock/subsys/boost_server.php |
| path_boost_log | /var/log/cacti/boost.log |
| boost_max_output_length | 1291862380:60 |
| log_destination | 1 |
| log_snmp | |
| log_graph | |
| log_export | |
| log_verbosity | 2 |
| log_pstats | |
| log_pwarn | |
| log_perror | |
| snmp_ver | 2 |
| snmp_community | <sanitized> |
| snmp_username | |
| snmp_password | |
| snmp_auth_protocol | MD5 |
| snmp_priv_passphrase | |
| snmp_priv_protocol | DES |
| snmp_timeout | 300 |
| snmp_port | 161 |
| snmp_retries | 3 |
| reindex_method | 1 |
| deletion_verification | on |
| poller_lastrun | 1291913102 |
| path_webroot | /var/cacti_safe/cacti_hd |
| path_rrdtool_default_font | |
| path_spine | /usr/local/spine/bin/spine |
| extended_paths | |
| date | 2010-12-09 08:46:39 |
| stats_poller | Time:97.7493 Method:spine Processes:2 Threads:20 Hosts:812 HostsPerProcess:406 DataSources:35951 RRDsProcessed:0 |
| stats_recache | RecacheTime:0.0 HostsRecached:0 |
| boost_last_run_time | 2010-12-09 8:36:38 |
| boost_next_run_time | 2010-12-09 9:06:38 |
| num_rows_graph | 30 |
| max_title_graph | 80 |
| max_data_query_field_length | 15 |
| default_graphs_new_dropdown | -2 |
| num_rows_data_query | 100 |
| num_rows_data_source | 30 |
| max_title_data_source | 45 |
| num_rows_device | 30 |
| num_rows_log | 200 |
| log_refresh_interval | 60 |
| title_size | 10 |
| title_font | |
| legend_size | 8 |
| legend_font | |
| axis_size | 6 |
| axis_font | |
| unit_size | 6 |
| unit_font | |
| plugin_discovery_version | 1.1 |
| discovery_subnet | |
| discovery_dns | |
| discovery_protocol | 0 |
| discovery_readstrings | |
| discovery_collection_timing | 1440 |
| discovery_base_time | 13:35 |
| discovery_query_rerun | |
| discovery_interface_up_only | |
| discovery_last_poll | 1291845301 |
| discovery_prev_base_time | 13:35 |
| discovery_last_run_time | 1291844100 |
| boost_poller_status | complete - end time:2010-12-09 8:38:08 |
| boost_peak_memory | 10914424 |
| stats_boost | Time:89.8800 RRDUpdates:106486 |
| stats_detail_boost | RRDUpdates:106486 TotalTime:89.8821 range_local_data_id:2.06 rcaston_add:1.36 get_records:3.98 results_cycle:80.78 rrd_path:16.41 rrd_template:39.97 rrd_lastupdate:5.85 rrd_field_names:3.48 rrdupdate:7.01 delete:1.26 |
| auth_method | 3 |
| guest_user | <sanitized> |
| user_template | <sanitized> |
| ldap_server | <sanitized> |
| ldap_port | <sanitized> |
| ldap_port_ssl | <sanitized> |
| ldap_version | 3 |
| ldap_encryption | 0 |
| ldap_referrals | 0 |
| ldap_mode | 0 |
| ldap_dn | <sanitized> |
| ldap_group_require | |
| ldap_group_dn | |
| ldap_group_attrib | |
| ldap_group_member_type | 1 |
| ldap_search_base | |
| ldap_search_filter | |
| ldap_specific_dn | |
| ldap_specific_password | |
Re: Boost 4.2 PNG rendering intermittent with PIA 2.9 and 0.
I have similar issues with boost. Much of the time the images do not show up until I do a refresh of the page, then they all show up.. Boost is working great for me in polling times, because like you I have a large environment and short (1min) polling interval. The other problems I have with boost are periodic troubles with graphs missing data, etc. I have tuned and re-tuned it, but fear my problem is (as many times as I have read the docs) that I don't understand it fully. Maybe Larry could do a webinar .
Re: Boost 4.2 PNG rendering intermittent with PIA 2.9 and 0.
As much as I hate to see a problem, I am glad it's not only me. Lends more credence to the possibility of a bug and not a config problem (unless we both did it wrong in the same way).
Re: Boost 4.2 PNG rendering intermittent with PIA 2.9 and 0.
Additional testing shows that sometimes if you let the browser try for a ridiculous amount of time, 5-10 mins or more, the images may finally render. This has not been consistent.
Re: Boost 4.2 PNG rendering intermittent with PIA 2.9 and 0.
I am starting to think the issue is related to the database. As I watch the threads in mysql, I am seeing these kinds of entries:
SELECT GET_LOCK('boost.single_ds.26420', 1)
When the images fail to render.
These are staying active for a very long time in some cases. Can anyone explain what part of the process is producing these threads?
SELECT GET_LOCK('boost.single_ds.26420', 1)
When the images fail to render.
These are staying active for a very long time in some cases. Can anyone explain what part of the process is producing these threads?
Re: Boost 4.2 PNG rendering intermittent with PIA 2.9 and 0.
One more thing. For a given page not displaying correctly, there is a lock like these for each graph that should be showing up that isn't. So if a page should have 5 graphs, and one is properly displayed, there will be 4 statements in the mysql process view like:
SELECT GET_LOCK('boost.single_ds.24601', 1)
SELECT GET_LOCK('boost.single_ds.24601', 1)
Re: Boost 4.2 PNG rendering intermittent with PIA 2.9 and 0.
I have the same issue with boost 4.3/PIA2.8/0.8.7g
everytime i view the graph, a gap appers
this is the boost status
and mysql setup
and the graphs look like in the image below
i noticed that the gaps appers only in the graph that I viewed
boost setup
everytime i view the graph, a gap appers
this is the boost status
Code: Select all
Current Boost Status
Boost On Demand Updating: Idle
Total Data Sources: 28232
Total Boost Records: 169359
Boost Storage Statistics
Database Engine: MEMORY
Current Boost Table Size: 118 MBytes
Avg Bytes/Record: 545 Bytes
Max Record Length: 109 Bytes
Max Allowed Boost Table Size: 464 MBytes
Estimated Maximum Records: 891812 Records
Runtime Statistics
Last Start Time: 2010-12-16 18:00:11
Last Run Duration: 4 minutes 23 seconds (15% of update frequency)
RRD Updates: 169380
Peak Poller Memory: 17.41 MBytes
Detailed Runtime Timers: RRDUpdates:169380 TotalTime:263.5504 range_local_data_id:0.89 rcaston_add:0.77 get_records:2.31 results_cycle:255.59 rrd_path:3.37 rrd_template:5.72 rrd_lastupdate:53.34 rrd_field_names:1.84 rrdupdate:187.85 delete:1.14
Max Poller Memory Allowed: 512 MBytes
Run Time Configuration
Update Frequency: 30 Minutes
Next Start Time: 2010-12-16 18:30:11
Maximum Records: 1000000 Records
Maximum Allowed Runtime: 20 Minutes
Boost Server Details
Server Config Status: Disabled
Multiprocess Server: Multiple Process
Update Timeout: 2 Seconds
Server/Port: localhost@9050
Authorized Update Web Servers: 127.0.0.1
RRDtool Binary Used: /usr/bin/rrdtool
Image Caching
Image Caching Status: Enabled
Cache Directory: /tmp/cacti
Cached Files: 11 Files
Cached Files Size: 330 KBytes
Code: Select all
skip-locking
key_buffer = 512M
query_cache_size = 128M
max_allowed_packet = 16M
table_cache = 512
sort_buffer_size = 128M
net_buffer_length = 8K
read_buffer_size = 1M
read_rnd_buffer_size = 32M
myisam_sort_buffer_size = 8M
max_heap_table_size = 512M
tmp_table_size=1G;
log_slow_queries
long_query_time = 2
log_long_format
innodb_buffer_pool_size = 256M
i noticed that the gaps appers only in the graph that I viewed
boost setup
Check out my CACTI Templates
Re: Boost 4.2 PNG rendering intermittent with PIA 2.9 and 0.
Has anyone found a work around? I am getting errors in my apache2/error.log as well:
PHP Notice: Undefined offset: 8192 in /var/www/cacti/plugins/boost/setup.php on line 556, referer: http://server/cacti/graph_view.php?acti ... af_id=2080
I get this for pretty much every graph. I found the line (556), but it is just the bit that puts together the error message to write to the cacti log. Also, there are cacti log entries, so I figure it has to be working some of the time. The "offset" it is using though, I don't under stand how it ever works - but I haven't had enough time to pour through all the code. Anyway, I would really appreciate a push in the right direction on this one - cacti is getting a bit difficult to use (missing graphs and such). I would just uninstall boost, but that is the only way I am able to keep my polling cycles under 1 minute. HELP!!!
PHP Notice: Undefined offset: 8192 in /var/www/cacti/plugins/boost/setup.php on line 556, referer: http://server/cacti/graph_view.php?acti ... af_id=2080
I get this for pretty much every graph. I found the line (556), but it is just the bit that puts together the error message to write to the cacti log. Also, there are cacti log entries, so I figure it has to be working some of the time. The "offset" it is using though, I don't under stand how it ever works - but I haven't had enough time to pour through all the code. Anyway, I would really appreciate a push in the right direction on this one - cacti is getting a bit difficult to use (missing graphs and such). I would just uninstall boost, but that is the only way I am able to keep my polling cycles under 1 minute. HELP!!!
Re: Boost 4.2 PNG rendering intermittent with PIA 2.9 and 0.
well I dont have any errors, but just gaps when I see a graph
Check out my CACTI Templates
Who is online
Users browsing this forum: No registered users and 1 guest