Web Page Load Time
Moderators: Developers, Moderators
Web Page Load Time
This is a monitor which will track the time it takes for a specific web page to load. It's useful for trending how long various pages are taking to load and is useful for tuning and tracking down system problems. Like if you see your CPU spiking at set times, running this would let you know if it's affecting your website response times.
This should run on Linux / Unix, but the page_load_time.pl script will need to be modified to point to the proper locations of time and wget.
For servers with multiple websites I've just been creating new "devices"(www.mysite.com www.myothersite.com) and attaching this monitor to them.
This should run on Linux / Unix, but the page_load_time.pl script will need to be modified to point to the proper locations of time and wget.
For servers with multiple websites I've just been creating new "devices"(www.mysite.com www.myothersite.com) and attaching this monitor to them.
- Attachments
-
- page_load_time.tar.gz
- Page Load Time
- (2.7 KiB) Downloaded 658 times
Last edited by mmealman on Mon May 21, 2007 9:14 am, edited 1 time in total.
hello Mealman, i tried you'r script. But it doesn't work.
If i run it, i get the following error:
[root@nms:scripts]# perl page_load_time.pl www.retecool.com index.php
avg:Error[root@nms:scripts]#
Wget/time/perl are installed on the system, wget was however installed in /usr/bin so i changed the location in page_load_time.pl file.
Any idea what the problem is?
If i run it, i get the following error:
[root@nms:scripts]# perl page_load_time.pl www.retecool.com index.php
avg:Error[root@nms:scripts]#
Wget/time/perl are installed on the system, wget was however installed in /usr/bin so i changed the location in page_load_time.pl file.
Any idea what the problem is?
Thank you danathane,
Good point, i guess I wasn't paying attention when posting. I initially tried with the dash, but when posting and trying it again I forget it.
I altered the Perl script and tried to echo the $time variable I got an error that "uninitialized value in string ". When I echo the $results variable I got "0.00user 0.02system 0:02.01elapsed 1%CPU (0avgtext+0avgdata 0maxresident)k
0inputs+0outputs (0major+400minor)pagefaults 0swaps"
I'm not sure what value is expected. Would you try to run the following command:"/usr/bin/time /usr/bin/wget -T 30 -o /tmp/page_checks/tmp.log -p --no-cache -nd -P /tmp/page_checks/ http://www.retecool.com/index.php"
So I can check what answer I should get.
Good point, i guess I wasn't paying attention when posting. I initially tried with the dash, but when posting and trying it again I forget it.
I altered the Perl script and tried to echo the $time variable I got an error that "uninitialized value in string ". When I echo the $results variable I got "0.00user 0.02system 0:02.01elapsed 1%CPU (0avgtext+0avgdata 0maxresident)k
0inputs+0outputs (0major+400minor)pagefaults 0swaps"
I'm not sure what value is expected. Would you try to run the following command:"/usr/bin/time /usr/bin/wget -T 30 -o /tmp/page_checks/tmp.log -p --no-cache -nd -P /tmp/page_checks/ http://www.retecool.com/index.php"
So I can check what answer I should get.
Try this page_load.pl script:
I wrote it originally on FreeBSD, this one should be more portable.
Code: Select all
#!/usr/bin/perl -w
$WGET = "/usr/bin/wget";
$TIME = "/usr/bin/time";
$host = $ARGV[0];
$url = $ARGV[1];
$check = "$TIME -p $WGET -T 30 -o /tmp/page_checks/tmp.log -p --no-cache -nd -P /tmp/page_checks/ http://" . $host . $url . " 2>&1";
$results = `$check`;
if($results =~ m/\s*real\s+(\d+\.\d+)/) {
$time = $1;
} else {
$time = "Error";
}
print "avg:$time";
Great, It works now!
I already figured the p switch out, but I was stuck with the regular expression. Im new to regular expressions but as Im progressing my journey from NT to Linux it seems to be basic/handy knowledge for a *nix sysadmin to have, gonna buy book or something on the topic.
Both thanks for your help.
Btw for others: if you don`t want to store your downloaded documents (in my case they where 1mb each wget) don`t change the P to /dev/null/ since it will render your /dev/null to a normal file. Instead replace P to --output-document=/dev/null
I already figured the p switch out, but I was stuck with the regular expression. Im new to regular expressions but as Im progressing my journey from NT to Linux it seems to be basic/handy knowledge for a *nix sysadmin to have, gonna buy book or something on the topic.
Both thanks for your help.
Btw for others: if you don`t want to store your downloaded documents (in my case they where 1mb each wget) don`t change the P to /dev/null/ since it will render your /dev/null to a normal file. Instead replace P to --output-document=/dev/null
Thank you for the script mmealman.
I thought it would be useful - at least for me - to have it checks if the page is actually loading at all. At the moment it returns a $time value even if the connection to the host fails.
I'm not really a programmer but here below is my modified version of the original script. Before loading the page with wget I check if I can load the page with head. If that fails I set $time to a high value so I can set thold to alert me when the load time is above the threshold either because of a high load time or because it failed all together.
My 2 cents...
I thought it would be useful - at least for me - to have it checks if the page is actually loading at all. At the moment it returns a $time value even if the connection to the host fails.
I'm not really a programmer but here below is my modified version of the original script. Before loading the page with wget I check if I can load the page with head. If that fails I set $time to a high value so I can set thold to alert me when the load time is above the threshold either because of a high load time or because it failed all together.
My 2 cents...
Code: Select all
#!/usr/bin/perl -w
use LWP::Simple;
$WGET = "/usr/bin/wget";
$TIME = "/usr/bin/time";
$host = $ARGV[0];
$url = $ARGV[1];
$uri="http://" . $host . $url;
$time = 0;
$check = "$TIME -p $WGET -T 30 -o tmp/tmp.log -p --no-cache -nd -O /dev/null " . $uri . " 2>&1";
if ( head($uri)) {
$results = `$check`;
if($results =~ m/\s*real\s+(\d+\.\d+)/) {
$time = $1;
} else {
$time = "Error";
}
}
else {
$time = 90;
}
print "avg: $time";
threshold/graphs
Ok, so I added this web page load time into my graphs...all seems to work until I setup thresholds. IF load time goes over 1000 ms, it alerts....
So now I get every once and while i get "Reponse Time [avg] went above threshold of 1000 with 1253.38" , yet the graph cacti attaches to the email, shows nothing over a hundred....? i am lost. and if I check the graphs in cacti, it never hits 1000.
So now I get every once and while i get "Reponse Time [avg] went above threshold of 1000 with 1253.38" , yet the graph cacti attaches to the email, shows nothing over a hundred....? i am lost. and if I check the graphs in cacti, it never hits 1000.
If all else fails, rm -rf /
Thanks for this most useful template.
Here is a PHP version if anyone is interested. Note that I've changed the wget options a little from the original (personal preference), and also the script cleans up the contents of /tmp/page_checks/ afterwards.
Here is a PHP version if anyone is interested. Note that I've changed the wget options a little from the original (personal preference), and also the script cleans up the contents of /tmp/page_checks/ afterwards.
Code: Select all
#!/usr/local/bin/php -q
<?
$wget = '/usr/bin/wget';
if(isset($argv[1]) && isset($argv[2]))
{
$host = $argv[1];
$url = $argv[2];
}
else
{
exit;
}
$check = $wget . ' -T 30 -o /dev/null -nd -P /tmp/page_checks/ ' . $host . $url;
$start = microtime(true);
shell_exec($check);
$end = microtime(true);
exec('rm /tmp/page_checks/*');
echo 'avg:' . ($end - $start);
?>
-
- Posts: 13
- Joined: Thu Mar 29, 2012 9:53 am
Re: Web Page Load Time
Getting:
the RRD does not contain an RRA matching the chosen CF
Any ideas chaps?
the RRD does not contain an RRA matching the chosen CF
Any ideas chaps?
- gandalf
- Developer
- Posts: 22383
- Joined: Thu Dec 02, 2004 2:46 am
- Location: Muenster, Germany
- Contact:
Re: Web Page Load Time
Either change data template to define the additional CF or change the graph template not to refer to the missing CF (somethimes e.g. the "last" GPRINT refers to CF "LAST" which "seems to be valid" but isn't. Replace e.g. by AVERAGE, then)
R.
R.
Who is online
Users browsing this forum: No registered users and 4 guests