Aug 312009
 

I’ve configured awstats to provide statistics on visitors of the websites on the server. I’ve also configured xymon to notify me when the website is not accessible. xymon uses a bot to check every 5 minutes to make sure the site is still up. As you can imagine, this generates a lot of hits, that aren’t really people.

So, since I don’t want to disable monitoring, I thought I should at least configure my stats so that they reflect people, and this bot.

awstats has an excellent robot detecting feature, but they do not yet detect xymon. So, in order to make the change, you need to dive into the robots.pm file, normally saved in /usr/local/awstats/wwwroot/cgi-bin/lib/

You will need to make two changes for it to detect this new robot (it tells you this at the top of the file). So, at the end of the @RobotsSearchIDOrder_list1 = ( array, I added:

  • ‘Hobbit[_+ ]bbtest-net/4.2.3′

And then at the top of Other robots reported by users in %RobotsHashIDLib = (, I added:

  • ‘Hobbit[_+ ]bbtest-net/4.2.3′,’xymon’,

Once I deleted the files in /var/lib/awstats/ and reprocessed my logs, the statistics better reflected my visitors. Beware that if you delete files from /var/lib/awstats/ and don’t have the original log files around, you will lose the history for that period of time. Also, this change should be reflected on all sites that you have running on this server and using awstats.

My webserver only had logs for August, so I only deleted the file for August, and left the file for July with the missed detections since I cannot rebuild it. I’ve since changed the configuration of the log rotater.

Later I’ll be posting a more in depth tutorial on installing awstats, and also on modifying the robots.pm file, since I couldn’t find much about it online.

UPDATE: As promised, here is a more detailed how-to on updating the robots file: http://wiki.cornempire.net/doku.php?id=awstats:awstatsrobots