Sun Feb 23 15:59:26 NZDT 2003
> they were important to me when I set it up, since it was hosted on adsl,
> and I was worried how much bandwidth it was pulling, but less about how
> much it was human. http://www.wlug.org.nz/archive/querys-recent.txt is
> autogenerated every night with what everyone is searching for from
> google. it appears to be about 150 hits from google a day at the
> moment. I touched the config up a bit last night before I went to bed.
> What I really want to do is get all the search engines grouped into one
> line, but I've not had success with getting webaliser to do that for me.
You should be able to do that with the Group and Hide config
GroupSite *.googlebot.com Search Engine
GroupSite *.fastsearch.net Search Engine
I also do this with ISP's. Grouping all dialin IP's under one ISP name.
> All in all I've been rather disappointed with log analysers in general,
> to me webaliser is the best of a particularly bad bunch.
Indeed. One could write one's own analyser with the help of
HTTPD::Log::Filter perl modules fairly easily. It'd be slow tho.
Webalizer is one of the fastest and it is written in C.
> > Also you're referrer records are useless. You've not filtered out local
> > referrers. Eg, you should be hiding referrers from www.wlug.org.nz.
> I played with that last night, apparently my changes didn't work,
> perhaps because of the history file. Oh well, I've updated it some,
Yes. You would have to delete the history and recreate the output for
Oliver Jones > Senior Software Engineer > Deeper Design Limited.
oliver@d... > www.deeperdesign.com > +64 (21) 41-2238
More information about the wlug
NOTICE: This is an archive of a public mailing list. The University of Waikato is not responsible for its contents.