University Crest

[wlug] Dansguardian / Squid

 
wlug archive index About the wlug list Mailing lists home
To The University of Waikato HomepageWaikato Home > Waikato Mailing Lists > wlug Info > wlug archives
Matt Brown matt@m...
Thu Mar 20 10:13:38 NZST 2003


Hi,

I'm pretty sure that I am going to add support for this in to the log
analyser that I am currently in the process of writing. We are in a
similar position with our schools that are using Dansguardian and Squid.

The Dansguardian FAQ suggests that a simple perl script could be used to
convert the log file in to a similar format as that used by squid.

Incidentally Dansguardian does support adding the X-Forwarded-For header.
Unfortunately it requires a patch to squid to get it to parse this header.


On Thu, 20 Mar 2003, Daniel Lawson wrote:

>
> I currently have a Dansguardian / Squid proxy set up.
>
> Its set up in the 'standard' way, with Dansguardian sitting between the
> nextwork and squid.
>
> This is fine, except that it affects the logging of requests. In squids
> logs, all requests come from localhost (where dansguardian is running).
> If I want to follow through to observe where a request actually
> originated from, I have to then find this in the dansguardian log files.
>
> This is ok if its just me doing it every now and then, but the person
> who is in charge of this site would like to have better access to this
> information. I'm currently running sarg
> (http://web.onda.com.br/orso/sarg.html), and aside from any other issues
> it may have, due to it parsing the squid log file I can only see bad
> hits as being from localhost, and any hits that are filtered by
> dansguardian I wont see at all.
>
> DG does log everything, but as far as I can tell its not in a form that
> SARG groks.
>
> Does anyone have any other thoughts on how to do this? Suggestions for
> tools other than DansGuardian and SARG are acceptable.
>
>
> One way I have thought of is to run two instances of squid, sandwhiching
> the DG process. The first runs in proxy-only mode, and performs
> authentication. Its logfiles are the ones SARG reads. It then forwards
> requests to DG which performs content and url filtering, which then
> passes the requests onto the second, caching, squid. This has the
> benefit of giving me full access to squids logs, and showing me any
> accesses that were denied due to DG.
>
>
> I dont want to just place a caching squid before the DG, because if
> something is cached it may be served out of the squid cache, without
> checking DG for 'permission'.
>
> Any thoughts?

-- 
Matt Brown
matt@m...




More information about the wlug mailing list
NOTICE: This is an archive of a public mailing list. The University of Waikato is not responsible for its contents.

The University of Waikato - Te Whare Wananga o Waikato