> -----Original Message-----
> ----------------------------------------
> > Subject: [squid-users] Looking for web usage reporting solution
> >
> > I am looking for a web usage reporting solution that can run via
> sniffing or from a mirror port on a switch. I envision this solution
> would simply log each URL request it sees and allow reports to be
> generated on web sites that internal users have gone to. I've searched
> high and low, but cannot find a "ready-made" solution, so I'm looking
> to put it together myself.
> >
> > Most people/posts suggest using squid/squidgard/dan's guardian, but
> it appears to me that is only an inline solution, and I would prefer a
> sniffing solution for safety (if machine crashes, it doesn't take down
> Internet). In that sense, it would work a lot like websense, but
> without the blocking, only reporting.
> >
> > From a high-level pseudo-code standpoint, it would simply sniff all
> traffic, and when it sees a packet requesting a webpage, it parses it
> and dumps these results into a database:
> >
> > -Date
> > -Time
> > -Source IP
> > -Dest IP
> > -URL requested
> > -FQDN portion of web request - IE: if request was for
> > http://www.microsoft.com/windows/server/2003, it records only
> > www.microsoft.com here
> > -domain portion of web request - only microsoft.com in above example
> >
> > Using this data, I can then produce reports for the client on who
> went where when.... Personally, I thought this would be a great program
> for open source, but I can't find anything like this already out
> there!!! It seems like kind of a mix between Squid, NTOP and Snort...
> >
>
> What's wrong with running a bash script on the squid logs?
>
I assume absolutely nothing is wrong with it, and the simplicity would be grand! I'm just having a tough time wrapping my head around how to get those logs in the first place. Can I set squid up to only log the traffic it sees passing through port 80, so it could run in a sniffing scenario and not inline? If so, I'll definitely start playing with that because that would be a simple solution.
Received on Fri Nov 13 2009 - 16:54:57 MST
This archive was generated by hypermail 2.2.0 : Fri Nov 13 2009 - 12:00:04 MST