For anyone who is interested:
Here at Massey University we've used Squid 1.1.x with redirectors to
implement authorisation of web use. The redirectors use RPC calls to our
database daemons to decide if the use of the cache is to be allowed, based
on source host (and who is logged in on it, also in the database),
destination host and time of day. Because RPC calls can be slow the
redirectors also use a shared memory cache - using the shared memory keeps
the redirectors small and cuts down the number of RPC calls being made.
If the access is allowed the redirector returns "" to allow squid to
proceed, otherwise it returns an error URL like
host/error.cgi?url=x&error=y which tells the user that they have been
denied access and why.
Using this we can do things like only allowing students in labs access to
web sites in New Zealand and a few select places around the world, and only
allowing staff access to www.penthousemag.com outside working hours etc.
Of course we need to firewall the web ports to force people through the
cache to make it completely effective.
Does the way we've done it seem reasonable?
Brent Foster
Systems Programmer, Massey University, New Zealand
Received on Thu Jan 16 1997 - 19:52:24 MST
This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:34:06 MST