squidpurge works, but it's hardly ideal, especially on squids with big
disks...in my testing on a box with 3x1TB cache_dirs, it took 15
minutes to run and thrashed the disks pretty hard while it was
running, affecting response time for production traffic.
The reason for this is that squid stores each object record internally
as a hash, not the URL itself, which means that in order to search for
regex matches, it's necessary to look at every file in every cache_dir
to check against the regex.
An easily-searchable URL datastore would help immensely here. As a mad
experiment a while back, a former colleague hacked a SQL update into
the store and release functions, but it's unlikely anything like that
would work well in production without some serious work to guarantee
squid/DB data integrity.
Some sort of internal b-tree that stores all currently-cached URLs
might be a solution...or even an internal sqlite implementation? Has
anyone else ever proposed such a solution?
-C
On Aug 25, 2008, at 11:55 PM, Amos Jeffries wrote:
>> On Sun, 24 Aug 2008 08:59:07 +1200
>> Amos Jeffries <squid3_at_treenet.co.nz> wrote:
>>
>>> Paras Fadte wrote:
>>>> Hi,
>>>>
>>>> Is there any utility for purging cached web objects in squid with
>>>> wildcard support ?
>>>
>>> Not that we know of.
>>
>> You presumable know about squidpurge. Has it broken or something?
>>
>> http://www.wa.apana.org.au/~dean/squidpurge/
>>
>
> Ah. no. I didn't. I see the name fly by every now and again, but
> haven't
> really noticed it.
>
> Amos
>
>
Received on Mon Sep 01 2008 - 15:17:30 MDT
This archive was generated by hypermail 2.2.0 : Mon Sep 01 2008 - 12:00:04 MDT