K,
thats another problem,
there is a tool for squid named purge (I forgot the address);
which will remove not valid objects from your cache.
another way is to remove swap.state file and rerun squid to
rebuild this file.
if your network traffic is higher than 256Kbit reduce your memory to
48 MB.
Best Regards
Mehrdad Fatemi
R&D Director
< AFRANET Co. ---------------------------- R&D Dept. >
-----Original Message-----
From: Andrew Blanche <andrew@foxall.com.au>
To: Mehrdad Fatemi <fatemi@afranet.com>
Date: Tuesday, September 05, 2000 2:47 AM
Subject: Re: [SQU] Squid Failure
>Thanks for the reply!
>
>I was a little out on the fault description, I acctually said
>"/etc/squid #swap_duplicate: entry 000036somthing
>non existant swap file
>Swap_Duplicate entry 000036
>vm: killing process squid
>trying to free swap
>
>I had the mem setting at 64Mb on a 384Mb system, how low do I go, how low
do
>I go?
>
>Andrew Blanche
>System Administrator
>Fox All Services Pty Ltd
>Ph - 61 3 9739 5262
>http://www.foxall.com.au
>----- Original Message -----
>From: "Mehrdad Fatemi" <fatemi@afranet.com>
>To: "Andrew Blanche" <andrew@foxall.com.au>
>Cc: "squid" <squid-users@ircache.net>
>Sent: Tuesday, September 05, 2000 6:50 AM
>Subject: Re: [SQU] Squid Failure
>
>
>> decrease the squid memory at about 1/4 or 1/5 of total ram
>> you have got in your system.
>>
>> Best Regards
>>
>> Mehrdad Fatemi
>> R&D Director
>>
>> < AFRANET Co. ---------------------------- R&D Dept. >
>>
>> -----Original Message-----
>> From: Andrew Blanche <andrew@foxall.com.au>
>> To: squid-users@ircache.net <squid-users@ircache.net>
>> Date: Monday, September 04, 2000 3:32 PM
>> Subject: [SQU] Squid Failure
>>
>>
>> Hello to all
>>
>> I have recently developed a rather huge problem with our Squid cache.
>>
>> It keeps Failing aleast once a day during peak load times!!!
>> On failurethe squid child process has stopped.
>> The only message I have been able to get on the failure is:
>> "Duplicate page error" then some type of memory address.
>> This message is then repeated a couple of times with different addresses.
>> Then the message vm:stopping process squid then a message about deleting
>> duplicate pages.
>>
>> We had been running Squid for about two years without any problems, I
>> recently added 30 additional modems to our dial pool making a total of
140
>> and the problems started.
>>
>> We were running Squid 2 on Redhat 6.0
>> The machine was a pentium 233, 256Mb ram 2X 9.1 gig SCSI drives(40Mb)
>>
>> I then built up a new box:
>> Pentium 3 733 Mhz
>> 384 Mb Ram(133 Mhz)
>> 1X 9.1 Gig lvd SCSI (160 Mb/sec)
>> Red hat 6.2
>> Squid 2
>>
>> I am still getting the same failure during peak load(About 5 access sec)
>>
>> If anybody has any ideas I would love to hear them A.S.A.P.
>> I am getting really frazzeled and don't know what to do!!!!!!!!!!!!!!
>>
>> Regards Andrew Blanche
>>
>>
>
-- To unsubscribe, see http://www.squid-cache.org/mailing-lists.htmlReceived on Tue Sep 05 2000 - 02:45:46 MDT
This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:55:12 MST