Marcio Augusto Stocco wrote:
> Testing Squid/SquidGuard with thousands of users, the cache.log shows
> the following messages:
>
> 2008/04/01 15:19:16| WARNING: All url_rewriter processes are busy.
> 2008/04/01 15:19:16| WARNING: up to 2730 pending requests queued
> 2008/04/01 15:19:16| Consider increasing the number of url_rewriter
> processes to at least 3552 in your config file.
> 2008/04/01 15:19:34| WARNING! Your cache is running out of filedescriptors
> 2008/04/01 15:19:50| WARNING! Your cache is running out of filedescriptors
> 2008/04/01 15:19:56| comm_open: socket failure: (24) Too many open files
> 2008/04/01 15:19:56| comm_open: socket failure: (24) Too many open files
> 2008/04/01 15:19:56| comm_open: socket failure: (24) Too many open files
>
> The server is a HP DL360G5 (2x Xeon Dual 1.6 GHz, RAM 8 GB, HP Smart
> Array - RAID 1).
>
> Is there any way to increase SQUID_MAXFD from 8192 to 65536, so I can
> try using the sugested number of url_rewriter processes?
Squid 2.6: --with-maxfd=65536
Squid 3.x: --with-filedescriptors=65536
Be sure your OS can handle a single process with that many FD though.
Using these options overrides the automatic build detections AFAIK.
You can also use ulimit while compiling (I don't know the details).
>
> With SQUID_MAXFD=8192 I got lots of "comm_open: socket failure: (24)
> Too many open files" if url_rewriter is set higher than 200 (roughly).
>
> Thanks for any help,
> Marcio.
For our info, you say you are handling thousands of users;
and what release of squid is it?
what request/sec load is your squid maxing out at?
Amos
-- Please use Squid 2.6STABLE19 or 3.0STABLE4Received on Tue Apr 01 2008 - 22:18:39 MDT
This archive was generated by hypermail 2.2.0 : Thu May 01 2008 - 12:00:03 MDT