Hello,
I have just installed Squid and made few tests.
When I execute several times the following command :
$> /usr/sbin/squidclient http://en.wikipedia.org/wiki/Squid
I can see in the log file that the URL seems to always been forwarded to the
website
$> cat /var/log/squid/access.log
1251219737.080 170 127.0.0.1 TCP_MISS/200 73186 GET
http://en.wikipedia.org/wiki/Squid - DIRECT/91.198.174.2 text/html
1251219740.240 132 127.0.0.1 TCP_MISS/200 73186 GET
http://en.wikipedia.org/wiki/Squid - DIRECT/91.198.174.2 text/html
1251219742.014 118 127.0.0.1 TCP_MISS/200 73184 GET
http://en.wikipedia.org/wiki/Squid - DIRECT/91.198.174.2 text/html
Is there a way to request Squid to always save in cache a specific website ?
I also would be interested in making sure that even an URL with GET
parameters will be also put in cache (for URL like :
http://en.wikipedia.org/wiki/Squid?foo=bar)
I have looked for the full list of config options
(http://www.squid-cache.org/Doc/config/) and I cannot find anything that
would help me.
Thanks in advance
Best Regards
-- View this message in context: http://www.nabble.com/Systematic-caching-tp25138412p25138412.html Sent from the Squid - Users mailing list archive at Nabble.com.Received on Tue Aug 25 2009 - 17:08:47 MDT
This archive was generated by hypermail 2.2.0 : Tue Aug 25 2009 - 12:00:03 MDT