Hi Squid users,
I'm running Squid 1.0.12 on Solaris 2.5
Every time I visited the Microsoft HomePage,
at http://www.microsoft.com
my server (Squid) is always trying to fetch the document
(HTML and .gif, .jpg ) from the original source.
This is due to the Expires in the Header.
1. Is it possible to cache a document(text or image)
from http://www.microsoft.com ?
2. How to configure the squid.conf, in order to
handle the request (not passing it to the neighbor or
parent) DIRECTLY from the Object cache ?
In my squid.conf I added :
hierarchy_stoplist http://www.microsoft.com
Is this right ?
3. My DNS returns 10 IP addresses of www.microsoft.com,
and I started 5 dns_children , but Squid after a
DNS lookup failure, still tried the same IP address
from the 10 addresses found.
Although Squid has more than 5 dns_children , then it
uses only one of them. Does the dnsserver choose
the IP address randomly ?
Thanks, in advance.
Benarson.
-- ------------------------------------------------------ - Email : Benarson.Behajaina@swh.sk - - Tel : +42 (7) 378 3560 - - Fax : +42 (7) 377 433 - - URL : http://www.drp.fmph.uniba.sk/~benarson - ------------------------------------------------------Received on Mon Sep 30 1996 - 06:53:30 MDT
This archive was generated by hypermail pre-2.1.9 : Tue Dec 09 2003 - 16:33:06 MST