ONLY Cache certain Websites.

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
43 messages Options
123
Reply | Threaded
Open this post in threaded view
|

ONLY Cache certain Websites.

nuhll
This post has NOT been accepted by the mailing list yet.
Hello,
i want to use my Debian Homeserver for caching, we dont have a fast internet here and it would take much time to download Windows Updates, Kaspersky Updates, Apple Updates 8 times... every time a update occures.

Now i want Squid to JUST <-------!!!!!! Cache certain websites AND OR certain files.

I know this post:

http://wiki.squid-cache.org/SquidFaq/WindowsUpdate

What i need to do do get Windows Update +

http://lkrms.org/caching-ios-updates-on-a-squid-proxy-server/

+ Antivirus (Antivir, Kaspersky) to cache.

1. ) The easiest way would be point dns to the squid server, but this dont seem to work, any ideas?

2.) atm i use a pac file which gets spread by dhcp. Down side is... ALL requests go to squid... thats my squid.conf:
http_port 192.168.0.1:3128 transparent

acl localnet src 192.168.0.0
acl localhost src 127.0.0.1
acl all src 0.0.0.0
range_offset_limit 5000 MB
maximum_object_size 5000 MB
quick_abort_min -1

# Add one of these lines for each of the websites you want to cache.

refresh_pattern -i microsoft.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80% 432000 reload-into-ims

refresh_pattern -i windowsupdate.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80% 432000 reload-into-ims

refresh_pattern -i windows.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80% 432000 reload-into-ims

# DONT MODIFY THESE LINES
refresh_pattern \^ftp:           1440    20%     10080
refresh_pattern \^gopher:        1440    0%      1440
refresh_pattern -i (/cgi-bin/|\?) 0     0%      0
refresh_pattern .               0       20%     4320

acl windowsupdate dstdomain windowsupdate.microsoft.com
acl windowsupdate dstdomain .update.microsoft.com
acl windowsupdate dstdomain download.windowsupdate.com
acl windowsupdate dstdomain redir.metaservices.microsoft.com
acl windowsupdate dstdomain images.metaservices.microsoft.com
acl windowsupdate dstdomain c.microsoft.com
acl windowsupdate dstdomain www.download.windowsupdate.com
acl windowsupdate dstdomain wustat.windows.com
acl windowsupdate dstdomain crl.microsoft.com
acl windowsupdate dstdomain sls.microsoft.com
acl windowsupdate dstdomain productactivation.one.microsoft.com
acl windowsupdate dstdomain ntservicepack.microsoft.com

acl CONNECT method CONNECT
acl wuCONNECT dstdomain www.update.microsoft.com
acl wuCONNECT dstdomain sls.microsoft.com

http_access allow CONNECT wuCONNECT localnet
http_access allow windowsupdate localnet

http_access allow CONNECT wuCONNECT localnet
http_access allow CONNECT wuCONNECT localhost
http_access allow windowsupdate localnet
http_access allow windowsupdate localhost
#======================
#Then I'd add this to ONLY cache the windows updates:
#======================

acl mywindowsupdates dstdomain .my.windowsupdate.website.com .windowsupdate.com .microsoft.com
cache allow mywindowsupdates
always_direct allow localnet


But it dont work. "problem with windows updates blabla"

Also many websites only work when i connect to them the 2. time.. its crazy.

It must be possible to JUST cache some websites and or files and just DIRECT ALL OTHER TRAFFIC ORIGNAL to theri destination, OR?


Thank you all for your help.
Reply | Threaded
Open this post in threaded view
|

Re: ONLY Cache certain Websites.

nuhll
This post has NOT been accepted by the mailing list yet.
This post was updated on .
I changed it now back to the original Windows Update cache:

acl localnet src 192.168.0.0
acl all src 0.0.0.0

http_port 192.168.0.1:3128 transparent

range_offset_limit 100 MB windowsupdate
maximum_object_size 6000 MB
quick_abort_min -1


# Add one of these lines for each of the websites you want to cache.

refresh_pattern -i microsoft.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80% 432000 reload-into-ims

refresh_pattern -i windowsupdate.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80% 432000 reload-into-ims

refresh_pattern -i windows.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80% 432000 reload-into-ims


# DONT MODIFY THESE LINES
refresh_pattern \^ftp:           1440    20%     10080
refresh_pattern \^gopher:        1440    0%      1440
refresh_pattern -i (/cgi-bin/|\?) 0     0%      0
refresh_pattern .               0       20%     4320

acl windowsupdate dstdomain windowsupdate.microsoft.com
acl windowsupdate dstdomain .update.microsoft.com
acl windowsupdate dstdomain download.windowsupdate.com
acl windowsupdate dstdomain redir.metaservices.microsoft.com
acl windowsupdate dstdomain images.metaservices.microsoft.com
acl windowsupdate dstdomain c.microsoft.com
acl windowsupdate dstdomain www.download.windowsupdate.com
acl windowsupdate dstdomain wustat.windows.com
acl windowsupdate dstdomain crl.microsoft.com
acl windowsupdate dstdomain sls.microsoft.com
acl windowsupdate dstdomain productactivation.one.microsoft.com
acl windowsupdate dstdomain ntservicepack.microsoft.com

acl CONNECT method CONNECT
acl wuCONNECT dstdomain www.update.microsoft.com
acl wuCONNECT dstdomain sls.microsoft.com

http_access allow CONNECT wuCONNECT localnet
http_access allow windowsupdate localnet
Reply | Threaded
Open this post in threaded view
|

Re: ONLY Cache certain Websites.

nuhll
im not able to fix it.

Normal websites work. But i cant get it to cache (or even allow access to Windows Update or Kaspersky).

Whats i am doin wrong?

2014/08/02 17:05:35| The request GET http://dnl-16.geo.kaspersky.com/updaters/updater.xml is DENIED, because it matched 'localhost'
2014/08/02 17:05:35| The reply for GET http://dnl-16.geo.kaspersky.com/updaters/updater.xml is ALLOWED, because it matched 'localhost'


2014/08/02 17:06:32| The request CONNECT 62.128.100.41:443 is DENIED, because it matched 'localhost'
2014/08/02 17:06:32| The reply for CONNECT 62.128.100.41:443 is ALLOWED, because it matched 'localhost'


014/08/02 17:07:07| The request CONNECT sls.update.microsoft.com:443 is DENIED, because it matched 'localhost'
2014/08/02 17:07:07| The reply for CONNECT sls.update.microsoft.com:443 is ALLOWED, because it matched 'localhost'



my config atm:
debug_options ALL,1 33,2
acl localnet src 192.168.0.0
acl all src 0.0.0.0
acl localhost src 127.0.0.1

access_log daemon:/var/log/squid/access.test.log squid

http_port 192.168.0.1:3128 transparent

cache_dir ufs /daten/squid 100000 16 256

range_offset_limit 100 MB windowsupdate
maximum_object_size 6000 MB
quick_abort_min -1


# Add one of these lines for each of the websites you want to cache.

refresh_pattern -i microsoft.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80% 432000 reload-into-ims

refresh_pattern -i windowsupdate.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80% 432000 reload-into-ims

refresh_pattern -i windows.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80% 432000 reload-into-ims

refresh_pattern -i geo.kaspersky.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80% 432000 reload-into-ims

# DONT MODIFY THESE LINES
refresh_pattern \^ftp:           1440    20%     10080
refresh_pattern \^gopher:        1440    0%      1440
refresh_pattern -i (/cgi-bin/|\?) 0     0%      0
refresh_pattern .               0       20%     4320

acl kaspersky dstdomain .kaspersky.com
acl windowsupdate dstdomain windowsupdate.microsoft.com
acl windowsupdate dstdomain .update.microsoft.com
acl windowsupdate dstdomain download.windowsupdate.com
acl windowsupdate dstdomain redir.metaservices.microsoft.com
acl windowsupdate dstdomain images.metaservices.microsoft.com
acl windowsupdate dstdomain c.microsoft.com
acl windowsupdate dstdomain www.download.windowsupdate.com
acl windowsupdate dstdomain wustat.windows.com
acl windowsupdate dstdomain crl.microsoft.com
acl windowsupdate dstdomain sls.microsoft.com
acl windowsupdate dstdomain productactivation.one.microsoft.com
acl windowsupdate dstdomain ntservicepack.microsoft.com

acl CONNECT method CONNECT
acl wuCONNECT dstdomain www.update.microsoft.com
acl wuCONNECT dstdomain sls.microsoft.com

http_access allow kaspersky localnet
http_access allow CONNECT wuCONNECT localnet
http_access allow windowsupdate localnet

http_access allow localnet
http_access allow localhost

Reply | Threaded
Open this post in threaded view
|

Re: ONLY Cache certain Websites.

Amos Jeffries
Administrator
On 3/08/2014 3:07 a.m., nuhll wrote:

> im not able to fix it.
>
> Normal websites work. But i cant get it to cache (or even allow access to
> Windows Update or Kaspersky).
>
> Whats i am doin wrong?
>
> 2014/08/02 17:05:35| The request GET
> http://dnl-16.geo.kaspersky.com/updaters/updater.xml is DENIED, because it
> matched 'localhost'
> 2014/08/02 17:05:35| The reply for GET
> http://dnl-16.geo.kaspersky.com/updaters/updater.xml is ALLOWED, because it
> matched 'localhost'
>
>
> 2014/08/02 17:06:32| The request CONNECT 62.128.100.41:443 is DENIED,
> because it matched 'localhost'
> 2014/08/02 17:06:32| The reply for CONNECT 62.128.100.41:443 is ALLOWED,
> because it matched 'localhost'
>
>
> 014/08/02 17:07:07| The request CONNECT sls.update.microsoft.com:443 is
> DENIED, because it matched 'localhost'
> 2014/08/02 17:07:07| The reply for CONNECT sls.update.microsoft.com:443 is
> ALLOWED, because it matched 'localhost'
>

So what access.log linesmatch these transactions?

>
> my config atm:
> debug_options ALL,1 33,2
> acl localnet src 192.168.0.0
> acl all src 0.0.0.0

1) you are defining the entire Internet to be a single IP address
"0.0.0.0" ... which is invalid.

This should be:
   acl all src all

> acl localhost src 127.0.0.1
>
> access_log daemon:/var/log/squid/access.test.log squid
>
> http_port 192.168.0.1:3128 transparent
>
> cache_dir ufs /daten/squid 100000 16 256
>
> range_offset_limit 100 MB windowsupdate
> maximum_object_size 6000 MB
> quick_abort_min -1
>
>
> # Add one of these lines for each of the websites you want to cache.
>
> refresh_pattern -i
> microsoft.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80% 432000
> reload-into-ims
>
> refresh_pattern -i
> windowsupdate.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80%
> 432000 reload-into-ims
>
> refresh_pattern -i
> windows.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80% 432000
> reload-into-ims
>
> refresh_pattern -i
> geo.kaspersky.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80%
> 432000 reload-into-ims
>
> # DONT MODIFY THESE LINES
> refresh_pattern \^ftp:           1440    20%     10080
> refresh_pattern \^gopher:        1440    0%      1440
> refresh_pattern -i (/cgi-bin/|\?) 0     0%      0
> refresh_pattern .               0       20%     4320
>
> acl kaspersky dstdomain .kaspersky.com
> acl windowsupdate dstdomain windowsupdate.microsoft.com
> acl windowsupdate dstdomain .update.microsoft.com
> acl windowsupdate dstdomain download.windowsupdate.com
> acl windowsupdate dstdomain redir.metaservices.microsoft.com
> acl windowsupdate dstdomain images.metaservices.microsoft.com
> acl windowsupdate dstdomain c.microsoft.com
> acl windowsupdate dstdomain www.download.windowsupdate.com
> acl windowsupdate dstdomain wustat.windows.com
> acl windowsupdate dstdomain crl.microsoft.com
> acl windowsupdate dstdomain sls.microsoft.com
> acl windowsupdate dstdomain productactivation.one.microsoft.com
> acl windowsupdate dstdomain ntservicepack.microsoft.com
>
> acl CONNECT method CONNECT
> acl wuCONNECT dstdomain www.update.microsoft.com
> acl wuCONNECT dstdomain sls.microsoft.com
>
> http_access allow kaspersky localnet
> http_access allow CONNECT wuCONNECT localnet
> http_access allow windowsupdate localnet
>
> http_access allow localnet
> http_access allow localhost
>

The above rule set is equivalent to:
 http_access allow localhost
 http_access deny !localnet
 http_access allow all

Amos

Reply | Threaded
Open this post in threaded view
|

Re: ONLY Cache certain Websites.

nuhll
This post was updated on .
Seems like "acl all src all" fixed it. Thanks!

One problem is left. Is it possible to only cache certain websites, the rest should just redirectet? I just want to make a refresh_pattern per websites which shluld be cached.

"The above rule set is equivalent to:
 http_access allow localhost
 http_access deny !localnet
 http_access allow all "

is this wrong? If yes, how i correct it?
Reply | Threaded
Open this post in threaded view
|

Re: ONLY Cache certain Websites.

Amos Jeffries
Administrator
On 3/08/2014 9:25 p.m., nuhll wrote:
> Seems like "acl all src all" fixed it. Thanks!
>
> One problem is left. Is it possible to only cache certain websites, the rest
> should just redirectet?

The "cache" directive is used to tell Squid any transactions to be
denied storage (deny matches). The rest (allow matches) are cached (or
not) as per HTTP specification. http://www.squid-cache.org/Doc/config/cache/

Redirect is done with url_rewrite_program helper or a deny_info ACL
producing a 30x status and alternative URL for the client to be
redirected to. Although I guess you used the word "redirectet" to mean
something other than HTTP redirection - so this may not be what you want
to do.

Amos

Reply | Threaded
Open this post in threaded view
|

Re: ONLY Cache certain Websites.

nuhll
Hello,
you are right. I dont mean redirect like 301.

I mean, squid should not touch the website or connection and just send it direct to the website, except some websites which i want to cache.

How to archive this?
Reply | Threaded
Open this post in threaded view
|

Re: ONLY Cache certain Websites.

Igor Novgorodov
always_direct directive

On 04.08.2014 22:15, nuhll wrote:

> Hello,
> you are right. I dont mean redirect like 301.
>
> I mean, squid should not touch the website or connection and just send it
> direct to the website, except some websites which i want to cache.
>
> How to archive this?
>
>
>
> --
> View this message in context: http://squid-web-proxy-cache.1019090.n4.nabble.com/ONLY-Cache-certain-Websites-tp4667121p4667134.html
> Sent from the Squid - Users mailing list archive at Nabble.com.

Reply | Threaded
Open this post in threaded view
|

Re: ONLY Cache certain Websites.

nuhll
always_direct allow all
and then my other code, or i need to add it before?
Reply | Threaded
Open this post in threaded view
|

Re: ONLY Cache certain Websites.

Igor Novgorodov
You should create an access list with sites that you don't want to cache
like:

always_direct allow acl_direct_sites

always_direct allow all will make ALL requests to go directly bypassing cache.
Also see cache_deny directive.


On 04.08.2014 22:25, nuhll wrote:
> always_direct allow all
> and then my other code, or i need to add it before?
>
>
>
> --
> View this message in context: http://squid-web-proxy-cache.1019090.n4.nabble.com/ONLY-Cache-certain-Websites-tp4667121p4667136.html
> Sent from the Squid - Users mailing list archive at Nabble.com.

Reply | Threaded
Open this post in threaded view
|

Re: ONLY Cache certain Websites.

nuhll
Thanks, but its not possible to make a list of all possible websites which i could visit but i dont want to cache xD.

Is there no way to direct ALL websites direct EXCEPT only some websites?
Reply | Threaded
Open this post in threaded view
|

Re: ONLY Cache certain Websites.

Igor Novgorodov
Piece of cake:

always_direct deny acl_not_direct
always_direct allow all

On 05.08.2014 23:19, nuhll wrote:

> Thanks, but its not possible to make a list of all possible websites which i
> could visit but i dont want to cache xD.
>
> Is there no way to direct ALL websites direct EXCEPT only some websites?
>
>
>
> --
> View this message in context: http://squid-web-proxy-cache.1019090.n4.nabble.com/ONLY-Cache-certain-Websites-tp4667121p4667140.html
> Sent from the Squid - Users mailing list archive at Nabble.com.

Reply | Threaded
Open this post in threaded view
|

Re: ONLY Cache certain Websites.

nuhll
This post was updated on .
Thanks for your answer.

Ill try to get it working but im not sure how. I dont understand this "acl" system. I know there are alot of tutorials out there, but not in my mother language so im not able to fully understand such expert things.

Could you maybe show me atleast at one exampel how to get it work? Also maybe there are things i can remove?

Im non profit but i need this for my smal network, we dont have fast internet and when 10 computers download updates we cant use telephone or anything... :-/


Heres my actual list:

acl localnet src 192.168.0.0
acl all src all
acl localhost src 127.0.0.1

#access_log daemon:/var/log/squid/access.test.log squid

http_port 192.168.0.1:3128 transparent

cache_dir ufs /daten/squid 100000 16 256

range_offset_limit 100 MB windowsupdate
maximum_object_size 6000 MB
quick_abort_min -1


# Add one of these lines for each of the websites you want to cache.

refresh_pattern -i microsoft.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80% 432000 reload-into-ims

refresh_pattern -i windowsupdate.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80% 432000 reload-into-ims

refresh_pattern -i windows.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80% 432000 reload-into-ims

#kaspersky update
refresh_pattern -i geo.kaspersky.com/.*\.(cab|dif|pack|q6v|2fv|49j|tvi|ez5|1nj|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80% 432000 reload-into-ims

#nvidia updates
refresh_pattern -i download.nvidia.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80% 432000 reload-into-ims

#java updates
refresh_pattern -i sdlc-esd.sun.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80% 432000 reload-into-ims

# DONT MODIFY THESE LINES
refresh_pattern \^ftp:           1440    20%     10080
refresh_pattern \^gopher:        1440    0%      1440
refresh_pattern -i (/cgi-bin/|\?) 0     0%      0
refresh_pattern .               0       20%     4320

#kaspersky update
acl kaspersky dstdomain geo.kaspersky.com

acl windowsupdate dstdomain windowsupdate.microsoft.com
acl windowsupdate dstdomain .update.microsoft.com
acl windowsupdate dstdomain download.windowsupdate.com
acl windowsupdate dstdomain redir.metaservices.microsoft.com
acl windowsupdate dstdomain images.metaservices.microsoft.com
acl windowsupdate dstdomain c.microsoft.com
acl windowsupdate dstdomain www.download.windowsupdate.com
acl windowsupdate dstdomain wustat.windows.com
acl windowsupdate dstdomain crl.microsoft.com
acl windowsupdate dstdomain sls.microsoft.com
acl windowsupdate dstdomain productactivation.one.microsoft.com
acl windowsupdate dstdomain ntservicepack.microsoft.com
acl CONNECT method CONNECT
acl wuCONNECT dstdomain www.update.microsoft.com
acl wuCONNECT dstdomain sls.microsoft.com

http_access allow kaspersky localnet
http_access allow CONNECT wuCONNECT localnet
http_access allow windowsupdate localnet

#test
http_access allow localnet
http_access allow all
http_access allow localhost
                                             
Reply | Threaded
Open this post in threaded view
|

Re: ONLY Cache certain Websites.

Igor Novgorodov
Well, english is not my native language too, but that does not hurt much :)

1. Define an access list (text file with domains you wanna cache, one
domain per line):
acl domains_cache dstdomain "/etc/squid/lists/domains_cache.txt"

2. Define a parameter that will allow cache for these domains, while
denying all others:
cache allow domains_cache
cache deny all

That's all, wasn't that difficult :)

P.S.
always_direct directive is for something a little different, it's used
with parent proxies,
so use just "cache".


On 06.08.2014 21:33, nuhll wrote:

> Thanks for your answer.
>
> Ill try to get it working but im not sure how. I dont understand this "acl"
> system. I know there are alot of tutorials out there, but not in my mother
> language so im not able to fully understand such expert things.
>
> Could you maybe show me atleast at one exampel how to get it work? Also
> maybe there are things i can remove?
>
> Heres my actual list:
>
> acl localnet src 192.168.0.0
> acl all src all
> acl localhost src 127.0.0.1
>
> #access_log daemon:/var/log/squid/access.test.log squid
>
> http_port 192.168.0.1:3128 transparent
>
> cache_dir ufs /daten/squid 100000 16 256
>
> range_offset_limit 100 MB windowsupdate
> maximum_object_size 6000 MB
> quick_abort_min -1
>
>
> # Add one of these lines for each of the websites you want to cache.
>
> refresh_pattern -i
> microsoft.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80% 432000
> reload-into-ims
>
> refresh_pattern -i
> windowsupdate.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80%
> 432000 reload-into-ims
>
> refresh_pattern -i
> windows.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80% 432000
> reload-into-ims
>
> #kaspersky update
> refresh_pattern -i
> geo.kaspersky.com/.*\.(cab|dif|pack|q6v|2fv|49j|tvi|ez5|1nj|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip)
> 4320 80% 432000 reload-into-ims
>
> #nvidia updates
> refresh_pattern -i
> download.nvidia.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80%
> 432000 reload-into-ims
>
> #java updates
> refresh_pattern -i
> sdlc-esd.sun.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80%
> 432000 reload-into-ims
>
> # DONT MODIFY THESE LINES
> refresh_pattern \^ftp:           1440    20%     10080
> refresh_pattern \^gopher:        1440    0%      1440
> refresh_pattern -i (/cgi-bin/|\?) 0     0%      0
> refresh_pattern .               0       20%     4320
>
> #kaspersky update
> acl kaspersky dstdomain geo.kaspersky.com
>
> acl windowsupdate dstdomain windowsupdate.microsoft.com
> acl windowsupdate dstdomain .update.microsoft.com
> acl windowsupdate dstdomain download.windowsupdate.com
> acl windowsupdate dstdomain redir.metaservices.microsoft.com
> acl windowsupdate dstdomain images.metaservices.microsoft.com
> acl windowsupdate dstdomain c.microsoft.com
> acl windowsupdate dstdomain www.download.windowsupdate.com
> acl windowsupdate dstdomain wustat.windows.com
> acl windowsupdate dstdomain crl.microsoft.com
> acl windowsupdate dstdomain sls.microsoft.com
> acl windowsupdate dstdomain productactivation.one.microsoft.com
> acl windowsupdate dstdomain ntservicepack.microsoft.com
> acl CONNECT method CONNECT
> acl wuCONNECT dstdomain www.update.microsoft.com
> acl wuCONNECT dstdomain sls.microsoft.com
>
> http_access allow kaspersky localnet
> http_access allow CONNECT wuCONNECT localnet
> http_access allow windowsupdate localnet
>
> #test
> http_access allow localnet
> http_access allow all
> http_access allow localhost
>                                              
>
>
>
> --
> View this message in context: http://squid-web-proxy-cache.1019090.n4.nabble.com/ONLY-Cache-certain-Websites-tp4667121p4667157.html
> Sent from the Squid - Users mailing list archive at Nabble.com.

Reply | Threaded
Open this post in threaded view
|

Re: ONLY Cache certain Websites.

nuhll
This post was updated on .
Thanks for your help.

But i go crazy. =)

Internet is slow as fuck. [fixed, wrong DNS Server] I dont see any errors in the logs. And some services (Battle.net) is not working.

/etc/squid3/squid.conf
debug_options ALL,1 33,2
acl domains_cache dstdomain "/etc/squid/lists/domains_cache"
cache allow domains_cache
acl localnet src 192.168.0.0
acl all src all
acl localhost src 127.0.0.1
cache deny all

#access_log daemon:/var/log/squid/access.test.log squid

http_port 192.168.0.1:3128 transparent

cache_dir ufs /daten/squid 100000 16 256

range_offset_limit 100 MB windowsupdate
maximum_object_size 6000 MB
quick_abort_min -1


# Add one of these lines for each of the websites you want to cache.

refresh_pattern -i microsoft.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80% 432000 reload-into-ims

refresh_pattern -i windowsupdate.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80% 432000 reload-into-ims

refresh_pattern -i windows.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80% 432000 reload-into-ims

#kaspersky update
refresh_pattern -i geo.kaspersky.com/.*\.(cab|dif|pack|q6v|2fv|49j|tvi|ez5|1nj|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80% 432000 reload-into-ims

#nvidia updates
refresh_pattern -i download.nvidia.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80% 432000 reload-into-ims

#java updates
refresh_pattern -i sdlc-esd.sun.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80% 432000 reload-into-ims

# DONT MODIFY THESE LINES
refresh_pattern \^ftp:           1440    20%     10080
refresh_pattern \^gopher:        1440    0%      1440
refresh_pattern -i (/cgi-bin/|\?) 0     0%      0
refresh_pattern .               0       20%     4320

#kaspersky update
acl kaspersky dstdomain geo.kaspersky.com

acl windowsupdate dstdomain windowsupdate.microsoft.com
acl windowsupdate dstdomain .update.microsoft.com
acl windowsupdate dstdomain download.windowsupdate.com
acl windowsupdate dstdomain redir.metaservices.microsoft.com
acl windowsupdate dstdomain images.metaservices.microsoft.com
acl windowsupdate dstdomain c.microsoft.com
acl windowsupdate dstdomain www.download.windowsupdate.com
acl windowsupdate dstdomain wustat.windows.com
acl windowsupdate dstdomain crl.microsoft.com
acl windowsupdate dstdomain sls.microsoft.com
acl windowsupdate dstdomain productactivation.one.microsoft.com
acl windowsupdate dstdomain ntservicepack.microsoft.com

acl CONNECT method CONNECT
acl wuCONNECT dstdomain www.update.microsoft.com
acl wuCONNECT dstdomain sls.microsoft.com

http_access allow kaspersky localnet
http_access allow CONNECT wuCONNECT localnet
http_access allow windowsupdate localnet

#test
http_access allow localnet
http_access allow all
http_access allow localhost


/etc/squid/lists/domains_cache
microsoft.com
windowsupdate.com
windows.com
#nvidia updates
download.nvidia.com

#java updates
sdlc-esd.sun.com
#kaspersky
geo.kaspersky.com

/var/log/squid3/access.log
1407786051.567  17909 192.168.0.125 TCP_MISS/000 0 GET http://dist.blizzard.com.edgesuite.net/hs-pod/beta/EU/4944.direct/base-Win-deDE.MPQ - DIRECT/dist.blizzard.com.edgesuite.net -
1407786051.567  17909 192.168.0.125 TCP_MISS/000 0 GET http://llnw.blizzard.com/hs-pod/beta/EU/4944.direct/base-Win.MPQ - DIRECT/llnw.blizzard.com -
1407786054.161    132 192.168.0.125 TCP_MISS/200 247 GET http://heartbeat.dm.origin.com/pulse? - DIRECT/54.225.219.232 text/plain
1407786054.852  11891 192.168.0.125 TCP_MISS/200 440 POST http://www.netvibes.com/api/my/messagebar/ - DIRECT/193.189.143.34 application/json
1407786055.785    125 192.168.0.125 TCP_MISS/304 432 GET http://wiki.squid-cache.org/wiki/squidtheme/js/niftyCorners.css - DIRECT/77.93.254.178 -
1407786055.786    124 192.168.0.125 TCP_MISS/304 433 GET http://wiki.squid-cache.org/wiki/squidtheme/js/niftycube.js - DIRECT/77.93.254.178 -
1407786055.787    124 192.168.0.125 TCP_MISS/304 431 GET http://wiki.squid-cache.org/wiki/squidtheme/js/kutils.js - DIRECT/77.93.254.178 -
1407786055.788    122 192.168.0.125 TCP_MISS/304 433 GET http://wiki.squid-cache.org/wiki/squidtheme/css/common.css - DIRECT/77.93.254.178 -
1407786055.788    123 192.168.0.125 TCP_MISS/304 433 GET http://wiki.squid-cache.org/wiki/squidtheme/css/screen.css - DIRECT/77.93.254.178 -
1407786055.843     56 192.168.0.125 TCP_MISS/304 433 GET http://wiki.squid-cache.org/wiki/common/js/common.js - DIRECT/77.93.254.178 -
1407786055.844     54 192.168.0.125 TCP_MISS/304 433 GET http://wiki.squid-cache.org/wiki/squidtheme/img/squid-bubbles.png - DIRECT/77.93.254.178 -
1407786055.845     53 192.168.0.125 TCP_MISS/304 431 GET http://wiki.squid-cache.org/wiki/squidtheme/img/icon-info.png - DIRECT/77.93.254.178 -
1407786055.865  12623 192.168.0.125 TCP_MISS/200 76761 GET http://wiki.squid-cache.org/SquidFaq/BinaryPackages - DIRECT/77.93.254.178 text/html
1407786055.866     59 192.168.0.125 TCP_MISS/304 431 GET http://wiki.squid-cache.org/wiki/squidtheme/img/alert.png - DIRECT/77.93.254.178 -
1407786055.898     50 192.168.0.125 TCP_MISS/304 432 GET http://wiki.squid-cache.org/wiki/squidtheme/css/print.css - DIRECT/77.93.254.178 -
1407786055.901     53 192.168.0.125 TCP_MISS/304 432 GET http://wiki.squid-cache.org/wiki/squidtheme/css/projection.css - DIRECT/77.93.254.178 -
1407786056.050     52 192.168.0.125 TCP_MISS/200 1962 GET http://wiki.squid-cache.org/wiki/squid-favicon.ico - DIRECT/77.93.254.178 text/plain
1407786065.810  11174 192.168.0.125 TCP_MISS/200 743 POST http://eu.patch.battle.net:1119/patch - DIRECT/213.248.127.133 application/xml
1407786066.114  12043 192.168.0.125 TCP_MISS/302 497 POST http://iir.blizzard.com:3724/submit/BNET_APP - DIRECT/12.129.242.24 text/html
1407786066.116  11394 192.168.0.125 TCP_MISS/302 497 POST http://iir.blizzard.com:3724/submit/BNET_APP - DIRECT/12.129.242.24 text/html
1407786066.506    379 192.168.0.125 TCP_MISS/302 497 POST http://iir.blizzard.com:3724/submit/BNET_APP - DIRECT/12.129.242.24 text/html
1407786066.510    380 192.168.0.125 TCP_MISS/302 497 POST http://iir.blizzard.com:3724/submit/BNET_APP - DIRECT/12.129.242.24 text/html
1407786066.897    379 192.168.0.125 TCP_MISS/302 497 POST http://iir.blizzard.com:3724/submit/BNET_APP - DIRECT/12.129.242.24 text/html
1407786066.902    380 192.168.0.125 TCP_MISS/302 497 POST http://iir.blizzard.com:3724/submit/BNET_APP - DIRECT/12.129.242.24 text/html
1407786067.288    379 192.168.0.125 TCP_MISS/302 497 POST http://iir.blizzard.com:3724/submit/BNET_APP - DIRECT/12.129.242.24 text/html
1407786067.294    379 192.168.0.125 TCP_MISS/302 497 POST http://iir.blizzard.com:3724/submit/BNET_APP - DIRECT/12.129.242.24 text/html
1407786067.679    379 192.168.0.125 TCP_MISS/302 497 POST http://iir.blizzard.com:3724/submit/BNET_APP - DIRECT/12.129.242.24 text/html
1407786067.690    379 192.168.0.125 TCP_MISS/302 497 POST http://iir.blizzard.com:3724/submit/BNET_APP - DIRECT/12.129.242.24 text/html
1407786068.070    379 192.168.0.125 TCP_MISS/302 497 POST http://iir.blizzard.com:3724/submit/BNET_APP - DIRECT/12.129.242.24 text/html
1407786068.082    381 192.168.0.125 TCP_MISS/302 497 POST http://iir.blizzard.com:3724/submit/BNET_APP - DIRECT/12.129.242.24 text/html
1407786068.456    378 192.168.0.125 TCP_MISS/302 497 POST http://iir.blizzard.com:3724/submit/BNET_APP - DIRECT/12.129.242.24 text/html
1407786068.516    380 192.168.0.125 TCP_MISS/302 497 POST http://iir.blizzard.com:3724/submit/BNET_APP - DIRECT/12.129.242.24 text/html
1407786068.658     67 192.168.0.125 TCP_MISS/200 554 POST http://www.google-analytics.com/collect - DIRECT/173.194.32.200 image/gif
1407786068.840    380 192.168.0.125 TCP_MISS/302 497 POST http://iir.blizzard.com:3724/submit/BNET_APP - DIRECT/12.129.242.24 text/html
1407786069.084    380 192.168.0.125 TCP_MISS/302 497 POST http://iir.blizzard.com:3724/submit/BNET_APP - DIRECT/12.129.242.24 text/html
1407786069.227    380 192.168.0.125 TCP_MISS/302 497 POST http://iir.blizzard.com:3724/submit/BNET_APP - DIRECT/12.129.242.24 text/html
1407786069.476    380 192.168.0.125 TCP_MISS/302 497 POST http://iir.blizzard.com:3724/submit/BNET_APP - DIRECT/12.129.242.24 text/html
1407786069.628    379 192.168.0.125 TCP_MISS/302 497 POST http://iir.blizzard.com:3724/submit/BNET_APP - DIRECT/12.129.242.24 text/html
1407786069.867    379 192.168.0.125 TCP_MISS/302 497 POST http://iir.blizzard.com:3724/submit/BNET_APP - DIRECT/12.129.242.24 text/html
1407786070.030    381 192.168.0.125 TCP_MISS/302 497 POST http://iir.blizzard.com:3724/submit/BNET_APP - DIRECT/12.129.242.24 text/html
1407786070.258    380 192.168.0.125 TCP_MISS/302 497 POST http://iir.blizzard.com:3724/submit/BNET_APP - DIRECT/12.129.242.24 text/html
1407786070.422    379 192.168.0.125 TCP_MISS/302 497 POST http://iir.blizzard.com:3724/submit/BNET_APP - DIRECT/12.129.242.24 text/html
1407786070.648    379 192.168.0.125 TCP_MISS/302 497 POST http://iir.blizzard.com:3724/submit/BNET_APP - DIRECT/12.129.242.24 text/html
1407786070.814    379 192.168.0.125 TCP_MISS/302 497 POST http://iir.blizzard.com:3724/submit/BNET_APP - DIRECT/12.129.242.24 text/html
1407786071.208    381 192.168.0.125 TCP_MISS/302 497 POST http://iir.blizzard.com:3724/submit/BNET_APP - DIRECT/12.129.242.24 text/html
1407786078.971  36006 192.168.0.125 TCP_MISS/200 263 CONNECT 81.19.104.93:443 - DIRECT/81.19.104.93 -
1407786079.753  23506 192.168.0.125 TCP_MISS/200 4690 CONNECT nydus.battle.net:443 - DIRECT/12.129.242.30 -
1407786079.754  13935 192.168.0.125 TCP_MISS/200 4690 CONNECT nydus.battle.net:443 - DIRECT/12.129.242.30 -
1407786079.757  23509 192.168.0.125 TCP_MISS/200 4674 CONNECT nydus.battle.net:443 - DIRECT/12.129.242.30 -
1407786079.759  23506 192.168.0.125 TCP_MISS/200 4674 CONNECT nydus.battle.net:443 - DIRECT/12.129.242.30 -
1407786081.834  12077 192.168.0.125 TCP_MISS/200 395 GET http://eu.launcher.battle.net/service/wow/alert/de-de - DIRECT/80.239.186.21 text/plain
1407786081.836  12078 192.168.0.125 TCP_MISS/200 394 GET http://eu.launcher.battle.net/service/Hero/alert/de-de - DIRECT/80.239.186.21 text/plain
1407786082.093    148 192.168.0.125 TCP_MISS/200 672 POST http://www.netvibes.com/api/streams - DIRECT/193.189.143.34 application/json
1407786082.112  12351 192.168.0.125 TCP_MISS/200 395 GET http://us.launcher.battle.net/service/hs/alert/de-de - DIRECT/12.129.242.21 text/plain
1407786082.113  12350 192.168.0.125 TCP_MISS/200 394 GET http://us.launcher.battle.net/service/s2/alert/de-de - DIRECT/12.129.242.21 text/plain
1407786082.918     49 192.168.0.125 TCP_MISS/200 395 GET http://eu.launcher.battle.net/service/s2/alert/de-de - DIRECT/80.239.186.21 text/plain
1407786082.920     50 192.168.0.125 TCP_MISS/200 395 GET http://eu.launcher.battle.net/service/d3/alert/de-de - DIRECT/80.239.186.21 text/plain
1407786082.927   1074 192.168.0.125 TCP_MISS/200 4637 CONNECT nydus.battle.net:443 - DIRECT/12.129.242.30 -
1407786082.929   1076 192.168.0.125 TCP_MISS/200 4637 CONNECT nydus.battle.net:443 - DIRECT/12.129.242.30 -
1407786083.195     49 192.168.0.125 TCP_MISS/200 395 GET http://eu.launcher.battle.net/service/hs/alert/de-de - DIRECT/80.239.186.21 text/plain

/var/log/squid3/cache.log
2014/08/11 21:51:29| Squid Cache (Version 3.1.20): Exiting normally.
2014/08/11 21:53:04| Starting Squid Cache version 3.1.20 for x86_64-pc-linux-gnu...
2014/08/11 21:53:04| Process ID 32739
2014/08/11 21:53:04| With 65535 file descriptors available
2014/08/11 21:53:04| Initializing IP Cache...
2014/08/11 21:53:04| DNS Socket created at [::], FD 7
2014/08/11 21:53:04| DNS Socket created at 0.0.0.0, FD 8
2014/08/11 21:53:04| Adding nameserver 8.8.8.8 from squid.conf
2014/08/11 21:53:04| Adding nameserver 8.8.4.4 from squid.conf
2014/08/11 21:53:05| Unlinkd pipe opened on FD 13
2014/08/11 21:53:05| Local cache digest enabled; rebuild/rewrite every 3600/3600 sec
2014/08/11 21:53:05| Store logging disabled
2014/08/11 21:53:05| Swap maxSize 102400000 + 262144 KB, estimated 7897088 objects
2014/08/11 21:53:05| Target number of buckets: 394854
2014/08/11 21:53:05| Using 524288 Store buckets
2014/08/11 21:53:05| Max Mem  size: 262144 KB
2014/08/11 21:53:05| Max Swap size: 102400000 KB
2014/08/11 21:53:05| Version 1 of swap file with LFS support detected...
2014/08/11 21:53:05| Rebuilding storage in /daten/squid (CLEAN)
2014/08/11 21:53:05| Using Least Load store dir selection
2014/08/11 21:53:05| Current Directory is /
2014/08/11 21:53:05| Loaded Icons.
2014/08/11 21:53:05| Accepting  intercepted HTTP connections at 192.168.0.1:3128, FD 16.
2014/08/11 21:53:05| HTCP Disabled.
2014/08/11 21:53:05| Squid plugin modules loaded: 0
2014/08/11 21:53:05| Adaptation support is off.
2014/08/11 21:53:05| Ready to serve requests.
2014/08/11 21:53:05| Store rebuilding is 61.06% complete
2014/08/11 21:53:05| Done reading /daten/squid swaplog (6707 entries)
2014/08/11 21:53:05| Finished rebuilding storage from disk.
2014/08/11 21:53:05|      6707 Entries scanned
2014/08/11 21:53:05|         0 Invalid entries.
2014/08/11 21:53:05|         0 With invalid flags.
2014/08/11 21:53:05|      6707 Objects loaded.
2014/08/11 21:53:05|         0 Objects expired.
2014/08/11 21:53:05|         0 Objects cancelled.
2014/08/11 21:53:05|         0 Duplicate URLs purged.
2014/08/11 21:53:05|         0 Swapfile clashes avoided.
2014/08/11 21:53:05|   Took 0.03 seconds (256041.23 objects/sec).
2014/08/11 21:53:05| Beginning Validation Procedure
2014/08/11 21:53:05|   Completed Validation Procedure
2014/08/11 21:53:05|   Validated 13439 Entries
2014/08/11 21:53:05|   store_swap_size = 4682512
2014/08/11 21:53:05.796| ZPH: Preserving TOS on miss, TOS=0
2014/08/11 21:53:06| storeLateRelease: released 0 objects
2014/08/11 21:53:12.147| ZPH: Preserving TOS on miss, TOS=0
2014/08/11 21:53:12.148| ConnStateData::swanSong: FD 21
2014/08/11 21:53:12.152| ConnStateData::swanSong: FD 18
2014/08/11 21:53:12.152| ConnStateData::swanSong: FD 17
2014/08/11 21:53:14.260| ZPH: Preserving TOS on miss, TOS=0
2014/08/11 21:53:14.263| ConnStateData::swanSong: FD 17
2014/08/11 21:53:14.640| ZPH: Preserving TOS on miss, TOS=0
2014/08/11 21:53:14.644| ConnStateData::swanSong: FD 18
2014/08/11 21:53:14.684| ZPH: Preserving TOS on miss, TOS=0
2014/08/11 21:53:14.685| ConnStateData::swanSong: FD 17
2014/08/11 21:53:14.954| ZPH: Preserving TOS on miss, TOS=0
2014/08/11 21:53:14.955| ConnStateData::swanSong: FD 26
2014/08/11 21:53:15.067| ZPH: Preserving TOS on miss, TOS=0
2014/08/11 21:53:15.068| ConnStateData::swanSong: FD 17
2014/08/11 21:53:15.339| ZPH: Preserving TOS on miss, TOS=0
2014/08/11 21:53:15.344| ConnStateData::swanSong: FD 21
2014/08/11 21:53:15.459| ZPH: Preserving TOS on miss, TOS=0
...
2014/08/11 21:53:16.639| ZPH: Preserving TOS on miss, TOS=0
2014/08/11 21:53:16.642| ZPH: Preserving TOS on miss, TOS=0
2014/08/11 21:53:16.643| ZPH: Preserving TOS on miss, TOS=0
2014/08/11 21:53:16.648| ConnStateData::swanSong: FD 21
2014/08/11 21:53:16.649| ConnStateData::swanSong: FD 28
2014/08/11 21:53:16.652| ConnStateData::swanSong: FD 38
2014/08/11 21:53:16.652| ConnStateData::swanSong: FD 23
2014/08/11 21:53:16.653| ConnStateData::swanSong: FD 36
2014/08/11 21:53:16.653| ConnStateData::swanSong: FD 29
2014/08/11 21:53:16.936| ZPH: Preserving TOS on miss, TOS=0
2014/08/11 21:53:16.945| ConnStateData::swanSong: FD 32
2014/08/11 21:53:16.945| ConnStateData::swanSong: FD 40
2014/08/11 21:53:16.970| ZPH: Preserving TOS on miss, TOS=0
2014/08/11 21:53:16.974| ConnStateData::swanSong: FD 18
2014/08/11 21:53:17.019| ZPH: Preserving TOS on miss, TOS=0
2014/08/11 21:53:17.021| ZPH: Preserving TOS on miss, TOS=0
2014/08/11 21:53:17.028| ConnStateData::swanSong: FD 33
2014/08/11 21:53:17.029| ConnStateData::swanSong: FD 23
2014/08/11 21:53:17.030| ConnStateData::swanSong: FD 34
2014/08/11 21:53:17.031| ConnStateData::swanSong: FD 28
2014/08/11 21:53:17.032| ZPH: Preserving TOS on miss, TOS=0
2014/08/11 21:53:17.038| ConnStateData::swanSong: FD 17
2014/08/11 21:53:17.164| ZPH: Preserving TOS on miss, TOS=0
2014/08/11 21:53:17.167| ConnStateData::swanSong: FD 26
2014/08/11 21:53:17.168| ConnStateData::swanSong: FD 18
2014/08/11 21:53:17.309| ZPH: Preserving TOS on miss, TOS=0
2014/08/11 21:53:17.314| ConnStateData::swanSong: FD 18
2014/08/11 21:53:17.314| ConnStateData::swanSong: FD 31
2014/08/11 21:53:17.361| ZPH: Preserving TOS on miss, TOS=0
2014/08/11 21:53:17.363| ConnStateData::swanSong: FD 36
2014/08/11 21:53:17.371| ZPH: Preserving TOS on miss, TOS=0
2014/08/11 21:53:17.376| ConnStateData::swanSong: FD 21
2014/08/11 21:53:17.376| ConnStateData::swanSong: FD 28
2014/08/11 21:53:17.434| ZPH: Preserving TOS on miss, TOS=0
2014/08/11 21:53:17.436| ConnStateData::swanSong: FD 30
2014/08/11 21:53:17.437| ZPH: Preserving TOS on miss, TOS=0
2014/08/11 21:53:17.438| ConnStateData::swanSong: FD 26
2014/08/11 21:53:17.505| ZPH: Preserving TOS on miss, TOS=0
...

I also upgraded to squid3. Now i get following infos at start:
[....] Restarting Squid HTTP Proxy 3.x: squid32014/08/11 21:53:04| WARNING: (B) '::/0' is a subnetwork of (A) '::/0'
2014/08/11 21:53:04| WARNING: because of this '::/0' is ignored to keep splay tree searching predictable
2014/08/11 21:53:04| WARNING: You should probably remove '::/0' from the ACL named 'all'


Reply | Threaded
Open this post in threaded view
|

Re: ONLY Cache certain Websites.

Amos Jeffries
Administrator
On 12/08/2014 7:57 a.m., nuhll wrote:

> Thanks for your help.
>
> But i go crazy. =)
>
> Internet is slow as fuck. I dont see any errors in the logs. And some
> services (Battle.net) is not working.
>
> /etc/squid3/squid.conf
> debug_options ALL,1 33,2
> acl domains_cache dstdomain "/etc/squid/lists/domains_cache"
> cache allow domains_cache
> acl localnet src 192.168.0.0
> acl all src all
> acl localhost src 127.0.0.1
> cache deny all
>
> #access_log daemon:/var/log/squid/access.test.log squid
>
> http_port 192.168.0.1:3128 transparent
>
> cache_dir ufs /daten/squid 100000 16 256
>
> range_offset_limit 100 MB windowsupdate
> maximum_object_size 6000 MB
> quick_abort_min -1
>
>
> # Add one of these lines for each of the websites you want to cache.
>
> refresh_pattern -i
> microsoft.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80% 432000
> reload-into-ims
>
> refresh_pattern -i
> windowsupdate.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80%
> 432000 reload-into-ims
>
> refresh_pattern -i
> windows.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80% 432000
> reload-into-ims
>
> #kaspersky update
> refresh_pattern -i
> geo.kaspersky.com/.*\.(cab|dif|pack|q6v|2fv|49j|tvi|ez5|1nj|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip)
> 4320 80% 432000 reload-into-ims
>
> #nvidia updates
> refresh_pattern -i
> download.nvidia.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80%
> 432000 reload-into-ims
>
> #java updates
> refresh_pattern -i
> sdlc-esd.sun.com/.*\.(cab|exe|ms[i|u|f]|[ap]sf|wm[v|a]|dat|zip) 4320 80%
> 432000 reload-into-ims
>
> # DONT MODIFY THESE LINES
> refresh_pattern \^ftp:           1440    20%     10080
> refresh_pattern \^gopher:        1440    0%      1440
> refresh_pattern -i (/cgi-bin/|\?) 0     0%      0
> refresh_pattern .               0       20%     4320
>
> #kaspersky update
> acl kaspersky dstdomain geo.kaspersky.com
>
> acl windowsupdate dstdomain windowsupdate.microsoft.com
> acl windowsupdate dstdomain .update.microsoft.com
> acl windowsupdate dstdomain download.windowsupdate.com
> acl windowsupdate dstdomain redir.metaservices.microsoft.com
> acl windowsupdate dstdomain images.metaservices.microsoft.com
> acl windowsupdate dstdomain c.microsoft.com
> acl windowsupdate dstdomain www.download.windowsupdate.com
> acl windowsupdate dstdomain wustat.windows.com
> acl windowsupdate dstdomain crl.microsoft.com
> acl windowsupdate dstdomain sls.microsoft.com
> acl windowsupdate dstdomain productactivation.one.microsoft.com
> acl windowsupdate dstdomain ntservicepack.microsoft.com
>
> acl CONNECT method CONNECT
> acl wuCONNECT dstdomain www.update.microsoft.com
> acl wuCONNECT dstdomain sls.microsoft.com
>
> http_access allow kaspersky localnet
> http_access allow CONNECT wuCONNECT localnet
> http_access allow windowsupdate localnet
>
> #test
> http_access allow localnet
> http_access allow all
> http_access allow localhost
>
>
> /etc/squid/lists/domains_cache
> microsoft.com
> windowsupdate.com
> windows.com
> #nvidia updates
> download.nvidia.com
>
> #java updates
> sdlc-esd.sun.com
> #kaspersky
> geo.kaspersky.com
>
> /var/log/squid3/access.log
> 1407786051.567  17909 192.168.0.125 TCP_MISS/000 0 GET
> http://dist.blizzard.com.edgesuite.net/hs-pod/beta/EU/4944.direct/base-Win-deDE.MPQ
> - DIRECT/dist.blizzard.com.edgesuite.net -
> 1407786051.567  17909 192.168.0.125 TCP_MISS/000 0 GET
> http://llnw.blizzard.com/hs-pod/beta/EU/4944.direct/base-Win.MPQ -
> DIRECT/llnw.blizzard.com -

The blizzard.com servers did not produce a response for these requests.
Squid waited almost 18 seconds and nothing came back.

TCP window scaling, ECN, Path-MTU discovery, ICMP blocking are things to
look for here. Any one of them could be breaking the connection from
transmitting or receiving properly.

The rest of the log shows working traffic. Even for battle.net. I
suspect battle.net uses non-80 ports right? I doubt those are being
intercepted in your setup.

> /var/log/squid3/cache.log
> 2014/08/11 21:51:29| Squid Cache (Version 3.1.20): Exiting normally.
> 2014/08/11 21:53:04| Starting Squid Cache version 3.1.20 for
> x86_64-pc-linux-gnu...

Hmm. Which version of Debian (or derived OS) are you using? and can you
update it to the latest stable? squid3 package has been at 3.3.8 for
most of a year now.

> 2014/08/11 21:53:04| Process ID 32739
> 2014/08/11 21:53:04| With 65535 file descriptors available
> 2014/08/11 21:53:04| Initializing IP Cache...
> 2014/08/11 21:53:04| DNS Socket created at [::], FD 7
> 2014/08/11 21:53:04| DNS Socket created at 0.0.0.0, FD 8
> 2014/08/11 21:53:04| Adding nameserver 8.8.8.8 from squid.conf
> 2014/08/11 21:53:04| Adding nameserver 8.8.4.4 from squid.conf
> 2014/08/11 21:53:05| Unlinkd pipe opened on FD 13
> 2014/08/11 21:53:05| Local cache digest enabled; rebuild/rewrite every
> 3600/3600 sec
> 2014/08/11 21:53:05| Store logging disabled
> 2014/08/11 21:53:05| Swap maxSize 102400000 + 262144 KB, estimated 7897088
> objects
> 2014/08/11 21:53:05| Target number of buckets: 394854
> 2014/08/11 21:53:05| Using 524288 Store buckets
> 2014/08/11 21:53:05| Max Mem  size: 262144 KB
> 2014/08/11 21:53:05| Max Swap size: 102400000 KB
> 2014/08/11 21:53:05| Version 1 of swap file with LFS support detected...
> 2014/08/11 21:53:05| Rebuilding storage in /daten/squid (CLEAN)
> 2014/08/11 21:53:05| Using Least Load store dir selection
> 2014/08/11 21:53:05| Current Directory is /
> 2014/08/11 21:53:05| Loaded Icons.
> 2014/08/11 21:53:05| Accepting  intercepted HTTP connections at
> 192.168.0.1:3128, FD 16.
> 2014/08/11 21:53:05| HTCP Disabled.
> 2014/08/11 21:53:05| Squid plugin modules loaded: 0
> 2014/08/11 21:53:05| Adaptation support is off.
> 2014/08/11 21:53:05| Ready to serve requests.
> 2014/08/11 21:53:05| Store rebuilding is 61.06% complete
> 2014/08/11 21:53:05| Done reading /daten/squid swaplog (6707 entries)
<snip>

Okay, storage rebuild completed. That is normally the first thing to
check with Squid being super-slow right after startup. But it seems fine
for now due to being almost empty.

>
> I also upgraded to squid3. Now i get following infos at start:
> [....] Restarting Squid HTTP Proxy 3.x: squid32014/08/11 21:53:04| WARNING:
> (B) '::/0' is a subnetwork of (A) '::/0'
> 2014/08/11 21:53:04| WARNING: because of this '::/0' is ignored to keep
> splay tree searching predictable
> 2014/08/11 21:53:04| WARNING: You should probably remove '::/0' from the ACL
> named 'all'

Remove "acl all src all" from your config file when using squid3. It is
pre-defined.

Amos
Reply | Threaded
Open this post in threaded view
|

Re: ONLY Cache certain Websites.

nuhll
This post was updated on .
Hello,
thanks for your help.

I fixed the slow issue  by myself, i forgott to add nameservers, so it was using the local dns, which ofc fakes some ips... i added nameservers directive and it works now fast again.


root@debian-server:~# cat /proc/version
Linux version 3.2.0-4-amd64 (debian-kernel@lists.debian.org) (gcc version 4.6.3 (Debian 4.6.3-14) ) #1 SMP Debian 3.2.60-1+deb7u3

If i look at http://wiki.squid-cache.org/SquidFaq/BinaryPackages#Debian 3.1 is the newest? Am i wrong?

You tell me that Squid cant connect to some servers. How. Its just connected to normal fritz.box, nothing special, nothign what could block, or do i miss something?

iptables -L
Chain INPUT (policy ACCEPT)
target     prot opt source               destination

Chain FORWARD (policy ACCEPT)
target     prot opt source               destination

Chain OUTPUT (policy ACCEPT)
target     prot opt source               destination

I dont know which ports battle.net uses. Whats happening when battlenet uses not 80? Does squid dont "redirect" it?

I wouldnt have this problems if it just would be possible to ONLY redirect SOME (only Windows Updates, Kaspersky, Nvidia etc) to the squid proxy. But i didnt found a way...

BTW. Removed acl src all and now there are no errors.
Reply | Threaded
Open this post in threaded view
|

Re: ONLY Cache certain Websites.

nuhll
This post was updated on .
Someone sent me his config file. I got nearly all working. Except Battle.net. This problem seems to known, but i dont know how to fix.

http://stackoverflow.com/questions/24933962/squid-proxy-blocks-battle-net
https://forum.pfsense.org/index.php?topic=72271.0

Its probably because of this error (it appears again)

2014/08/11 21:53:17.363| ConnStateData::swanSong: FD 36
2014/08/11 21:53:17.371| ZPH: Preserving TOS on miss, TOS=0

Thats my actual config:
#
#Recommended minimum configuration:
#
always_direct allow all
debug_options ALL,1 33,2

# Example rule allowing access from your local networks.
# Adapt to list your (internal) IP networks from where browsing
# should be allowed
acl localnet src 192.168.0.0/16
acl localnet src fc00::/7
acl localnet src fe80::/10 # RFC1918 possible internal network
acl Safe_ports port 1-65535 # RFC1918 possible internal network
acl CONNECT method GET POST HEAD CONNECT PUT DELETE # RFC1918 possible internal network
#acl block-fnes urlpath_regex -i .*/fnes/echo # RFC 4193 local private network range
acl noscan dstdomain .symantecliveupdate.com liveupdate.symantec.com psi3.secunia.com update.immunet.com # RFC 4291 link-local (directly plugged) machines
acl video urlpath_regex -i \.(m2a|avi|mov|mp(e?g|a|e|1|2|3|4)|m1s|mp2v|m2v|m2s|wmx|rm|rmvb|3pg|3gpp|omg|ogm|asf|asx|wmvm3u8|flv|ts)

#
# Recommended minimum Access Permission configuration:
#
# Only allow cachemgr access from localhost

no_cache deny noscan
always_direct allow noscan
always_direct allow video

# Deny requests to certain unsafe ports

# Deny CONNECT to other than secure SSL ports

# We strongly recommend the following be uncommented to protect innocent
# web applications running on the proxy server who think the only
# one who can access services on .localhost. is a local user
#http_access deny to_localhost

#
# INSERT YOUR OWN RULE(S) HERE TO ALLOW ACCESS FROM YOUR CLIENTS
#
#cache_peer 192.168.1.1 parent 8080 0 default no-query no-digest
#no-netdb-exchange
#never_direct allow all

# Example rule allowing access from your local networks.
# Adapt localnet in the ACL section to list your (internal) IP networks
# from where browsing should be allowed

http_access allow all

# allow localhost always proxy functionality

# And finally deny all other access to this proxy

# Squid normally listens to port 3128
http_port 192.168.0.1:3128 intercept

# We recommend you to use at least the following line.
hierarchy_stoplist cgi-bin ?

# Uncomment and adjust the following to add a disk cache directory.
maximum_object_size 5000 MB
#store_dir_select_algorithm round-robin
cache_dir aufs /daten/squid 100000 16 256


# Leave coredumps in the first cache dir
coredump_dir /daten/squid

# Add any of your own refresh_pattern entries above these.
# General Rules
refresh_pattern -i \.(jpg|gif|png|webp|jpeg|ico|bmp|tiff|bif|ver|pict|pixel|bs)$ 220000 90% 300000 override-expire ignore-no-store ignore-private ignore-auth refresh-ims
refresh_pattern -i \.(js|css|class|swf|wav|dat|zsci|do|ver|advcs|woff|eps|ttf|svg|svgz|ps|acsm|wma)$ 220000 90% 300000 override-expire ignore-no-store ignore-private ignore-auth refresh-ims
refresh_pattern -i \.(html|htm|crl)$ 220000 90% 259200 override-expire ignore-no-store ignore-private ignore-auth refresh-ims
refresh_pattern -i \.(xml|flow)$ 0 90% 100000
refresh_pattern -i \.(json)$ 1440 90% 5760
refresh_pattern -i \.(bin|deb|rpm|drpm|exe|zip|tar|tgz|bz2|ipa|bz|ram|rar|bin|uxx|gz|crl|msi|dll|hz|cab|psf|vidt|apk|wtex|hz|ipsw)$ 262974 90% 262974 override-expire ignore-no-store ignore-private ignore-auth refresh-ims
refresh_pattern -i \.(ppt|pptx|doc|docx|pdf|xls|xlsx|csv|txt)$ 202974 90% 262974 override-expire ignore-no-store ignore-private ignore-auth refresh-ims
refresh_pattern -i ^ftp: 22974 90% 262974
refresh_pattern -i (/cgi-bin/|\?) 0 0% 0
refresh_pattern -i . 0 90% 259200

#windows update
refresh_pattern -i microsoft.com/.*\.(cab|exe|ms[i|u|f]|asf|wma|dat|zip)$ 202974 80% 262974
refresh_pattern -i windowsupdate.com/.*\.(cab|exe|ms[i|u|f]|asf|wma|dat|zip)$ 202974 80% 262974
refresh_pattern -i windows.com/.*\.(cab|exe|ms[i|u|f]|asf|wma|dat|zip)$ 202974 80% 262974

#antivir
refresh_pattern -i ^http:\/\/liveupdate.symantecliveupdate.com.*\(zip)$ 0 0% 262974
refresh_pattern -i .*dnl.*\.geo\.kaspersky\.com/.*(zip|avc|kdc) 202974 100% 262974 ignore-no-cache ignore-reload  reload-into-ims
refresh_pattern -i .*\.avg.com/.*\.(bin) 2160 100% 262974 ignore-no-cache reload-into-ims
refresh_pattern -i .*\.avast.com/.*\.(vpu|vpaa) 2160 100% 262974 ignore-no-cache reload-into-ims
refresh_pattern -i .*\.kaspersky-labs.com/.*\.(cab|zip|exe|msi|msp) 4320 100% 262974 ignore-no-cache reload-into-ims
refresh_pattern -i .*\.kaspersky.com/.*\.(cab|zip|exe|msi|msp|avc) 2160 100% 262974 ignore-no-cache reload-into-ims
refresh_pattern -i .*\.nai.com/.*\.(gem|zip|mcs) 2160 100% 262974 ignore-no-cache reload-into-ims

#nvidia updates
refresh_pattern -i de.download.nvidia.com/.*\.(cab|exe|ms[i|u|f]|asf|wma|dat|zip)$ 220000 80% 262974

#apple updates
refresh_pattern -i appldnld\.apple\.com 202974 100% 262974 ignore-reload ignore-no-store override-expire override-lastmod ignore-must-revalidate


log_icp_queries off
icp_port 0
htcp_port 0
snmp_port 3401
acl snmppublic snmp_community public
snmp_access allow snmppublic all
minimum_object_size 0 KB
buffered_logs on
cache_effective_user proxy
#header_replace User-Agent Mozilla/5.0 (X11; U;) Gecko/20080221 Firefox/2.0.0.9
vary_ignore_expire on
cache_swap_low 90
cache_swap_high 95
visible_hostname shadow
unique_hostname shadow-DHS
shutdown_lifetime 0 second
request_header_max_size 256 KB
half_closed_clients off
max_filedesc 65535
connect_timeout 10 second
cache_effective_group proxy
#access_log /var/log/squid/access.log squid
#access_log daemon:/var/log/squid3/access.test.log squid
client_db off
dns_nameservers 192.168.0.10
ipcache_size 1024
fqdncache_size 1024
positive_dns_ttl 24 hours
negative_dns_ttl 5 minutes
#itcp_outgoing_address 192.168.2.2
dns_v4_first on
check_hostnames off
forwarded_for delete
via off
#pinger_enable off
#memory_replacement_policy heap LFUDA
#cache_replacement_policy heap LFUDA
cache_mem 2048 MB
maximum_object_size_in_memory 512 KB
#memory_cache_mode disk
cache_store_log none
read_ahead_gap 50 MB
pipeline_prefetch on
reload_into_ims on
quick_abort_min -1 KB


I found some infos about this error. https://www.google.de/webhp?sourceid=chrome-instant&ion=1&espv=2&es_th=1&ie=UTF-8#q=ZPH%3A%20Preserving%20TOS%20on%20miss%2C%20TOS%3D0&safe=off

But i dont know how to fix it. And i also dont know if it would work. Maybe something different is wrong.
Reply | Threaded
Open this post in threaded view
|

Re: ONLY Cache certain Websites.

Amos Jeffries
Administrator
On 16/08/2014 8:02 a.m., nuhll wrote:
> I got nearly all working. Except Battle.net. This problem seems to known, but
> i dont know how to fix.
>
> http://stackoverflow.com/questions/24933962/squid-proxy-blocks-battle-net

That post displays a perfectly working proxy transaction. No sign of an
error anywhere.


> https://forum.pfsense.org/index.php?topic=72271.0
>

Contains three solutions. All of which are essentially turn on PNP at
the router.

Amos
Reply | Threaded
Open this post in threaded view
|

Re: ONLY Cache certain Websites.

nuhll
This post was updated on .
What is pnp. Do you mean UPNP? Its enabled. I dont understand RU. If i were able to read and understand it, why u think i post it here? Just so that u tell me thats the answer?! BTW my router is fritzbox 7390.

If i google 7390 pnp i dont find any usefull information.
123