ssl bump and url_rewrite_program (like squidguard)

classic Classic list List threaded Threaded
23 messages Options
12
Reply | Threaded
Open this post in threaded view
|

ssl bump and url_rewrite_program (like squidguard)

Edouard Gaulué
Hi community,

I've followed
http://wiki.squid-cache.org/ConfigExamples/Intercept/SslBumpExplicit  to
set my server. It looks really interesting and it's said to be the more
common configuration.

I often observe (example here withwww.youtube.com) :
***************************
The following error was encountered while trying to retrieve the URL:
https://http/*

     *Unable to determine IP address from host name "http"*

The DNS server returned:

     Name Error: The domain name does not exist.
****************************

This happens while the navigator (Mozilla) is trying to get a frame at
https://ad.doubleclick.net/N4061/adi/com.ythome/_default;sz=970x250;tile=1;ssl=1;dc_yt=1;kbsg=HPFR151103;kga=-1;kgg=-1;klg=fr;kmyd=ad_creative_1;ytexp=9406852,9408210,9408502,9417689,9419444,9419802,9420440,9420473,9421645,9421711,9422141,9422865,9423510,9423563,9423789;ord=968558538238386?

That's ads so I'm not so fond of it...

But this leads me to the fact I get this behavior each time the site is
banned by squidguard.

Is there something to do to avoid this behavior? I mean, squidguard
should send :

*********************************
   Access denied

Supplementary info :
Client address = 192.168.XXX.XXX
Client name = 192.168.XXX.XXX
User ident =
Client group = XXXXXXX
URL = https://ad.doubleclick.net/
Target class = ads

If this is wrong, contact your administrator
**********************************

squidguard is an url_rewrite_program that looks to respect squid
requirements. Redirect looks like this :
http://proxyweb.myserver.mydomain/cgi-bin/squidGuard-simple.cgi?clientaddr=...

I've played arround trying to change the redirect URL and it leads me to
the idea ssl_bump tries to analyse the part until the ":". Is there a way
to avoid this? Is this just a configuration matter?

Could putting a ssl_bump rule saying "every server that name match "http" or
"https" should splice" solve the problem?

Regards, EG


_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: ssl bump and url_rewrite_program (like squidguard)

Marcus Kool
I suspect that the problem is that you redirect a HTTPS-based URL to an HTTP URL and Squid does not like that.

Marcus



On 11/03/2015 08:48 PM, Edouard Gaulué wrote:

> Hi community,
>
> I've followed
> http://wiki.squid-cache.org/ConfigExamples/Intercept/SslBumpExplicit  to
> set my server. It looks really interesting and it's said to be the more
> common configuration.
>
> I often observe (example here withwww.youtube.com) :
> ***************************
> The following error was encountered while trying to retrieve the URL:
> https://http/*
>
>      *Unable to determine IP address from host name "http"*
>
> The DNS server returned:
>
>      Name Error: The domain name does not exist.
> ****************************
>
> This happens while the navigator (Mozilla) is trying to get a frame at
> https://ad.doubleclick.net/N4061/adi/com.ythome/_default;sz=970x250;tile=1;ssl=1;dc_yt=1;kbsg=HPFR151103;kga=-1;kgg=-1;klg=fr;kmyd=ad_creative_1;ytexp=9406852,9408210,9408502,9417689,9419444,9419802,9420440,9420473,9421645,9421711,9422141,9422865,9423510,9423563,9423789;ord=968558538238386?
>
>
> That's ads so I'm not so fond of it...
>
> But this leads me to the fact I get this behavior each time the site is
> banned by squidguard.
>
> Is there something to do to avoid this behavior? I mean, squidguard
> should send :
>
> *********************************
>    Access denied
>
> Supplementary info     :
> Client address     =     192.168.XXX.XXX
> Client name     =     192.168.XXX.XXX
> User ident     =
> Client group     =     XXXXXXX
> URL     =     https://ad.doubleclick.net/
> Target class     =     ads
>
> If this is wrong, contact your administrator
> **********************************
>
> squidguard is an url_rewrite_program that looks to respect squid
> requirements. Redirect looks like this :
> http://proxyweb.myserver.mydomain/cgi-bin/squidGuard-simple.cgi?clientaddr=...
>
> I've played arround trying to change the redirect URL and it leads me to
> the idea ssl_bump tries to analyse the part until the ":". Is there a way
> to avoid this? Is this just a configuration matter?
>
> Could putting a ssl_bump rule saying "every server that name match "http" or
> "https" should splice" solve the problem?
>
> Regards, EG
>
>
> _______________________________________________
> squid-users mailing list
> [hidden email]
> http://lists.squid-cache.org/listinfo/squid-users
_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: ssl bump and url_rewrite_program (like squidguard)

Amos Jeffries
Administrator
On 4/11/2015 12:48 p.m., Marcus Kool wrote:
> I suspect that the problem is that you redirect a HTTPS-based URL to an
> HTTP URL and Squid does not like that.
>
> Marcus
>

No it is apparently the fact that the domain name being redirected to is
"http".

As in: "http://http/something"


Which brings up the question of why you are using SG to block adverts?

squid.conf:
 acl ads dstdomain .doubleclick.net
 http_access deny ads

Amos

_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: ssl bump and url_rewrite_program (like squidguard)

Edouard Gaulué
Le 04/11/2015 11:00, Amos Jeffries a écrit :
On 4/11/2015 12:48 p.m., Marcus Kool wrote:
I suspect that the problem is that you redirect a HTTPS-based URL to an
HTTP URL and Squid does not like that.

Marcus
To give it a try in that direction I now redirect to an https server. And I get :

The following error was encountered while trying to retrieve the URL: https://https/*

Unable to determine IP address from host name https

The DNS server returned:

Name Error: The domain name does not exist.

Moreover this would leads sometimes to HTTP-based URL to an HTTPS URL and I don't know how much squid likes it either.

No it is apparently the fact that the domain name being redirected to is
"http".

As in: "http://http/something"

I can assure my rewrite_url looks like "https://proxyweb.xxxxx.xxxxx/var1=xxxx&...".

And this confirm ssl_bump parse this result and get the left part before the ":". To play with, I have also redirect to "proxyweb.xxxxx.xxxxx:443/var1=xxxx&..." (ie. I removed the "https://" and add a ":443") to force the parsing. Then I don't get this message anymore, but Mozilla gets crazy waiting for the ad.doubleclick.net certificate and getting the proxyweb.xxxxx.xxxxx one. And of course it breaks my SG configuration and can't be production solution.
Which brings up the question of why you are using SG to block adverts?

squid.conf:
 acl ads dstdomain .doubleclick.net
 http_access deny ads

Amos


I don't use SG to specificaly block adverts, I use it to block 90 % of the web. Here it's just an example with ads but it could be with so much other things...

I just want to try make SG and ssl_bump live together.

Is this possible to have a rule like "if it has been rewrite then don't try to ssl_bump"?

Regards, EG

_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: ssl bump and url_rewrite_program (like squidguard)

Marcus Kool
You need to know what squidGuard actually sends to Squid.
squidGuard does not have a debug option for this, so you have to set
    debug_options ALL,1 61,9
in squid.conf to see what Squid receives.
I bet that what Squid receives, is what it complains about:
the URL starts with 'https://http'

Marcus

On 11/04/2015 10:55 AM, Edouard Gaulué wrote:

> Le 04/11/2015 11:00, Amos Jeffries a écrit :
>> On 4/11/2015 12:48 p.m., Marcus Kool wrote:
>>> I suspect that the problem is that you redirect a HTTPS-based URL to an
>>> HTTP URL and Squid does not like that.
>>>
>>> Marcus
> To give it a try in that direction I now redirect to an https server. And I get :
>
> The following error was encountered while trying to retrieve the URL: https://https/*
>
>     *Unable to determine IP address from host name "https"*
>
> The DNS server returned:
>
>     Name Error: The domain name does not exist.
>
>
> Moreover this would leads sometimes to HTTP-based URL to an HTTPS URL and I don't know how much squid likes it either.
>
>> No it is apparently the fact that the domain name being redirected to is
>> "http".
>>
>> As in:"http://http/something"
>>
> I can assure my rewrite_url looks like "https://proxyweb.xxxxx.xxxxx/var1=xxxx&...".
>
> And this confirm ssl_bump parse this result and get the left part before the ":". To play with, I have also redirect to "proxyweb.xxxxx.xxxxx:443/var1=xxxx&..." (ie. I removed the "https://" and add a
> ":443") to force the parsing. Then I don't get this message anymore, but Mozilla gets crazy waiting for the ad.doubleclick.net certificate and getting the proxyweb.xxxxx.xxxxx one. And of course it
> breaks my SG configuration and can't be production solution.
>> Which brings up the question of why you are using SG to block adverts?
>>
>> squid.conf:
>>   acl ads dstdomain .doubleclick.net
>>   http_access deny ads
>>
>> Amos
>>
>>
> I don't use SG to specificaly block adverts, I use it to block 90 % of the web. Here it's just an example with ads but it could be with so much other things...
>
> I just want to try make SG and ssl_bump live together.
>
> Is this possible to have a rule like "if it has been rewrite then don't try to ssl_bump"?
>
> Regards, EG
>
>
> _______________________________________________
> squid-users mailing list
> [hidden email]
> http://lists.squid-cache.org/listinfo/squid-users
>
_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: ssl bump and url_rewrite_program (like squidguard)

Edouard Gaulué
Hi Marcus,

Well that just an URL rewriter program. You can just test it from the
command line :
echo "URL" | /usr/bin/squidGuard -c /etc/squidguard/squidGuard.conf

Before I understood it was possible to precise the redirect code I got that:
#> echo
"https://ad.doubleclick.net/N4061/adi/com.ythome/_default;sz=970x250;tile=1;ssl=1;dc_yt=1;kbsg=HPFR151103;kga=-1;kgg=-1;klg=fr;kmyd=ad_creative_1;ytexp=9406852,9408210,9408502,9417689,9419444,9419802,9420440,9420473,9421645,9421711,9422141,9422865,9423510,9423563,9423789;ord=968558538238386?
- - GET"|/usr/bin/squidGuard -c /etc/squidguard/squidGuard.conf
#> OK
rewrite-url="https://proxyweb.XXXXX.XXXXX/cgi-bin/squidGuard-simple.cgi?clientaddr=-pipo&clientname=&clientuser=&clientgroup=default&targetgroup=unknown&url=https://ad.doubleclick.net/N4061/adi/com.ythome/_default;sz=970x250;tile=1;ssl=1;dc_yt=1;kbsg=HPFR151103;kga=-1;kgg=-1;klg=fr;kmyd=ad_creative_1;ytexp=9406852,9408210,9408502,9417689,9419444,9419802,9420440,9420473,9421645,9421711,9422141,9422865,9423510,9423563,9423789;ord=968558538238386?"

After a little change in the squidguard.conf, I get:
#> OK status=302
url="https://proxyweb.echoppe.lan/cgi-bin/squidGuard-simple.cgi?clientaddr=-pipo&clientname=&clientuser=&clientgroup=default&targetgroup=unknown&url=https://ad.doubleclick.net/N4061/adi/com.ythome/_default;sz=970x250;tile=1;ssl=1;dc_yt=1;kbsg=HPFR151103;kga=-1;kgg=-1;klg=fr;kmyd=ad_creative_1;ytexp=9406852,9408210,9408502,9417689,9419444,9419802,9420440,9420473,9421645,9421711,9422141,9422865,9423510,9423563,9423789;ord=968558538238386?"

It's not so better handled by my browser saying "can't connect to
https://ad.doubleclick.net" message. But, I don't get the squid message
anymore regarding http/https.

It may be that rewrite_rule_program come after peek and splice stuff
leading squid to an unpredictable situation. Is there a way to play on
order things happen in squid?

Regards, EG


Le 04/11/2015 14:10, Marcus Kool a écrit :

> You need to know what squidGuard actually sends to Squid.
> squidGuard does not have a debug option for this, so you have to set
>    debug_options ALL,1 61,9
> in squid.conf to see what Squid receives.
> I bet that what Squid receives, is what it complains about:
> the URL starts with 'https://http'
>
> Marcus
>
> On 11/04/2015 10:55 AM, Edouard Gaulué wrote:
>> Le 04/11/2015 11:00, Amos Jeffries a écrit :
>>> On 4/11/2015 12:48 p.m., Marcus Kool wrote:
>>>> I suspect that the problem is that you redirect a HTTPS-based URL
>>>> to an
>>>> HTTP URL and Squid does not like that.
>>>>
>>>> Marcus
>> To give it a try in that direction I now redirect to an https server.
>> And I get :
>>
>> The following error was encountered while trying to retrieve the URL:
>> https://https/*
>>
>>     *Unable to determine IP address from host name "https"*
>>
>> The DNS server returned:
>>
>>     Name Error: The domain name does not exist.
>>
>>
>> Moreover this would leads sometimes to HTTP-based URL to an HTTPS URL
>> and I don't know how much squid likes it either.
>>
>>> No it is apparently the fact that the domain name being redirected
>>> to is
>>> "http".
>>>
>>> As in:"http://http/something"
>>>
>> I can assure my rewrite_url looks like
>> "https://proxyweb.xxxxx.xxxxx/var1=xxxx&...".
>>
>> And this confirm ssl_bump parse this result and get the left part
>> before the ":". To play with, I have also redirect to
>> "proxyweb.xxxxx.xxxxx:443/var1=xxxx&..." (ie. I removed the
>> "https://" and add a
>> ":443") to force the parsing. Then I don't get this message anymore,
>> but Mozilla gets crazy waiting for the ad.doubleclick.net certificate
>> and getting the proxyweb.xxxxx.xxxxx one. And of course it
>> breaks my SG configuration and can't be production solution.
>>> Which brings up the question of why you are using SG to block adverts?
>>>
>>> squid.conf:
>>>   acl ads dstdomain .doubleclick.net
>>>   http_access deny ads
>>>
>>> Amos
>>>
>>>
>> I don't use SG to specificaly block adverts, I use it to block 90 %
>> of the web. Here it's just an example with ads but it could be with
>> so much other things...
>>
>> I just want to try make SG and ssl_bump live together.
>>
>> Is this possible to have a rule like "if it has been rewrite then
>> don't try to ssl_bump"?
>>
>> Regards, EG
>>
>>
>> _______________________________________________
>> squid-users mailing list
>> [hidden email]
>> http://lists.squid-cache.org/listinfo/squid-users
>>

_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: ssl bump and url_rewrite_program (like squidguard)

Amos Jeffries
Administrator
On 5/11/2015 11:55 a.m., Edouard Gaulué wrote:

> Hi Marcus,
>
> Well that just an URL rewriter program. You can just test it from the
> command line :
> echo "URL" | /usr/bin/squidGuard -c /etc/squidguard/squidGuard.conf
>
> Before I understood it was possible to precise the redirect code I got
> that:
> #> echo
> "https://ad.doubleclick.net/N4061/adi/com.ythome/_default;sz=970x250;tile=1;ssl=1;dc_yt=1;kbsg=HPFR151103;kga=-1;kgg=-1;klg=fr;kmyd=ad_creative_1;ytexp=9406852,9408210,9408502,9417689,9419444,9419802,9420440,9420473,9421645,9421711,9422141,9422865,9423510,9423563,9423789;ord=968558538238386?
> - - GET"|/usr/bin/squidGuard -c /etc/squidguard/squidGuard.conf
> #> OK
> rewrite-url="https://proxyweb.XXXXX.XXXXX/cgi-bin/squidGuard-simple.cgi?clientaddr=-pipo&clientname=&clientuser=&clientgroup=default&targetgroup=unknown&url=https://ad.doubleclick.net/N4061/adi/com.ythome/_default;sz=970x250;tile=1;ssl=1;dc_yt=1;kbsg=HPFR151103;kga=-1;kgg=-1;klg=fr;kmyd=ad_creative_1;ytexp=9406852,9408210,9408502,9417689,9419444,9419802,9420440,9420473,9421645,9421711,9422141,9422865,9423510,9423563,9423789;ord=968558538238386?"
>
>
> After a little change in the squidguard.conf, I get:
> #> OK status=302
> url="https://proxyweb.echoppe.lan/cgi-bin/squidGuard-simple.cgi?clientaddr=-pipo&clientname=&clientuser=&clientgroup=default&targetgroup=unknown&url=https://ad.doubleclick.net/N4061/adi/com.ythome/_default;sz=970x250;tile=1;ssl=1;dc_yt=1;kbsg=HPFR151103;kga=-1;kgg=-1;klg=fr;kmyd=ad_creative_1;ytexp=9406852,9408210,9408502,9417689,9419444,9419802,9420440,9420473,9421645,9421711,9422141,9422865,9423510,9423563,9423789;ord=968558538238386?"
>
>
> It's not so better handled by my browser saying "can't connect to
> https://ad.doubleclick.net" message. But, I don't get the squid message
> anymore regarding http/https.


What Squid version?
 There was a bug about the wrong SNI being sent to servers on bumped
traffic that got re-written. That got fixed in Squid-3.5.7 and
re-writers should have been fully working since then.

Note that CONNECT requests should not be re-written though. We dont
prevent it automatically because it is sometimes actually useful, but SG
cannot handle them correctly.

>
> It may be that rewrite_rule_program come after peek and splice stuff
> leading squid to an unpredictable situation. Is there a way to play on
> order things happen in squid?

debug_options to raise the amount of output each part of Squid produces
in cache.log.

A lits of the sections can be found at
<http://wiki.squid-cache.org/KnowledgeBase/DebugSections> - slightly
outdated, but not much changes with these. Or the latest list in
doc/debug-sections.txt of the Squid sources.

Amos

_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: ssl bump and url_rewrite_program (like squidguard)

Marcus Kool
In reply to this post by Edouard Gaulué


On 11/04/2015 08:55 PM, Edouard Gaulué wrote:

> Hi Marcus,
>
> Well that just an URL rewriter program. You can just test it from the command line :
> echo "URL" | /usr/bin/squidGuard -c /etc/squidguard/squidGuard.conf
>
> Before I understood it was possible to precise the redirect code I got that:
> #> echo
> "https://ad.doubleclick.net/N4061/adi/com.ythome/_default;sz=970x250;tile=1;ssl=1;dc_yt=1;kbsg=HPFR151103;kga=-1;kgg=-1;klg=fr;kmyd=ad_creative_1;ytexp=9406852,9408210,9408502,9417689,9419444,9419802,9420440,9420473,9421645,9421711,9422141,9422865,9423510,9423563,9423789;ord=968558538238386?
> - - GET"|/usr/bin/squidGuard -c /etc/squidguard/squidGuard.conf
> #> OK
> rewrite-url="https://proxyweb.XXXXX.XXXXX/cgi-bin/squidGuard-simple.cgi?clientaddr=-pipo&clientname=&clientuser=&clientgroup=default&targetgroup=unknown&url=https://ad.doubleclick.net/N4061/adi/com.ythome/_default;sz=970x250;tile=1;ssl=1;dc_yt=1;kbsg=HPFR151103;kga=-1;kgg=-1;klg=fr;kmyd=ad_creative_1;ytexp=9406852,9408210,9408502,9417689,9419444,9419802,9420440,9420473,9421645,9421711,9422141,9422865,9423510,9423563,9423789;ord=968558538238386?"
>
>
> After a little change in the squidguard.conf, I get:
> #> OK status=302
> url="https://proxyweb.echoppe.lan/cgi-bin/squidGuard-simple.cgi?clientaddr=-pipo&clientname=&clientuser=&clientgroup=default&targetgroup=unknown&url=https://ad.doubleclick.net/N4061/adi/com.ythome/_default;sz=970x250;tile=1;ssl=1;dc_yt=1;kbsg=HPFR151103;kga=-1;kgg=-1;klg=fr;kmyd=ad_creative_1;ytexp=9406852,9408210,9408502,9417689,9419444,9419802,9420440,9420473,9421645,9421711,9422141,9422865,9423510,9423563,9423789;ord=968558538238386?"

This looks fine, so now you need to look at Squid and set the debug options to find out what it is doing.

Note that squidGuard does not percent-escape the URL parameter as it should (see RFC 3986).
This is, however, most likely not the cause of the issue that you are seeing.

Marcus

>
> It's not so better handled by my browser saying "can't connect to https://ad.doubleclick.net" message. But, I don't get the squid message anymore regarding http/https.
>
> It may be that rewrite_rule_program come after peek and splice stuff leading squid to an unpredictable situation. Is there a way to play on order things happen in squid?
>
> Regards, EG
>
>
> Le 04/11/2015 14:10, Marcus Kool a écrit :
>> You need to know what squidGuard actually sends to Squid.
>> squidGuard does not have a debug option for this, so you have to set
>>    debug_options ALL,1 61,9
>> in squid.conf to see what Squid receives.
>> I bet that what Squid receives, is what it complains about:
>> the URL starts with 'https://http'
>>
>> Marcus
>>
>> On 11/04/2015 10:55 AM, Edouard Gaulué wrote:
>>> Le 04/11/2015 11:00, Amos Jeffries a écrit :
>>>> On 4/11/2015 12:48 p.m., Marcus Kool wrote:
>>>>> I suspect that the problem is that you redirect a HTTPS-based URL to an
>>>>> HTTP URL and Squid does not like that.
>>>>>
>>>>> Marcus
>>> To give it a try in that direction I now redirect to an https server. And I get :
>>>
>>> The following error was encountered while trying to retrieve the URL: https://https/*
>>>
>>>     *Unable to determine IP address from host name "https"*
>>>
>>> The DNS server returned:
>>>
>>>     Name Error: The domain name does not exist.
>>>
>>>
>>> Moreover this would leads sometimes to HTTP-based URL to an HTTPS URL and I don't know how much squid likes it either.
>>>
>>>> No it is apparently the fact that the domain name being redirected to is
>>>> "http".
>>>>
>>>> As in:"http://http/something"
>>>>
>>> I can assure my rewrite_url looks like "https://proxyweb.xxxxx.xxxxx/var1=xxxx&...".
>>>
>>> And this confirm ssl_bump parse this result and get the left part before the ":". To play with, I have also redirect to "proxyweb.xxxxx.xxxxx:443/var1=xxxx&..." (ie. I removed the "https://" and add a
>>> ":443") to force the parsing. Then I don't get this message anymore, but Mozilla gets crazy waiting for the ad.doubleclick.net certificate and getting the proxyweb.xxxxx.xxxxx one. And of course it
>>> breaks my SG configuration and can't be production solution.
>>>> Which brings up the question of why you are using SG to block adverts?
>>>>
>>>> squid.conf:
>>>>   acl ads dstdomain .doubleclick.net
>>>>   http_access deny ads
>>>>
>>>> Amos
>>>>
>>>>
>>> I don't use SG to specificaly block adverts, I use it to block 90 % of the web. Here it's just an example with ads but it could be with so much other things...
>>>
>>> I just want to try make SG and ssl_bump live together.
>>>
>>> Is this possible to have a rule like "if it has been rewrite then don't try to ssl_bump"?
>>>
>>> Regards, EG
>>>
>>>
>>> _______________________________________________
>>> squid-users mailing list
>>> [hidden email]
>>> http://lists.squid-cache.org/listinfo/squid-users
>>>
>
>
>
_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: ssl bump and url_rewrite_program (like squidguard)

Edouard Gaulué
Hi Marcus, Amos and maybe others,

Here were I am. I've looked in the log. Let me describe what I observe.
It's maybe linked with some other posts I've read.

Imagine I try to connect to http://ad.doubleclick.net/ad.jpg. I observe
the request in wireshark. It goes to the squid process: there is no SSL
involved so no bump. Squidguard sends its redirect to squid and Squid
sends to the browser :
HTTP/1.1 302 Found
Server: squid/3.5.10
Date: Wed, 11 Nov 2015 22:49:44 GMT
Content-Length: 0
Location:
https://proxyweb.xxx.xxx/cgi-bin/squidGuard-simple.cgi?clientaddr=xxx&clientgroup=low-ip&targetgroup=adv&url=http://ad.doubleclick.net/ad.jpg
X-Cache: MISS from squid
X-Cache-Lookup: MISS from squid:3128
Via: 1.1 squid (squid/3.5.10)
Connection: keep-alive

The browser next sends a request to proxyweb: everything is fine.


Now let's add some SSL stuff and connect to
https://ad.doubleclick.net/ad.jpg. I observe the request in wireshark.
It goes to the squid process with SSL. Squidguard sends its redirect to
squid and Squid sends to the browser :
HTTP/1.1 302 Found
Server: squid/3.5.10
Date: Wed, 11 Nov 2015 22:49:44 GMT
Content-Length: 0
Location:
https://proxyweb.xxx.xxx/cgi-bin/squidGuard-simple.cgi?clientaddr=xxx&clientgroup=low-ip&targetgroup=adv&url=http://ad.doubleclick.net/ad.jpg
X-Cache: MISS from squid

The browser next tries to send a direct request to
https://ad.doubleclick.net that is never ending as this machine hasn't
got the right to go on the Internet.

Why is the REDIRECT not the same with and without SSL? It looks it
disturbs the navigator (at least Mozilla and IE).

I can also provide squid logs, but tell me what because I've got a lot...

Regards, EG


Le 05/11/2015 14:01, Marcus Kool a écrit :

>
>
> On 11/04/2015 08:55 PM, Edouard Gaulué wrote:
>> Hi Marcus,
>>
>> Well that just an URL rewriter program. You can just test it from the
>> command line :
>> echo "URL" | /usr/bin/squidGuard -c /etc/squidguard/squidGuard.conf
>>
>> Before I understood it was possible to precise the redirect code I
>> got that:
>> #> echo
>> "https://ad.doubleclick.net/N4061/adi/com.ythome/_default;sz=970x250;tile=1;ssl=1;dc_yt=1;kbsg=HPFR151103;kga=-1;kgg=-1;klg=fr;kmyd=ad_creative_1;ytexp=9406852,9408210,9408502,9417689,9419444,9419802,9420440,9420473,9421645,9421711,9422141,9422865,9423510,9423563,9423789;ord=968558538238386?
>>
>> - - GET"|/usr/bin/squidGuard -c /etc/squidguard/squidGuard.conf
>> #> OK
>> rewrite-url="https://proxyweb.XXXXX.XXXXX/cgi-bin/squidGuard-simple.cgi?clientaddr=-pipo&clientname=&clientuser=&clientgroup=default&targetgroup=unknown&url=https://ad.doubleclick.net/N4061/adi/com.ythome/_default;sz=970x250;tile=1;ssl=1;dc_yt=1;kbsg=HPFR151103;kga=-1;kgg=-1;klg=fr;kmyd=ad_creative_1;ytexp=9406852,9408210,9408502,9417689,9419444,9419802,9420440,9420473,9421645,9421711,9422141,9422865,9423510,9423563,9423789;ord=968558538238386?"
>>
>>
>>
>> After a little change in the squidguard.conf, I get:
>> #> OK status=302
>> url="https://proxyweb.echoppe.lan/cgi-bin/squidGuard-simple.cgi?clientaddr=-pipo&clientname=&clientuser=&clientgroup=default&targetgroup=unknown&url=https://ad.doubleclick.net/N4061/adi/com.ythome/_default;sz=970x250;tile=1;ssl=1;dc_yt=1;kbsg=HPFR151103;kga=-1;kgg=-1;klg=fr;kmyd=ad_creative_1;ytexp=9406852,9408210,9408502,9417689,9419444,9419802,9420440,9420473,9421645,9421711,9422141,9422865,9423510,9423563,9423789;ord=968558538238386?"
>>
>
> This looks fine, so now you need to look at Squid and set the debug
> options to find out what it is doing.
>
> Note that squidGuard does not percent-escape the URL parameter as it
> should (see RFC 3986).
> This is, however, most likely not the cause of the issue that you are
> seeing.
>
> Marcus
>
>>
>> It's not so better handled by my browser saying "can't connect to
>> https://ad.doubleclick.net" message. But, I don't get the squid
>> message anymore regarding http/https.
>>
>> It may be that rewrite_rule_program come after peek and splice stuff
>> leading squid to an unpredictable situation. Is there a way to play
>> on order things happen in squid?
>>
>> Regards, EG
>>
>>
>> Le 04/11/2015 14:10, Marcus Kool a écrit :
>>> You need to know what squidGuard actually sends to Squid.
>>> squidGuard does not have a debug option for this, so you have to set
>>>    debug_options ALL,1 61,9
>>> in squid.conf to see what Squid receives.
>>> I bet that what Squid receives, is what it complains about:
>>> the URL starts with 'https://http'
>>>
>>> Marcus
>>>
>>> On 11/04/2015 10:55 AM, Edouard Gaulué wrote:
>>>> Le 04/11/2015 11:00, Amos Jeffries a écrit :
>>>>> On 4/11/2015 12:48 p.m., Marcus Kool wrote:
>>>>>> I suspect that the problem is that you redirect a HTTPS-based URL
>>>>>> to an
>>>>>> HTTP URL and Squid does not like that.
>>>>>>
>>>>>> Marcus
>>>> To give it a try in that direction I now redirect to an https
>>>> server. And I get :
>>>>
>>>> The following error was encountered while trying to retrieve the
>>>> URL: https://https/*
>>>>
>>>>     *Unable to determine IP address from host name "https"*
>>>>
>>>> The DNS server returned:
>>>>
>>>>     Name Error: The domain name does not exist.
>>>>
>>>>
>>>> Moreover this would leads sometimes to HTTP-based URL to an HTTPS
>>>> URL and I don't know how much squid likes it either.
>>>>
>>>>> No it is apparently the fact that the domain name being redirected
>>>>> to is
>>>>> "http".
>>>>>
>>>>> As in:"http://http/something"
>>>>>
>>>> I can assure my rewrite_url looks like
>>>> "https://proxyweb.xxxxx.xxxxx/var1=xxxx&...".
>>>>
>>>> And this confirm ssl_bump parse this result and get the left part
>>>> before the ":". To play with, I have also redirect to
>>>> "proxyweb.xxxxx.xxxxx:443/var1=xxxx&..." (ie. I removed the
>>>> "https://" and add a
>>>> ":443") to force the parsing. Then I don't get this message
>>>> anymore, but Mozilla gets crazy waiting for the ad.doubleclick.net
>>>> certificate and getting the proxyweb.xxxxx.xxxxx one. And of course it
>>>> breaks my SG configuration and can't be production solution.
>>>>> Which brings up the question of why you are using SG to block
>>>>> adverts?
>>>>>
>>>>> squid.conf:
>>>>>   acl ads dstdomain .doubleclick.net
>>>>>   http_access deny ads
>>>>>
>>>>> Amos
>>>>>
>>>>>
>>>> I don't use SG to specificaly block adverts, I use it to block 90 %
>>>> of the web. Here it's just an example with ads but it could be with
>>>> so much other things...
>>>>
>>>> I just want to try make SG and ssl_bump live together.
>>>>
>>>> Is this possible to have a rule like "if it has been rewrite then
>>>> don't try to ssl_bump"?
>>>>
>>>> Regards, EG
>>>>
>>>>
>>>> _______________________________________________
>>>> squid-users mailing list
>>>> [hidden email]
>>>> http://lists.squid-cache.org/listinfo/squid-users
>>>>
>>
>>
>>
> _______________________________________________
> squid-users mailing list
> [hidden email]
> http://lists.squid-cache.org/listinfo/squid-users

_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: ssl bump and url_rewrite_program (like squidguard)

Marcus Kool


On 11/12/2015 07:03 AM, Edouard Gaulué wrote:

> Hi Marcus, Amos and maybe others,
>
> Here were I am. I've looked in the log. Let me describe what I observe. It's maybe linked with some other posts I've read.
>
> Imagine I try to connect to http://ad.doubleclick.net/ad.jpg. I observe the request in wireshark. It goes to the squid process: there is no SSL involved so no bump. Squidguard sends its redirect to
> squid and Squid sends to the browser :
> HTTP/1.1 302 Found
> Server: squid/3.5.10
> Date: Wed, 11 Nov 2015 22:49:44 GMT
> Content-Length: 0
> Location: https://proxyweb.xxx.xxx/cgi-bin/squidGuard-simple.cgi?clientaddr=xxx&clientgroup=low-ip&targetgroup=adv&url=http://ad.doubleclick.net/ad.jpg
> X-Cache: MISS from squid
> X-Cache-Lookup: MISS from squid:3128
> Via: 1.1 squid (squid/3.5.10)
> Connection: keep-alive
>
> The browser next sends a request to proxyweb: everything is fine.
>
>
> Now let's add some SSL stuff and connect to https://ad.doubleclick.net/ad.jpg. I observe the request in wireshark. It goes to the squid process with SSL. Squidguard sends its redirect to squid and
> Squid sends to the browser :
> HTTP/1.1 302 Found
> Server: squid/3.5.10
> Date: Wed, 11 Nov 2015 22:49:44 GMT
> Content-Length: 0
> Location: https://proxyweb.xxx.xxx/cgi-bin/squidGuard-simple.cgi?clientaddr=xxx&clientgroup=low-ip&targetgroup=adv&url=http://ad.doubleclick.net/ad.jpg
> X-Cache: MISS from squid
>
> The browser next tries to send a direct request to https://ad.doubleclick.net that is never ending as this machine hasn't got the right to go on the Internet.

This is the strange part!  Why would the browser connect to ad.doubleclick.net if SG redirected the request to proxyweb.xxx.xxx ?
I think you need to increase debug levels of Squid and show the fragments of cache.log where Squid receives the redirection from SG and what happens next.

> Why is the REDIRECT not the same with and without SSL? It looks it disturbs the navigator (at least Mozilla and IE).

SSL is a protocol designed not to be tampered with.  The SSL protocol has no support for redirection so any redirection by Squid or an other proxy is an attempt to break the SSL protocol.
Redirection with HTTP is simple because the HTTP protocol has a built-in mechanism for redirection that proxies can use.

Marcus

> I can also provide squid logs, but tell me what because I've got a lot...
>
> Regards, EG
>
>
> Le 05/11/2015 14:01, Marcus Kool a écrit :
>>
>>
>> On 11/04/2015 08:55 PM, Edouard Gaulué wrote:
>>> Hi Marcus,
>>>
>>> Well that just an URL rewriter program. You can just test it from the command line :
>>> echo "URL" | /usr/bin/squidGuard -c /etc/squidguard/squidGuard.conf
>>>
>>> Before I understood it was possible to precise the redirect code I got that:
>>> #> echo
>>> "https://ad.doubleclick.net/N4061/adi/com.ythome/_default;sz=970x250;tile=1;ssl=1;dc_yt=1;kbsg=HPFR151103;kga=-1;kgg=-1;klg=fr;kmyd=ad_creative_1;ytexp=9406852,9408210,9408502,9417689,9419444,9419802,9420440,9420473,9421645,9421711,9422141,9422865,9423510,9423563,9423789;ord=968558538238386?
>>>
>>> - - GET"|/usr/bin/squidGuard -c /etc/squidguard/squidGuard.conf
>>> #> OK
>>> rewrite-url="https://proxyweb.XXXXX.XXXXX/cgi-bin/squidGuard-simple.cgi?clientaddr=-pipo&clientname=&clientuser=&clientgroup=default&targetgroup=unknown&url=https://ad.doubleclick.net/N4061/adi/com.ythome/_default;sz=970x250;tile=1;ssl=1;dc_yt=1;kbsg=HPFR151103;kga=-1;kgg=-1;klg=fr;kmyd=ad_creative_1;ytexp=9406852,9408210,9408502,9417689,9419444,9419802,9420440,9420473,9421645,9421711,9422141,9422865,9423510,9423563,9423789;ord=968558538238386?"
>>>
>>>
>>>
>>> After a little change in the squidguard.conf, I get:
>>> #> OK status=302
>>> url="https://proxyweb.echoppe.lan/cgi-bin/squidGuard-simple.cgi?clientaddr=-pipo&clientname=&clientuser=&clientgroup=default&targetgroup=unknown&url=https://ad.doubleclick.net/N4061/adi/com.ythome/_default;sz=970x250;tile=1;ssl=1;dc_yt=1;kbsg=HPFR151103;kga=-1;kgg=-1;klg=fr;kmyd=ad_creative_1;ytexp=9406852,9408210,9408502,9417689,9419444,9419802,9420440,9420473,9421645,9421711,9422141,9422865,9423510,9423563,9423789;ord=968558538238386?"
>>>
>>
>> This looks fine, so now you need to look at Squid and set the debug options to find out what it is doing.
>>
>> Note that squidGuard does not percent-escape the URL parameter as it should (see RFC 3986).
>> This is, however, most likely not the cause of the issue that you are seeing.
>>
>> Marcus
>>
>>>
>>> It's not so better handled by my browser saying "can't connect to https://ad.doubleclick.net" message. But, I don't get the squid message anymore regarding http/https.
>>>
>>> It may be that rewrite_rule_program come after peek and splice stuff leading squid to an unpredictable situation. Is there a way to play on order things happen in squid?
>>>
>>> Regards, EG
>>>
>>>
>>> Le 04/11/2015 14:10, Marcus Kool a écrit :
>>>> You need to know what squidGuard actually sends to Squid.
>>>> squidGuard does not have a debug option for this, so you have to set
>>>>    debug_options ALL,1 61,9
>>>> in squid.conf to see what Squid receives.
>>>> I bet that what Squid receives, is what it complains about:
>>>> the URL starts with 'https://http'
>>>>
>>>> Marcus
>>>>
>>>> On 11/04/2015 10:55 AM, Edouard Gaulué wrote:
>>>>> Le 04/11/2015 11:00, Amos Jeffries a écrit :
>>>>>> On 4/11/2015 12:48 p.m., Marcus Kool wrote:
>>>>>>> I suspect that the problem is that you redirect a HTTPS-based URL to an
>>>>>>> HTTP URL and Squid does not like that.
>>>>>>>
>>>>>>> Marcus
>>>>> To give it a try in that direction I now redirect to an https server. And I get :
>>>>>
>>>>> The following error was encountered while trying to retrieve the URL: https://https/*
>>>>>
>>>>>     *Unable to determine IP address from host name "https"*
>>>>>
>>>>> The DNS server returned:
>>>>>
>>>>>     Name Error: The domain name does not exist.
>>>>>
>>>>>
>>>>> Moreover this would leads sometimes to HTTP-based URL to an HTTPS URL and I don't know how much squid likes it either.
>>>>>
>>>>>> No it is apparently the fact that the domain name being redirected to is
>>>>>> "http".
>>>>>>
>>>>>> As in:"http://http/something"
>>>>>>
>>>>> I can assure my rewrite_url looks like "https://proxyweb.xxxxx.xxxxx/var1=xxxx&...".
>>>>>
>>>>> And this confirm ssl_bump parse this result and get the left part before the ":". To play with, I have also redirect to "proxyweb.xxxxx.xxxxx:443/var1=xxxx&..." (ie. I removed the "https://" and
>>>>> add a
>>>>> ":443") to force the parsing. Then I don't get this message anymore, but Mozilla gets crazy waiting for the ad.doubleclick.net certificate and getting the proxyweb.xxxxx.xxxxx one. And of course it
>>>>> breaks my SG configuration and can't be production solution.
>>>>>> Which brings up the question of why you are using SG to block adverts?
>>>>>>
>>>>>> squid.conf:
>>>>>>   acl ads dstdomain .doubleclick.net
>>>>>>   http_access deny ads
>>>>>>
>>>>>> Amos
>>>>>>
>>>>>>
>>>>> I don't use SG to specificaly block adverts, I use it to block 90 % of the web. Here it's just an example with ads but it could be with so much other things...
>>>>>
>>>>> I just want to try make SG and ssl_bump live together.
>>>>>
>>>>> Is this possible to have a rule like "if it has been rewrite then don't try to ssl_bump"?
>>>>>
>>>>> Regards, EG
>>>>>
>>>>>
>>>>> _______________________________________________
>>>>> squid-users mailing list
>>>>> [hidden email]
>>>>> http://lists.squid-cache.org/listinfo/squid-users
>>>>>
>>>
>>>
>>>
>> _______________________________________________
>> squid-users mailing list
>> [hidden email]
>> http://lists.squid-cache.org/listinfo/squid-users
>
> _______________________________________________
> squid-users mailing list
> [hidden email]
> http://lists.squid-cache.org/listinfo/squid-users
_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: ssl bump and url_rewrite_program (like squidguard)

Edouard Gaulué
In reply to this post by Edouard Gaulué
Hi again,

Just forget what I said about REDIRECT answers, there are the same with
or without SSL (it was a side effect of "-C5" on my logs grep).

But, why are browsers handling that in a different way?

Without SSL, it's all right. With SSL it's getting to the conclusion it
should try to connect directly...

Regards, EG

Le 12/11/2015 10:03, Edouard Gaulué a écrit :

> Hi Marcus, Amos and maybe others,
>
> Here were I am. I've looked in the log. Let me describe what I
> observe. It's maybe linked with some other posts I've read.
>
> Imagine I try to connect to http://ad.doubleclick.net/ad.jpg. I
> observe the request in wireshark. It goes to the squid process: there
> is no SSL involved so no bump. Squidguard sends its redirect to squid
> and Squid sends to the browser :
> HTTP/1.1 302 Found
> Server: squid/3.5.10
> Date: Wed, 11 Nov 2015 22:49:44 GMT
> Content-Length: 0
> Location:
> https://proxyweb.xxx.xxx/cgi-bin/squidGuard-simple.cgi?clientaddr=xxx&clientgroup=low-ip&targetgroup=adv&url=http://ad.doubleclick.net/ad.jpg
> X-Cache: MISS from squid
> X-Cache-Lookup: MISS from squid:3128
> Via: 1.1 squid (squid/3.5.10)
> Connection: keep-alive
>
> The browser next sends a request to proxyweb: everything is fine.
>
>
> Now let's add some SSL stuff and connect to
> https://ad.doubleclick.net/ad.jpg. I observe the request in wireshark.
> It goes to the squid process with SSL. Squidguard sends its redirect
> to squid and Squid sends to the browser :
> HTTP/1.1 302 Found
> Server: squid/3.5.10
> Date: Wed, 11 Nov 2015 22:49:44 GMT
> Content-Length: 0
> Location:
> https://proxyweb.xxx.xxx/cgi-bin/squidGuard-simple.cgi?clientaddr=xxx&clientgroup=low-ip&targetgroup=adv&url=http://ad.doubleclick.net/ad.jpg
> X-Cache: MISS from squid
>
> The browser next tries to send a direct request to
> https://ad.doubleclick.net that is never ending as this machine hasn't
> got the right to go on the Internet.
>
> Why is the REDIRECT not the same with and without SSL? It looks it
> disturbs the navigator (at least Mozilla and IE).
>
> I can also provide squid logs, but tell me what because I've got a lot...
>
> Regards, EG
>
>
> Le 05/11/2015 14:01, Marcus Kool a écrit :
>>
>>
>> On 11/04/2015 08:55 PM, Edouard Gaulué wrote:
>>> Hi Marcus,
>>>
>>> Well that just an URL rewriter program. You can just test it from
>>> the command line :
>>> echo "URL" | /usr/bin/squidGuard -c /etc/squidguard/squidGuard.conf
>>>
>>> Before I understood it was possible to precise the redirect code I
>>> got that:
>>> #> echo
>>> "https://ad.doubleclick.net/N4061/adi/com.ythome/_default;sz=970x250;tile=1;ssl=1;dc_yt=1;kbsg=HPFR151103;kga=-1;kgg=-1;klg=fr;kmyd=ad_creative_1;ytexp=9406852,9408210,9408502,9417689,9419444,9419802,9420440,9420473,9421645,9421711,9422141,9422865,9423510,9423563,9423789;ord=968558538238386?
>>>
>>> - - GET"|/usr/bin/squidGuard -c /etc/squidguard/squidGuard.conf
>>> #> OK
>>> rewrite-url="https://proxyweb.XXXXX.XXXXX/cgi-bin/squidGuard-simple.cgi?clientaddr=-pipo&clientname=&clientuser=&clientgroup=default&targetgroup=unknown&url=https://ad.doubleclick.net/N4061/adi/com.ythome/_default;sz=970x250;tile=1;ssl=1;dc_yt=1;kbsg=HPFR151103;kga=-1;kgg=-1;klg=fr;kmyd=ad_creative_1;ytexp=9406852,9408210,9408502,9417689,9419444,9419802,9420440,9420473,9421645,9421711,9422141,9422865,9423510,9423563,9423789;ord=968558538238386?"
>>>
>>>
>>>
>>> After a little change in the squidguard.conf, I get:
>>> #> OK status=302
>>> url="https://proxyweb.echoppe.lan/cgi-bin/squidGuard-simple.cgi?clientaddr=-pipo&clientname=&clientuser=&clientgroup=default&targetgroup=unknown&url=https://ad.doubleclick.net/N4061/adi/com.ythome/_default;sz=970x250;tile=1;ssl=1;dc_yt=1;kbsg=HPFR151103;kga=-1;kgg=-1;klg=fr;kmyd=ad_creative_1;ytexp=9406852,9408210,9408502,9417689,9419444,9419802,9420440,9420473,9421645,9421711,9422141,9422865,9423510,9423563,9423789;ord=968558538238386?"
>>>
>>
>> This looks fine, so now you need to look at Squid and set the debug
>> options to find out what it is doing.
>>
>> Note that squidGuard does not percent-escape the URL parameter as it
>> should (see RFC 3986).
>> This is, however, most likely not the cause of the issue that you are
>> seeing.
>>
>> Marcus
>>
>>>
>>> It's not so better handled by my browser saying "can't connect to
>>> https://ad.doubleclick.net" message. But, I don't get the squid
>>> message anymore regarding http/https.
>>>
>>> It may be that rewrite_rule_program come after peek and splice stuff
>>> leading squid to an unpredictable situation. Is there a way to play
>>> on order things happen in squid?
>>>
>>> Regards, EG
>>>
>>>
>>> Le 04/11/2015 14:10, Marcus Kool a écrit :
>>>> You need to know what squidGuard actually sends to Squid.
>>>> squidGuard does not have a debug option for this, so you have to set
>>>>    debug_options ALL,1 61,9
>>>> in squid.conf to see what Squid receives.
>>>> I bet that what Squid receives, is what it complains about:
>>>> the URL starts with 'https://http'
>>>>
>>>> Marcus
>>>>
>>>> On 11/04/2015 10:55 AM, Edouard Gaulué wrote:
>>>>> Le 04/11/2015 11:00, Amos Jeffries a écrit :
>>>>>> On 4/11/2015 12:48 p.m., Marcus Kool wrote:
>>>>>>> I suspect that the problem is that you redirect a HTTPS-based
>>>>>>> URL to an
>>>>>>> HTTP URL and Squid does not like that.
>>>>>>>
>>>>>>> Marcus
>>>>> To give it a try in that direction I now redirect to an https
>>>>> server. And I get :
>>>>>
>>>>> The following error was encountered while trying to retrieve the
>>>>> URL: https://https/*
>>>>>
>>>>>     *Unable to determine IP address from host name "https"*
>>>>>
>>>>> The DNS server returned:
>>>>>
>>>>>     Name Error: The domain name does not exist.
>>>>>
>>>>>
>>>>> Moreover this would leads sometimes to HTTP-based URL to an HTTPS
>>>>> URL and I don't know how much squid likes it either.
>>>>>
>>>>>> No it is apparently the fact that the domain name being
>>>>>> redirected to is
>>>>>> "http".
>>>>>>
>>>>>> As in:"http://http/something"
>>>>>>
>>>>> I can assure my rewrite_url looks like
>>>>> "https://proxyweb.xxxxx.xxxxx/var1=xxxx&...".
>>>>>
>>>>> And this confirm ssl_bump parse this result and get the left part
>>>>> before the ":". To play with, I have also redirect to
>>>>> "proxyweb.xxxxx.xxxxx:443/var1=xxxx&..." (ie. I removed the
>>>>> "https://" and add a
>>>>> ":443") to force the parsing. Then I don't get this message
>>>>> anymore, but Mozilla gets crazy waiting for the ad.doubleclick.net
>>>>> certificate and getting the proxyweb.xxxxx.xxxxx one. And of
>>>>> course it
>>>>> breaks my SG configuration and can't be production solution.
>>>>>> Which brings up the question of why you are using SG to block
>>>>>> adverts?
>>>>>>
>>>>>> squid.conf:
>>>>>>   acl ads dstdomain .doubleclick.net
>>>>>>   http_access deny ads
>>>>>>
>>>>>> Amos
>>>>>>
>>>>>>
>>>>> I don't use SG to specificaly block adverts, I use it to block 90
>>>>> % of the web. Here it's just an example with ads but it could be
>>>>> with so much other things...
>>>>>
>>>>> I just want to try make SG and ssl_bump live together.
>>>>>
>>>>> Is this possible to have a rule like "if it has been rewrite then
>>>>> don't try to ssl_bump"?
>>>>>
>>>>> Regards, EG
>>>>>
>>>>>
>>>>> _______________________________________________
>>>>> squid-users mailing list
>>>>> [hidden email]
>>>>> http://lists.squid-cache.org/listinfo/squid-users
>>>>>
>>>
>>>
>>>
>> _______________________________________________
>> squid-users mailing list
>> [hidden email]
>> http://lists.squid-cache.org/listinfo/squid-users
>
> _______________________________________________
> squid-users mailing list
> [hidden email]
> http://lists.squid-cache.org/listinfo/squid-users

_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: ssl bump and url_rewrite_program (like squidguard)

Edouard Gaulué
In reply to this post by Marcus Kool
Hi Marcus and all,

I have option_debug ALL,2 61,9.

Logs don't tell me a lot, the squidguard answer is exactly the same with
or without ssl.

=======================

2015/11/12 11:51:13.320 kid1| 11,2| client_side.cc(2345)
parseHttpRequest: HTTP Client local=192.168.0.233:3128
remote=192.168.0.74:52719 FD 32 flags=1
2015/11/12 11:51:13.320 kid1| 11,2| client_side.cc(2346)
parseHttpRequest: HTTP Client REQUEST:
---------
GET
http://ad.doubleclick.net/N4061/adi/com.ythome/_default;sz=970x250;tile=1;ssl=1;dc_yt=1;kbsg=HPFR151111;kga=-1;kgg=-1;klg=fr;kmyd=ad_creative_1;ytexp=9405824,9415555,9416484,9416674,9417703,9418199,9419444,9420772,9421341,9421522,9421931,9421945,9422479,9423231,9423294,9423347,9423510,9423789;ord=5269238259430125?
HTTP/1.1
Host: ad.doubleclick.net
User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.10; rv:42.0)
Gecko/20100101 Firefox/42.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: fr,fr-FR;q=0.8,en-US;q=0.5,en;q=0.3
Accept-Encoding: gzip, deflate
Cookie:
id=22444c07d901000f||t=1399896339|et=730|cs=002213fd48651016fb03856b79;
IDE=AHWqTUlZo9sH_j9svI23Ge8QFYiXp8lJDU2dwdeEJthW3WouVnYC__mRag
Connection: keep-alive


----------
2015/11/12 11:51:13.361 kid1| 85,2| client_side_request.cc(741)
clientAccessCheckDone: The request GET
http://ad.doubleclick.net/N4061/adi/com.ythome/_default;sz=970x250;tile=1;ssl=1;dc_yt=1;kbsg=HPFR151111;kga=-1;kgg=-1;klg=fr;kmyd=ad_creative_1;ytexp=9405824,9415555,9416484,9416674,9417703,9418199,9419444,9420772,9421341,9421522,9421931,9421945,9422479,9423231,9423294,9423347,9423510,9423789;ord=5269238259430125?
is ALLOWED; last ACL checked: localnet
2015/11/12 11:51:13.362 kid1| 23,2| url.cc(393) urlParse: urlParse: URI
has whitespace: {icap://127.0.0.1:1344/squidclamav ICAP/1.0
}
2015/11/12 11:51:13.363 kid1| 61,5| redirect.cc(292) redirectStart:
redirectStart:
'http://ad.doubleclick.net/N4061/adi/com.ythome/_default;sz=970x250;tile=1;ssl=1;dc_yt=1;kbsg=HPFR151111;kga=-1;kgg=-1;klg=fr;kmyd=ad_creative_1;ytexp=9405824,9415555,9416484,9416674,9417703,9418199,9419444,9420772,9421341,9421522,9421931,9421945,9422479,9423231,9423294,9423347,9423510,9423789;ord=5269238259430125?'
2015/11/12 11:51:13.363 kid1| 61,6| redirect.cc(281)
constructHelperQuery: sending
'http://ad.doubleclick.net/N4061/adi/com.ythome/_default;sz=970x250;tile=1;ssl=1;dc_yt=1;kbsg=HPFR151111;kga=-1;kgg=-1;klg=fr;kmyd=ad_creative_1;ytexp=9405824,9415555,9416484,9416674,9417703,9418199,9419444,9420772,9421341,9421522,9421931,9421945,9422479,9423231,9423294,9423347,9423510,9423789;ord=5269238259430125?
192.168.0.74/192.168.0.74 - GET myip=192.168.0.233 myport=3128
' to the redirector helper
2015/11/12 11:51:13.363 kid1| 61,5| redirect.cc(82) redirectHandleReply:
reply={result=OK, notes={status: 302; url:
https://proxyweb.echoppe.lan/cgi-bin/squidGuard-simple.cgi?clientaddr=192.168.0.74pipo&clientname=192.168.0.74&clientuser=&clientgroup=marine&targetgroup=adv; 
}}
2015/11/12 11:51:13.363 kid1| 85,2| client_side_request.cc(717)
clientAccessCheck2: No adapted_http_access configuration. default: ALLOW
2015/11/12 11:51:13.363 kid1| 85,2| client_side_request.cc(741)
clientAccessCheckDone: The request GET
http://ad.doubleclick.net/N4061/adi/com.ythome/_default;sz=970x250;tile=1;ssl=1;dc_yt=1;kbsg=HPFR151111;kga=-1;kgg=-1;klg=fr;kmyd=ad_creative_1;ytexp=9405824,9415555,9416484,9416674,9417703,9418199,9419444,9420772,9421341,9421522,9421931,9421945,9422479,9423231,9423294,9423347,9423510,9423789;ord=5269238259430125?
is ALLOWED; last ACL checked: all
2015/11/12 11:51:13.363 kid1| 20,2| store.cc(936) checkCachable:
StoreEntry::checkCachable: NO: not cachable
2015/11/12 11:51:13.363 kid1| 20,2| store.cc(936) checkCachable:
StoreEntry::checkCachable: NO: not cachable
2015/11/12 11:51:13.363 kid1| 88,2| client_side_reply.cc(2001)
processReplyAccessResult: The reply for GET
http://ad.doubleclick.net/N4061/adi/com.ythome/_default;sz=970x250;tile=1;ssl=1;dc_yt=1;kbsg=HPFR151111;kga=-1;kgg=-1;klg=fr;kmyd=ad_creative_1;ytexp=9405824,9415555,9416484,9416674,9417703,9418199,9419444,9420772,9421341,9421522,9421931,9421945,9422479,9423231,9423294,9423347,9423510,9423789;ord=5269238259430125?
is ALLOWED, because it matched all
2015/11/12 11:51:13.363 kid1| 11,2| client_side.cc(1391)
sendStartOfMessage: HTTP Client local=192.168.0.233:3128
remote=192.168.0.74:52719 FD 32 flags=1
2015/11/12 11:51:13.363 kid1| 11,2| client_side.cc(1392)
sendStartOfMessage: HTTP Client REPLY:
---------
HTTP/1.1 302 Found
Server: squid/3.5.10
Date: Thu, 12 Nov 2015 10:51:13 GMT
Content-Length: 0
Location:
https://proxyweb.echoppe.lan/cgi-bin/squidGuard-simple.cgi?clientaddr=192.168.0.74pipo&clientname=192.168.0.74&clientuser=&clientgroup=marine&targetgroup=adv
X-Cache: MISS from squid
X-Cache-Lookup: MISS from squid:3128
Via: 1.1 squid (squid/3.5.10)
Connection: keep-alive


----------
2015/11/12 11:51:13.363 kid1| 20,2| store.cc(936) checkCachable:
StoreEntry::checkCachable: NO: not cachable
2015/11/12 11:51:13.363 kid1| 20,2| store.cc(936) checkCachable:
StoreEntry::checkCachable: NO: not cachable
2015/11/12 11:51:14.849 kid1| 5,2| TcpAcceptor.cc(222) doAccept: New
connection on FD 46
2015/11/12 11:51:14.849 kid1| 5,2| TcpAcceptor.cc(297) acceptNext:
connection on local=[::]:3128 remote=[::] FD 46 flags=9
2015/11/12 11:51:14.849 kid1| 11,2| client_side.cc(2345)
parseHttpRequest: HTTP Client local=192.168.0.233:3128
remote=192.168.0.74:52721 FD 48 flags=1
2015/11/12 11:51:14.849 kid1| 11,2| client_side.cc(2346)
parseHttpRequest: HTTP Client REQUEST:
---------
CONNECT ad.doubleclick.net:443 HTTP/1.1
User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.10; rv:42.0)
Gecko/20100101 Firefox/42.0
Proxy-Connection: keep-alive
Connection: keep-alive
Host: ad.doubleclick.net:443


----------
2015/11/12 11:51:14.850 kid1| 85,2| client_side_request.cc(741)
clientAccessCheckDone: The request CONNECT ad.doubleclick.net:443 is
ALLOWED; last ACL checked: localnet
2015/11/12 11:51:14.850 kid1| 23,2| url.cc(393) urlParse: urlParse: URI
has whitespace: {icap://127.0.0.1:1344/squidclamav ICAP/1.0
}
2015/11/12 11:51:14.851 kid1| 61,5| redirect.cc(292) redirectStart:
redirectStart: 'ad.doubleclick.net:443'
2015/11/12 11:51:14.851 kid1| 61,6| redirect.cc(281)
constructHelperQuery: sending 'ad.doubleclick.net:443
192.168.0.74/192.168.0.74 - CONNECT myip=192.168.0.233 myport=3128
' to the redirector helper
2015/11/12 11:51:14.851 kid1| 61,5| redirect.cc(82) redirectHandleReply:
reply={result=OK, notes={status: 302; url:
https://proxyweb.echoppe.lan/cgi-bin/squidGuard-simple.cgi?clientaddr=192.168.0.74pipo&clientname=192.168.0.74&clientuser=&clientgroup=marine&targetgroup=adv; 
}}
2015/11/12 11:51:14.851 kid1| 85,2| client_side_request.cc(717)
clientAccessCheck2: No adapted_http_access configuration. default: ALLOW
2015/11/12 11:51:14.851 kid1| 85,2| client_side_request.cc(741)
clientAccessCheckDone: The request CONNECT ad.doubleclick.net:443 is
ALLOWED; last ACL checked: all
2015/11/12 11:51:14.851 kid1| 20,2| store.cc(936) checkCachable:
StoreEntry::checkCachable: NO: not cachable
2015/11/12 11:51:14.851 kid1| 20,2| store.cc(936) checkCachable:
StoreEntry::checkCachable: NO: not cachable
2015/11/12 11:51:14.851 kid1| 88,2| client_side_reply.cc(2001)
processReplyAccessResult: The reply for CONNECT ad.doubleclick.net:443
is ALLOWED, because it matched all
2015/11/12 11:51:14.851 kid1| 11,2| client_side.cc(1391)
sendStartOfMessage: HTTP Client local=192.168.0.233:3128
remote=192.168.0.74:52721 FD 48 flags=1
2015/11/12 11:51:14.851 kid1| 11,2| client_side.cc(1392)
sendStartOfMessage: HTTP Client REPLY:
---------
HTTP/1.1 302 Found
Server: squid/3.5.10
Date: Thu, 12 Nov 2015 10:51:14 GMT
Content-Length: 0
Location:
https://proxyweb.echoppe.lan/cgi-bin/squidGuard-simple.cgi?clientaddr=192.168.0.74pipo&clientname=192.168.0.74&clientuser=&clientgroup=marine&targetgroup=adv
X-Cache: MISS from squid
X-Cache-Lookup: MISS from squid:3128
Via: 1.1 squid (squid/3.5.10)
Connection: keep-alive


----------
2015/11/12 11:51:14.851 kid1| 20,2| store.cc(936) checkCachable:
StoreEntry::checkCachable: NO: not cachable
2015/11/12 11:51:14.851 kid1| 20,2| store.cc(936) checkCachable:
StoreEntry::checkCachable: NO: not cachable
2015/11/12 11:51:14.851 kid1| abandoning local=192.168.0.233:3128
remote=192.168.0.74:52721 FD 48 flags=1

========================

On the wireshark side:

In the http case I observe 2 streams:
* One with the proxy
GET
http://ad.doubleclick.net/N4061/adi/com.ythome/_default;sz=970x250;tile=1;ssl=1;dc_yt=1;kbsg=HPFR151111;kga=-1;kgg=-1;klg=fr;kmyd=ad_creative_1;ytexp=9405824,9415555,9416484,9416674,9417703,9418199,9419444,9420772,9421341,9421522,9421931,9421945,9422479,9423231,9423294,9423347,9423510,9423789;ord=5269238259430125?
HTTP/1.1
Host: ad.doubleclick.net
User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.10; rv:42.0)
Gecko/20100101 Firefox/42.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: fr,fr-FR;q=0.8,en-US;q=0.5,en;q=0.3
Accept-Encoding: gzip, deflate
Cookie:
id=22444c07d901000f||t=1399896339|et=730|cs=002213fd48651016fb03856b79;
IDE=AHWqTUlZo9sH_j9svI23Ge8QFYiXp8lJDU2dwdeEJthW3WouVnYC__mRag
Connection: keep-alive

HTTP/1.1 302 Found
Server: squid/3.5.10
Date: Thu, 12 Nov 2015 10:35:50 GMT
Content-Length: 0
Location:
https://proxyweb.echoppe.lan/cgi-bin/squidGuard-simple.cgi?clientaddr=192.168.0.74pipo&clientname=192.168.0.74&clientuser=&clientgroup=marine&targetgroup=adv
X-Cache: MISS from squid
X-Cache-Lookup: MISS from squid:3128
Via: 1.1 squid (squid/3.5.10)
Connection: keep-alive

* Then one with proxyweb SSL encoded

That sounds logical to me.


In the https case I observe just 1 stream:
CONNECT ad.doubleclick.net:443 HTTP/1.1
User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.10; rv:42.0)
Gecko/20100101 Firefox/42.0
Proxy-Connection: keep-alive
Connection: keep-alive
Host: ad.doubleclick.net:443

HTTP/1.1 302 Found
Server: squid/3.5.10
Date: Thu, 12 Nov 2015 10:35:57 GMT
Content-Length: 0
Location:
https://proxyweb.echoppe.lan/cgi-bin/squidGuard-simple.cgi?clientaddr=192.168.0.74pipo&clientname=192.168.0.74&clientuser=&clientgroup=marine&targetgroup=adv
X-Cache: MISS from squid
X-Cache-Lookup: MISS from squid:3128
Via: 1.1 squid (squid/3.5.10)
Connection: keep-alive

CONNECT ad.doubleclick.net:443 HTTP/1.1
User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.10; rv:42.0)
Gecko/20100101 Firefox/42.0
Proxy-Connection: keep-alive
Connection: keep-alive
Host: ad.doubleclick.net:443


All this is between my client and proxy server.

Why is the browser not taking account of the redirect?
Why is it redoing the same connect?
Why is there no trace at all in the proxy logs of this second CONNECT?

Regards, EG
_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: ssl bump and url_rewrite_program (like squidguard)

Marcus Kool
I cannot make much of the logs and expect that information is missing.
But using just logic, it seems that Squid has a problem with the redirect to a CONNECT.
I suggest to set debug all,9 and to look closely at what happens with the redirection.

Marcus


On 11/12/2015 10:02 AM, Edouard Gaulué wrote:

> Hi Marcus and all,
>
> I have option_debug ALL,2 61,9.
>
> Logs don't tell me a lot, the squidguard answer is exactly the same with or without ssl.
>
> =======================
>
> 2015/11/12 11:51:13.320 kid1| 11,2| client_side.cc(2345) parseHttpRequest: HTTP Client local=192.168.0.233:3128 remote=192.168.0.74:52719 FD 32 flags=1
> 2015/11/12 11:51:13.320 kid1| 11,2| client_side.cc(2346) parseHttpRequest: HTTP Client REQUEST:
> ---------
> GET
> http://ad.doubleclick.net/N4061/adi/com.ythome/_default;sz=970x250;tile=1;ssl=1;dc_yt=1;kbsg=HPFR151111;kga=-1;kgg=-1;klg=fr;kmyd=ad_creative_1;ytexp=9405824,9415555,9416484,9416674,9417703,9418199,9419444,9420772,9421341,9421522,9421931,9421945,9422479,9423231,9423294,9423347,9423510,9423789;ord=5269238259430125?
> HTTP/1.1
> Host: ad.doubleclick.net
> User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.10; rv:42.0) Gecko/20100101 Firefox/42.0
> Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
> Accept-Language: fr,fr-FR;q=0.8,en-US;q=0.5,en;q=0.3
> Accept-Encoding: gzip, deflate
> Cookie: id=22444c07d901000f||t=1399896339|et=730|cs=002213fd48651016fb03856b79; IDE=AHWqTUlZo9sH_j9svI23Ge8QFYiXp8lJDU2dwdeEJthW3WouVnYC__mRag
> Connection: keep-alive
>
>
> ----------
> 2015/11/12 11:51:13.361 kid1| 85,2| client_side_request.cc(741) clientAccessCheckDone: The request GET
> http://ad.doubleclick.net/N4061/adi/com.ythome/_default;sz=970x250;tile=1;ssl=1;dc_yt=1;kbsg=HPFR151111;kga=-1;kgg=-1;klg=fr;kmyd=ad_creative_1;ytexp=9405824,9415555,9416484,9416674,9417703,9418199,9419444,9420772,9421341,9421522,9421931,9421945,9422479,9423231,9423294,9423347,9423510,9423789;ord=5269238259430125?
> is ALLOWED; last ACL checked: localnet
> 2015/11/12 11:51:13.362 kid1| 23,2| url.cc(393) urlParse: urlParse: URI has whitespace: {icap://127.0.0.1:1344/squidclamav ICAP/1.0
> }
> 2015/11/12 11:51:13.363 kid1| 61,5| redirect.cc(292) redirectStart: redirectStart:
> 'http://ad.doubleclick.net/N4061/adi/com.ythome/_default;sz=970x250;tile=1;ssl=1;dc_yt=1;kbsg=HPFR151111;kga=-1;kgg=-1;klg=fr;kmyd=ad_creative_1;ytexp=9405824,9415555,9416484,9416674,9417703,9418199,9419444,9420772,9421341,9421522,9421931,9421945,9422479,9423231,9423294,9423347,9423510,9423789;ord=5269238259430125?'
>
> 2015/11/12 11:51:13.363 kid1| 61,6| redirect.cc(281) constructHelperQuery: sending
> 'http://ad.doubleclick.net/N4061/adi/com.ythome/_default;sz=970x250;tile=1;ssl=1;dc_yt=1;kbsg=HPFR151111;kga=-1;kgg=-1;klg=fr;kmyd=ad_creative_1;ytexp=9405824,9415555,9416484,9416674,9417703,9418199,9419444,9420772,9421341,9421522,9421931,9421945,9422479,9423231,9423294,9423347,9423510,9423789;ord=5269238259430125?
> 192.168.0.74/192.168.0.74 - GET myip=192.168.0.233 myport=3128
> ' to the redirector helper
> 2015/11/12 11:51:13.363 kid1| 61,5| redirect.cc(82) redirectHandleReply: reply={result=OK, notes={status: 302; url:
> https://proxyweb.echoppe.lan/cgi-bin/squidGuard-simple.cgi?clientaddr=192.168.0.74pipo&clientname=192.168.0.74&clientuser=&clientgroup=marine&targetgroup=adv; }}
> 2015/11/12 11:51:13.363 kid1| 85,2| client_side_request.cc(717) clientAccessCheck2: No adapted_http_access configuration. default: ALLOW
> 2015/11/12 11:51:13.363 kid1| 85,2| client_side_request.cc(741) clientAccessCheckDone: The request GET
> http://ad.doubleclick.net/N4061/adi/com.ythome/_default;sz=970x250;tile=1;ssl=1;dc_yt=1;kbsg=HPFR151111;kga=-1;kgg=-1;klg=fr;kmyd=ad_creative_1;ytexp=9405824,9415555,9416484,9416674,9417703,9418199,9419444,9420772,9421341,9421522,9421931,9421945,9422479,9423231,9423294,9423347,9423510,9423789;ord=5269238259430125?
> is ALLOWED; last ACL checked: all
> 2015/11/12 11:51:13.363 kid1| 20,2| store.cc(936) checkCachable: StoreEntry::checkCachable: NO: not cachable
> 2015/11/12 11:51:13.363 kid1| 20,2| store.cc(936) checkCachable: StoreEntry::checkCachable: NO: not cachable
> 2015/11/12 11:51:13.363 kid1| 88,2| client_side_reply.cc(2001) processReplyAccessResult: The reply for GET
> http://ad.doubleclick.net/N4061/adi/com.ythome/_default;sz=970x250;tile=1;ssl=1;dc_yt=1;kbsg=HPFR151111;kga=-1;kgg=-1;klg=fr;kmyd=ad_creative_1;ytexp=9405824,9415555,9416484,9416674,9417703,9418199,9419444,9420772,9421341,9421522,9421931,9421945,9422479,9423231,9423294,9423347,9423510,9423789;ord=5269238259430125?
> is ALLOWED, because it matched all
> 2015/11/12 11:51:13.363 kid1| 11,2| client_side.cc(1391) sendStartOfMessage: HTTP Client local=192.168.0.233:3128 remote=192.168.0.74:52719 FD 32 flags=1
> 2015/11/12 11:51:13.363 kid1| 11,2| client_side.cc(1392) sendStartOfMessage: HTTP Client REPLY:
> ---------
> HTTP/1.1 302 Found
> Server: squid/3.5.10
> Date: Thu, 12 Nov 2015 10:51:13 GMT
> Content-Length: 0
> Location: https://proxyweb.echoppe.lan/cgi-bin/squidGuard-simple.cgi?clientaddr=192.168.0.74pipo&clientname=192.168.0.74&clientuser=&clientgroup=marine&targetgroup=adv
> X-Cache: MISS from squid
> X-Cache-Lookup: MISS from squid:3128
> Via: 1.1 squid (squid/3.5.10)
> Connection: keep-alive
>
>
> ----------
> 2015/11/12 11:51:13.363 kid1| 20,2| store.cc(936) checkCachable: StoreEntry::checkCachable: NO: not cachable
> 2015/11/12 11:51:13.363 kid1| 20,2| store.cc(936) checkCachable: StoreEntry::checkCachable: NO: not cachable
> 2015/11/12 11:51:14.849 kid1| 5,2| TcpAcceptor.cc(222) doAccept: New connection on FD 46
> 2015/11/12 11:51:14.849 kid1| 5,2| TcpAcceptor.cc(297) acceptNext: connection on local=[::]:3128 remote=[::] FD 46 flags=9
> 2015/11/12 11:51:14.849 kid1| 11,2| client_side.cc(2345) parseHttpRequest: HTTP Client local=192.168.0.233:3128 remote=192.168.0.74:52721 FD 48 flags=1
> 2015/11/12 11:51:14.849 kid1| 11,2| client_side.cc(2346) parseHttpRequest: HTTP Client REQUEST:
> ---------
> CONNECT ad.doubleclick.net:443 HTTP/1.1
> User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.10; rv:42.0) Gecko/20100101 Firefox/42.0
> Proxy-Connection: keep-alive
> Connection: keep-alive
> Host: ad.doubleclick.net:443
>
>
> ----------
> 2015/11/12 11:51:14.850 kid1| 85,2| client_side_request.cc(741) clientAccessCheckDone: The request CONNECT ad.doubleclick.net:443 is ALLOWED; last ACL checked: localnet
> 2015/11/12 11:51:14.850 kid1| 23,2| url.cc(393) urlParse: urlParse: URI has whitespace: {icap://127.0.0.1:1344/squidclamav ICAP/1.0
> }
> 2015/11/12 11:51:14.851 kid1| 61,5| redirect.cc(292) redirectStart: redirectStart: 'ad.doubleclick.net:443'
> 2015/11/12 11:51:14.851 kid1| 61,6| redirect.cc(281) constructHelperQuery: sending 'ad.doubleclick.net:443 192.168.0.74/192.168.0.74 - CONNECT myip=192.168.0.233 myport=3128
> ' to the redirector helper
> 2015/11/12 11:51:14.851 kid1| 61,5| redirect.cc(82) redirectHandleReply: reply={result=OK, notes={status: 302; url:
> https://proxyweb.echoppe.lan/cgi-bin/squidGuard-simple.cgi?clientaddr=192.168.0.74pipo&clientname=192.168.0.74&clientuser=&clientgroup=marine&targetgroup=adv; }}
> 2015/11/12 11:51:14.851 kid1| 85,2| client_side_request.cc(717) clientAccessCheck2: No adapted_http_access configuration. default: ALLOW
> 2015/11/12 11:51:14.851 kid1| 85,2| client_side_request.cc(741) clientAccessCheckDone: The request CONNECT ad.doubleclick.net:443 is ALLOWED; last ACL checked: all
> 2015/11/12 11:51:14.851 kid1| 20,2| store.cc(936) checkCachable: StoreEntry::checkCachable: NO: not cachable
> 2015/11/12 11:51:14.851 kid1| 20,2| store.cc(936) checkCachable: StoreEntry::checkCachable: NO: not cachable
> 2015/11/12 11:51:14.851 kid1| 88,2| client_side_reply.cc(2001) processReplyAccessResult: The reply for CONNECT ad.doubleclick.net:443 is ALLOWED, because it matched all
> 2015/11/12 11:51:14.851 kid1| 11,2| client_side.cc(1391) sendStartOfMessage: HTTP Client local=192.168.0.233:3128 remote=192.168.0.74:52721 FD 48 flags=1
> 2015/11/12 11:51:14.851 kid1| 11,2| client_side.cc(1392) sendStartOfMessage: HTTP Client REPLY:
> ---------
> HTTP/1.1 302 Found
> Server: squid/3.5.10
> Date: Thu, 12 Nov 2015 10:51:14 GMT
> Content-Length: 0
> Location: https://proxyweb.echoppe.lan/cgi-bin/squidGuard-simple.cgi?clientaddr=192.168.0.74pipo&clientname=192.168.0.74&clientuser=&clientgroup=marine&targetgroup=adv
> X-Cache: MISS from squid
> X-Cache-Lookup: MISS from squid:3128
> Via: 1.1 squid (squid/3.5.10)
> Connection: keep-alive
>
>
> ----------
> 2015/11/12 11:51:14.851 kid1| 20,2| store.cc(936) checkCachable: StoreEntry::checkCachable: NO: not cachable
> 2015/11/12 11:51:14.851 kid1| 20,2| store.cc(936) checkCachable: StoreEntry::checkCachable: NO: not cachable
> 2015/11/12 11:51:14.851 kid1| abandoning local=192.168.0.233:3128 remote=192.168.0.74:52721 FD 48 flags=1
>
> ========================
>
> On the wireshark side:
>
> In the http case I observe 2 streams:
> * One with the proxy
> GET
> http://ad.doubleclick.net/N4061/adi/com.ythome/_default;sz=970x250;tile=1;ssl=1;dc_yt=1;kbsg=HPFR151111;kga=-1;kgg=-1;klg=fr;kmyd=ad_creative_1;ytexp=9405824,9415555,9416484,9416674,9417703,9418199,9419444,9420772,9421341,9421522,9421931,9421945,9422479,9423231,9423294,9423347,9423510,9423789;ord=5269238259430125?
> HTTP/1.1
> Host: ad.doubleclick.net
> User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.10; rv:42.0) Gecko/20100101 Firefox/42.0
> Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
> Accept-Language: fr,fr-FR;q=0.8,en-US;q=0.5,en;q=0.3
> Accept-Encoding: gzip, deflate
> Cookie: id=22444c07d901000f||t=1399896339|et=730|cs=002213fd48651016fb03856b79; IDE=AHWqTUlZo9sH_j9svI23Ge8QFYiXp8lJDU2dwdeEJthW3WouVnYC__mRag
> Connection: keep-alive
>
> HTTP/1.1 302 Found
> Server: squid/3.5.10
> Date: Thu, 12 Nov 2015 10:35:50 GMT
> Content-Length: 0
> Location: https://proxyweb.echoppe.lan/cgi-bin/squidGuard-simple.cgi?clientaddr=192.168.0.74pipo&clientname=192.168.0.74&clientuser=&clientgroup=marine&targetgroup=adv
> X-Cache: MISS from squid
> X-Cache-Lookup: MISS from squid:3128
> Via: 1.1 squid (squid/3.5.10)
> Connection: keep-alive
>
> * Then one with proxyweb SSL encoded
>
> That sounds logical to me.
>
>
> In the https case I observe just 1 stream:
> CONNECT ad.doubleclick.net:443 HTTP/1.1
> User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.10; rv:42.0) Gecko/20100101 Firefox/42.0
> Proxy-Connection: keep-alive
> Connection: keep-alive
> Host: ad.doubleclick.net:443
>
> HTTP/1.1 302 Found
> Server: squid/3.5.10
> Date: Thu, 12 Nov 2015 10:35:57 GMT
> Content-Length: 0
> Location: https://proxyweb.echoppe.lan/cgi-bin/squidGuard-simple.cgi?clientaddr=192.168.0.74pipo&clientname=192.168.0.74&clientuser=&clientgroup=marine&targetgroup=adv
> X-Cache: MISS from squid
> X-Cache-Lookup: MISS from squid:3128
> Via: 1.1 squid (squid/3.5.10)
> Connection: keep-alive
>
> CONNECT ad.doubleclick.net:443 HTTP/1.1
> User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.10; rv:42.0) Gecko/20100101 Firefox/42.0
> Proxy-Connection: keep-alive
> Connection: keep-alive
> Host: ad.doubleclick.net:443
>
>
> All this is between my client and proxy server.
>
> Why is the browser not taking account of the redirect?
> Why is it redoing the same connect?
> Why is there no trace at all in the proxy logs of this second CONNECT?
>
> Regards, EG
> _______________________________________________
> squid-users mailing list
> [hidden email]
> http://lists.squid-cache.org/listinfo/squid-users
_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: ssl bump and url_rewrite_program (like squidguard)

Edouard Gaulué
Le 12/11/2015 13:28, Marcus Kool a écrit :
> I cannot make much of the logs and expect that information is missing.
> But using just logic, it seems that Squid has a problem with the
> redirect to a CONNECT.
> I suggest to set debug all,9 and to look closely at what happens with
> the redirection.
>
> Marcus

I've got a log file but it's rather big (800ko and 80ko gzipped). It's
got one request with https and the same with http. I can send it to
anyone interested. On my side, I didn't see anything special but I'm not
an expert.

I've build a little test server to see how is handled SSL redirection:

https://site-a/index.html   == 302 ==>  https://site-b/index.html

It works well going through squid or not.

So the trouble is really with redirect in case of SSL with squid.

I can do more test/log if requested but I'm sorry to say now my
knowledges don't leave me go further.

Regards, EG
_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: ssl bump and url_rewrite_program (like squidguard)

Walter H.
In reply to this post by Amos Jeffries
On 05.11.2015 04:26, Amos Jeffries wrote:
> There was a bug about the wrong SNI being sent to servers on bumped
> traffic that got re-written. That got fixed in Squid-3.5.7 and
> re-writers should have been fully working since then.
This seems to be a bug in 3.5.x only
with 3.4.10 this works fine ...

just tries the following url-rewrite-program (perl)

<url-rewrite-program.pl>
#!/usr/bin/perl -wl
$ |= 1;  # don't buffer the output
while ( <> )
{
     unless( m,(\S+) (\S+)/(\S+) (\S+) (\S+), )
     {
         $uri = ''; next;
     }
     $uri = $1;
     ...
     $uri = "301:https://rsa-md5.ssl.hboeck.de/" if ( $uri =~
m/^https:\/\/ssl\.hboeck\.de\/(\S*)/ );
}
continue
{
     print "$uri";
}
exit;
</url-rewrite-program.pl>


_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users

smime.p7s (5K) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: ssl bump and url_rewrite_program (like squidguard)

Amos Jeffries
Administrator
In reply to this post by Edouard Gaulué
On 13/11/2015 1:02 a.m., Edouard Gaulué wrote:

>
> In the https case I observe just 1 stream:
> CONNECT ad.doubleclick.net:443 HTTP/1.1
> User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.10; rv:42.0)
> Gecko/20100101 Firefox/42.0
> Proxy-Connection: keep-alive
> Connection: keep-alive
> Host: ad.doubleclick.net:443
>
> HTTP/1.1 302 Found
> Server: squid/3.5.10
> Date: Thu, 12 Nov 2015 10:35:57 GMT
> Content-Length: 0
> Location:
> https://proxyweb.echoppe.lan/cgi-bin/squidGuard-simple.cgi?clientaddr=192.168.0.74pipo&clientname=192.168.0.74&clientuser=&clientgroup=marine&targetgroup=adv
>
> X-Cache: MISS from squid
> X-Cache-Lookup: MISS from squid:3128
> Via: 1.1 squid (squid/3.5.10)
> Connection: keep-alive
>
> CONNECT ad.doubleclick.net:443 HTTP/1.1
> User-Agent: Mozilla/5.0 (Macintosh; Intel Mac OS X 10.10; rv:42.0)
> Gecko/20100101 Firefox/42.0
> Proxy-Connection: keep-alive
> Connection: keep-alive
> Host: ad.doubleclick.net:443
>
>
> All this is between my client and proxy server.
>
> Why is the browser not taking account of the redirect?

Think about *exactly* what is being redirected.

CONNECT is a request to setup a blind packet relaying tunnel.


> Why is it redoing the same connect?

Because its a browser. They do some really weird things when confused.

It was told a TCP relay tunnel existed at
"https://proxyweb.echoppe.lan/cgi-bin/...". Thats a pretty weird place
for a network socket to exist.


> Why is there no trace at all in the proxy logs of this second CONNECT?
>

Only if it was handled would it be logged. It seems it may have been
read in (or maybe not) but definitely not processed for some reason.

Amos

_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: ssl bump and url_rewrite_program (like squidguard)

Edouard Gaulué
Hi Amos and all,

Learning on HTTP CONNECT, I got
there:http://serverfault.com/questions/727262/how-to-redirect-https-connect-request-with-squid-explicit-proxy

I read on http://wiki.squid-cache.org/Features/MimicSslServerCert in the
"Delayed error responses" chapter:
"When Squid fails to negotiate a secure connection with the origin
server and bump-ssl-server-first is enabled, Squid remembers the error
page and serves it after establishing the secure connection with the
client and receiving the first encrypted client request. The error is
served securely. The same approach is used for Squid redirect messages
configured via deny_info. This error delay is implemented because (a)
browsers like FireFox and Chromium do not display CONNECT errors
correctly and (b) intercepted SSL connections must wait for the first
request to serve an error."

My ideas/questions:
1/ Is there a way to have the same with new peek and splice feature?
2/ Is there a way to say url_rewrite_program not to work on CONNECT
request? This way the CONNECT is not redirected, next request the
browser sends after squid has bumped it  should be a kind of  GET/POST
one that will be redirected by url_rewrite_program.
3/ Would it works if squidguard were i-cap'ed?

EG


Le 13/11/2015 01:31, Amos Jeffries a écrit :

> On 13/11/2015 1:02 a.m., Edouard Gaulué wrote:
>>
>>
>> Why is the browser not taking account of the redirect?
> Think about *exactly* what is being redirected.
>
> CONNECT is a request to setup a blind packet relaying tunnel.
>
>
>> Why is it redoing the same connect?
> Because its a browser. They do some really weird things when confused.
>
> It was told a TCP relay tunnel existed at
> "https://proxyweb.echoppe.lan/cgi-bin/...". Thats a pretty weird place
> for a network socket to exist.
>
>
>> Why is there no trace at all in the proxy logs of this second CONNECT?
>>
> Only if it was handled would it be logged. It seems it may have been
> read in (or maybe not) but definitely not processed for some reason.
>
> Amos
>
> _______________________________________________
> squid-users mailing list
> [hidden email]
> http://lists.squid-cache.org/listinfo/squid-users

_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: ssl bump and url_rewrite_program (like squidguard)

Amos Jeffries
Administrator
On 13/11/2015 10:16 p.m., Edouard Gaulué wrote:

> Hi Amos and all,
>
> Learning on HTTP CONNECT, I got
> there:http://serverfault.com/questions/727262/how-to-redirect-https-connect-request-with-squid-explicit-proxy
>
>
> I read on http://wiki.squid-cache.org/Features/MimicSslServerCert in the
> "Delayed error responses" chapter:
> "When Squid fails to negotiate a secure connection with the origin
> server and bump-ssl-server-first is enabled, Squid remembers the error
> page and serves it after establishing the secure connection with the
> client and receiving the first encrypted client request. The error is
> served securely. The same approach is used for Squid redirect messages
> configured via deny_info. This error delay is implemented because (a)
> browsers like FireFox and Chromium do not display CONNECT errors
> correctly and (b) intercepted SSL connections must wait for the first
> request to serve an error."
>
> My ideas/questions:
> 1/ Is there a way to have the same with new peek and splice feature?

Not really because CONNECT is not a part of TLS. It is a HTTP message.

> 2/ Is there a way to say url_rewrite_program not to work on CONNECT
> request?

http://www.squid-cache.org/Doc/config/url_rewrite_access/



 This way the CONNECT is not redirected, next request the
> browser sends after squid has bumped it  should be a kind of  GET/POST
> one that will be redirected by url_rewrite_program.
> 3/ Would it works if squidguard were i-cap'ed?

All SquidGuard does is apply some basic ACL rules to the details it is
given by Squid.

You would be far better off simply converting the SG rulset into
http_access ACLs.

Amos
_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: ssl bump and url_rewrite_program (like squidguard)

Alex Rousskov
In reply to this post by Edouard Gaulué
On 11/13/2015 02:16 AM, Edouard Gaulué wrote:

> I read on http://wiki.squid-cache.org/Features/MimicSslServerCert in the
> "Delayed error responses" chapter:
> "When Squid fails to negotiate a secure connection with the origin
> server and bump-ssl-server-first is enabled, Squid remembers the error
> page and serves it after establishing the secure connection with the
> client and receiving the first encrypted client request. The error is
> served securely. The same approach is used for Squid redirect messages
> configured via deny_info."
>
> My ideas/questions:
> 1/ Is there a way to have the same with new peek and splice feature?

Yes, SslBump failures should result in delayed errors securely served to
SSL clients where possible. This essential SslBump feature is not
specific to the old server-first bumping method. If the latest Squid
does not do this, it is essentially a bug.

Alex.

_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: ssl bump and url_rewrite_program (like squidguard)

Edouard Gaulué-2
In reply to this post by Edouard Gaulué
Hi community,

Any news about this?

I've tried 3.5.25 but still observe this behaviour.

I understand it well since I read:
https://serverfault.com/questions/727262/how-to-redirect-https-connect-request-with-squid-explicit-proxy

But how to let the CONNECT request succeed and later block/redirect next
HTTP request coming through this established connection tunnel?

Best Regards,

Le 03/11/2015 à 23:48, Edouard Gaulué a écrit :

> Hi community,
>
> I've followed
> http://wiki.squid-cache.org/ConfigExamples/Intercept/SslBumpExplicit  to
> set my server. It looks really interesting and it's said to be the more
> common configuration.
>
> I often observe (example here withwww.youtube.com) :
> ***************************
> The following error was encountered while trying to retrieve the URL:
> https://http/*
>
>     *Unable to determine IP address from host name "http"*
>
> The DNS server returned:
>
>     Name Error: The domain name does not exist.
> ****************************
>
> This happens while the navigator (Mozilla) is trying to get a frame at
> https://ad.doubleclick.net/N4061/adi/com.ythome/_default;sz=970x250;tile=1;ssl=1;dc_yt=1;kbsg=HPFR151103;kga=-1;kgg=-1;klg=fr;kmyd=ad_creative_1;ytexp=9406852,9408210,9408502,9417689,9419444,9419802,9420440,9420473,9421645,9421711,9422141,9422865,9423510,9423563,9423789;ord=968558538238386?
>
>
> That's ads so I'm not so fond of it...
>
> But this leads me to the fact I get this behavior each time the site is
> banned by squidguard.
>
> Is there something to do to avoid this behavior? I mean, squidguard
> should send :
>
> *********************************
>   Access denied
>
> Supplementary info     :
> Client address     =     192.168.XXX.XXX
> Client name     =     192.168.XXX.XXX
> User ident     =
> Client group     =     XXXXXXX
> URL     =     https://ad.doubleclick.net/
> Target class     =     ads
>
> If this is wrong, contact your administrator
> **********************************
>
> squidguard is an url_rewrite_program that looks to respect squid
> requirements. Redirect looks like this :
> http://proxyweb.myserver.mydomain/cgi-bin/squidGuard-simple.cgi?clientaddr=...
>
>
> I've played arround trying to change the redirect URL and it leads me to
> the idea ssl_bump tries to analyse the part until the ":". Is there a way
> to avoid this? Is this just a configuration matter?
>
> Could putting a ssl_bump rule saying "every server that name match
> "http" or
> "https" should splice" solve the problem?
>
> Regards, EG
>
>
> _______________________________________________
> squid-users mailing list
> [hidden email]
> http://lists.squid-cache.org/listinfo/squid-users


_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
12