Squid and url modifying

classic Classic list List threaded Threaded
14 messages Options
Reply | Threaded
Open this post in threaded view
|

Squid and url modifying

Egoitz Aurrekoetxea

Good afternoon,


Is it possible for Squid to do something like :


- Receive request : https://oooeeee.eeee.ttt.thesquidserver.org/u?ii=99&j=88


and


to really perform a request as : https://oooeeee.eeee.ttt/u?ii=99&j=88


I mean not to redirect users with url redirection. Just act as a proxy but having Squid the proper acknoledge internally for being able to make the proper request to the destination?. Is it possible without redirecting url, to return for instance a 403 error to the source web browser in order to not be able to access to the site if some kind of circumstances are given?.


If the last config, was not possible... perhaps I needed to just to redirect forcibly?. I have read for that purpose you can use URL redirectors.... so I assume the concept is :


- Receive request : https://oooeeee.eeee.ttt.thesquidserver.org/u?ii=99&j=88


and


to really perform a request as : https://oooeeee.eeee.ttt/u?ii=99&j=88


If all conditions for allowing to see the content are OK, return the web browser a 301 redirect answer with the https://oooeeee.eeee.ttt/u?ii=99&j=88 URL. Else, just return a 403 or redirect you to a Forbidden page... I think this could be implemented with URL redirectors...but... the fact is... which kind of conditions or env situations can you use for validating the content inside the url redirector?.



Thanks a lot for your time :)


Cheers!




--
sarenet
Egoitz Aurrekoetxea
Dpto. de sistemas
944 209 470
Parque Tecnológico. Edificio 103
48170 Zamudio (Bizkaia)

Antes de imprimir este correo electrónico piense si es necesario hacerlo.

_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: Squid and url modifying

Amos Jeffries
Administrator
On 2/03/19 1:59 am, Egoitz Aurrekoetxea wrote:

> Good afternoon,
>
>
> Is it possible for Squid to do something like :
>
>
> - Receive request : https://oooeeee.eeee.ttt.thesquidserver.org/u?ii=99&j=88
>
>
> and
>
>
> to really perform a request as : https://oooeeee.eeee.ttt/u?ii=99&j=88
> <https://oooeeee.eeee.ttt.thesquidserver.org/u?ii=99&j=88>
>

Possible? Yes.

Good Idea? No.

This is a typical URL-rewrite. Except that in order to perform the
SSL-Bump to see that URL in the first place Squid has probably had to
contact the server and now has TLS bound to that particular server. So
you better hope the server oooeeee.eeee.ttt.thesquidserver.org knows how
to answer requests for the https://oooeeee.eeee.ttt/* URLs.

Avoiding that problem is only possible with bumping at step2 (aka.
client-first bumping). Which opens your proxy to a large number of
possible attacks and nasty side effects from incompatible TLS features
being negotiated on client and server connections.
 YMMV on whether that is even usable for your situation.


>
> I mean not to redirect users with url redirection. Just act as a proxy
> but having Squid the proper acknoledge internally for being able to make
> the proper request to the destination?. Is it possible without
> redirecting url, to return for instance a 403 error to the source web
> browser in order to not be able to access to the site if some kind of
> circumstances are given?

Of course.

The issue people tend to have is that Browsers do not show proxy error
messages in a lot of circumstances. They still show *an* error though.
So if your goal is just to block, yes that much is relatively easy.


>
> If all conditions for allowing to see the content are OK, return the web
> browser a 301 redirect answer with the
> https://oooeeee.eeee.ttt/u?ii=99&j=88
> <https://oooeeee.eeee.ttt.thesquidserver.org/u?ii=99&j=88> URL. Else,
> just return a 403 or redirect you to a Forbidden page... I think this
> could be implemented with URL redirectors...but... the fact is... which
> kind of conditions or env situations can you use for validating the
> content inside the url redirector?.

302 is better since this is not a permanent state. Your policy may
change any time.

Anyhow, 30x redirection is the *best* way to do it.

The helper is contacted by Squid after the TLS has been decrypted and
the HTTP(S) messages area arriving. Before that point there is no URL to
rewrite or redirect.

Helpers can do anything they like. Squid can pass them details of the
client TCP connection, the TLS state, or the HTTP request the client has
delivered. And some other details (eg rDNS, IDENT, ASN or such) that
Squid can lookup. Anything else the helper can lookup by itself can also
be used.

That extreme flexibility is the point of the helper. If you just need a
somewhat static mapping deny_info can perform redirection faster.


Amos
_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: Squid and url modifying

Alex Rousskov
In reply to this post by Egoitz Aurrekoetxea
On 3/1/19 5:59 AM, Egoitz Aurrekoetxea wrote:

> Is it possible for Squid to do something like :

> - Receive request : https://oooeeee.eeee.ttt.thesquidserver.org/u?ii=99&j=88

> and

> to really perform a request as : https://oooeeee.eeee.ttt/u?ii=99&j=88

How does your Squid receive the former request? Amos' answer probably
assumes that your Squid is _not_ oooeeee.eeee.ttt.thesquidserver.org,
but the name you have chosen for your example may imply that it is.

* If your Squid is _intercepting_ traffic destined for the real
oooeeee.eeee.ttt.thesquidserver.org, then see Amos' answer.

* If your Squid is representing oooeeee.eeee.ttt.thesquidserver.org,
then your Squid is a reverse proxy that ought to have the certificate
key for that domain, and none of the SslBump problems that Amos
mentioned apply.

Please clarify what your use case is.

Alex.



> I mean not to redirect users with url redirection. Just act as a proxy
> but having Squid the proper acknoledge internally for being able to make
> the proper request to the destination?. Is it possible without
> redirecting url, to return for instance a 403 error to the source web
> browser in order to not be able to access to the site if some kind of
> circumstances are given?.
>
>
> If the last config, was not possible... perhaps I needed to just to
> redirect forcibly?. I have read for that purpose you can use URL
> redirectors.... so I assume the concept is :
>
>
> - Receive request : https://oooeeee.eeee.ttt.thesquidserver.org/u?ii=99&j=88
>
>
> and
>
>
> to really perform a request as : https://oooeeee.eeee.ttt/u?ii=99&j=88
> <https://oooeeee.eeee.ttt.thesquidserver.org/u?ii=99&j=88>
>
>
> If all conditions for allowing to see the content are OK, return the web
> browser a 301 redirect answer with the
> https://oooeeee.eeee.ttt/u?ii=99&j=88
> <https://oooeeee.eeee.ttt.thesquidserver.org/u?ii=99&j=88> URL. Else,
> just return a 403 or redirect you to a Forbidden page... I think this
> could be implemented with URL redirectors...but... the fact is... which
> kind of conditions or env situations can you use for validating the
> content inside the url redirector?.
>
>
>
> Thanks a lot for your time :)
>
>
> Cheers!
>
>
>
>
> --
> sarenet
> *Egoitz Aurrekoetxea*
> Dpto. de sistemas
> 944 209 470
> Parque Tecnológico. Edificio 103
> 48170 Zamudio (Bizkaia)
> [hidden email] <mailto:[hidden email]>
> www.sarenet.es <http://www.sarenet.es>
>
> Antes de imprimir este correo electrónico piense si es necesario hacerlo.
>
> _______________________________________________
> squid-users mailing list
> [hidden email]
> http://lists.squid-cache.org/listinfo/squid-users
>

_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: Squid and url modifying

Egoitz Aurrekoetxea

Good morning,


Thanks a lot mates for your reading time. Alex, Amos speciallý to you that answered too to this mail.


My idea is simple. I wanted specific url, to be filtered through the proxy. How can I manage this URL to be checked by the proxy?. I assumed, I could modify the real and original content where urls appeared by setting for instance :


- Being the real url : https://oooeeee.eeee.ttt/u?ii=99&j=88

- I would rewrite in the own content the URL so that  the new URL is now : https://oooeeee.eeee.ttt.thesquidserver.org/u?ii=99&j=88

The domain thesquidserver.org will be used for doing wilcards. For instance : *.thesquidserver.org *.*.thesquidserver.org etc... will resolve to the ip of the Squid server. But I don't want any url being asked as whatever.thesquidserver.org to be checked... just those ones I have wrote in some place...


So I was trying to write some content managing script, which should check if that URL is needed to be checked and in case it should, check it against an icap service. If Icap service gives you all to be right, redirect you to the real site (just removing the thesquidserver.org for the URL for instance). If that URL contains malware for instance, give you an error page.


This is all what I was trying to do... Some time ago, I used Squid with Dansguardian for this kind of purposes, but now I wanted to do something slightly different. I wanted to pass a request (if should be passed) to an icap service and later depeding in the result of that ICAP service (which I don't really know how could I check with an script) redirect to the real site or give an error page.


For this purpose is perhaps the reason because url redirector programs exist?. I'm trying to see the entire puzzle :)


Any help would be very appreciatted :)


Thank you so much,

---
sarenet
Egoitz Aurrekoetxea
Dpto. de sistemas
944 209 470
Parque Tecnológico. Edificio 103
48170 Zamudio (Bizkaia)

Antes de imprimir este correo electrónico piense si es necesario hacerlo.


El 2019-03-02 23:21, Alex Rousskov escribió:

On 3/1/19 5:59 AM, Egoitz Aurrekoetxea wrote:

Is it possible for Squid to do something like :

- Receive request : https://oooeeee.eeee.ttt.thesquidserver.org/u?ii=99&j=88

and

to really perform a request as : https://oooeeee.eeee.ttt/u?ii=99&j=88

How does your Squid receive the former request? Amos' answer probably
assumes that your Squid is _not_ oooeeee.eeee.ttt.thesquidserver.org,
but the name you have chosen for your example may imply that it is.

* If your Squid is _intercepting_ traffic destined for the real
oooeeee.eeee.ttt.thesquidserver.org, then see Amos' answer.

* If your Squid is representing oooeeee.eeee.ttt.thesquidserver.org,
then your Squid is a reverse proxy that ought to have the certificate
key for that domain, and none of the SslBump problems that Amos
mentioned apply.

Please clarify what your use case is.

Alex.



I mean not to redirect users with url redirection. Just act as a proxy
but having Squid the proper acknoledge internally for being able to make
the proper request to the destination?. Is it possible without
redirecting url, to return for instance a 403 error to the source web
browser in order to not be able to access to the site if some kind of
circumstances are given?.


If the last config, was not possible... perhaps I needed to just to
redirect forcibly?. I have read for that purpose you can use URL
redirectors.... so I assume the concept is :


- Receive request : https://oooeeee.eeee.ttt.thesquidserver.org/u?ii=99&j=88


and


to really perform a request as : https://oooeeee.eeee.ttt/u?ii=99&j=88
<https://oooeeee.eeee.ttt.thesquidserver.org/u?ii=99&j=88>


If all conditions for allowing to see the content are OK, return the web
browser a 301 redirect answer with the
https://oooeeee.eeee.ttt/u?ii=99&j=88
<https://oooeeee.eeee.ttt.thesquidserver.org/u?ii=99&j=88> URL. Else,
just return a 403 or redirect you to a Forbidden page... I think this
could be implemented with URL redirectors...but... the fact is... which
kind of conditions or env situations can you use for validating the
content inside the url redirector?.



Thanks a lot for your time :)


Cheers!




-- 
sarenet
*Egoitz Aurrekoetxea*
Dpto. de sistemas
944 209 470
Parque Tecnológico. Edificio 103
48170 Zamudio (Bizkaia)
[hidden email] <mailto:[hidden email]>
www.sarenet.es <http://www.sarenet.es>

Antes de imprimir este correo electrónico piense si es necesario hacerlo.

_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users

_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users

_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: Squid and url modifying

Alex Rousskov
On 3/4/19 12:53 AM, Egoitz Aurrekoetxea wrote:

> My idea is simple. I wanted specific url, to be filtered through the
> proxy. How can I manage this URL to be checked by the proxy?.

To answer your questions correctly, we need to translate the vague
description above into one of the many Squid configurations that may
match that vague description. In hope to do that, I am asking these two
basic questions:

1. Do clients/browsers request
https://oooeeee.eeee.ttt.thesquidserver.org/... URLs? Or do they request
https://oooeeee.eeee.ttt/... URLs?

For the purpose of the next question, lets assume that the answer to the
above question is: "Clients request https://publicDomain/... URLs"
(where "publicDomain" is one of the two domains mentioned in that
question). Let's further assume that when clients do a DNS lookup for
publicDomain they get a publicIp IP address back.

2. Does your Squid listen on port 443 of publicIp?

Alex.



> I assumed,
> I could modify the real and original content where urls appeared by
> setting for instance :
>
>
> - Being the real url : https://oooeeee.eeee.ttt/u?ii=99&j=88
> <https://oooeeee.eeee.ttt.thesquidserver.org/u?ii=99&j=88>
>
> - I would rewrite in the own content the URL so that  the new URL is now
> : https://oooeeee.eeee.ttt.thesquidserver.org/u?ii=99&j=88
>
> The domain thesquidserver.org
> <https://oooeeee.eeee.ttt.thesquidserver.org/u?ii=99&j=88> will be used
> for doing wilcards. For instance : *.thesquidserver.org
> *.*.thesquidserver.org etc... will resolve to the ip of the Squid
> server. But I don't want any url being asked as
> whatever.thesquidserver.org to be checked... just those ones I have
> wrote in some place...
>
>
> So I was trying to write some content managing script, which should
> check if that URL is needed to be checked and in case it should, check
> it against an icap service. If Icap service gives you all to be right,
> redirect you to the real site (just removing the thesquidserver.org for
> the URL for instance). If that URL contains malware for instance, give
> you an error page.
>
>
> This is all what I was trying to do... Some time ago, I used Squid with
> Dansguardian for this kind of purposes, but now I wanted to do something
> slightly different. I wanted to pass a request (if should be passed) to
> an icap service and later depeding in the result of that ICAP service
> (which I don't really know how could I check with an script) redirect to
> the real site or give an error page.
>
>
> For this purpose is perhaps the reason because url redirector programs
> exist?. I'm trying to see the entire puzzle :)


> El 2019-03-02 23:21, Alex Rousskov escribió:
>
>> On 3/1/19 5:59 AM, Egoitz Aurrekoetxea wrote:
>>
>>> Is it possible for Squid to do something like :
>>
>>> - Receive request :
>>> https://oooeeee.eeee.ttt.thesquidserver.org/u?ii=99&j=88
>>
>>> and
>>
>>> to really perform a request as : https://oooeeee.eeee.ttt/u?ii=99&j=88
>>
>> How does your Squid receive the former request? Amos' answer probably
>> assumes that your Squid is _not_ oooeeee.eeee.ttt.thesquidserver.org,
>> but the name you have chosen for your example may imply that it is.
>>
>> * If your Squid is _intercepting_ traffic destined for the real
>> oooeeee.eeee.ttt.thesquidserver.org, then see Amos' answer.
>>
>> * If your Squid is representing oooeeee.eeee.ttt.thesquidserver.org,
>> then your Squid is a reverse proxy that ought to have the certificate
>> key for that domain, and none of the SslBump problems that Amos
>> mentioned apply.
>>
>> Please clarify what your use case is.
>>
>> Alex.
>>
>>
>>
>>> I mean not to redirect users with url redirection. Just act as a proxy
>>> but having Squid the proper acknoledge internally for being able to make
>>> the proper request to the destination?. Is it possible without
>>> redirecting url, to return for instance a 403 error to the source web
>>> browser in order to not be able to access to the site if some kind of
>>> circumstances are given?.
>>>
>>>
>>> If the last config, was not possible... perhaps I needed to just to
>>> redirect forcibly?. I have read for that purpose you can use URL
>>> redirectors.... so I assume the concept is :
>>>
>>>
>>> - Receive request :
>>> https://oooeeee.eeee.ttt.thesquidserver.org/u?ii=99&j=88
>>>
>>>
>>> and
>>>
>>>
>>> to really perform a request as : https://oooeeee.eeee.ttt/u?ii=99&j=88
>>> <https://oooeeee.eeee.ttt.thesquidserver.org/u?ii=99&j=88>
>>>
>>>
>>> If all conditions for allowing to see the content are OK, return the web
>>> browser a 301 redirect answer with the
>>> https://oooeeee.eeee.ttt/u?ii=99&j=88
>>> <https://oooeeee.eeee.ttt.thesquidserver.org/u?ii=99&j=88> URL. Else,
>>> just return a 403 or redirect you to a Forbidden page... I think this
>>> could be implemented with URL redirectors...but... the fact is... which
>>> kind of conditions or env situations can you use for validating the
>>> content inside the url redirector?.
>>>
>>>
>>>
>>> Thanks a lot for your time :)
>>>
>>>
>>> Cheers!
>>>
>>>
>>>
>>>
>>> -- 
>>> sarenet
>>> *Egoitz Aurrekoetxea*
>>> Dpto. de sistemas
>>> 944 209 470
>>> Parque Tecnológico. Edificio 103
>>> 48170 Zamudio (Bizkaia)
>>> [hidden email] <mailto:[hidden email]>
>>> <mailto:[hidden email] <mailto:[hidden email]>>
>>> www.sarenet.es <http://www.sarenet.es> <http://www.sarenet.es>
>>>
>>> Antes de imprimir este correo electrónico piense si es necesario hacerlo.
>>>
>>> _______________________________________________
>>> squid-users mailing list
>>> [hidden email]
>>> <mailto:[hidden email]>
>>> http://lists.squid-cache.org/listinfo/squid-users
>>
>> _______________________________________________
>> squid-users mailing list
>> [hidden email]
>> <mailto:[hidden email]>
>> http://lists.squid-cache.org/listinfo/squid-users
>
> _______________________________________________
> squid-users mailing list
> [hidden email]
> http://lists.squid-cache.org/listinfo/squid-users
>

_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: Squid and url modifying

Egoitz Aurrekoetxea

Hi Alex,


I'm so sorry... have tried explaining the best I could... sorry....

Clients, will ask :

https://oooeeee.eeee.ttt.thesquidserver.org/

but redirector if site is virus free (checked with an icap daemon) should return a 302 to https://oooeeee.eeee.ttt/


For the second question, I say I have DNAT rules, for being able to redirect tcp/80 and tcp/443 to squid's port silently. So the answer I assume should be yes.


I'll try to say again, what I'm trying to do.

I wanted to setup a proxy machine which I wanted to be able to receive url like :

- www.iou.net.theproxy.com/hj.php?ui=9

If this site returns clean content (scanned by Icap server) the url redirector should return :

- www.iou.net/hj.php?ui=9 (the real url) as URL.


I'm using this config https://pastebin.com/raw/mP73fame and this redirector code https://pastebin.com/p6Usmq75


So I would say my questions are :

- Is it possible with Squid to achieve my goal?. With Squid, a redirector, and a Icap daemon which performs virus scanning...

- For plain http the config and the URL seem to be working BUT the virus are not being scanned. Could the config be adjusted for that?.


Cheers!


---
sarenet
Egoitz Aurrekoetxea
Dpto. de sistemas
944 209 470
Parque Tecnológico. Edificio 103
48170 Zamudio (Bizkaia)

Antes de imprimir este correo electrónico piense si es necesario hacerlo.


El 2019-03-04 17:23, Alex Rousskov escribió:

On 3/4/19 12:53 AM, Egoitz Aurrekoetxea wrote:

My idea is simple. I wanted specific url, to be filtered through the
proxy. How can I manage this URL to be checked by the proxy?.

To answer your questions correctly, we need to translate the vague
description above into one of the many Squid configurations that may
match that vague description. In hope to do that, I am asking these two
basic questions:

1. Do clients/browsers request
https://oooeeee.eeee.ttt.thesquidserver.org/... URLs? Or do they request
https://oooeeee.eeee.ttt/... URLs?

For the purpose of the next question, lets assume that the answer to the
above question is: "Clients request https://publicDomain/... URLs"
(where "publicDomain" is one of the two domains mentioned in that
question). Let's further assume that when clients do a DNS lookup for
publicDomain they get a publicIp IP address back.

2. Does your Squid listen on port 443 of publicIp?

Alex.



I assumed,
I could modify the real and original content where urls appeared by
setting for instance :


- Being the real url : https://oooeeee.eeee.ttt/u?ii=99&j=88
<https://oooeeee.eeee.ttt.thesquidserver.org/u?ii=99&j=88>

- I would rewrite in the own content the URL so that  the new URL is now
: https://oooeeee.eeee.ttt.thesquidserver.org/u?ii=99&j=88

The domain thesquidserver.org
<https://oooeeee.eeee.ttt.thesquidserver.org/u?ii=99&j=88> will be used
for doing wilcards. For instance : *.thesquidserver.org
*.*.thesquidserver.org etc... will resolve to the ip of the Squid
server. But I don't want any url being asked as
whatever.thesquidserver.org to be checked... just those ones I have
wrote in some place...


So I was trying to write some content managing script, which should
check if that URL is needed to be checked and in case it should, check
it against an icap service. If Icap service gives you all to be right,
redirect you to the real site (just removing the thesquidserver.org for
the URL for instance). If that URL contains malware for instance, give
you an error page.


This is all what I was trying to do... Some time ago, I used Squid with
Dansguardian for this kind of purposes, but now I wanted to do something
slightly different. I wanted to pass a request (if should be passed) to
an icap service and later depeding in the result of that ICAP service
(which I don't really know how could I check with an script) redirect to
the real site or give an error page.


For this purpose is perhaps the reason because url redirector programs
exist?. I'm trying to see the entire puzzle :)


El 2019-03-02 23:21, Alex Rousskov escribió:

On 3/1/19 5:59 AM, Egoitz Aurrekoetxea wrote:

Is it possible for Squid to do something like :

- Receive request :
https://oooeeee.eeee.ttt.thesquidserver.org/u?ii=99&j=88

and

to really perform a request as : https://oooeeee.eeee.ttt/u?ii=99&j=88

How does your Squid receive the former request? Amos' answer probably
assumes that your Squid is _not_ oooeeee.eeee.ttt.thesquidserver.org,
but the name you have chosen for your example may imply that it is.

* If your Squid is _intercepting_ traffic destined for the real
oooeeee.eeee.ttt.thesquidserver.org, then see Amos' answer.

* If your Squid is representing oooeeee.eeee.ttt.thesquidserver.org,
then your Squid is a reverse proxy that ought to have the certificate
key for that domain, and none of the SslBump problems that Amos
mentioned apply.

Please clarify what your use case is.

Alex.



I mean not to redirect users with url redirection. Just act as a proxy
but having Squid the proper acknoledge internally for being able to make
the proper request to the destination?. Is it possible without
redirecting url, to return for instance a 403 error to the source web
browser in order to not be able to access to the site if some kind of
circumstances are given?.


If the last config, was not possible... perhaps I needed to just to
redirect forcibly?. I have read for that purpose you can use URL
redirectors.... so I assume the concept is :


- Receive request :
https://oooeeee.eeee.ttt.thesquidserver.org/u?ii=99&j=88


and


to really perform a request as : https://oooeeee.eeee.ttt/u?ii=99&j=88
<https://oooeeee.eeee.ttt.thesquidserver.org/u?ii=99&j=88>


If all conditions for allowing to see the content are OK, return the web
browser a 301 redirect answer with the
https://oooeeee.eeee.ttt/u?ii=99&j=88
<https://oooeeee.eeee.ttt.thesquidserver.org/u?ii=99&j=88> URL. Else,
just return a 403 or redirect you to a Forbidden page... I think this
could be implemented with URL redirectors...but... the fact is... which
kind of conditions or env situations can you use for validating the
content inside the url redirector?.



Thanks a lot for your time :)


Cheers!




-- 
sarenet
*Egoitz Aurrekoetxea*
Dpto. de sistemas
944 209 470
Parque Tecnológico. Edificio 103
48170 Zamudio (Bizkaia)
[hidden email] <mailto:[hidden email]>
<mailto:[hidden email] <mailto:[hidden email]>>
www.sarenet.es <http://www.sarenet.es> <http://www.sarenet.es>

Antes de imprimir este correo electrónico piense si es necesario hacerlo.

_______________________________________________
squid-users mailing list
[hidden email]
<mailto:[hidden email]>
http://lists.squid-cache.org/listinfo/squid-users

_______________________________________________
squid-users mailing list
[hidden email]
<mailto:[hidden email]>
http://lists.squid-cache.org/listinfo/squid-users

_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users

_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users

_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: Squid and url modifying

Alex Rousskov
On 3/4/19 11:20 AM, Egoitz Aurrekoetxea wrote:

> Clients, will ask :
>
> https://oooeeee.eeee.ttt.thesquidserver.org/

> So the answer [to the second question] I assume should be yes.

If I am interpreting your answers correctly, then your setup looks like
a reverse proxy to me. In that case, you do not need SslBump and
interception. You do need an web server certificate for the
oooeeee.eeee.ttt.thesquidserver.org domain, issued by a well-trusted CA.
Do you already have that?


> I have DNAT rules, for being able to
> redirect tcp/80 and tcp/443 to squid's port silently.

Please note that your current Squid configuration is not a reverse proxy
configuration. It is an interception configuration. It also lacks
https_port for handling port 443 traffic. There are probably some
documents on Squid wiki (and/or elsewhere) explaining how to configure
Squid to become a reverse proxy. Follow them.


> I wanted to setup a proxy machine which I wanted to be able to receive
> url like :
>
> - www.iou.net.theproxy.com/hj.php?ui=9
>
> If this site returns clean content (scanned by Icap server) the url
> redirector should return :
>
> - www.iou.net/hj.php?ui=9 <http://www.iou.net/hj.php?ui=9> (the real
> url) as URL.

OK.


> - Is it possible with Squid to achieve my goal?. With Squid, a
> redirector, and a Icap daemon which performs virus scanning...

A redirector seems out of scope here -- it works on requests while you
want to rewrite (scanned by ICAP) responses.

It is probably possible to use deny_info to respond with a redirect
message. To trigger a deny_info action, you would have to configure your
Squid to block virus-free responses, which is rather strange!


> - For plain http the config and the URL seem to be working BUT the virus
> are not being scanned. Could the config be adjusted for that?.


I would start by removing the redirector, "intercept", SslBump, and
disabling ICAP. Configure your Squid as a reverse proxy without any
virus scanning. Then add ICAP. Get the virus scanning working without
any URL manipulation. Once that is done, you can adjust Squid to block
virus-free responses (via http_reply_access) and trigger a deny_info
response containing an HTTP redirect.


Please note that once the browser gets a redirect to another site, that
browser is not going to revisit your reverse proxy for any content
related to that other site -- all requests for that other site will go
from the browser to that other site. Your proxy will not be in the loop
anymore. If that is not what you want, then you cannot use redirects at
all -- you would have to accelerate that other site for all requests
instead and make sure that other site does not contain absolute URLs
pointing the browser away from your reverse proxy.


Disclaimer: I have not tested the above ideas and, again, I may be
misinterpreting what you really want to achieve.

Alex.
_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: Squid and url modifying

Egoitz Aurrekoetxea

Good morning Alex,


Thank you so much for your time. Your interpretations I would say are almost exact. I say almost, because I wanted to be a reverse proxy of multiple sites. Not just for the sites you host or similar... And yes I wanted, let's say if all is OK "block" the request by giving a 301 to directly the site. Yes I know I won't traverse the proxy any more after that BUT I will only go direct if content is clean. If it is not I will receive an error response from the ICAP so all is fine then...


I'll deeply check your comments and will tell here something :)


Thank you so much!

---
sarenet
Egoitz Aurrekoetxea
Dpto. de sistemas
944 209 470
Parque Tecnológico. Edificio 103
48170 Zamudio (Bizkaia)

Antes de imprimir este correo electrónico piense si es necesario hacerlo.


El 2019-03-05 08:13, Alex Rousskov escribió:

On 3/4/19 11:20 AM, Egoitz Aurrekoetxea wrote:

Clients, will ask :

https://oooeeee.eeee.ttt.thesquidserver.org/

So the answer [to the second question] I assume should be yes.

If I am interpreting your answers correctly, then your setup looks like
a reverse proxy to me. In that case, you do not need SslBump and
interception. You do need an web server certificate for the
oooeeee.eeee.ttt.thesquidserver.org domain, issued by a well-trusted CA.
Do you already have that?


I have DNAT rules, for being able to
redirect tcp/80 and tcp/443 to squid's port silently.

Please note that your current Squid configuration is not a reverse proxy
configuration. It is an interception configuration. It also lacks
https_port for handling port 443 traffic. There are probably some
documents on Squid wiki (and/or elsewhere) explaining how to configure
Squid to become a reverse proxy. Follow them.


I wanted to setup a proxy machine which I wanted to be able to receive
url like :

- www.iou.net.theproxy.com/hj.php?ui=9

If this site returns clean content (scanned by Icap server) the url
redirector should return :

- www.iou.net/hj.php?ui=9 <http://www.iou.net/hj.php?ui=9> (the real
url) as URL.

OK.


- Is it possible with Squid to achieve my goal?. With Squid, a
redirector, and a Icap daemon which performs virus scanning...

A redirector seems out of scope here -- it works on requests while you
want to rewrite (scanned by ICAP) responses.

It is probably possible to use deny_info to respond with a redirect
message. To trigger a deny_info action, you would have to configure your
Squid to block virus-free responses, which is rather strange!


- For plain http the config and the URL seem to be working BUT the virus
are not being scanned. Could the config be adjusted for that?.


I would start by removing the redirector, "intercept", SslBump, and
disabling ICAP. Configure your Squid as a reverse proxy without any
virus scanning. Then add ICAP. Get the virus scanning working without
any URL manipulation. Once that is done, you can adjust Squid to block
virus-free responses (via http_reply_access) and trigger a deny_info
response containing an HTTP redirect.


Please note that once the browser gets a redirect to another site, that
browser is not going to revisit your reverse proxy for any content
related to that other site -- all requests for that other site will go
from the browser to that other site. Your proxy will not be in the loop
anymore. If that is not what you want, then you cannot use redirects at
all -- you would have to accelerate that other site for all requests
instead and make sure that other site does not contain absolute URLs
pointing the browser away from your reverse proxy.


Disclaimer: I have not tested the above ideas and, again, I may be
misinterpreting what you really want to achieve.

Alex.
_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users

_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: Squid and url modifying

Egoitz Aurrekoetxea
In reply to this post by Alex Rousskov

Hi Alex,


What you told about http_reply_access could work for me... but I have a problem...

Can http_reply_access and some for of... url_regexp dstdom_regex or similar.... cause a redirect by using matching content?.


I mean :

https://a.b.c.cloud.aaa.bbb

to be redirected to :

https://a.b.c


Let's say matching all but cloud.aaa.bbb ?


Cheers!!

---
sarenet
Egoitz Aurrekoetxea
Dpto. de sistemas
944 209 470
Parque Tecnológico. Edificio 103
48170 Zamudio (Bizkaia)

Antes de imprimir este correo electrónico piense si es necesario hacerlo.


El 2019-03-05 08:13, Alex Rousskov escribió:

On 3/4/19 11:20 AM, Egoitz Aurrekoetxea wrote:

Clients, will ask :

https://oooeeee.eeee.ttt.thesquidserver.org/

So the answer [to the second question] I assume should be yes.

If I am interpreting your answers correctly, then your setup looks like
a reverse proxy to me. In that case, you do not need SslBump and
interception. You do need an web server certificate for the
oooeeee.eeee.ttt.thesquidserver.org domain, issued by a well-trusted CA.
Do you already have that?


I have DNAT rules, for being able to
redirect tcp/80 and tcp/443 to squid's port silently.

Please note that your current Squid configuration is not a reverse proxy
configuration. It is an interception configuration. It also lacks
https_port for handling port 443 traffic. There are probably some
documents on Squid wiki (and/or elsewhere) explaining how to configure
Squid to become a reverse proxy. Follow them.


I wanted to setup a proxy machine which I wanted to be able to receive
url like :

- www.iou.net.theproxy.com/hj.php?ui=9

If this site returns clean content (scanned by Icap server) the url
redirector should return :

- www.iou.net/hj.php?ui=9 <http://www.iou.net/hj.php?ui=9> (the real
url) as URL.

OK.


- Is it possible with Squid to achieve my goal?. With Squid, a
redirector, and a Icap daemon which performs virus scanning...

A redirector seems out of scope here -- it works on requests while you
want to rewrite (scanned by ICAP) responses.

It is probably possible to use deny_info to respond with a redirect
message. To trigger a deny_info action, you would have to configure your
Squid to block virus-free responses, which is rather strange!


- For plain http the config and the URL seem to be working BUT the virus
are not being scanned. Could the config be adjusted for that?.


I would start by removing the redirector, "intercept", SslBump, and
disabling ICAP. Configure your Squid as a reverse proxy without any
virus scanning. Then add ICAP. Get the virus scanning working without
any URL manipulation. Once that is done, you can adjust Squid to block
virus-free responses (via http_reply_access) and trigger a deny_info
response containing an HTTP redirect.


Please note that once the browser gets a redirect to another site, that
browser is not going to revisit your reverse proxy for any content
related to that other site -- all requests for that other site will go
from the browser to that other site. Your proxy will not be in the loop
anymore. If that is not what you want, then you cannot use redirects at
all -- you would have to accelerate that other site for all requests
instead and make sure that other site does not contain absolute URLs
pointing the browser away from your reverse proxy.


Disclaimer: I have not tested the above ideas and, again, I may be
misinterpreting what you really want to achieve.

Alex.
_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users

_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: Squid and url modifying

Egoitz Aurrekoetxea
In reply to this post by Alex Rousskov

Hi!,


I have Squid configured with the virus scanning software using ICAP and working. But, when I do :

acl matchear_todo url_regex [-i] ^.*$
http_reply_access deny matchear_todo
deny_info   http://172.16.8.61/redirigir.php?url=%s matchear_todo

it's always redirecting me without passing the own ICAP system... I wanted the redirection to be done only when content is clean... this is doing it always... have I missed something?


Cheers!


---
sarenet
Egoitz Aurrekoetxea
Dpto. de sistemas
944 209 470
Parque Tecnológico. Edificio 103
48170 Zamudio (Bizkaia)

Antes de imprimir este correo electrónico piense si es necesario hacerlo.


El 2019-03-05 08:13, Alex Rousskov escribió:

On 3/4/19 11:20 AM, Egoitz Aurrekoetxea wrote:

Clients, will ask :

https://oooeeee.eeee.ttt.thesquidserver.org/

So the answer [to the second question] I assume should be yes.

If I am interpreting your answers correctly, then your setup looks like
a reverse proxy to me. In that case, you do not need SslBump and
interception. You do need an web server certificate for the
oooeeee.eeee.ttt.thesquidserver.org domain, issued by a well-trusted CA.
Do you already have that?


I have DNAT rules, for being able to
redirect tcp/80 and tcp/443 to squid's port silently.

Please note that your current Squid configuration is not a reverse proxy
configuration. It is an interception configuration. It also lacks
https_port for handling port 443 traffic. There are probably some
documents on Squid wiki (and/or elsewhere) explaining how to configure
Squid to become a reverse proxy. Follow them.


I wanted to setup a proxy machine which I wanted to be able to receive
url like :

- www.iou.net.theproxy.com/hj.php?ui=9

If this site returns clean content (scanned by Icap server) the url
redirector should return :

- www.iou.net/hj.php?ui=9 <http://www.iou.net/hj.php?ui=9> (the real
url) as URL.

OK.


- Is it possible with Squid to achieve my goal?. With Squid, a
redirector, and a Icap daemon which performs virus scanning...

A redirector seems out of scope here -- it works on requests while you
want to rewrite (scanned by ICAP) responses.

It is probably possible to use deny_info to respond with a redirect
message. To trigger a deny_info action, you would have to configure your
Squid to block virus-free responses, which is rather strange!


- For plain http the config and the URL seem to be working BUT the virus
are not being scanned. Could the config be adjusted for that?.


I would start by removing the redirector, "intercept", SslBump, and
disabling ICAP. Configure your Squid as a reverse proxy without any
virus scanning. Then add ICAP. Get the virus scanning working without
any URL manipulation. Once that is done, you can adjust Squid to block
virus-free responses (via http_reply_access) and trigger a deny_info
response containing an HTTP redirect.


Please note that once the browser gets a redirect to another site, that
browser is not going to revisit your reverse proxy for any content
related to that other site -- all requests for that other site will go
from the browser to that other site. Your proxy will not be in the loop
anymore. If that is not what you want, then you cannot use redirects at
all -- you would have to accelerate that other site for all requests
instead and make sure that other site does not contain absolute URLs
pointing the browser away from your reverse proxy.


Disclaimer: I have not tested the above ideas and, again, I may be
misinterpreting what you really want to achieve.

Alex.
_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users

_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: Squid and url modifying

Alex Rousskov
On 3/5/19 1:57 AM, Egoitz Aurrekoetxea wrote:

> I have Squid configured with the virus scanning software using ICAP and
> working. But, when I do :
>
> acl matchear_todo url_regex [-i] ^.*$

FYI: "[-i]" is documentation syntax that means an optional flag called
"-i". If you want to use that "-i" flag, then type

  acl matchear_todo url_regex -i ^.*$

... but keep in mind that "-i" makes no sense when you regular
expression does not contain small or capital characters. Adding "-i"
would not change what URLs such a regular expression would match.


> http_reply_access deny matchear_todo
> deny_info   <a href="http://172.16.8.61/redirigir.php?url=%s">http://172.16.8.61/redirigir.php?url=%s matchear_todo

Why are you blocking based on URL instead of blocking based on the ICAP
scan result? In your earlier specifications, you wanted to
block/redirect only those transactions that were certified virus-free by
your ICAP client. The above matchear_todo ACL does not do that.


> it's always redirecting me without passing the own ICAP system...

Looking at the Squid code, what you describe overall seems impossible --
Squid checks http_reply_access _after_ the RESPMOD transaction, not
before it. Adding http_reply_access cannot disable ICAP scans AFAICT!
Are you sure it has that effect in your use case?


> I
> wanted the redirection to be done only when content is clean... this is
> doing it always... have I missed something?

Your ACL says nothing about "clean". It says "always". How does your
ICAP service mark "clean" (or "dirty") HTTP responses? Your ACL needs to
match that marking (or the absence of that marking).

Alex.


> El 2019-03-05 08:13, Alex Rousskov escribió:
>
>> On 3/4/19 11:20 AM, Egoitz Aurrekoetxea wrote:
>>
>>> Clients, will ask :
>>>
>>> https://oooeeee.eeee.ttt.thesquidserver.org/
>>
>>> So the answer [to the second question] I assume should be yes.
>>
>> If I am interpreting your answers correctly, then your setup looks like
>> a reverse proxy to me. In that case, you do not need SslBump and
>> interception. You do need an web server certificate for the
>> oooeeee.eeee.ttt.thesquidserver.org domain, issued by a well-trusted CA.
>> Do you already have that?
>>
>>
>>> I have DNAT rules, for being able to
>>> redirect tcp/80 and tcp/443 to squid's port silently.
>>
>> Please note that your current Squid configuration is not a reverse proxy
>> configuration. It is an interception configuration. It also lacks
>> https_port for handling port 443 traffic. There are probably some
>> documents on Squid wiki (and/or elsewhere) explaining how to configure
>> Squid to become a reverse proxy. Follow them.
>>
>>
>>> I wanted to setup a proxy machine which I wanted to be able to receive
>>> url like :
>>>
>>> - www.iou.net.theproxy.com/hj.php?ui=9
>>> <http://www.iou.net.theproxy.com/hj.php?ui=9>
>>>
>>> If this site returns clean content (scanned by Icap server) the url
>>> redirector should return :
>>>
>>> - www.iou.net/hj.php?ui=9 <http://www.iou.net/hj.php?ui=9>
>>> <http://www.iou.net/hj.php?ui=9> (the real
>>> url) as URL.
>>
>> OK.
>>
>>
>>> - Is it possible with Squid to achieve my goal?. With Squid, a
>>> redirector, and a Icap daemon which performs virus scanning...
>>
>> A redirector seems out of scope here -- it works on requests while you
>> want to rewrite (scanned by ICAP) responses.
>>
>> It is probably possible to use deny_info to respond with a redirect
>> message. To trigger a deny_info action, you would have to configure your
>> Squid to block virus-free responses, which is rather strange!
>>
>>
>>> - For plain http the config and the URL seem to be working BUT the virus
>>> are not being scanned. Could the config be adjusted for that?.
>>
>>
>> I would start by removing the redirector, "intercept", SslBump, and
>> disabling ICAP. Configure your Squid as a reverse proxy without any
>> virus scanning. Then add ICAP. Get the virus scanning working without
>> any URL manipulation. Once that is done, you can adjust Squid to block
>> virus-free responses (via http_reply_access) and trigger a deny_info
>> response containing an HTTP redirect.
>>
>>
>> Please note that once the browser gets a redirect to another site, that
>> browser is not going to revisit your reverse proxy for any content
>> related to that other site -- all requests for that other site will go
>> from the browser to that other site. Your proxy will not be in the loop
>> anymore. If that is not what you want, then you cannot use redirects at
>> all -- you would have to accelerate that other site for all requests
>> instead and make sure that other site does not contain absolute URLs
>> pointing the browser away from your reverse proxy.
>>
>>
>> Disclaimer: I have not tested the above ideas and, again, I may be
>> misinterpreting what you really want to achieve.
>>
>> Alex.
>> _______________________________________________
>> squid-users mailing list
>> [hidden email]
>> <mailto:[hidden email]>
>> http://lists.squid-cache.org/listinfo/squid-users
>
> _______________________________________________
> squid-users mailing list
> [hidden email]
> http://lists.squid-cache.org/listinfo/squid-users
>

_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: Squid and url modifying

Egoitz Aurrekoetxea

Hi Alex!!


I do answer below!! Many many thanks in advance...

---
sarenet
Egoitz Aurrekoetxea
Dpto. de sistemas
944 209 470
Parque Tecnológico. Edificio 103
48170 Zamudio (Bizkaia)

Antes de imprimir este correo electrónico piense si es necesario hacerlo.


El 2019-03-05 17:45, Alex Rousskov escribió:

On 3/5/19 1:57 AM, Egoitz Aurrekoetxea wrote:

I have Squid configured with the virus scanning software using ICAP and
working. But, when I do :

acl matchear_todo url_regex [-i] ^.*$

FYI: "[-i]" is documentation syntax that means an optional flag called
"-i". If you want to use that "-i" flag, then type

  acl matchear_todo url_regex -i ^.*$

... but keep in mind that "-i" makes no sense when you regular
expression does not contain small or capital characters. Adding "-i"
would not change what URLs such a regular expression would match.
 
 
 
 
 
 
I see... I though it was for matching case insensitively... some sort of i/______/
 
 
 
 
 



http_reply_access deny matchear_todo
deny_info   <a href="http://172.16.8.61/redirigir.php?url=%s" target="_blank" rel="noopener noreferrer">http://172.16.8.61/redirigir.php?url=%s matchear_todo

Why are you blocking based on URL instead of blocking based on the ICAP
scan result? In your earlier specifications, you wanted to
block/redirect only those transactions that were certified virus-free by
your ICAP client. The above matchear_todo ACL does not do that.
 
 
 
That was an attempt of achieving my goal. Redirect requests to a php which does the request to a "next Squid" and then return one thing or another....
 
 
Sorry, that's wrong. I have done tons of tests... At present... I don't really know how to do that... I would be very thankful if you could guide me on how could I do it... Is it possible to be done from Squid side?. Or does the own ICAP implementation directly return a 3xx answer?.
 



it's always redirecting me without passing the own ICAP system...

Looking at the Squid code, what you describe overall seems impossible --
Squid checks http_reply_access _after_ the RESPMOD transaction, not
before it. Adding http_reply_access cannot disable ICAP scans AFAICT!
Are you sure it has that effect in your use case?
 
 
 
 
 
 
It seemed to do so yes.... I'll try it again....
 
 
 
 



I
wanted the redirection to be done only when content is clean... this is
doing it always... have I missed something?

Your ACL says nothing about "clean". It says "always". How does your
ICAP service mark "clean" (or "dirty") HTTP responses? Your ACL needs to
match that marking (or the absence of that marking).
 
 
 
 
 
 
 
 
 
 
 
 
Could you give me a clue of how could I do it?.
 
 
 


Alex.
 
 
Thanks Alex!!!!


El 2019-03-05 08:13, Alex Rousskov escribió:

On 3/4/19 11:20 AM, Egoitz Aurrekoetxea wrote:

Clients, will ask :

https://oooeeee.eeee.ttt.thesquidserver.org/

So the answer [to the second question] I assume should be yes.

If I am interpreting your answers correctly, then your setup looks like
a reverse proxy to me. In that case, you do not need SslBump and
interception. You do need an web server certificate for the
oooeeee.eeee.ttt.thesquidserver.org domain, issued by a well-trusted CA.
Do you already have that?


I have DNAT rules, for being able to
redirect tcp/80 and tcp/443 to squid's port silently.

Please note that your current Squid configuration is not a reverse proxy
configuration. It is an interception configuration. It also lacks
https_port for handling port 443 traffic. There are probably some
documents on Squid wiki (and/or elsewhere) explaining how to configure
Squid to become a reverse proxy. Follow them.


I wanted to setup a proxy machine which I wanted to be able to receive
url like :

- www.iou.net.theproxy.com/hj.php?ui=9
<http://www.iou.net.theproxy.com/hj.php?ui=9>

If this site returns clean content (scanned by Icap server) the url
redirector should return :

- www.iou.net/hj.php?ui=9 <http://www.iou.net/hj.php?ui=9>
<http://www.iou.net/hj.php?ui=9> (the real
url) as URL.

OK.


- Is it possible with Squid to achieve my goal?. With Squid, a
redirector, and a Icap daemon which performs virus scanning...

A redirector seems out of scope here -- it works on requests while you
want to rewrite (scanned by ICAP) responses.

It is probably possible to use deny_info to respond with a redirect
message. To trigger a deny_info action, you would have to configure your
Squid to block virus-free responses, which is rather strange!


- For plain http the config and the URL seem to be working BUT the virus
are not being scanned. Could the config be adjusted for that?.


I would start by removing the redirector, "intercept", SslBump, and
disabling ICAP. Configure your Squid as a reverse proxy without any
virus scanning. Then add ICAP. Get the virus scanning working without
any URL manipulation. Once that is done, you can adjust Squid to block
virus-free responses (via http_reply_access) and trigger a deny_info
response containing an HTTP redirect.


Please note that once the browser gets a redirect to another site, that
browser is not going to revisit your reverse proxy for any content
related to that other site -- all requests for that other site will go
from the browser to that other site. Your proxy will not be in the loop
anymore. If that is not what you want, then you cannot use redirects at
all -- you would have to accelerate that other site for all requests
instead and make sure that other site does not contain absolute URLs
pointing the browser away from your reverse proxy.


Disclaimer: I have not tested the above ideas and, again, I may be
misinterpreting what you really want to achieve.

Alex.
_______________________________________________
squid-users mailing list
[hidden email]
<mailto:[hidden email]>
http://lists.squid-cache.org/listinfo/squid-users

_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users

_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users

_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: Squid and url modifying

Alex Rousskov
On 3/5/19 9:59 AM, Egoitz Aurrekoetxea wrote:

> El 2019-03-05 17:45, Alex Rousskov escribió:
>> On 3/5/19 1:57 AM, Egoitz Aurrekoetxea wrote:
>>
>>> I have Squid configured with the virus scanning software using ICAP and
>>> working. But, when I do :
>>>
>>> acl matchear_todo url_regex [-i] ^.*$

>> FYI: "[-i]" is documentation syntax that means an optional flag called
>> "-i". If you want to use that "-i" flag, then type
>>
>>   acl matchear_todo url_regex -i ^.*$
>>
>> ... but keep in mind that "-i" makes no sense when you regular
>> expression does not contain small or capital characters. Adding "-i"
>> would not change what URLs such a regular expression would match.
  
> I see... I though it was for matching case insensitively...

You thought correctly. The -i flag enables case insensitive matches
indeed, but you are specifying that flag incorrectly (extra square
brackets), and it makes no sense to specify it at all for your specific
regular expression!


>>> http_reply_access deny matchear_todo
>>> deny_info   <a href="http://172.16.8.61/redirigir.php?url=%s">http://172.16.8.61/redirigir.php?url=%s matchear_todo

>> Why are you blocking based on URL instead of blocking based on the ICAP
>> scan result? In your earlier specifications, you wanted to
>> block/redirect only those transactions that were certified virus-free by
>> your ICAP client. The above matchear_todo ACL does not do that.
  
>> *That was an attempt of achieving my goal. Redirect requests to a php
>> which does the request to a "next Squid" and then return one thing or
>> another....*

Sounds like you are asking about one thing and then testing/discussing
another. Doing so makes helping you more difficult. Focus on making the
simplest use case working first.


>> Is it possible to be done from Squid side?

Probably (as long as your ICAP service can signal clean/dirty status in
a way Squid ACLs can detect). Since you appear to change the
problem/goal, I am not sure what the answer to this question is.


> Or does the own ICAP implementation directly return a 3xx answer?

That works as well. In that case, you do not need deny_info tricks.

>> Your ACL says nothing about "clean". It says "always". How does your
>> ICAP service mark "clean" (or "dirty") HTTP responses? Your ACL needs to
>> match that marking (or the absence of that marking).
  
> Could you give me a clue of how could I do it?

I cannot because I do not know what your ICAP service is capable of (and
do not have the time to research that). For example, if your ICAP
service can add an HTTP header to dirty HTTP responses, then you can use
the corresponding Squid ACL to detect the presence of that header in the
adapted response.

Alex.
  

>>> El 2019-03-05 08:13, Alex Rousskov escribió:
>>>
>>>> On 3/4/19 11:20 AM, Egoitz Aurrekoetxea wrote:
>>>>
>>>>> Clients, will ask :
>>>>>
>>>>> https://oooeeee.eeee.ttt.thesquidserver.org/
>>>>
>>>>> So the answer [to the second question] I assume should be yes.
>>>>
>>>> If I am interpreting your answers correctly, then your setup looks like
>>>> a reverse proxy to me. In that case, you do not need SslBump and
>>>> interception. You do need an web server certificate for the
>>>> oooeeee.eeee.ttt.thesquidserver.org domain, issued by a well-trusted CA.
>>>> Do you already have that?
>>>>
>>>>
>>>>> I have DNAT rules, for being able to
>>>>> redirect tcp/80 and tcp/443 to squid's port silently.
>>>>
>>>> Please note that your current Squid configuration is not a reverse proxy
>>>> configuration. It is an interception configuration. It also lacks
>>>> https_port for handling port 443 traffic. There are probably some
>>>> documents on Squid wiki (and/or elsewhere) explaining how to configure
>>>> Squid to become a reverse proxy. Follow them.
>>>>
>>>>
>>>>> I wanted to setup a proxy machine which I wanted to be able to receive
>>>>> url like :
>>>>>
>>>>> - www.iou.net.theproxy.com/hj.php?ui=9
>>>>> <http://www.iou.net.theproxy.com/hj.php?ui=9>
>>>>> <http://www.iou.net.theproxy.com/hj.php?ui=9>
>>>>>
>>>>> If this site returns clean content (scanned by Icap server) the url
>>>>> redirector should return :
>>>>>
>>>>> - www.iou.net/hj.php?ui=9 <http://www.iou.net/hj.php?ui=9>
>>>>> <http://www.iou.net/hj.php?ui=9>
>>>>> <http://www.iou.net/hj.php?ui=9> (the real
>>>>> url) as URL.
>>>>
>>>> OK.
>>>>
>>>>
>>>>> - Is it possible with Squid to achieve my goal?. With Squid, a
>>>>> redirector, and a Icap daemon which performs virus scanning...
>>>>
>>>> A redirector seems out of scope here -- it works on requests while you
>>>> want to rewrite (scanned by ICAP) responses.
>>>>
>>>> It is probably possible to use deny_info to respond with a redirect
>>>> message. To trigger a deny_info action, you would have to configure your
>>>> Squid to block virus-free responses, which is rather strange!
>>>>
>>>>
>>>>> - For plain http the config and the URL seem to be working BUT the
>>>>> virus
>>>>> are not being scanned. Could the config be adjusted for that?.
>>>>
>>>>
>>>> I would start by removing the redirector, "intercept", SslBump, and
>>>> disabling ICAP. Configure your Squid as a reverse proxy without any
>>>> virus scanning. Then add ICAP. Get the virus scanning working without
>>>> any URL manipulation. Once that is done, you can adjust Squid to block
>>>> virus-free responses (via http_reply_access) and trigger a deny_info
>>>> response containing an HTTP redirect.
>>>>
>>>>
>>>> Please note that once the browser gets a redirect to another site, that
>>>> browser is not going to revisit your reverse proxy for any content
>>>> related to that other site -- all requests for that other site will go
>>>> from the browser to that other site. Your proxy will not be in the loop
>>>> anymore. If that is not what you want, then you cannot use redirects at
>>>> all -- you would have to accelerate that other site for all requests
>>>> instead and make sure that other site does not contain absolute URLs
>>>> pointing the browser away from your reverse proxy.
>>>>
>>>>
>>>> Disclaimer: I have not tested the above ideas and, again, I may be
>>>> misinterpreting what you really want to achieve.
>>>>
>>>> Alex.
>>>> _______________________________________________
>>>> squid-users mailing list
>>>> [hidden email]
>>>> <mailto:[hidden email]>
>>>> <mailto:[hidden email]
>>>> <mailto:[hidden email]>>
>>>> http://lists.squid-cache.org/listinfo/squid-users
>>>
>>> _______________________________________________
>>> squid-users mailing list
>>> [hidden email]
>>> <mailto:[hidden email]>
>>> http://lists.squid-cache.org/listinfo/squid-users
>>
>> _______________________________________________
>> squid-users mailing list
>> [hidden email]
>> <mailto:[hidden email]>
>> http://lists.squid-cache.org/listinfo/squid-users

_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: Squid and url modifying

Egoitz Aurrekoetxea

Hi!,


Thank you so much for all your effort. We have finally got it done by using a mixed solution. A script plus the Squid actual configured mode :)


I wanted to thank really all your time because it has been like gold for me :) :)


Bye mates!


El 2019-03-05 18:48, Alex Rousskov escribió:

On 3/5/19 9:59 AM, Egoitz Aurrekoetxea wrote:

El 2019-03-05 17:45, Alex Rousskov escribió:
On 3/5/19 1:57 AM, Egoitz Aurrekoetxea wrote:

I have Squid configured with the virus scanning software using ICAP and
working. But, when I do :

acl matchear_todo url_regex [-i] ^.*$

FYI: "[-i]" is documentation syntax that means an optional flag called
"-i". If you want to use that "-i" flag, then type

  acl matchear_todo url_regex -i ^.*$

... but keep in mind that "-i" makes no sense when you regular
expression does not contain small or capital characters. Adding "-i"
would not change what URLs such a regular expression would match.
  
I see... I though it was for matching case insensitively...

You thought correctly. The -i flag enables case insensitive matches
indeed, but you are specifying that flag incorrectly (extra square
brackets), and it makes no sense to specify it at all for your specific
regular expression!


http_reply_access deny matchear_todo
deny_info   <a href="http://172.16.8.61/redirigir.php?url=%s" target="_blank" rel="noopener noreferrer">http://172.16.8.61/redirigir.php?url=%s matchear_todo

Why are you blocking based on URL instead of blocking based on the ICAP
scan result? In your earlier specifications, you wanted to
block/redirect only those transactions that were certified virus-free by
your ICAP client. The above matchear_todo ACL does not do that.
  
*That was an attempt of achieving my goal. Redirect requests to a php
which does the request to a "next Squid" and then return one thing or
another....*

Sounds like you are asking about one thing and then testing/discussing
another. Doing so makes helping you more difficult. Focus on making the
simplest use case working first.


Is it possible to be done from Squid side?

Probably (as long as your ICAP service can signal clean/dirty status in
a way Squid ACLs can detect). Since you appear to change the
problem/goal, I am not sure what the answer to this question is.


Or does the own ICAP implementation directly return a 3xx answer?

That works as well. In that case, you do not need deny_info tricks.

Your ACL says nothing about "clean". It says "always". How does your
ICAP service mark "clean" (or "dirty") HTTP responses? Your ACL needs to
match that marking (or the absence of that marking).
  
Could you give me a clue of how could I do it?

I cannot because I do not know what your ICAP service is capable of (and
do not have the time to research that). For example, if your ICAP
service can add an HTTP header to dirty HTTP responses, then you can use
the corresponding Squid ACL to detect the presence of that header in the
adapted response.

Alex.
  

El 2019-03-05 08:13, Alex Rousskov escribió:

On 3/4/19 11:20 AM, Egoitz Aurrekoetxea wrote:

Clients, will ask :

https://oooeeee.eeee.ttt.thesquidserver.org/

So the answer [to the second question] I assume should be yes.

If I am interpreting your answers correctly, then your setup looks like
a reverse proxy to me. In that case, you do not need SslBump and
interception. You do need an web server certificate for the
oooeeee.eeee.ttt.thesquidserver.org domain, issued by a well-trusted CA.
Do you already have that?


I have DNAT rules, for being able to
redirect tcp/80 and tcp/443 to squid's port silently.

Please note that your current Squid configuration is not a reverse proxy
configuration. It is an interception configuration. It also lacks
https_port for handling port 443 traffic. There are probably some
documents on Squid wiki (and/or elsewhere) explaining how to configure
Squid to become a reverse proxy. Follow them.


I wanted to setup a proxy machine which I wanted to be able to receive
url like :

- www.iou.net.theproxy.com/hj.php?ui=9
<http://www.iou.net.theproxy.com/hj.php?ui=9>
<http://www.iou.net.theproxy.com/hj.php?ui=9>

If this site returns clean content (scanned by Icap server) the url
redirector should return :

- www.iou.net/hj.php?ui=9 <http://www.iou.net/hj.php?ui=9>
<http://www.iou.net/hj.php?ui=9>
<http://www.iou.net/hj.php?ui=9> (the real
url) as URL.

OK.


- Is it possible with Squid to achieve my goal?. With Squid, a
redirector, and a Icap daemon which performs virus scanning...

A redirector seems out of scope here -- it works on requests while you
want to rewrite (scanned by ICAP) responses.

It is probably possible to use deny_info to respond with a redirect
message. To trigger a deny_info action, you would have to configure your
Squid to block virus-free responses, which is rather strange!


- For plain http the config and the URL seem to be working BUT the
virus
are not being scanned. Could the config be adjusted for that?.


I would start by removing the redirector, "intercept", SslBump, and
disabling ICAP. Configure your Squid as a reverse proxy without any
virus scanning. Then add ICAP. Get the virus scanning working without
any URL manipulation. Once that is done, you can adjust Squid to block
virus-free responses (via http_reply_access) and trigger a deny_info
response containing an HTTP redirect.


Please note that once the browser gets a redirect to another site, that
browser is not going to revisit your reverse proxy for any content
related to that other site -- all requests for that other site will go
from the browser to that other site. Your proxy will not be in the loop
anymore. If that is not what you want, then you cannot use redirects at
all -- you would have to accelerate that other site for all requests
instead and make sure that other site does not contain absolute URLs
pointing the browser away from your reverse proxy.


Disclaimer: I have not tested the above ideas and, again, I may be
misinterpreting what you really want to achieve.

Alex.
_______________________________________________
squid-users mailing list
[hidden email]
<mailto:[hidden email]>
<mailto:[hidden email]
<mailto:[hidden email]>>
http://lists.squid-cache.org/listinfo/squid-users

_______________________________________________
squid-users mailing list
[hidden email]
<mailto:[hidden email]>
http://lists.squid-cache.org/listinfo/squid-users

_______________________________________________
squid-users mailing list
[hidden email]
<mailto:[hidden email]>
http://lists.squid-cache.org/listinfo/squid-users

_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users