TCP_MISS_ABORTED/503 - -Squid-Error: ERR_DNS_FAIL 0

classic Classic list List threaded Threaded
5 messages Options
Reply | Threaded
Open this post in threaded view
|

TCP_MISS_ABORTED/503 - -Squid-Error: ERR_DNS_FAIL 0

L A Walsh
Pulled this out of my log.  Downloading lots of files through squid has
the download aborting after about 3k files.  This is the first I've seen
that there's also an associated ERR_DNS_FAIL -- is that a message from
squid's internal resolver?

1566304848.234      1 192.168.3.1 TCP_MISS_ABORTED/503 4185 GET
http://download.
opensuse.org/tumbleweed/repo/src-oss/src/gdouros-aegean-fonts-9.78-1.6.src.rpm
- HIER_NONE/- text/html [Referer:
http://download.opensuse.org/tumbleweed/repo/src-oss/src/\r\nUser-Agent:
openSuSE_Client\r\nAccept: */*\r\nAccept-Encoding:
identity\r\nConnection: Keep-Alive\r\nProxy-Connection:
Keep-Alive\r\nHost: download.opensuse.org\r\n] [HTTP/1.1 503 Service
Unavailable\r\nServer: squid/4.0.25\r\nMime-Version: 1.0\r\nDate: Tue,
20 Aug 2019 12:40:48 GMT\r\nContent-Type:
text/html;charset=utf-8\r\nContent-Length: 4163\r\nX-Squid-Error:
ERR_DNS_FAIL 0\r\nContent-Language: en\r\n\r]

One thing -- all the files were being downloaded in 1 copy of wget, so
it was
a long connection.

If I don't go through the proxy, it does download, but its hard to see
why DNS
would ahve a problem only 4 hosts are accessed in the download:


   1027 http://download.opensuse.org
      2 http://ftp1.nluug.nl
   2030 http://mirror.sfo12.us.leaseweb.net
     14 http://mirrors.acm.wpi.edu

3073...

    Looks like it's exactly 3k lookups and death on 3k+1

ring any bells?

Thanks




_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: TCP_MISS_ABORTED/503 - -Squid-Error: ERR_DNS_FAIL 0

Eliza
Hi

on 2019/8/21 11:51, L A Walsh wrote:
> Pulled this out of my log.  Downloading lots of files through squid has
> the download aborting after about 3k files.  This is the first I've seen
> that there's also an associated ERR_DNS_FAIL -- is that a message from
> squid's internal resolver?

Squid may cache the wrong DNS hosts. can you restart squid to give
another try?

regards.
_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: TCP_MISS_ABORTED/503 - -Squid-Error: ERR_DNS_FAIL 0

L A Walsh
On 2019/08/20 20:56, Eliza wrote:

> Hi
>
> on 2019/8/21 11:51, L A Walsh wrote:
>  
>> Pulled this out of my log.  Downloading lots of files through squid has
>> the download aborting after about 3k files.  This is the first I've seen
>> that there's also an associated ERR_DNS_FAIL -- is that a message from
>> squid's internal resolver?
>>    
>
> Squid may cache the wrong DNS hosts. can you restart squid to give
> another try?
>  
This wasn't just 1 time, but seems to happen each time I try to download
a distro update through squid -- and squid has been restarted multiple
times over
that period -- seems I have set something somewhere to a 3k size.

had 1k for the FQDN in the squid.conf dns section.  but that seems real

This may not be a problem in squid at all -- DNS, my config that's been the
same for months, its just that that's where I'm seeing the evidence of an
error, but it could easily be someplace else.


Was sorta hoping that someone had seen this symptom and had a clue, but
otherwise, I'll try other things.;


_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: TCP_MISS_ABORTED/503 - -Squid-Error: ERR_DNS_FAIL 0

Amos Jeffries
Administrator
In reply to this post by L A Walsh
On 21/08/19 3:51 pm, L A Walsh wrote:
> Pulled this out of my log.  Downloading lots of files through squid has
> the download aborting after about 3k files.  This is the first I've seen
> that there's also an associated ERR_DNS_FAIL -- is that a message from
> squid's internal resolver?

Indirectly. It will be the result of DNS failure on the domain in that
particular transactions URL.

You mentioned the ABORTED was repeatable. If this DNS_FAIL was not
always present, what was?


>
> 1566304848.234      1 192.168.3.1 TCP_MISS_ABORTED/503 4185 GET
> http://download.
> opensuse.org/tumbleweed/repo/src-oss/src/gdouros-aegean-fonts-9.78-1.6.src.rpm
> - HIER_NONE/- text/html [Referer:
> http://download.opensuse.org/tumbleweed/repo/src-oss/src/\r\nUser-Agent:
> openSuSE_Client\r\nAccept: */*\r\nAccept-Encoding:
> identity\r\nConnection: Keep-Alive\r\nProxy-Connection:
> Keep-Alive\r\nHost: download.opensuse.org\r\n] [HTTP/1.1 503 Service
> Unavailable\r\nServer: squid/4.0.25\r\nMime-Version: 1.0\r\nDate: Tue,

Please upgrade your Squid. That is a beta release. Squid-4 has been in
stable/production releases for over a year now.


> 20 Aug 2019 12:40:48 GMT\r\nContent-Type:
> text/html;charset=utf-8\r\nContent-Length: 4163\r\nX-Squid-Error:
> ERR_DNS_FAIL 0\r\nContent-Language: en\r\n\r]
>
> One thing -- all the files were being downloaded in 1 copy of wget, so
> it was
> a long connection.

This makes me suspect the timing may be important. Check if these 3k
transactions all terminated the same amount of time after their TCP
connection was opened by the client.
 If they are or very close to similar timing before the end. Look for
somewhere that may be timing out (could be Squid, or the TCP stack or
any routers along the way.


>
> If I don't go through the proxy, it does download, but its hard to see
> why DNS
> would ahve a problem only 4 hosts are accessed in the download:
>

The hostname needs to be looked up on every request (HTTP being
stateless). The Squid internal resolver does cache names for the DNS
provided TTL, when that expires they may need to be re-resolved and hit
a failure then.


>
>    1027 http://download.opensuse.org
>       2 http://ftp1.nluug.nl
>    2030 http://mirror.sfo12.us.leaseweb.net
>      14 http://mirrors.acm.wpi.edu
>
> 3073...
>
>     Looks like it's exactly 3k lookups and death on 3k+1
>
> ring any bells?

There is nothing in Squid that is tied to a particular number of
transactions. An infinite number of HTTP/1.x requests may be processed
on a TCP connection.

It is more likely to be timing or bandwidth related. Some network level
thing. The error message "page content" Squid produces on that last
request each time would be useful info.

Amos
_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: TCP_MISS_ABORTED/503 - -Squid-Error: ERR_DNS_FAIL 0

L A Walsh
On 2019/08/21 04:41, Amos Jeffries wrote:

> On 21/08/19 3:51 pm, L A Walsh wrote:
>  
>> Pulled this out of my log.  Downloading lots of files through squid has
>> the download aborting after about 3k files.  This is the first I've seen
>> that there's also an associated ERR_DNS_FAIL -- is that a message from
>> squid's internal resolver?
>>    
>
> Indirectly. It will be the result of DNS failure on the domain in that
> particular transactions URL.
>
> You mentioned the ABORTED was repeatable. If this DNS_FAIL was not
> always present, what was?
>  
----
    Well not several months ago, but in the past 1-2 seems like it.


>
>  
>> 1566304848.234      1 192.168.3.1 TCP_MISS_ABORTED/503 4185 GET
>> http://download.
>> opensuse.org/tumbleweed/repo/src-oss/src/gdouros-aegean-fonts-9.78-1.6.src.rpm
>> - HIER_NONE/- text/html [Referer:
>> http://download.opensuse.org/tumbleweed/repo/src-oss/src/\r\nUser-Agent:
>> openSuSE_Client\r\nAccept: */*\r\nAccept-Encoding:
>> identity\r\nConnection: Keep-Alive\r\nProxy-Connection:
>> Keep-Alive\r\nHost: download.opensuse.org\r\n] [HTTP/1.1 503 Service
>> Unavailable\r\nServer: squid/4.0.25\r\nMime-Version: 1.0\r\nDate: Tue,
>>    
>
> Please upgrade your Squid. That is a beta release. Squid-4 has been in
> stable/production releases for over a year now.
>  
----
    Yeah, just put together 4.8 and need to install, somehow I'm finding
it hard to believe it is in squid, it's just the piece that is bearing
the "bad news"...but probably pointing at something else.



>
>  
>> 20 Aug 2019 12:40:48 GMT\r\nContent-Type:
>> text/html;charset=utf-8\r\nContent-Length: 4163\r\nX-Squid-Error:
>> ERR_DNS_FAIL 0\r\nContent-Language: en\r\n\r]
>>
>> One thing -- all the files were being downloaded in 1 copy of wget, so
>> it was
>> a long connection.
>>    
>
> This makes me suspect the timing may be important. Check if these 3k
> transactions all terminated the same amount of time after their TCP
> connection was opened by the client.
>  If they are or very close to similar timing before the end. Look for
> somewhere that may be timing out (could be Squid, or the TCP stack or
> any routers along the way.
>  
----
    Don't see anything in the named logs either, but it maybe that it is
not recording these types of events, and my nssswitch for hosts does a
roll-over:

hosts:  files  dns wins

---
but wins isn't recording very much detail.


>
>  
>> If I don't go through the proxy, it does download, but its hard to see
>> why DNS
>> would ahve a problem only 4 hosts are accessed in the download:
>>
>>    
>
> The hostname needs to be looked up on every request (HTTP being
> stateless). The Squid internal resolver does cache names for the DNS
> provided TTL, when that expires they may need to be re-resolved and hit
> a failure then.
>
>
>  
>>    1027 http://download.opensuse.org
>>       2 http://ftp1.nluug.nl
>>    2030 http://mirror.sfo12.us.leaseweb.net
>>      14 http://mirrors.acm.wpi.edu
>>
>> 3073...
>>
>>     Looks like it's exactly 3k lookups and death on 3k+1
>>
>> ring any bells?
>>    
>
> There is nothing in Squid that is tied to a particular number of
> transactions. An infinite number of HTTP/1.x requests may be processed
> on a TCP connection.
>
> It is more likely to be timing or bandwidth related. Some network level
> thing. The error message "page content" Squid produces on that last
> request each time would be useful info.
>  

---
    Everything was being fetched through wget, so no error page,
exactly, but here is the border in the log of last working to 1st non
working:
--2019-08-20 05:40:31--
http://download.opensuse.org/tumbleweed/repo/oss/x86_64/cmus-plugins-all-2.8.0~20190219.ge27e813-1.5.x86_64.rpm
Reusing existing connection to ishtar.sc.tlinx.org:8080.
Proxy request sent, awaiting response... 302 Found
Location:
http://mirror.sfo12.us.leaseweb.net/opensuse/tumbleweed/repo/oss/x86_64/cmus-plugins-all-2.8.0~20190219.ge27e813-1.5.x86_64.rpm
[following]
--2019-08-20 05:40:34--
http://mirror.sfo12.us.leaseweb.net/opensuse/tumbleweed/repo/oss/x86_64/cmus-plugins-all-2.8.0~20190219.ge27e813-1.5.x86_64.rpm
Reusing existing connection to ishtar.sc.tlinx.org:8080.
Proxy request sent, awaiting response... 304 Not Modified
File
‘tumbleweed/repo/oss/x86_64/cmus-plugins-all-2.8.0~20190219.ge27e813-1.5.x86_64.rpm’
not modified on server. Omitting download.

--2019-08-20 05:40:35--
http://download.opensuse.org/tumbleweed/repo/oss/x86_64/cni-0.7.1-1.1.x86_64.rpm
Reusing existing connection to ishtar.sc.tlinx.org:8080.
Proxy request sent, awaiting response... 503 Service Unavailable
2019-08-20 05:40:45 ERROR 503: Service Unavailable.
------------------------------------------------------

So 3072 requests where I got that my connection to the proxy was being
"reused"
and most  of the requests were not modified from a previous run, saw a few
that actually downloaded, but they were in the minority.

Then a 503 -- and answer fills the rest of the log.  It's an alphabetical
download only into the 'c's.


Well Need to try new squid binary and see if it does anything different.

If I find anything out, will post back.
Thanks all for responses.





_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users