Collecting squid logs to DB

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
13 messages Options
Reply | Threaded
Open this post in threaded view
|

Collecting squid logs to DB

Alex K
Hi all,

I had a previous setup on Debian 7 with squid and I was using mysar to collect squid logs and store them to DB and provide some browsing report at the end of the day.
Now at Debian 9, trying to upgrade the whole setup, I see that mysar does not compile.

Checking around I found mysar-ng but this has compilation issues on Debian 9 also.
Do you suggest any tool that does this job? Does squid support logging to DB natively? (I am using mysql/mariadb)

Some other tool I stumbled on is https://github.com/paranormal/blooper.

Thanx a bunch,
Alex

_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: Collecting squid logs to DB

Amos Jeffries
Administrator
On 05/05/18 10:20, Alex K wrote:

> Hi all,
>
> I had a previous setup on Debian 7 with squid and I was using mysar to
> collect squid logs and store them to DB and provide some browsing report
> at the end of the day.
> Now at Debian 9, trying to upgrade the whole setup, I see that mysar
> does not compile.
>
> Checking around I found mysar-ng but this has compilation issues on
> Debian 9 also.
> Do you suggest any tool that does this job? Does squid support logging
> to DB natively? (I am using mysql/mariadb)
>

Squid-3 comes with log_db_daemon helper which stores to any SQL database
in realtime. You still need something else to do the analysis of that data.
 <http://www.squid-cache.org/Versions/v3/3.5/manuals/log_db_daemon.html>


> Some other tool I stumbled on is https://github.com/paranormal/blooper.

Blooper is a fork of the logmysqldaemon (aka log_db_daemon) re-written
in ruby instead of native C/C++ code.

Amos
_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: Collecting squid logs to DB

Amos Jeffries
Administrator
On 05/05/18 17:19, Amos Jeffries wrote:

> On 05/05/18 10:20, Alex K wrote:
>> Hi all,
>>
>> I had a previous setup on Debian 7 with squid and I was using mysar to
>> collect squid logs and store them to DB and provide some browsing report
>> at the end of the day.
>> Now at Debian 9, trying to upgrade the whole setup, I see that mysar
>> does not compile.
>>
>> Checking around I found mysar-ng but this has compilation issues on
>> Debian 9 also.
>> Do you suggest any tool that does this job? Does squid support logging
>> to DB natively? (I am using mysql/mariadb)
>>
>
> Squid-3 comes with log_db_daemon helper which stores to any SQL database
> in realtime. You still need something else to do the analysis of that data.
>  <http://www.squid-cache.org/Versions/v3/3.5/manuals/log_db_daemon.html>
>
>
>> Some other tool I stumbled on is https://github.com/paranormal/blooper.
>
> Blooper is a fork of the logmysqldaemon (aka log_db_daemon) re-written
> in ruby instead of native C/C++ code.

Sorry, that should have been Ruby instead of Perl.

Amos
_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: Collecting squid logs to DB

Michael Pelletier

On Sat, May 5, 2018 at 2:25 AM, Amos Jeffries <[hidden email]> wrote:
On 05/05/18 17:19, Amos Jeffries wrote:
> On 05/05/18 10:20, Alex K wrote:
>> Hi all,
>>
>> I had a previous setup on Debian 7 with squid and I was using mysar to
>> collect squid logs and store them to DB and provide some browsing report
>> at the end of the day.
>> Now at Debian 9, trying to upgrade the whole setup, I see that mysar
>> does not compile.
>>
>> Checking around I found mysar-ng but this has compilation issues on
>> Debian 9 also.
>> Do you suggest any tool that does this job? Does squid support logging
>> to DB natively? (I am using mysql/mariadb)
>>
>
> Squid-3 comes with log_db_daemon helper which stores to any SQL database
> in realtime. You still need something else to do the analysis of that data.
>  <http://www.squid-cache.org/Versions/v3/3.5/manuals/log_db_daemon.html>
>
>
>> Some other tool I stumbled on is https://github.com/paranormal/blooper.
>
> Blooper is a fork of the logmysqldaemon (aka log_db_daemon) re-written
> in ruby instead of native C/C++ code.

Sorry, that should have been Ruby instead of Perl.

Amos
_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users


Disclaimer: Under Florida law, e-mail addresses are public records. If you do not want your e-mail address released in response to a public records request, do not send electronic mail to this entity. Instead, contact this office by phone or in writing.


_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: Collecting squid logs to DB

Eliezer Croitoru
In reply to this post by Alex K

Hey Alex,

 

How did you used to log into the DB? What configuration lines have you used?

Also what log format have you used?

Is it important to have realtime data in the DB or a periodic parsing is also an option?

 

Eliezer

 

----

Eliezer Croitoru
Linux System Administrator
Mobile: +972-5-28704261
Email: [hidden email]

 

From: squid-users <[hidden email]> On Behalf Of Alex K
Sent: Saturday, May 5, 2018 01:20
To: [hidden email]
Subject: [squid-users] Collecting squid logs to DB

 

Hi all,

I had a previous setup on Debian 7 with squid and I was using mysar to collect squid logs and store them to DB and provide some browsing report at the end of the day.

Now at Debian 9, trying to upgrade the whole setup, I see that mysar does not compile.

Checking around I found mysar-ng but this has compilation issues on Debian 9 also.

Do you suggest any tool that does this job? Does squid support logging to DB natively? (I am using mysql/mariadb)

Some other tool I stumbled on is https://github.com/paranormal/blooper.

 

Thanx a bunch,

Alex


_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: Collecting squid logs to DB

Alex K
+++ Including list +++

Hi Eliezer,

I have used the following lines to instruct squid to log at mariadb:

logfile_daemon /usr/lib/squid/log_db_daemon
access_log daemon:/127.0.0.1/squid_log/access_log/squid/squid squid


Through testing it seems that sometimes squid is not logging anything. I don't know why. After a restart it seems to unblock and write to DB.
The access_log table is currently InnoDB and I am wondering if MyISAM will behave better.

I would prefer if I could have real time access log. My scenario is that when a user disconnects from squid, an aggregated report of the sites that the user browsed will be available under some web portal where the user has access. Usually there will be up to 20 users connected concurrently so I have to check if this approach is scalable. If this approach is not stable then I might go with log parsing (perhaps logstash or some custom parser) which will parse and generate an aggregated report once per hour or day.

Is there a way I format the log and pipe to DB only some interesting fields in order to lessen the stress to DB?


On Sun, May 13, 2018 at 1:25 AM, Eliezer Croitoru <[hidden email]> wrote:

Hey Alex,

 

How did you used to log into the DB? What configuration lines have you used?

Also what log format have you used?

Is it important to have realtime data in the DB or a periodic parsing is also an option?

 

Eliezer

 

----

Eliezer Croitoru
Linux System Administrator
Mobile: +972-5-28704261
Email: [hidden email]

 

From: squid-users <squid-users-bounces@lists.squid-cache.org> On Behalf Of Alex K
Sent: Saturday, May 5, 2018 01:20
To: [hidden email]-cache.org
Subject: [squid-users] Collecting squid logs to DB

 

Hi all,

I had a previous setup on Debian 7 with squid and I was using mysar to collect squid logs and store them to DB and provide some browsing report at the end of the day.

Now at Debian 9, trying to upgrade the whole setup, I see that mysar does not compile.

Checking around I found mysar-ng but this has compilation issues on Debian 9 also.

Do you suggest any tool that does this job? Does squid support logging to DB natively? (I am using mysql/mariadb)

Some other tool I stumbled on is https://github.com/paranormal/blooper.

 

Thanx a bunch,

Alex



_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: Collecting squid logs to DB

Eliezer Croitoru

I have a daemon written in Ruby and GoLang which can do a better job.

Specifically for your scenario I think the better option is to use a tcp server such as in:

https://wiki.squid-cache.org/Features/LogModules#Module:_TCP_Receiver

 

Ubuntu and Debian use systemd and you can spawn a log daemon which is not related directly to squid.

MyISAM and InnoDB is not a question InnoDB is the only relevant choice for couple reasons from my expirence.

I will try to update you here with the relevant script that you might be able to use for real time logging.

 

You should be able to use the next line for your purpose:

access_log tcp://127.0.0.1:5000 squid

 

Eliezer

 

----

Eliezer Croitoru
Linux System Administrator
Mobile: +972-5-28704261
Email: [hidden email]

 

From: Alex K <[hidden email]>
Sent: Sunday, May 13, 2018 01:56
To: Eliezer Croitoru <[hidden email]>
Cc: [hidden email]
Subject: Re: [squid-users] Collecting squid logs to DB

 

+++ Including list +++

Hi Eliezer,

I have used the following lines to instruct squid to log at mariadb:

logfile_daemon /usr/lib/squid/log_db_daemon
access_log daemon:/127.0.0.1/squid_log/access_log/squid/squid squid

Through testing it seems that sometimes squid is not logging anything. I don't know why. After a restart it seems to unblock and write to DB.

The access_log table is currently InnoDB and I am wondering if MyISAM will behave better.

 

I would prefer if I could have real time access log. My scenario is that when a user disconnects from squid, an aggregated report of the sites that the user browsed will be available under some web portal where the user has access. Usually there will be up to 20 users connected concurrently so I have to check if this approach is scalable. If this approach is not stable then I might go with log parsing (perhaps logstash or some custom parser) which will parse and generate an aggregated report once per hour or day.

Is there a way I format the log and pipe to DB only some interesting fields in order to lessen the stress to DB?

 

 

On Sun, May 13, 2018 at 1:25 AM, Eliezer Croitoru <[hidden email]> wrote:

Hey Alex,

 

How did you used to log into the DB? What configuration lines have you used?

Also what log format have you used?

Is it important to have realtime data in the DB or a periodic parsing is also an option?

 

Eliezer

 

----

Eliezer Croitoru
Linux System Administrator
Mobile: +972-5-28704261
Email: [hidden email]

 

From: squid-users <squid-users-bounces@lists.squid-cache.org> On Behalf Of Alex K
Sent: Saturday, May 5, 2018 01:20
To: [hidden email]
Subject: [squid-users] Collecting squid logs to DB

 

Hi all,

I had a previous setup on Debian 7 with squid and I was using mysar to collect squid logs and store them to DB and provide some browsing report at the end of the day.

Now at Debian 9, trying to upgrade the whole setup, I see that mysar does not compile.

Checking around I found mysar-ng but this has compilation issues on Debian 9 also.

Do you suggest any tool that does this job? Does squid support logging to DB natively? (I am using mysql/mariadb)

Some other tool I stumbled on is https://github.com/paranormal/blooper.

 

Thanx a bunch,

Alex

 


_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: Collecting squid logs to DB

Eliezer Croitoru
In reply to this post by Alex K

Hey Alex,

 

I wrote a simple service that now run’s here locally with my server.

The sources and installation instruction can be seen here:

http://gogs.ngtech.co.il/elicro/squid-sql-logger

 

Let me know if it works for you.

 

Eliezer

·         I will try to write a daemon in GoLang which might be better for some systems.

 

----

Eliezer Croitoru
Linux System Administrator
Mobile: +972-5-28704261
Email: [hidden email]

 

From: Alex K <[hidden email]>
Sent: Sunday, May 13, 2018 01:56
To: Eliezer Croitoru <[hidden email]>
Cc: [hidden email]
Subject: Re: [squid-users] Collecting squid logs to DB

 

+++ Including list +++

Hi Eliezer,

I have used the following lines to instruct squid to log at mariadb:

logfile_daemon /usr/lib/squid/log_db_daemon
access_log daemon:/127.0.0.1/squid_log/access_log/squid/squid squid

Through testing it seems that sometimes squid is not logging anything. I don't know why. After a restart it seems to unblock and write to DB.

The access_log table is currently InnoDB and I am wondering if MyISAM will behave better.

 

I would prefer if I could have real time access log. My scenario is that when a user disconnects from squid, an aggregated report of the sites that the user browsed will be available under some web portal where the user has access. Usually there will be up to 20 users connected concurrently so I have to check if this approach is scalable. If this approach is not stable then I might go with log parsing (perhaps logstash or some custom parser) which will parse and generate an aggregated report once per hour or day.

Is there a way I format the log and pipe to DB only some interesting fields in order to lessen the stress to DB?

 

 

On Sun, May 13, 2018 at 1:25 AM, Eliezer Croitoru <[hidden email]> wrote:

Hey Alex,

 

How did you used to log into the DB? What configuration lines have you used?

Also what log format have you used?

Is it important to have realtime data in the DB or a periodic parsing is also an option?

 

Eliezer

 

----

Eliezer Croitoru
Linux System Administrator
Mobile: +972-5-28704261
Email: [hidden email]

 

From: squid-users <squid-users-bounces@lists.squid-cache.org> On Behalf Of Alex K
Sent: Saturday, May 5, 2018 01:20
To: [hidden email]
Subject: [squid-users] Collecting squid logs to DB

 

Hi all,

I had a previous setup on Debian 7 with squid and I was using mysar to collect squid logs and store them to DB and provide some browsing report at the end of the day.

Now at Debian 9, trying to upgrade the whole setup, I see that mysar does not compile.

Checking around I found mysar-ng but this has compilation issues on Debian 9 also.

Do you suggest any tool that does this job? Does squid support logging to DB natively? (I am using mysql/mariadb)

Some other tool I stumbled on is https://github.com/paranormal/blooper.

 

Thanx a bunch,

Alex

 


_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: Collecting squid logs to DB

Amos Jeffries
Administrator
In reply to this post by Alex K
On 13/05/18 10:55, Alex K wrote:
>
> Is there a way I format the log and pipe to DB only some interesting
> fields in order to lessen the stress to DB?
>

You can use the logformat directive to define a format of your choice
and log that instead of the default Squid format.
 <http://www.squid-cache.org/Doc/config/logformat/>

Amos
_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: Collecting squid logs to DB

Eliezer Croitoru
In reply to this post by Alex K

To lose the stress on the DB you can use a custom format as Amos suggested but..

I think that when you will define and write what you want to log exactly you will get what you need and want.

 

The general squid access log is pretty lose and I believe that with these days hardware the difference will only be seen on systems with thousands or millions of clients requests.

If this is a small place it’s not required.

 

All The Bests,

Eliezer

 

----

Eliezer Croitoru
Linux System Administrator
Mobile: +972-5-28704261
Email: [hidden email]

 

From: Alex K <[hidden email]>
Sent: Sunday, May 13, 2018 01:56
To: Eliezer Croitoru <[hidden email]>
Cc: [hidden email]
Subject: Re: [squid-users] Collecting squid logs to DB

 

+++ Including list +++

Hi Eliezer,

I have used the following lines to instruct squid to log at mariadb:

logfile_daemon /usr/lib/squid/log_db_daemon
access_log daemon:/127.0.0.1/squid_log/access_log/squid/squid squid

Through testing it seems that sometimes squid is not logging anything. I don't know why. After a restart it seems to unblock and write to DB.

The access_log table is currently InnoDB and I am wondering if MyISAM will behave better.

 

I would prefer if I could have real time access log. My scenario is that when a user disconnects from squid, an aggregated report of the sites that the user browsed will be available under some web portal where the user has access. Usually there will be up to 20 users connected concurrently so I have to check if this approach is scalable. If this approach is not stable then I might go with log parsing (perhaps logstash or some custom parser) which will parse and generate an aggregated report once per hour or day.

Is there a way I format the log and pipe to DB only some interesting fields in order to lessen the stress to DB?

 

 

On Sun, May 13, 2018 at 1:25 AM, Eliezer Croitoru <[hidden email]> wrote:

Hey Alex,

 

How did you used to log into the DB? What configuration lines have you used?

Also what log format have you used?

Is it important to have realtime data in the DB or a periodic parsing is also an option?

 

Eliezer

 

----

Eliezer Croitoru
Linux System Administrator
Mobile: +972-5-28704261
Email: [hidden email]

 

From: squid-users <squid-users-bounces@lists.squid-cache.org> On Behalf Of Alex K
Sent: Saturday, May 5, 2018 01:20
To: [hidden email]
Subject: [squid-users] Collecting squid logs to DB

 

Hi all,

I had a previous setup on Debian 7 with squid and I was using mysar to collect squid logs and store them to DB and provide some browsing report at the end of the day.

Now at Debian 9, trying to upgrade the whole setup, I see that mysar does not compile.

Checking around I found mysar-ng but this has compilation issues on Debian 9 also.

Do you suggest any tool that does this job? Does squid support logging to DB natively? (I am using mysql/mariadb)

Some other tool I stumbled on is https://github.com/paranormal/blooper.

 

Thanx a bunch,

Alex

 


_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: Collecting squid logs to DB

Alex K
Thanx Eliezer and Amos for the feedback. I just saw the logformat directive and will experiment with that.
Yes, I have a small group of users (up to 30 - 40 devices) but the hardware is a relatively small appliance (4G RAM, 4 cores 2GHz, SSD).

Alex


On Sun, May 13, 2018, 11:37 Eliezer Croitoru <[hidden email]> wrote:

To lose the stress on the DB you can use a custom format as Amos suggested but..

I think that when you will define and write what you want to log exactly you will get what you need and want.

 

The general squid access log is pretty lose and I believe that with these days hardware the difference will only be seen on systems with thousands or millions of clients requests.

If this is a small place it’s not required.

 

All The Bests,

Eliezer

 

----

Eliezer Croitoru
Linux System Administrator
Mobile: +972-5-28704261
Email: [hidden email]

 

From: Alex K <[hidden email]>
Sent: Sunday, May 13, 2018 01:56
To: Eliezer Croitoru <[hidden email]>
Cc: [hidden email]
Subject: Re: [squid-users] Collecting squid logs to DB

 

+++ Including list +++

Hi Eliezer,

I have used the following lines to instruct squid to log at mariadb:

logfile_daemon /usr/lib/squid/log_db_daemon
access_log daemon:/127.0.0.1/squid_log/access_log/squid/squid squid

Through testing it seems that sometimes squid is not logging anything. I don't know why. After a restart it seems to unblock and write to DB.

The access_log table is currently InnoDB and I am wondering if MyISAM will behave better.

 

I would prefer if I could have real time access log. My scenario is that when a user disconnects from squid, an aggregated report of the sites that the user browsed will be available under some web portal where the user has access. Usually there will be up to 20 users connected concurrently so I have to check if this approach is scalable. If this approach is not stable then I might go with log parsing (perhaps logstash or some custom parser) which will parse and generate an aggregated report once per hour or day.

Is there a way I format the log and pipe to DB only some interesting fields in order to lessen the stress to DB?

 

 

On Sun, May 13, 2018 at 1:25 AM, Eliezer Croitoru <[hidden email]> wrote:

Hey Alex,

 

How did you used to log into the DB? What configuration lines have you used?

Also what log format have you used?

Is it important to have realtime data in the DB or a periodic parsing is also an option?

 

Eliezer

 

----

Eliezer Croitoru
Linux System Administrator
Mobile: +972-5-28704261
Email: [hidden email]

 

From: squid-users <squid-users-bounces@lists.squid-cache.org> On Behalf Of Alex K
Sent: Saturday, May 5, 2018 01:20
To: [hidden email]
Subject: [squid-users] Collecting squid logs to DB

 

Hi all,

I had a previous setup on Debian 7 with squid and I was using mysar to collect squid logs and store them to DB and provide some browsing report at the end of the day.

Now at Debian 9, trying to upgrade the whole setup, I see that mysar does not compile.

Checking around I found mysar-ng but this has compilation issues on Debian 9 also.

Do you suggest any tool that does this job? Does squid support logging to DB natively? (I am using mysql/mariadb)

Some other tool I stumbled on is https://github.com/paranormal/blooper.

 

Thanx a bunch,

Alex

 


_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: Collecting squid logs to DB

Amos Jeffries
Administrator
On 13/05/18 23:22, Alex K wrote:
> Thanx Eliezer and Amos for the feedback. I just saw the logformat
> directive and will experiment with that.
> Yes, I have a small group of users (up to 30 - 40 devices) but the
> hardware is a relatively small appliance (4G RAM, 4 cores 2GHz, SSD).
>

That should be more than enough for Squid.

Amos
_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users
Reply | Threaded
Open this post in threaded view
|

Re: Collecting squid logs to DB

Eliezer Croitoru
In reply to this post by Alex K

I updated the repo: http://gogs.ngtech.co.il/elicro/squid-sql-logger

 

The additions are:

-          GoLang mysql logging service source code.

-          Static pre-compiled binaries for(linux, windows, all bsd, Darwin, linux_arm.. , linux_mips...).

-          Installation instructions for the pre-compiled binaries for linux_amd64 version.

Tested to work under very high load.

There is one improvement I had in mind but I will add it later on.


All The Bests,

Eliezer

 

----

Eliezer Croitoru
Linux System Administrator
Mobile: +972-5-28704261
Email: [hidden email]

 

From: Alex K <[hidden email]>
Sent: Sunday, May 13, 2018 14:23
To: Eliezer Croitoru <[hidden email]>
Cc: [hidden email]
Subject: Re: [squid-users] Collecting squid logs to DB

 

Thanx Eliezer and Amos for the feedback. I just saw the logformat directive and will experiment with that.

Yes, I have a small group of users (up to 30 - 40 devices) but the hardware is a relatively small appliance (4G RAM, 4 cores 2GHz, SSD).

 

Alex

 

On Sun, May 13, 2018, 11:37 Eliezer Croitoru <[hidden email]> wrote:

To lose the stress on the DB you can use a custom format as Amos suggested but..

I think that when you will define and write what you want to log exactly you will get what you need and want.

 

The general squid access log is pretty lose and I believe that with these days hardware the difference will only be seen on systems with thousands or millions of clients requests.

If this is a small place it’s not required.

 

All The Bests,

Eliezer

 

----

Eliezer Croitoru
Linux System Administrator
Mobile: +972-5-28704261
Email: [hidden email]

 

From: Alex K <[hidden email]>
Sent: Sunday, May 13, 2018 01:56
To: Eliezer Croitoru <[hidden email]>
Cc: [hidden email]
Subject: Re: [squid-users] Collecting squid logs to DB

 

+++ Including list +++

Hi Eliezer,

I have used the following lines to instruct squid to log at mariadb:

logfile_daemon /usr/lib/squid/log_db_daemon
access_log daemon:/127.0.0.1/squid_log/access_log/squid/squid squid

Through testing it seems that sometimes squid is not logging anything. I don't know why. After a restart it seems to unblock and write to DB.

The access_log table is currently InnoDB and I am wondering if MyISAM will behave better.

 

I would prefer if I could have real time access log. My scenario is that when a user disconnects from squid, an aggregated report of the sites that the user browsed will be available under some web portal where the user has access. Usually there will be up to 20 users connected concurrently so I have to check if this approach is scalable. If this approach is not stable then I might go with log parsing (perhaps logstash or some custom parser) which will parse and generate an aggregated report once per hour or day.

Is there a way I format the log and pipe to DB only some interesting fields in order to lessen the stress to DB?

 

 

On Sun, May 13, 2018 at 1:25 AM, Eliezer Croitoru <[hidden email]> wrote:

Hey Alex,

 

How did you used to log into the DB? What configuration lines have you used?

Also what log format have you used?

Is it important to have realtime data in the DB or a periodic parsing is also an option?

 

Eliezer

 

----

Eliezer Croitoru
Linux System Administrator
Mobile: +972-5-28704261
Email: [hidden email]

 

From: squid-users <[hidden email]> On Behalf Of Alex K
Sent: Saturday, May 5, 2018 01:20
To: [hidden email]
Subject: [squid-users] Collecting squid logs to DB

 

Hi all,

I had a previous setup on Debian 7 with squid and I was using mysar to collect squid logs and store them to DB and provide some browsing report at the end of the day.

Now at Debian 9, trying to upgrade the whole setup, I see that mysar does not compile.

Checking around I found mysar-ng but this has compilation issues on Debian 9 also.

Do you suggest any tool that does this job? Does squid support logging to DB natively? (I am using mysql/mariadb)

Some other tool I stumbled on is https://github.com/paranormal/blooper.

 

Thanx a bunch,

Alex

 


_______________________________________________
squid-users mailing list
[hidden email]
http://lists.squid-cache.org/listinfo/squid-users