Elasticsearch as Logwatch

Hello,

We currently use Logwatch in our Server Farm to have a daily report of all
the logs of all the machines.
The problem is that receiving everyday hundred of email makes it difficult
to go through all them.

We were looking something similiar to Splunk, that allow us to easily query
all the collected logs, and be daily informed for example about who failed
to login more than 3 times on our machine, or how many packages hae been
rejected from the firewall and from and to which ports etc... (Hope I gave
the idea of what we need).
As I said Splunk was really good on how to query those kind of data, but
it's too expensive and actually it does even too much (we don't need graph,
GUI to check the data in real time etc).

Of course by checking on the web for an alternative, Elasticsearch and
Logstash names came out.
I really like logstash as it helps you to filter out stuff and better
categorize the data (for a better future query), but my problem is that I
have a central Logstash server that receive all the logs from rsyslog, and
in this way I can't separate the kind of data like for example sshd logs
from firewall one etc.
Also I dont like at all Elasticsearch, the query system seem to much
"complicated" even to simply query a normal document, it became really
trick when it's up to do something like exaplained before, and there is no
way to have a report of the logs queried, if not by writing some script to
parse the Json, and depending of whih kind of logs, regx and re-write the
text to have an understandable email report, and send it.

Does anybody got any suggestion?

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/2ef6f964-e2b7-4f75-bcbe-9eb8b4b3658d%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Kibana, a web front end for Elasticsearch, provides a much easier query
method and is part of the ELK stack.

Logstash has inbuilt filters that define specific types of log lines and
let's you tag them, so you can have a pattern for apache logs that then
tags them as such, which allows for easier categorisation.
If you want to know more about the Logstash side of things then it's a good
idea to sign up to that mailing list as well -
https://groups.google.com/forum/?hl=en-GB#!forum/logstash-users

Regards,
Mark Walkom

Infrastructure Engineer
Campaign Monitor
email: markw@campaignmonitor.com
web: www.campaignmonitor.com

On 25 September 2014 19:47, Daniel Fazio daniel.fazio.92@gmail.com wrote:

Hello,

We currently use Logwatch in our Server Farm to have a daily report of all
the logs of all the machines.
The problem is that receiving everyday hundred of email makes it difficult
to go through all them.

We were looking something similiar to Splunk, that allow us to easily
query all the collected logs, and be daily informed for example about who
failed to login more than 3 times on our machine, or how many packages hae
been rejected from the firewall and from and to which ports etc... (Hope I
gave the idea of what we need).
As I said Splunk was really good on how to query those kind of data, but
it's too expensive and actually it does even too much (we don't need graph,
GUI to check the data in real time etc).

Of course by checking on the web for an alternative, Elasticsearch and
Logstash names came out.
I really like logstash as it helps you to filter out stuff and better
categorize the data (for a better future query), but my problem is that I
have a central Logstash server that receive all the logs from rsyslog, and
in this way I can't separate the kind of data like for example sshd logs
from firewall one etc.
Also I dont like at all Elasticsearch, the query system seem to much
"complicated" even to simply query a normal document, it became really
trick when it's up to do something like exaplained before, and there is no
way to have a report of the logs queried, if not by writing some script to
parse the Json, and depending of whih kind of logs, regx and re-write the
text to have an understandable email report, and send it.

Does anybody got any suggestion?

--
You received this message because you are subscribed to the Google Groups
"elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an
email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit
https://groups.google.com/d/msgid/elasticsearch/2ef6f964-e2b7-4f75-bcbe-9eb8b4b3658d%40googlegroups.com
https://groups.google.com/d/msgid/elasticsearch/2ef6f964-e2b7-4f75-bcbe-9eb8b4b3658d%40googlegroups.com?utm_medium=email&utm_source=footer
.
For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/CAEM624bbHDhtV-Vik9wEYrjOvPiCCk28jhFB0MsMz-XryvAmXA%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.

thank for the answer, but
Well as I said, We don't need Kibana, we don't care to see a real time
graph or else, we need to have a daily report sytem which kibana doesn't
provide.

I know about those pattern, but they reconize the source path file, and I
don't have one as I receive everything from rsyslog.
And for example if throught logstash a ssh log pass, I want to filter the
message by separating form the message the user, the email, the time when
he tried etc

Il giorno giovedì 25 settembre 2014 11:47:30 UTC+2, Daniel Fazio ha scritto:

Hello,

We currently use Logwatch in our Server Farm to have a daily report of all
the logs of all the machines.
The problem is that receiving everyday hundred of email makes it difficult
to go through all them.

We were looking something similiar to Splunk, that allow us to easily
query all the collected logs, and be daily informed for example about who
failed to login more than 3 times on our machine, or how many packages hae
been rejected from the firewall and from and to which ports etc... (Hope I
gave the idea of what we need).
As I said Splunk was really good on how to query those kind of data, but
it's too expensive and actually it does even too much (we don't need graph,
GUI to check the data in real time etc).

Of course by checking on the web for an alternative, Elasticsearch and
Logstash names came out.
I really like logstash as it helps you to filter out stuff and better
categorize the data (for a better future query), but my problem is that I
have a central Logstash server that receive all the logs from rsyslog, and
in this way I can't separate the kind of data like for example sshd logs
from firewall one etc.
Also I dont like at all Elasticsearch, the query system seem to much
"complicated" even to simply query a normal document, it became really
trick when it's up to do something like exaplained before, and there is no
way to have a report of the logs queried, if not by writing some script to
parse the Json, and depending of whih kind of logs, regx and re-write the
text to have an understandable email report, and send it.

Does anybody got any suggestion?

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/ca5a5f02-e1bd-45e7-831a-d653bee32ed1%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

There is currently no way to do this natively within ELK, you would need to
develop something unfortunately.
I do recall some other users discussing this same issue a few months and
using a couple of interesting options, it might be worth having a look at
the list archives to see if that helps.

As for your pattern problem, if the rsyslog output contains the daemon name
that generated the log entry, as it should, then it is simple to categorise
the entry.

Regards,
Mark Walkom

Infrastructure Engineer
Campaign Monitor
email: markw@campaignmonitor.com
web: www.campaignmonitor.com

On 25 September 2014 20:00, Daniel Fazio daniel.fazio.92@gmail.com wrote:

thank for the answer, but
Well as I said, We don't need Kibana, we don't care to see a real time
graph or else, we need to have a daily report sytem which kibana doesn't
provide.

I know about those pattern, but they reconize the source path file, and I
don't have one as I receive everything from rsyslog.
And for example if throught logstash a ssh log pass, I want to filter the
message by separating form the message the user, the email, the time when
he tried etc

Il giorno giovedì 25 settembre 2014 11:47:30 UTC+2, Daniel Fazio ha
scritto:

Hello,

We currently use Logwatch in our Server Farm to have a daily report of
all the logs of all the machines.
The problem is that receiving everyday hundred of email makes it
difficult to go through all them.

We were looking something similiar to Splunk, that allow us to easily
query all the collected logs, and be daily informed for example about who
failed to login more than 3 times on our machine, or how many packages hae
been rejected from the firewall and from and to which ports etc... (Hope I
gave the idea of what we need).
As I said Splunk was really good on how to query those kind of data, but
it's too expensive and actually it does even too much (we don't need graph,
GUI to check the data in real time etc).

Of course by checking on the web for an alternative, Elasticsearch and
Logstash names came out.
I really like logstash as it helps you to filter out stuff and better
categorize the data (for a better future query), but my problem is that I
have a central Logstash server that receive all the logs from rsyslog, and
in this way I can't separate the kind of data like for example sshd logs
from firewall one etc.
Also I dont like at all Elasticsearch, the query system seem to much
"complicated" even to simply query a normal document, it became really
trick when it's up to do something like exaplained before, and there is no
way to have a report of the logs queried, if not by writing some script to
parse the Json, and depending of whih kind of logs, regx and re-write the
text to have an understandable email report, and send it.

Does anybody got any suggestion?

--
You received this message because you are subscribed to the Google Groups
"elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an
email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit
https://groups.google.com/d/msgid/elasticsearch/ca5a5f02-e1bd-45e7-831a-d653bee32ed1%40googlegroups.com
https://groups.google.com/d/msgid/elasticsearch/ca5a5f02-e1bd-45e7-831a-d653bee32ed1%40googlegroups.com?utm_medium=email&utm_source=footer
.
For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/CAEM624Z76CAZV9BsPdd3UMBa3Li69zRN3G8bquQwp8%3DwaL9S1g%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.

Just to throw it out there, is there a reason you wouldn't take the daily
results from logwatch, and pump those into elasticsearch? If dealing with
the hundreds of emails is the issue, then that could let you make a query
to show (for example) which users had the most login failures across all
systems each day.

It's also worth mentioning that kibana can be used to help you develop
complex queries; you just need to build the search you want, and inspect
the element containing it to get the json it uses. You can put an "index"
variable into a script, change that daily, and use it in each query
(removing the timestamp limitations, obviously) to run the same set of
queries daily against the results of your logwatch.

--
The information transmitted in this email is intended only for the
person(s) or entity to which it is addressed and may contain confidential
and/or privileged material. Any review, retransmission, dissemination or
other use of, or taking of any action in reliance upon, this information by
persons or entities other than the intended recipient is prohibited. If you
received this email in error, please contact the sender and permanently
delete the email from any computer.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/7a46e62b-c5e0-404f-bc42-5db34fc042eb%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Greg,

  I started researching exactly what you suggested: Pumping the daily 

logwatch results into elasticsearch. I already have an ELK in place and
working. Do you know what is needed to implement this? There is no
logwatch or email input.

Thanks in advance.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/01e701d2-59f8-436c-8c58-5de03653b65a%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

I believe you can tell logwatch to output its reports as a file, which
could then be ingested with logstash.
Alternatively, logstash has an imap input that you could use to get emails
into Elasticsearch.

--
The information transmitted in this email is intended only for the
person(s) or entity to which it is addressed and may contain confidential
and/or privileged material. Any review, retransmission, dissemination or
other use of, or taking of any action in reliance upon, this information by
persons or entities other than the intended recipient is prohibited. If you
received this email in error, please contact the sender and permanently
delete the email from any computer.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/CABnN7hJ5UrywGBEtezo%2BgMTZg3rrCeWdYHBZT_yp_8mvOTVbOQ%40mail.gmail.com.
For more options, visit https://groups.google.com/d/optout.