Log alerting for different applications

Hello,

i'm a bit confused over the way the alerting works in the elastic stack.
I have multiple small applications that generate logs and also have to manage multiple larger applications that generate multiple different log files each.
I want to introduce something that tells me if something goes wrong, what went wrong and where to look. I heard that elasticsearch has may ways to deal with logs, so i'm trying to setup a log monitoring for my purpose, but i don't understand how it is intended to work.

So for the small applications i want to recieve a notification on every single error containing the errormessage. They do stuff like file processing and only generate errors if an input file is corrupt, which doesn't happen often, but if it happens i have to look.

While the bigger applications can generate all kinds of errors and i want to define some rules that allow me to control on which kind of errors a notification is generated.

I use filebeat to ship all logs to elastic.

So now my questions:

  1. On Observability > Logs > Stream there is a list with all the logs. But how can i only view the logs of one application? I can set a filter on the log.file.path, but obviously there is more than one log file per application and it is not even possible to use a regex to get all logfiles of one application.

  2. What can i do such that an email is sent on every error log of a certain application? I don't see a way to do it. I tried using log treshold but this only does something if a certain amount of error messages is reached. So i set it to is more than or equal 1, but there is a timewindow and such. So if two errors happen in a short timeframe, only one alert will happen and thus only one email sent. Also i want to include the error message in the email, but in the connector the {{message}} is empty, which makes sense if the alert always happens on thresholds and not on single error messages.
    And again, there is no way to handle all logfiles of a single application, as log.file.path only has an IS operator that doesn't allow a regex.

So i'm really confused as i heard all the time that elasticsearch is a great way of handling logs but for me it looks like it can't even deal with more than one application if it happens to create more than 1 logfile. What am i missing here?

Normally you will have a field in your document with the app name or any reference to the app that you can use to filter.

You can create these fields directly in filebeat using custom fields or parsing your message in Elasticsearch using an ingest pipelines.

To send e-mails directly from Kibana you need to use the E-mail connector, which is only available if you have a paid license. Are you running on-premises or on cloud? I'm not sure if this connector is available on cloud without a platinum or enterpreise license, I do not use the Elastic Cloud.

If I'm not wrong, there was an issue to change this behavior and trigger one action per alert, not sure if it was already merged. In which version are you?

This is one of the main issues with Kibana alert, sometimes you cannot even access the fields you need, there are a couple of issues about this on Github but I'm not sure about the status because I went back to use a third party tool for alert.

As an alternative you can use a third-party tool called ElastAlert2 to trigger your alerts.

1 Like

Thank you very much for your response. I was ready to give up using elastic for this, but now i will give it another shot.

This will already solve a lot. As it is my first time working with filebeat, i didn't even know there was an add_field processor i can use. So i assume i have to add it in the filebeat.yml under processors. So if i take the example from the docs:

processors:
  - add_fields:
      target: project
      fields:
        name: myproject
        id: '574734885120952459'

This will add a project field with the name and id to every document that this filebeat instance creates, right?
But what if one filebeat instance monitors multiple applications on the same machine?
Can i have one processor per filestream input/path? For me it looks like a global setting for the whole filebeat.
(Do i need multiple filestream inputs, or one filestream input with multiple paths in order to be able to assign a different name to the documents coming from the different applications?)

Also i didn't use any ingest pipelines so far. Can you elaborate a bit further how this would work? In combination with filebeat?

I use on-prem and don't have a license, so my workaround was using the index connector instead with a small script that just reads the index and generates the mails, so thats not a problem if the rest works.

I'm currently on 8.7, but i'm ready to upgrade if newer versions solve my problems. Do you have a link to the issue so i can check?

Thank you for the recommendation. I will probably have a look at the tool, but it is kind of sad, that the software which is widely known for log processing and handling needs an external software to trigger alerts in a sensible way.

So i assume ElastAlert2 can trigger alerts on the message level and not only on a treshold in a timeframe?
And can ElastAlert2 send mails by itself, or will it trigger an alert in elastic which needs again a connector? I assume it can send mail, because otherwise you again wouldn't have the message available in the connector, right?

Again, thanks for taking the time.

Yes, if you use the add_field on the processors part of the filebeat.yml, it will apply the field to every document that the filebeat instances collects.

In this case, if you have multiple filestream inputs in the same filebeat instance, you can also add custom fields on the input level, check the example on this documentation.

Basically you will need this:

- type: filestream
  id: filestream-unique-id
  paths:
    - "/path/to/your/logs/*"
  fields:
    customField: "value"
  fields_under_root: false

I recommend setting fields_under_root to false, so your custom fields will be under the top-level field named fields, so you will have fields.customField.

If they are different applications and the logs are in different paths I would recommend using one filestream for each application.

If you want to use ingest pipelines in elasticsearch to transform data, you can configure it in filebeat, so filebeat requests will tell Elasticsearch to use this ingest pipeline.

Yeah, it is one alternative. If you want anything with more options you may check ElastAlert2, it was a third-party tool built when Elasticsearch didn't have any alert feature.

If I'm not wrong it was implement on 8.8, I think it is this issue.

Since ElastAlert2 is a third-party tool, it is not supported here, but basically you can create your query on it and select wich fields you want to send on the alert, you can send e-mails, requests to webhooks and other connectors that Kibana Alerts still does not have.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.