Adding fields containg a log-level?

In our project we use logback with the following log-back pattern:
<pattern>%date{ISO8601} [%thread] %-5level %logger{25} - %message%n</pattern>
I.e. all log messages have a format like this:

2013-03-10 10:14:16,458 [default-dispatcher-21] ERROR - __SOME_DETAILS__
or
2015-11-04 15:15:26,458 [event-thread] INFO - __OTHER_DETAILS__

I'd like to provide some kind of filtering depending on the log-level . The obvious idea which first came to mind was to just add

fields:
   level: debug # <--- How do I know that?

or something like this.

The problem is it's not clear how to extract the log level form the message. Is it ever possible? I can specify the multilline:pattern:, but how to extract level from it?

Filebeat doesn't do this kind of parsing, but Logstash does. Even better is to configure Logback to emit JSON logs so that nobody has to do any parsing (although JSON support is only available in the upcoming Filebeat 5).

If you want to filter out 'early' in filebeat there is an exclude/include_lines option in filebeat. Potential pattern can be ' [.+] INFO - ' to capture INFO log-level messages. This is some pre-filtering support to reduce network usage. Logstash provides additional parsing + filtering capabilities.

This does not quite fit. I wanted to store an additional log-level field into elastic in order to facilitate indexing and filtering with logstash. This was the reason I wanted to do this parsing on filebeat's side.

Your flow is Filebeat -> Elasticsearch -> Logstash? That doesn't appear to make sense. What comes after Logstash?

Filebeat-->Graylog-->Elasticsearch, but as far as I understood graylog does pretty much the same as logstash.

Okay. So what's the problem with parsing things in Graylog and sending the results to ES?

This topic was automatically closed after 21 days. New replies are no longer allowed.