Filebeat 1.2.0. multiline


I new to the ELK, just learning the basics.

Logstash (multiline implemented) => elasticsearch => kibana. This is working for me.
However i want to start using filebeat => logstash => elastichsearch => kibana.

Now i run into trouble using the multiline feature of filebeat.

As far as i understand it doesn't support grok (which i used in logstash).
Is this planned to be supported in the near future?
I hope really soon, now just doing the log level multline, I also got multiple time format to be implemented

So i replaced the the complete pattern (LOGLEVEL in this case) to the filebeat configuration.
I copied the the pattern from:


Filebeat configuration:

- "/var/log/xx/*/default.log"
input_type: log
document_type: equinox
pattern: [1]lert|ALERT|[Tt]race|TRACE|[Dd]ebug|DEBUG|[Nn]otice|NOTICE|[Ii]nfo|INFO|[Ww]arn?(?:ing)?|WARN?(?:ING)?|[Ee]rr?(?:or)?|ERR?(?:OR)?|[Cc]rit?(?:ical)?|CRIT?(?:ICAL)?|[Ff]atal|FATAL|[Ss]evere|SEVERE|EMERG(?:ENCY)?|[Ee]merg(?:ency)?
negate: true
match: after

This is not resulting in any hit on my log files.
So short question what am i doing wrong? :slight_smile:

If more information needed pls ask.

Kind regards.

  1. Aa ↩︎

No, beats will only do basic filter. Logstash will still be needed for groking etc.

I don't think filebeat will support grok-like regexes in near future.

Can you post some sample logs of yours? I don't really get your use-case and why you need the overcomplicated pattern?

filebeat uses the golang regex engine, which is pretty different from ruby/perl regexes. Just copying regexes will often not work.

Thx for the answers Mark and Steffen.

Your are probably right, most cases probably basic pattern will do (like LOGLEVEL).
Only it makes life easier if there are already predefined regex patterns.
My knowledge of regex is still to low :slight_smile: and i have to debug a lot to get it right

Some examples of log lines that I find harder to implement are:

Apr 4, 2016 6:41:50 AM org.apache.karaf.main.SimpleFileLock lock
2016-04-04 06:42:14,323 | WARN  | Event Dispatcher | ResourceFinder                   | example message

Very useful tip did not know there was different regex engine used. I will google on golang regex.

I prepared a script to run (just press Run button) regex tests.

Replace pattern, negate and content for testing regexes. Check out docs on regex support.

Thx for your answer helpful example. I will give a try and create own regex :slight_smile:

back again,

i tried the following 'Run Script' works fine in the example code i think.

not with filebeat configuration:

   # List of prospectors to fetch data.
             - "/tmp/default.log"
          input_type: log
          document_type: equinox
             pattern: '^[[:alpha:]]+[[:space:]]+[[:digit:]]{4}\-[[:digit:]]{2}\-[[:digit:]]{2}|^$'
             negate: true
             match: before

all become separate events??

use three backticks to format code, not >. I can not tell if identation is correct or not.

you sure about match: before

I noticed log levels start with upper case letters + java exceptions normally start with full class patch (lower case). This pattern is working for me: ^[A-Z]

Your pattern maybe work by removing the backslashes ''. Escape characters are a little tricky at times, as yaml parser has some different rules on string parsing + parsed string is interpreted(compiled) by regex engine. This makes it difficult doing regex at times.

If you want to be more strict in your pattern try: ^[DIWEC] to capture log levels 'debug', 'info', 'warn', 'error', 'critical'.

Tried both. I need after.

I'm not sure if this is always the case (upper case) i'm not the owner of the log file.

I don't know why but when i remove the last part |^$ it's going OK with filebeat.

maybe due to negate? Why did you include the empty line check? You have empty lines in your log or you wanted to get 'clean' output in test script?

The empty line check is from your example ;). I actually don't need it so i removed it.