Multi line logs


I'm currently using fluentd/td-agent to ship logs to elasticsearch and use kibana for visualization.
In fluentd you can specify when tailing logs which is the first line of a multi-line log(for instance java errors).
And basically use regexp to split the message into fields.

But since td-agent doesn't do well with milliseconds currently, I am looking for other options to ship logs to elasticsearch.

Skimming the documentation on filebeats, the configuration seems fairly easy. But I see is no mention of multi-line log files.

We do have a lot of Java applications that generate errors that are split into multiple lines..
I'm wondering how it works beats. Do we need to use logstash to handle it or is filebeats enough? In that case how does beats handle it?

Hi you can use codec multiline if you use one port to recover log by file.

You can use multine filter but you have to make run your logstash on one thread like i do.

  • you launch logstash with the option -w 1
    -this is my filter multiline for exemple
    multiline {
    pattern => "(^\d+\serror)|(^.+Exception: .+)|(^\s+at .+)|(^?s+... \d+ more)|(^\s*Caused by:.+)|(--- End of inner exception stack trace ---)|(Parameter name: source)"
    what => "previous"

Yes, you can use Logstash at the moment, and we are also working on adding multiline support to filebeat: