Logstash (multiline implemented) => elasticsearch => kibana. This is working for me.
However i want to start using filebeat => logstash => elastichsearch => kibana.
Now i run into trouble using the multiline feature of filebeat.
As far as i understand it doesn't support grok (which i used in logstash). Is this planned to be supported in the near future?
I hope really soon, now just doing the log level multline, I also got multiple time format to be implemented
So i replaced the the complete pattern (LOGLEVEL in this case) to the filebeat configuration.
I copied the the pattern from:
Your are probably right, most cases probably basic pattern will do (like LOGLEVEL).
Only it makes life easier if there are already predefined regex patterns.
My knowledge of regex is still to low and i have to debug a lot to get it right
Some examples of log lines that I find harder to implement are:
Apr 4, 2016 6:41:50 AM org.apache.karaf.main.SimpleFileLock lock
2016-04-04 06:42:14,323 | WARN | Event Dispatcher | ResourceFinder | example message
Very useful tip did not know there was different regex engine used. I will google on golang regex.
use three backticks to format code, not >. I can not tell if identation is correct or not.
you sure about match: before
I noticed log levels start with upper case letters + java exceptions normally start with full class patch (lower case). This pattern is working for me: ^[A-Z]http://play.golang.org/p/Yi2c1lewDK
Your pattern maybe work by removing the backslashes ''. Escape characters are a little tricky at times, as yaml parser has some different rules on string parsing + parsed string is interpreted(compiled) by regex engine. This makes it difficult doing regex at times.
If you want to be more strict in your pattern try: ^[DIWEC] to capture log levels 'debug', 'info', 'warn', 'error', 'critical'.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.