I'm new to using Filebeats/Logstash and have a application that is using Log4J and wanted to know what was the best option to allow for searching/filters based on the information in the log file from Kibana..

Here is a sample of the line entry:

2017-07-12 03:58:36,718 WARN [ThreadName] com.test.namespace - Message
2017-07-12 03:58:36,719 INFO [ThreadName] com.test.namespace - Message2

I would like to be able to filter based upon:

  • Log Level
  • ThreadName
  • Namespace
  • message or contents of the message, which could be multiline.

I understand that filebeat won't be having Grok, but instead it is going to be put them into Elasticsearch.
Anybody have a good example for this ?

1 Like

For filebeat, consider to configure multiline support (check line starts with timestamp), such that stack-traces will be part of the error event. This is a common pattern and you might find samples in this forum (search for multiline) and docs. For your timestamps use the regex '^[0-9]{4}-[0-9]{2}-[0-9]{2} '

have you had a look at elasticsearch ingest node? You can configure an ingest pipeline to do some parsing (via grok), to extract some more structured information from your logs. This will simplify searching and filtering a ton. For example by timespan, log level, class name, message and others.

Filebeat modules do have some sample configurations, but no java/log4j specific services in there (log format of log4j is somewhat configurable). Also see modules implementations for existing examples: https://github.com/elastic/beats/tree/master/filebeat/module

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.