Dynamically injecting custom fields to logstash from filebeat

I have set up filebeat to send logs from one of our servers to logstash. It's working fine but the message view in Kibana is pretty ugly and hard to debug as the message is pretty big(it's web service message,xml or json, sent and recieved ). So I was wondering if there's a way to extract some fields from the message, for example order Id or processid , and inject them to custom fields so that user can query based on those fields.

Please note that these webservice logs are written by a cots product via out of the box feature. So, pretty printing before writing to log file is not an option.

Somethng like this..


  • /opt/jboss/jboss-eap-7.0/bin/log/webservices.log
    encoding: plain
    multiline.pattern: '^[[:digit:]]{4}-[[:digit:]]{2}-[[:digit:]]{2}'
    multiline.negate: true
    multiline.match: after
    application-name: customApp
    #order-Id: value from message?????
    #thread-name: from message???

You can use the dissect or grok filter in Logstash for parsing your logs.
Logstash also has a json and xml filter.

This topic was automatically closed after 28 days. New replies are no longer allowed.