Can Logstash output a read JSON-line as an event with multiple fields?


so I am trying to feed logdata to Elasticsearch. At first I tried Filebeat but that method did just read 1 line of JSON and converted it into one field called "message".

For that reason I switched to Logstash only to find that it behaves exactly the same. It reads one line of the JSON in the log file and puts it into a field called "message".

My problem is that I want to actually work with the data, e.g. create Kibana visualizations. But in order for that to work I need the data in this format:

1 line of JSON = 1 event with as many fields as there are properties. I guess I almost want something like deserialization. I logstash or any part of the ELK capable of this?

In case your logs are already in json, use json filter in the pipeline

    filter {
      json {
        source => "message"

Of course it also works with filebeat:

 - decode_json_fields:
     fields: ['message']
     target: json

If you have regular single/multiline messages, use grok or dissect filter to extract the values.

1 Like

Thank you so much. I did not even think about filters ... :roll_eyes:

You really helped me out! :slight_smile:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.