Logstash Filter || Json Message to Split Fields

Could someone suggest the best and easy way to split the message into separate fields?

  1. I have incoming messages in json format:
    ex: message {"scenarioName":"x","id":"x","type":"x","source":"x","transaction":"x","appId":"x","template":"x","email":"x@blah.com","name":"x","lastName":"x ","data":{"Id":"x","source":"x"},"customData":{}}

and I would like to split each field for building dashboards in Kibana.

  1. Would this put additional load on data nodes / elasticsearch?
  2. Flow is logstash consuming kafka topics and pushing to elasticsearch.

Since that is JSON I would use a json filter

filter { json { source => "message" remove_field => [ "message" ] } }

That will give you

{
        "data" => {
        "Id" => "x",
    "source" => "x"
},
  "@timestamp" => 2019-09-16T19:19:06.661Z,
    "template" => "x",
 "transaction" => "x",
    "lastName" => "x ",
"scenarioName" => "x",
        "name" => "x",
          "id" => "x",
        "type" => "x",
       "email" => "x@blah.com",
       "appId" => "x",
  "customData" => {},
      "source" => "x"
}

Obviously this parsing is not free, and indexing more fields costs more, but whether it is too expensive is a question only you can answer.

1 Like

works perfect, thanks for the quick help! @Badger

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.