How to read JSON input sent to Http input plugin in filter section

Hi, I have sample http input plugin and posting json data to this plugin. I want to copy the json body into another field in order to standardize the output format. Invalid jsons are going into message field. I followed this link - https://www.elastic.co/blog/introducing-logstash-input-http-plugin. This link says body will be deserialized and expanded to root.. Please suggest how to do.. Thanks.

input {
  http {
    host => "xxx" # default: 0.0.0.0
    port => 8080 # default: 8080
  }
}
filter {
    grok {
        match => { "[headers][request_uri]" => "/app/%{WORD:App}/env/%{WORD:Environment}/dc/%{WORD:DataCenter}" }
    }
}
output {
kafka {
    codec => json
    bootstrap_servers => "xxx"
    topic_id => "xx-topic"
  }
}

Use a stdout { codec => rubydebug } to make debugging easier. What does an example event produced by Logstash look like?

Thanks @magnusbaeck JSON body is being added to root.. How can I add the body to another field like message instead of root? Is there any configuration in input?

The json codec that's going to be used by default in this case (as the received data probably as application/json as the declared MIME type) always stores the parsed payload in the root of the event, but you should be able to override the codec with the additional_codecs option to force a plain codec. Then you can use the json filter to parse the JSON payload, and then you can elect to store the result under another field.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.