How to disable parsing of nested filelds

Below is my logstash configuration

input {
    beats {
            port => 5001
    }

    kafka {
            bootstrap_servers => "kafka-xxxxxxxxxxxxxxxx:9092"
            topics            => [ "java-prod", "java-prod-cron" ]
            codec => "json"
            tags => "java-prod"
            consumer_threads => "20"
            decorate_events => true
}
}

filter {
  mutate {
gsub => [
  "message", "^[0-9]{4}-[0-9]{2}-[0-9]{2} [0-9]{2}:[0-9]{2}:[0-9]{2} ", ""
]
  }
  grok {
            match => { "message" => "\[%{DATA:thread}\]\s+%{LOGLEVEL:severity}\s+%{DATA:class}\s+%{DATA:linenum}\s+-\s%{GREEDYDATA:log}" }
   }

 if [message] =~ /http.wire-log.writeResponse/  {
 mutate {
add_field => { "http_type" => "response" }
  }
   json {
    source => "log"
    target => "response"
    skip_on_invalid_json => true
  }
  mutate {
convert => { "response.body" => "string" }
  }
  }


if [message] =~ /http.wire-log.writeRequest/  {
  mutate {
add_field => { "http_type" => "request" }
  }
   json {
    source => "log"
    target => "request"
    skip_on_invalid_json => true
  }
  }
}

output {

    stdout {
            #codec => rubydebug
            codec => json
    }
    if [@metadata][kafka][topic] == "java-prod" {
   elasticsearch {
            hosts => "elasticsearch-xxxxxxxxxxxxxxxx:9200"
            index => "java-prod-%{+YYYY.MM.dd}"
            manage_template => true
            template => '/etc/logstash/templates/java-prod.json'
            template_name => 'java'
            template_overwrite => true
    }
    }
    else if [@metadata][kafka][topic] == "java-prod-cron" {
    elasticsearch {
            hosts => "elasticsearch-xxxxxxxxxxxxxxxx:9200"
            index => "java-cron-prod-%{+YYYY.MM.dd}"
            manage_template => true
            template => '/etc/logstash/templates/java-prod.json'
            template_name => 'java'
            template_overwrite => true
    }
    }
}

The issue I am facing is that response.body contains json response which has nested json that has depth of 100 and more. I dont want those to be indexed as fields. How do I avoid parsing of response.body. I tried

mutate {
    convert => { "response.body" => "string" }
      }

But it doesnt seem to work. Tried json_encode too. But that too didnt work. How can I handle it to consider as normal text and dont parse it as fields?

In short I want logstash to treat response.body as a string and not an json object

Try using [response][body] which is the correct way to address that nested field.

https://www.elastic.co/guide/en/logstash/current/event-dependent-configuration.html#logstash-config-field-references

This is how my configuration is now

json {
        source => "log"
        target => "response"
        skip_on_invalid_json => true
      }

   mutate {
    convert => { "[response][body]" => "string" }
  }

But I still get this error

[2018-08-21T09:52:18,527][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"java-prod-2018.08.21", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x68ff20cb>], :response=>{"index"=>{"_index"=>"java-prod-2018.08.21", "_type"=>"doc", "_id"=>"6iq4WmUBom8sbLqrFsN_", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [response.body.data.status]", "caused_by"=>{"type"=>"json_parse_exception", "reason"=>"Current token (VALUE_TRUE) not numeric, can not use numeric value accessors\n at [Source: org.elasticsearch.common.bytes.BytesReference$MarkSupportingStreamInputWrapper@5cc6fffc; line: 1, column: 1458]"}}}}}

Its still trying to parse response.body.data.status

I'm not sure the mutate convert option works as you expect. I suggest you use the json_encode filter instead.

Secondly I strongly recommend you use a stdout { codec => rubydebug } output rather than an elasticsearch output while you're debugging this. When the event looks and desired you can refocus your effort on ES.

The json_encode seems to working. One doubt. What if a string comes once in a while as response.body. How do I handle it? Can I write a if condition?

I suppose the json_encode filter adds a tag to the event.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.