Object mapping for [configurationItems.configuration.state] tried to parse field [state] as object, but found a concrete value

Hi.....
i am trying to parse aws config data into elasticsearch
here is my configuration for logstash

input {
  s3 {
      access_key_id => "XXXXXXXXXXXXXX"
      secret_access_key => "XXXXXXXXXXXXXXXXXX"
      bucket => "bucketname"
      region => "us-east-1"
      prefix => "Config/AWSLogs/XXXXXXXXXXXXx/Config"
      type => "s3"
      add_field => { source => config_new }
      codec => "json"
      interval => "30"
    }
}

output {
  elasticsearch {
    hosts => ["http://192.168.1.36:9200"]
    user => "elastic"
    password => "access"
    index => "config_virginia"
  }
  stdout {codec => rubydebug }
}

While i am running this code it is skipping some .gz files(which are in my s3 bucket) and throwing below error

[2017-03-28T18:03:38,424][WARN ][logstash.outputs.elasticsearch] Failed action. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"config_virginia", :_type=>"s3", :_routing=>nil}, 2017-03-28T12:33:38.249Z %{host} %{message}], :response=>{"index"=>{"_index"=>"config_virginia", "_type"=>"s3", "_id"=>"AVsU6EZyr2lwLlBjHgGB", "status"=>400, _"error"=>{"type"=>"mapper_parsing_exception", "reason"=>"object mapping for [configurationItems.configuration.state] tried to parse field [state] as object, but found a concrete value"_}}}}

how can i overcome with this?

I think that you indexed some documents looking like:

{
  "configurationItems.configuration.state": {
    "foo": "bar"
  }
}

But then you have documents like:

{
  "configurationItems.configuration.state":  "bar"
}

Which elasticsearch can't accept. Check your source.

2 Likes

Exactly that is what happening but how can we overcome that would suggest me any other way to do that because of these elasticsearch skipping some events

can multi mapping helps me for single index

You need to fix that by yourself before indexing into elasticsearch. An object can't be a string and vice versa.

May be use logstash to mutate your data or an ingest pipeline may be?

Thank you very much for your time
finally i resolved this error with below configuration

input {
s3 {
      access_key_id => "xxxxxxxxxxxxxxxxxx"
      secret_access_key => "xxxxxxxxxxxxxxxxxxxx"
      bucket => "bucketname"
      region => "us-east-1"
      prefix => "Config/AWSLogs/"
      type => "s3"
      add_field => { source => config_new }
      #codec => "json"
      interval => "30"
    }
}

filter{
mutate { gsub => [ "message", '{"relatedEvents"', '\n{"relatedEvents"' ] }
split { terminator => "\n" }
mutate { gsub => [ "message", ",$", "" ] }
mutate { gsub => [ "message", "]}$", "" ] }
json { source => "message" }
}

output {
if "_jsonparsefailure" not in [tags] {
if [configuration][state] =~ /.*/
{
elasticsearch {
hosts => ["http://192.168.1.36:9200"]
user => "elastic"
password => "access"
index => "ytest_1"
}
stdout {codec => rubydebug }
}
else
{
elasticsearch {
hosts => ["http://192.168.1.36:9200"]
user => "elastic"
password => "access"
index => "ytest_2"
}
stdout {codec => rubydebug }
}
}
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.