[logstash.outputsCould not index event to Elasticsearch."error {"type"=>"mapper_parsing_exception", "reason"=>"object mapping for [host] tried to parse field [host] as object, but found a concrete value"

Getting this error
 [2022-01-11T12:48:53,887][WARN ][logstash.outputs.elasticsearch][main][d60352f946d70780df98c56812b4beacd1633b0b5fb5d9c306d2c578620115b7] Could not index event to Elasticsearch. {:status=>400, :action=>["create", {:_id=>nil, :_index=>"metrics-generic-default", :routing=>nil}, {"path"=>"/app/log_nginx/access.log", "host"=>"a0410pttselfcareapp04", "http.version"=>"1.1", "nginx.access.time"=>"11/Jan/2022:12:48:53 +0600", "user.id"=>"-", "http.request.method"=>"POST", "url.original"=>"/smlaro/api/v5/get_payg_status/3D0FD055649155A8A8665D413B5AA133", "source.ip"=>"192.168.207.87", "event.duration"=>"0.042", "user_agent.original"=>"\"-\"\"okhttp/3.12.12\"", "received_from"=>"a0410pttselfcareapp04", "http.response.status_code"=>200, "type"=>"nginx-access", "received_at"=>"2022-01-11T06:48:53.777Z", "@version"=>"1", "@timestamp"=>2022-01-11T06:48:53.777Z, "message"=>"192.168.207.87 - - [11/Jan/2022:12:48:53 +0600] \"POST /smlaro/api/v5/get_payg_status/3D0FD055649155A8A8665D413B5AA133 HTTP/1.1\" 200 288 0.042 - \"-\"\"okhttp/3.12.12\" \"27.147.220.217\"", "http.response.body.bytes"=>288, "user.name"=>"-", "data_stream"=>{"type"=>"metrics", "dataset"=>"generic", "namespace"=>"default"}}], :response=>{"create"=>{"_index"=>".ds-metrics-generic-default-2022.01.10-000001", "_type"=>"_doc", "_id"=>"aqPlR34BlNONtcbIj2t5", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"object mapping for [host] tried to parse field [host] as object, but found a concrete value"}}}}

need suggestions to resolve this issue . Here is my filter to my logstash conf, but it does not work.

filter {
    if [type] == "nginx-access" {

        mutate {
    remove_field => [ "[host]" ]
  }
      grok {
        match => { "message" => "%{IPORHOST:source.ip} %{USER:user.id} %{USER:user.name} \[%{HTTPDATE:nginx.access.time}\] \"%{WORD:http.request.method} %{DATA:url.original} HTTP/%{NUMBER:http.version}\" %{NUMBER:http.response.status_code:int} (?:-|%{NUMBER:http.response.body.bytes:int}) (?:-|%{NUMBER:event.duration:double}) (?:-|%{USER:ident}) (-|%{DATA:user_agent.original}) %{GREEDYDATA}" }
        add_field => [ "received_at", "%{@timestamp}" ]
        add_field => [ "received_from", "%{host}" ]

      }

   }
}

See this answer. Once you have indexed an document in which [host] is an object (probably containing [host][name]) any event that in which [host] is text will be rejected.

Where I add this code @Badger. I add filter option but got again error.

Blockquote
[2022-01-12T16:57:56,455][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of [ \t\r\n], "#", "=>" at line 8, column 8 (byte 128) after input {\n\n file {\n path => "/app/log_nginx/access.log"\n type => "nginx-access"\n start_position => "beginning"\n\n if ", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/compiler.rb:32:in compile_imperative'", "org/logstash/execution/AbstractPipelineExt.java:187:in initialize'", "org/logstash/execution/JavaBasePipelineExt.java:72:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:47:in initialize'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:52:in execute'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:389:in block in converge_state'"]}

Please tell me how can get rid of this solution?

You cannot use a conditional in the input section.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.