Logstash "Could not index event to Elasticsearch"

Hi,
I am trying to index my data into Elasticsearch from logstash but I keep getting this type of errors for all indices.
2020-05-06T14:38:11,434][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-2020.05.06", :_type=>"doc", :routing=>nil}, #<LogStash::Event:0x343023b1>], :response=>{"index"=>{"_index"=>"logstash-2020.05.06", "_type"=>"doc", "_id"=>"32H-6XEBEechOczSU02t", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [Tax.amount] of type [text]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:1067"}}}}}

Can you please help me in debugging the problem that is causing this to happen?
Thanks and Regards

A field in elasticsearch cannot be a keyword in some documents and an object in others. You are trying to index a document in which the [Tax.amount] field is an object into an index in which the [Tax.amount] field is a string.

Thanks for the reply.
How do I go about to resolve this, by filtering the object? Can you guide me how?
Here is my pipeline.conf file which I am using:

input {
    udp {
        port => 5000
        type => syslog
    }
    udp {
        port => 5001
        type => json
    }
    tcp {
        port => 5001
        type => json
        ssl_enable => true
        ssl_key => "/etc/letsencrypt/live/elastic.acomodeo.com/privkey.pem"
        ssl_cert => "/etc/letsencrypt/live/elastic.acomodeo.com/fullchain.pem"
        ssl_extra_chain_certs => ["/etc/letsencrypt/live/elastic.acomodeo.com/chain.pem"]
        ssl_verify => false
        add_field => {"ssl" => "on"}
    }
}

## Add your filters / logstash plugins configuration here

filter {
    mutate {
      remove_field => [ "port" ]
    }

    if [type] == "syslog" {
        grok {
            match => { "message" => "\A%{TIMESTAMP_ISO8601:tmptimestamp}\|%{HOSTNAME:cluster}\|%{HOSTNAME:hostname}\|%{HOSTNAME:app}\|%{GREEDYDATA:message}\Z" }
           overwrite => [ "message" ]
        }
        date {
            match => [ "tmptimestamp", "ISO8601" ]
            remove_field => [ "tmptimestamp" ]
        }
    }

    if [type] == "json" {
        json {
            source => "message"
        }
        acologs {
        }
    }
}
output {

    if [type] == "syslog" and "_grokparsefailure" in [tags] {
        file { path => "~/log/failed_syslog_events-%{+YYYY-MM-dd}" }
    }

    elasticsearch {
        hosts => "localhost:9200"
    }
}

I suggest you read this post.

Thanks for the suggestion.
I made the modification in my pipeline.conf by adding the line

if ! [host][name] {
	mutate {
	   rename { "[host]" => "[host][name]"	}
		}
            }

And now I'm getting the error -

[ERROR][logstash.agent           ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, =>

What am I missing here?

Thanks

I found the missing part, there needs to be a => after rename, like this

if ! [host][name] {
	mutate {
	   rename => { "[host]" => "[host][name]"	}
		}
            }

But still I get the same error as before, did I miss something?

Hi @Badger ,
I also tried using the second solution as suggested in the other post by adding

mutate { replace => { "[host]" => "[host][name]" } }

I still am getting the errors same as the ones before:

[logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-2020.06.09", :_type=>"doc", :routing=>nil}, #<LogStash::Event:0x4fba72ab>], :response=>{"index"=>{"_index"=>"logstash-2020.06.09", "_type"=>"doc", "_id"=>"-VRFnnIBEechOczSRbyg", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [Rate.timeSpan] of type [text]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:539"}}}}}

I appreciate the help.
Thanks

In elasticsearch Rate.timeSpan is a text field, but in your events it is an object. You need to decide which one it should be and adjust accordingly.

Thanks for the reply @Badger
My apologies, I am just starting with Elastic stack.
I need to define a filter for it specifically?
Could you give me an example?

Thanks

The post I linked to earlier explain what to do for the field host.name, you need to go through the same steps for Rate.timeSpan

Thanks @Badger,
I will make the adjustments for Rate.timeSpan and check.

Hi @Badger,
I tried the solution you provided by adding this to my pipeline.conf file in logstash

if ! [Rate][timeSpan] {
	mutate {
	   rename { "[Rate]" => "[Rate][timeSpan]"	}
		}
            } 

But the error still persists.
Can you please guide me in what I am doing wrong.

Best Regards

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.