Hi,
I am trying to index my data into Elasticsearch from logstash but I keep getting this type of errors for all indices. 2020-05-06T14:38:11,434][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-2020.05.06", :_type=>"doc", :routing=>nil}, #<LogStash::Event:0x343023b1>], :response=>{"index"=>{"_index"=>"logstash-2020.05.06", "_type"=>"doc", "_id"=>"32H-6XEBEechOczSU02t", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [Tax.amount] of type [text]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:1067"}}}}}
Can you please help me in debugging the problem that is causing this to happen?
Thanks and Regards
A field in elasticsearch cannot be a keyword in some documents and an object in others. You are trying to index a document in which the [Tax.amount] field is an object into an index in which the [Tax.amount] field is a string.
Thanks for the reply.
How do I go about to resolve this, by filtering the object? Can you guide me how?
Here is my pipeline.conf file which I am using:
input {
udp {
port => 5000
type => syslog
}
udp {
port => 5001
type => json
}
tcp {
port => 5001
type => json
ssl_enable => true
ssl_key => "/etc/letsencrypt/live/elastic.acomodeo.com/privkey.pem"
ssl_cert => "/etc/letsencrypt/live/elastic.acomodeo.com/fullchain.pem"
ssl_extra_chain_certs => ["/etc/letsencrypt/live/elastic.acomodeo.com/chain.pem"]
ssl_verify => false
add_field => {"ssl" => "on"}
}
}
## Add your filters / logstash plugins configuration here
filter {
mutate {
remove_field => [ "port" ]
}
if [type] == "syslog" {
grok {
match => { "message" => "\A%{TIMESTAMP_ISO8601:tmptimestamp}\|%{HOSTNAME:cluster}\|%{HOSTNAME:hostname}\|%{HOSTNAME:app}\|%{GREEDYDATA:message}\Z" }
overwrite => [ "message" ]
}
date {
match => [ "tmptimestamp", "ISO8601" ]
remove_field => [ "tmptimestamp" ]
}
}
if [type] == "json" {
json {
source => "message"
}
acologs {
}
}
}
output {
if [type] == "syslog" and "_grokparsefailure" in [tags] {
file { path => "~/log/failed_syslog_events-%{+YYYY-MM-dd}" }
}
elasticsearch {
hosts => "localhost:9200"
}
}
[ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"LogStash::ConfigurationError", :message=>"Expected one of #, =>
I still am getting the errors same as the ones before:
[logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-2020.06.09", :_type=>"doc", :routing=>nil}, #<LogStash::Event:0x4fba72ab>], :response=>{"index"=>{"_index"=>"logstash-2020.06.09", "_type"=>"doc", "_id"=>"-VRFnnIBEechOczSRbyg", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [Rate.timeSpan] of type [text]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:539"}}}}}
Thanks for the reply @Badger
My apologies, I am just starting with Elastic stack.
I need to define a filter for it specifically?
Could you give me an example?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.