Hi there,
I try to get a JSON Formatted Datastream from Kafka processed by logstash towards Elasticsearch.
My Json Stream has multiple Arrays filled with objects. All of the arrays except of one are working perfectly fine, but one of these is always failing by throwing the following error:
[2018-09-25T11:09:41,384][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"index_test", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x41f204ec>], :response=>{"index"=>{"_index"=>"index_test", "_type"=>"doc", "_id"=>"iWYXEVx", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [HeaderVariables.Value.VALUE]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"For input string: \"input_string\""}}}}}
The JSON string is generated by a windows service I run on a server (bought that tool), so the JSON should be in a valid format.
The other objects are all build with the same structure so I really don't understand why logstash is parsing some and some not.
My logstash pipeline looks like that:
input{
kafka {
topics => ["topic_test"]
bootstrap_servers => "localhost:9092"
}
}
filter {
json {
source => "message"
}
}
output{
stdout {codec => rubydebug }
elasticsearch {
hosts => "localhost:9200"
index=> "index_test"
}
}
I also tried a bit around with the split command, but with no success. if I remove the HeaderVariables it gets forwarded to elasticsearch and is now viewable in Kibana, but i really need the HeaderVariables.
If someone has an Idea how to fix this, please let me know that would help me out very much.