Im using ES 7.9.0. My goal is to get syslogs from my firewall to logstash. I tried every config that i found on net. But could not solve it. I configured rsyslog to forward incoming syslogs to my logstash whis is ok. but when i open rsyslog i got below logs from logstash.
"[2020-08-20T09:53:29,227][WARN ][logstash.outputs.elasticsearch][main][1e41b167c9426887518a221b53e23a45ea979df4317fd894368eb36f0dfbce44] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash", :routing=>nil, :_type=>"_doc"}, #LogStash::Event:0x5d300f85], :response=>{"index"=>{"_index"=>"logstash-2020.08.19-000001", "_type"=>"_doc", "_id"=>"PqikCnQBgP1uDgny0g90", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"object mapping for [host] tried to parse field [host] as object, but found a concrete value"}}}}"
Thank you @Badger, if this is a common problem, somebody need to fix it. If you lost this problem's history even have good knowledge, newbies like me can not do anything. Is there any "clear" solution for this?
Not sure what you expect anyone to add here. It is working as designed. Fields on a document in elasticsearch have a type, if you try to index a document where a field has the wrong type elasticsearch rejects it. You need to make sure all the fields have the right type.
The issue is so simple: I need to collect syslogs from firewall, router etc. And it is clear that there is a configuration problem while getting these via logstash. I could not pass logstash thus i dont know how to deal with elasticsearch.
someone already did this, i believe. Because it is too common need. Even commercial documens can not help, what can i do?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.