Hello all,
Im using ES 7.9.0. My goal is to get syslogs from my firewall to logstash. I tried every config that i found on net. But could not solve it. I configured rsyslog to forward incoming syslogs to my logstash whis is ok. but when i open rsyslog i got below logs from logstash.
"[2020-08-20T09:53:29,227][WARN ][logstash.outputs.elasticsearch][main][1e41b167c9426887518a221b53e23a45ea979df4317fd894368eb36f0dfbce44] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash", :routing=>nil, :_type=>"_doc"}, #LogStash::Event:0x5d300f85], :response=>{"index"=>{"_index"=>"logstash-2020.08.19-000001", "_type"=>"_doc", "_id"=>"PqikCnQBgP1uDgny0g90", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"object mapping for [host] tried to parse field [host] as object, but found a concrete value"}}}}"
I know it is related with logstash filter options but could not find any working filter set for it. I tried Elastic recommandions (https://www.elastic.co/guide/en/logstash/current/config-examples.html) but no luck 
I know it should not be so hard and complicated like this. Can anyone help me about it?
My logstash config is;
"input {
tcp {
port => 5514
type => syslog
}
udp {
port => 5514
type => syslog
}
}
filter {
if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:[%{POSINT:syslog_pid}])?: %{GREEDYDATA:syslog_message}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}
output {
elasticsearch { hosts => ["localhost:9200"] }
stdout { codec => rubydebug }
}"