Syslog input config for logstash

Hello all,

Im using ES 7.9.0. My goal is to get syslogs from my firewall to logstash. I tried every config that i found on net. But could not solve it. I configured rsyslog to forward incoming syslogs to my logstash whis is ok. but when i open rsyslog i got below logs from logstash.

"[2020-08-20T09:53:29,227][WARN ][logstash.outputs.elasticsearch][main][1e41b167c9426887518a221b53e23a45ea979df4317fd894368eb36f0dfbce44] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash", :routing=>nil, :_type=>"_doc"}, #LogStash::Event:0x5d300f85], :response=>{"index"=>{"_index"=>"logstash-2020.08.19-000001", "_type"=>"_doc", "_id"=>"PqikCnQBgP1uDgny0g90", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"object mapping for [host] tried to parse field [host] as object, but found a concrete value"}}}}"

I know it is related with logstash filter options but could not find any working filter set for it. I tried Elastic recommandions (https://www.elastic.co/guide/en/logstash/current/config-examples.html) but no luck :frowning:

I know it should not be so hard and complicated like this. Can anyone help me about it?

My logstash config is;
"input {
tcp {
port => 5514
type => syslog
}
udp {
port => 5514
type => syslog
}
}
filter {
if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:[%{POSINT:syslog_pid}])?: %{GREEDYDATA:syslog_message}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}

output {
elasticsearch { hosts => ["localhost:9200"] }
stdout { codec => rubydebug }
}"

also tried with default grok pattern (https://www.elastic.co/guide/en/logstash/current/plugins-inputs-syslog.html#plugins-inputs-syslog-grok_pattern)

and syslog field (https://www.elastic.co/guide/en/logstash/current/plugins-inputs-syslog.html#plugins-inputs-syslog-syslog_field)

no luck, still got same error on logstash :frowning:

This is a common problem. I suggest you start here.

Thank you @Badger, if this is a common problem, somebody need to fix it. If you lost this problem's history even have good knowledge, newbies like me can not do anything. Is there any "clear" solution for this?

does anybody have "any" idea about it?

Not sure what you expect anyone to add here. It is working as designed. Fields on a document in elasticsearch have a type, if you try to index a document where a field has the wrong type elasticsearch rejects it. You need to make sure all the fields have the right type.

The issue is so simple: I need to collect syslogs from firewall, router etc. And it is clear that there is a configuration problem while getting these via logstash. I could not pass logstash thus i dont know how to deal with elasticsearch.

someone already did this, i believe. Because it is too common need. Even commercial documens can not help, what can i do?

Well, you tried to use grok debugger with your expressions, I did your analysis with rsyslog and it works fine for me, I leave the pipeline filter

input {
    udp {
        host => "10.0.0.10"
        port => 6000
        type => rsyslog
      }
    tcp {
        host => "10.0.0.10"
        port => 6000
        type => rsyslog
      }
}
filter {
    if [type] == "rsyslog" {
        grok {
            match => {
                "message" => ["%{SYSLOGTIMESTAMP:rsyslog_timestamp} %{SYSLOGHOST:rsyslog_hostname} %{DATA:rsyslog_program}(?:\[%{POSINT:rsyslog_pid}\])?: %{GREEDYDATA:rsyslog_message} %{USERNAME:rsyslog_user}."]
                overwrite => ["message"]
            }
        }
        mutate {
            add_field => {"[@metadata][indexName]" => "this-server" }
        }
    }
}
output {
    pipeline {
        send_to => [esServer]
    }
}

this part in bold (?:[%{POSINT:rsyslog_pid}])?: is the case for you

the grok debugger is your firs step, is the difference to me... regards