Logstash parsing syslog

Hello all,

I have a syslog server and the ELK stack on the same server. I create a directory for each syslog source.
I'm trying to parse syslog events with Logstash, and I'd like to keep the ip adress of the syslog source in the "host" field. At the moment I have the 0.0.0.0 source after Logstash parsing.
could you please help ?

Thanks.

Please show your configuration.

Below the logstash.conf:

input {
file {
path => ["path/to/file.log"]
start_position => "beginning"
type => "linux-syslog"
ignore_older => 0
}
}

filter {
if [type] == "linux-syslog" {
grok {
match => {"message" => "<%{POSINT:syslog_pri}>%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:[%{POSINT:syslog_pid}])?: %{GREEDYDATA:syslog_message}" }
}
}
}

output {
elasticsearch {
hosts => ["@IP_Elastic:Port_Elastic"]
}
stdout { codec => rubydebug }
}

The name of the host that generated each message should be present in the syslog_hostname field, right? If so, change your grok filter to store that string into the host field instead (you'll have to adjust the grok filter's overwrite option).

Yes, indeed I'd like to keep the source ip or the name of the machine that generated the log because when I have a look at kibana, the "host" field is set to 0.0.0.0 for all the syslog sources. Moreover, the timestamp original is not kept, but only the timestamp of processing date by logstash.

Moreover, the timestamp original is not kept, but only the timestamp of processing date by logstash.

Use a date filter to parse the syslog_timestamp field.

Thank you magnus.

Is it possible to reuse the name of the syslog directory for the "host" field Logstash ?
For example, use the $HOST for the host field.

input {
file {
path => ["path/to/$HOST/file.log"]
start_position => "beginning"
type => "linux-syslog"
ignore_older => 0
}
}

Not like that, but you can use a grok filter to extract the hostname from the path field. Examples of this has been posted here and on StackOverflow many times.