Parsing Syslog to Logstash

Hey;

im currently trying to parse my syslog events to my elasticsearch host over logstash.
I created a new .conf file under /etc/logstash/conf.d called syslog.conf.

This file looks like this:


input {
syslog {
port => 514
}
}

filter {
if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:[%{POSINT:syslog_pid}])?: %{GREEDYDATA:syslog_message}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
syslog_pri { }
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}

output {
elasticsearch {
hosts => "http://b4d1syslog.b4dom1.local:9200"
manage_template => false
index => "syslog"
document_type => "%{[@metadata][type]}"
}
}


Sadly i cant get the logs to show up in Elasticsearch.
When i type "tcpdump -A -i any dst port 514" I can see the right logs coming in, so my client is configured the right way.

I would be grateful if anyone could help me. Thx :slight_smile:

Hi there,

when posting parts of code or response, please highlight your text and press this icon image to format it.

Anyway, can you paste here what is printed in stdout running the following pipeline?

input {
  syslog {
    port => 514
  }
}

filter {}

output {
  stdout{}
}

What makes you think that [type] will contain the value "syslog", and what makes you think [@metadata][type] will contain a value?

1 Like

Hey,

to be honest i copied that part of the code... My goal is just to index all syslog data coming in at port 5541 to elasticsearch. Maybe you could give me a hint, how this can be accomplished?

Thanks in advance :slight_smile:

input {
  syslog {
    port => 5541
  }
}

filter {}

output {
  elasticsearch {
    hosts => "whatever_your_host_is:whatever_your_port_is"
    index => "whatever_your_index_is"
  }
}

Thanks! :slight_smile:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.