Logstash - Syslog Help - I am a total pleb

Hey Guys,

I am a noob when it comes to ELK but am really eager to get this set up. I am currently using ELK to store syslog from multiple firewalls. I am using a fortinet (which is seeming to be not that fun to work with). I am having all of the syslog from the Fortigate go to port 514, and attempting to have logstash parse the logs. I know the following thus far:

  1. I am able to receive syslog on the Ubuntu instance on the server.
  2. Kibana is successfully receiving logs from beats and able to parse them with a logstash parser that I set up (followed a tutorial video on youtube).
  3. I cobbled together the franken-code below. Please let me know where you guys think I should look to go next. I have been playing around with configuration file for way too long... This is all in one file (which may be the problem?).

logstash conf file
input {
File {
path => "/var/log/syslog"
type => "syslog"
start_position => "beginning"
}
udp {
port => 514
type => "fortigate"
}
tcp {
port => 514
type => "fortigate"
}
}

Configure syslog filtering

for the Fortigate firewall logs

filter {
if [type] == "fortigate" {
mutate {
add_tag => ["fortigate"]
}
grok {
match => ["message", "%{SYSLOG5424PRI:syslog_index}%{GREEDYDATA:mes$
overwrite => [ "message" ]
tag_on_failure => [ "failure_grok_fortigate" ]
}

    kv { }

    if [msg] {
        mutate {
            replace => [ "message", "%{msg}" ]
        }
    }
    mutate {
        add_field => ["logTimestamp", "%{date} %{time}"]
        add_field => ["loglevel", "%{level}"]
        replace => [ "fortigate_type", "%{type}"]
        replace => [ "fortigate_subtype", "%{subtype}"]
        remove_field => [ "msg","type", "level", "date", "time" ]
    }
    date {
        locale => "en"
        match => ["logTimestamp", "YYYY-MM-dd HH:mm:ss"]
        remove_field => ["logTimestamp", "year", "month", "day", "time", "d$
        add_field => ["type", "fortigate"]
    }
}#end if type fortigate

}

output {
if ( [type] == "fortigate" ) {
#stdout { codec => rubydebug }
elasticsearch {
index => "logstash_fortigate-%{+YYYY.MM.dd}"
host => ["localhost:9200"]
protocol => "http"
port => "443"
}
}

}

And what problem are you experiencing?

host => ["localhost:9200"]
protocol => "http"
port => "443"

Port 443 is normally used for HTTPS. And what port do you want to use, 9200 or 443?

Thanks for the help Magnus!

I think 443 is best. The server is being hosted on my local device as is Kibana. But i also gave it my IP so that when you hit the IP in a web browser, it redirects to 80.

I guess I shoul've been clearer: I dont think logstash is parsing the data properly and I don't really know the best way to go about this for both Grok and the KV filter.

I guess I shoul've been clearer: I dont think logstash is parsing the data properly and I don't really know the best way to go about this for both Grok and the KV filter.

Showing an example event produced by Logstash would be a good start. Comment out your elasticsearch output and use a stdout { codec => rubydebug } output for debugging purposes until the events look as expected.

set it to be this -- still nothing populating in Kibana for me to visualize. Is it possible I shouldn't be looking in Kibana?

This is the code I currently have posted.

output {
if ( [type] == "fortigate" ) {
stdout { codec => rubydebug }
#elasticsearch {
#index => "logstash_fortigate-%{+YYYY.MM.dd}"
#host => ["localhost:9200"]
#protocol => "http"
#port => "443"
}
}

The stdout output will dump the events in the Logstash log so that's where you should look.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.