Syslog input configuration for Logstash

I want to send syslog files to my logstash, but not sure how to incorporate it with my beats input. When I have tried adding my beats stop working. Below is my logstash file.

input {
  beats {
    port => 5044
  }
}

filter {
  if [source][ip] =~ /^10.3./ {
   mutate { replace      => { "[source][geo][timezone]"      => "Eastern/New York" } }
   mutate { replace     => { "[source][geo][country_name]"  => " Office" } }
   mutate { replace      => { "[source][geo][country_code2]" => "US" } }
   mutate { replace      => { "[source][geo][country_code3]" => "IMO" } }
   mutate { replace      => { "[source][geo][city_name]" => "XXX" } }
   mutate { replace      => { "[source][geo][region_name]"      => "XX" } }
   mutate { replace      => { "[source][geo][region_code]"      => "XX" } }
   mutate { replace      => { "[source][geo][postal_code]"      => "XXXXX" } }
   mutate { update      => { "[source][geo][ip]"      => "[source][ip]" } }
   mutate { remove_field => [ "[source][geo][location]" ] }
   mutate { add_field    => { "[source][geo][location]"      => "-XX.XX" } }
   mutate { add_field    => { "[source][geo][location]"      => "XX.XXX" } }
   mutate { convert      => [ "[source][geo][location]",        "float" ] }
   mutate { replace      => [ "[source][geo][latitude]",        XX.XX ] }
   mutate { convert      => [ "[source][geo][latitude]",        "float" ] }
   mutate { replace      => [ "[source][geo][longitude]",       -XX.XXX ] }
   mutate { convert      => [ "[source][geo][longitude]",       "float" ] }
} else {
geoip {
source => "[source][ip]"
target=>"[source][geo]"
      }
  }
}

output {
  if [@metadata][pipeline] {
    elasticsearch {
	ilm_enabled => true
    hosts => ["http://10.3.200.45:9200", "http://10.3.200.46:9200"]
	index => "%{[@metadata][beat]}-%{[@metadata][version]}"
	  ilm_pattern => "000001"
      pipeline => "%{[@metadata][pipeline]}" 	 
    }
  } else {
    elasticsearch {
    hosts => ["http://10.3.200.45:9200", "http://10.3.200.46:9200"]
	index => "%{[@metadata][beat]}-%{[@metadata][version]}"
	ilm_pattern => "000001"
    }
  }
}

I have tried adding this tot he Input:

  tcp {
    port => 5040
	type => syslog
	}

This to the filter.

if [type] == "syslog" {
    grok {
      match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
      add_field => [ "received_at", "%{@timestamp}" ]
      add_field => [ "received_from", "%{host}" ]
    }
    date {
      match => [ "syslog_timestamp", "MMM  d HH:mm:ss", "MMM dd HH:mm:ss" ]
    }
  }

This to the output.

elasticsearch {
	ilm_enabled => true
    hosts => ["http://10.3.200.45:9200", "http://10.3.200.46:9200"]
	index => "syslog-000001"
	  ilm_pattern => "000001"
    }
  }

Any help will be appreciated.

Anyone have any thoughts?

I am running Logstash 7.3 if that will help.

What is your question?

I want to know how to get my beats to still work and add the syslog portion. I'm having trouble with the input and output. I have tried the input as so.

input {
  beats {
    port => 5044
  }

   syslog {
     port => 5040
     codec => cef
     syslog_field => "syslog"
     grok_pattern => "<%{POSINT:priority}>%{SYSLOGTIMESTAMP:timestamp}  %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" 
       add_field => [ "received_at", "%{@timestamp}" ]
       add_field => [ "received_from", "%{host}" ]
	}
  }

I really struggle with the output. I want an if/ else statement but not sure how to structure it correctly. This is what I have so far.

output {
  if [type] == "beats" {
    elasticsearch {
	ilm_enabled => true
    hosts => ["http://10.3.200.45:9200", "http://10.3.200.46:9200"]
	index => "%{[@metadata][beat]}-%{[@metadata][version]}"
	  ilm_pattern => "000001"
      pipeline => "%{[@metadata][pipeline]}" 	 
    }
  } else {
    elasticsearch {
	ilm_enabled => true
    hosts => ["http://10.3.200.45:9200", "http://10.3.200.46:9200"]
	index => "syslog-000001"
	  ilm_pattern => "000001"
    }
  }
}

The problem this is the beats stop working and everything goes to the syslog, but no actual syslogs are getting pulled.

Can anyone help me understand what I am doing wrong?

Does anyone have any ideas?

Please and thank you.

My two cents:
I tried using syslog input on Logstash many versions ago. Logstash's syslog input didn't scale, so for syslog input, I use syslog-ng which is relatively simple and scales a lot better than logstash in terms of events per second. Syslog-ng drops to a flat file. A separate logstash pipeline reads the flat file. Thus, my suggestion:

  1. simple syslog-ng setup to drop to flat files. (It can even drop to a json file.)
  2. pipelines.yml to separate beats input and flat file input in logstash processes.
    One of the benefits is that you can see the data that you'll need to parse in flat file. There are plenty of log types that send "type" as a field that will mess with your if [type] logic.

@sampas Let me see if I understand. Set up a Syslog application to gather all the logs. Then have Filebeat of something ingest them into Logstash? Does Syslog-ng work on Windows?

Thank you

Syslog-ng is very good at collecting traditional events sent via the syslog protocol, but there's no windows version. I consume the flat files that syslog-ng writes with logstash, and it scales well. You could also consume the flat files with beats. Part of the value of flat files is you get to see what you're trying to parse, e.g. what format the timestamps are in, field layout, etc.

Running the two listeners in separate logstash pipelines (controlled via pipeline.yml) will also help keep the logic separate. You can also have logstash output the syslog to a flat file just so you can see it.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.