Postfix logs not forwarding to ELK


#1

I thought I had this fixed, but had my "exclude_files" line commented out. I moved the input up to the top and it still does not work. I am getting data from the other two inputs so I am thinking it has to do with this configuration setting. Any ideas would be appreciated!

filebeat.prospectors:

- input_type: log
  document_type: postfix
  paths:
   - /var/log/mail.*

- input_type: log
  document_type: syslog
  exclude_files: ['mail.log']
  paths:
    - /var/log/*.log

- input_type: log
  json.message_key: message
  json.keys_under_root: true
  document_type: apache
  paths:
    - /var/log/apache2/logstash_access_log

#2

I didn't change anything, but now I am seeing the postfix logs in ELK. This is just a small lab so its not heavily used so there shouldn't be any delay.


(ruflin) #3

Best have a look at your filebeat logs to see if there are any errors inside to explain the delay. Which version are you using?


#4

My custom filter/pattern is not working and I am getting a _grokparesfailure tag in Kibana. I am seeing this error in the logs? Can someone help me interpret the meaning?

[2017-04-25T09:00:59,495][ERROR][logstash.pipeline ] Error registering plugin {:plugin=>"#<LogStash::FilterDelegator:0x50ee7435 @id=\"d78b782db93e05e8e2cb2b01787934c31753cfe8-3\", @klass=LogStash::Filters::Grok, @metric_events=#<LogStash::Instrument::NamespacedMetric:0x1ff45cdd @metric=#<LogStash::Instrument::Metric:0x4bdac55 @collector=#<LogStash::Instrument::Collector:0x64b68eaa @agent=nil, @metric_store=#<LogStash::Instrument::MetricStore:0x475a01d5 @store=#<Concurrent::Map:0x7af5af57 @default_proc=nil>, @structured_lookup_mutex=#<Mutex:0x13436b58>, @fast_lookup=#<Concurrent::Map:0x43d06444 @default_proc=nil>>>>, @namespace_name=[:stats, :pipelines, :main, :plugins, :filters, :\"d78b782db93e05e8e2cb2b01787934c31753cfe8-3\", :events]>, @logger=#<LogStash::Logging::Logger:0x3cb1380 @logger=#<Java::OrgApacheLoggingLog4jCore::Logger:0x5d79037>>, @filter=<LogStash::Filters::Grok patterns_dir=>[\"/etc/logstash/patterns\"], match=>{\"message\"=>\"%{POSTFIX}\"}, add_tag=>[\"postfix\", \"grokked\"], id=>\"d78b782db93e05e8e2cb2b01787934c31753cfe8-3\", enable_metric=>true, periodic_flush=>false, patterns_files_glob=>\"*\", break_on_match=>true, named_captures_only=>true, keep_empty_captures=>false, tag_on_failure=>[\"_grokparsefailure\"], timeout_millis=>30000, tag_on_timeout=>\"_groktimeout\">>", :error=>"pattern %{OSTNAME:hostname} not defined"}


(ruflin) #5

That sounds more like a Logstash problem. One issue could be the following message: pattern %{OSTNAME:hostname} not defined" This should probably be HOSTNAME.


#6

I noticed that, but in my custom pattern it is listed properly. Any idea where else the problem could lie?

filter {
  if [type] == "postfix" {
    grok {
      patterns_dir => ["/etc/logstash/patterns"]
      match => [ "message", "%{POSTFIX}" ]
      add_tag => [ "postfix", "grokked" ]
    }
  }
}





COMP ([\w._\/%-]+)
COMPPID postfix\/%{COMP:component}(?:\[%{POSINT:pid}\])?
QUEUEID ([A-F0-9]{5,15}{1})
#EMAILADDRESSPART [a-zA-Z0-9_.+-=:]+
#EMAILADDRESS %{EMAILADDRESSPART:local}@%{EMAILADDRESSPART:remote}
POSTFIX %{SYSLOGTIMESTAMP:timestamp} %{SYSLOGHOST:hostname} %{COMPPID}: %{QUEUEID:queueid}
#POSTFIXQMGR %{POSTFIX}: (?:removed|from=<(?:%{EMAILADDRESS:from})?>(?:, size=%{POSINT:size}, nrcpt=%{POSINT:nrcpt} \(%{GREEDYDATA:queuestatus}\))?)

#7

OK, I am VERY frustrated because its now working. The only thing I've done is reboot the server. Until then I was just restarting logstash & elasticsearch from the command line. Logstash sure is black magic at times.


(system) #8

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.