Error logs for nginx and apache not working

Hi there!

I'm adding some conf files to my logstash based on https://www.elastic.co/guide/en/logstash/current/logstash-config-for-filebeat-modules.html
So I have two config files working great:

file1-

input {
  beats {
    port => 5044
    ssl => false
  }
}

filter {
  if "access_log" in [tags] {
    grok {
          match => { "message" => "%{COMBINEDAPACHELOG}" }

    }
    date {
          match => [ "timestamp" , "dd/MMM/yyyy:HH:mm:ss Z" ]
    }
    geoip {
          source => "clientip"
    }
    useragent {
          source => "agent"
    }
  }
}

output {
  elasticsearch {
        hosts => ["localhost:9200"]
        sniffing => true
        manage_template => true
        index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
        document_type => "%{[@metadata][type]}"
  }
}

file 2-

filter {
  if "nginx" in [tags] and "error_log" in [tags] {
    grok {
      match => { "message" => ["%{DATA:[nginx][error][time]} \[%{DATA:[nginx][error][level]}\] %{NUMBER:[nginx][error][pid]}#%{NUMBER:[nginx][error][tid]}: (\*%{NUMBER:[nginx][error][connection_id]} )?%{GREEDYDATA:[nginx][error][message]}"] }
      remove_field => "message"
    }
    mutate {
      rename => { "@timestamp" => "read_timestamp" }
    }
    date {
      match => [ "[nginx][error][time]", "YYYY/MM/dd H:m:s" ]
      remove_field => "[nginx][error][time]"
    }
  }
}

And when I want to add another file to match apache errors, with this file:

filter {
  if "apache" in [tags] and "error_log" in [tags] {
   grok {
      match => { "message" => ["\[%{APACHE_TIME:[apache2][error][timestamp]}\] \[%{LOGLEVEL:[apache2][error][level]}\]( \[client %{IPORHOST:[apache2][error][client]}\])? %{GREEDYDATA:[apache2][error][message]}",
        "\[%{APACHE_TIME:[apache2][error][timestamp]}\] \[%{DATA:[apache2][error][module]}:%{LOGLEVEL:[apache2][error][level]}\] \[pid %{NUMBER:[apache2][error][pid]}(:tid %{NUMBER:[apache2][error][tid]})?\]( \[client %{IPORHOST:[apache2][error][client]}\])? %{GREEDYDATA:[apache2][error][message1]}" ] }
      pattern_definitions => {
        "APACHE_TIME" => "%{DAY} %{MONTH} %{MONTHDAY} %{TIME} %{YEAR}"
      }
      remove_field => "message"
   }
   mutate {
      rename => { "[apache2][error][message1]" => "[apache2][error][message]" }
   }
   date {
      match => [ "[apache2][error][timestamp]", "EEE MMM dd H:m:s YYYY", "EEE MMM dd H:m:s.SSSSSS YYYY" ]
      remove_field => "[apache2][error][timestamp]"
   }
  }
}

Every apache error log is getting ignored and not logged into Kibana.

What am I doing wrong?

Thanks in advance

Standard advice: Use a stdout { codec => rubydebug } output until you're sure that events are being processed as expected and that they look good. Only then do you add more complexity by enabling an elasticsearch output.

I'd also read the Logstash log and look for clues.

Hi Magnus, I followed your advice, but nothing wrong came out on Logstash log.

But using stdout (I used "file output") I see this logged

{"@timestamp":"2017-10-27T17:00:17.352Z","offset":1298823,"apache2":{"error":{"pid":"58839","message":"[client 201.148.293.147:60848] PHP Stack trace:, referer: https://www.google.com.ar/","level":"error"}},"@version":"1","input_type":"log","beat":{"name":"troya","hostname":"troya","version":"5.6.2"},"host":"troya","source":"/var/log/apache2/error.log","type":"log","fields":{"serverName":"troya","datacenter":"azure","appName":"apache","serverType":"virtual"},"tags":["apache","error_log","beats_input_codec_plain_applied"]}

and when I change back to elasticsearch output that line it's being ignored.

If you see the line tags, "error_log" is present, so why is it being parsed?

What am I doing wrong?

How are you verifying that nothing gets into ES? Have you checked all indexes?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.