Nginx module | Processor drop_event HTTP 200 not working

I'm trying to exclude HTTP 200 events from Nginx module using processors

My config file nginx.yml is

# Module: nginx
# Docs: https://www.elastic.co/guide/en/beats/filebeat/7.4/filebeat-module-nginx.html

- module: nginx
  # Access logs
  access:
    enabled: true

    # Set custom paths for the log files. If left empty,
    # Filebeat will choose the paths depending on your OS.
    var.paths: ["/var/log/nginx/*access.log"]
    input:
      processors:
      - drop_event:
          when:
            equals:
              http.response.status_code: 200

  # Error logs
  error:
    enabled: true

    # Set custom paths for the log files. If left empty,
    # Filebeat will choose the paths depending on your OS.
    var.paths: ["/var/log/nginx/*error.log"]

Logs flooding

Oct 13 13:40:16 KTA10-CG12-DH2 filebeat[4207]: 2019-10-13T13:40:16.197-0300#011WARN#011[conditions]#011conditions/equals.go:100#011unexpected type []string in equals condition as it accepts only integers, strings, or boolea
ns.

Using when.equals, contains and range.gt: 200 has same result

No logs in elasticsearch index =(

Hi @lfraga thanks for trying the Elastic Stack and Filebeat and nginx module.

Per the exported fields defined here http.response.status_code appears to be a string...perhaps your drop event should use the "200" although I agree the examples seem to indicate what you have should work.

EDIT : I looked at the mapping for http.response.status_code and it is a long

"http" : {
  "properties" : {
 ......
    "response" : {
      "properties" : {
        "body" : {
          "properties" : {
            "bytes" : {
              "type" : "long"
            },
            "content" : {
              "type" : "keyword",
              "ignore_above" : 1024
            }
          }
        },
        "bytes" : {
          "type" : "long"
        },
        "status_code" : {
          "type" : "long"
        }
      }
    },

So now I am not sure why it is not working.... Hmm

Ahh take a look at this post. Now it makes more sense.

For Filebeat the whole log line gets shipped as the message and then processed with the ingest Pipeline on the Elasticsearch side so the fields are not available yet for the drop_event processor on the harvestor side so it can not find the field and thus is not executed and that is probably what is producing those error logs.

You will need to use a different approach.

Example exclude_line or a drop_event with regex on the message field etc

NOTE I got this to work in the nginx.yml

- module: nginx

  # Access logs
  access:
    enabled: true

    # Set custom paths for the log files. If left empty,
    # Filebeat will choose the paths depending on your OS.
    var.paths: ["/Users/sbrown/workspace/sample-data/nginx/nginx.log"]

    input:
      processors:
      - add_locale: ~
      - drop_event.when.regexp.message: " 200 "

BTW I had the add the add_locale as it seems it is added automatically but needs to be explicitly defined when adding another processors perhaps that is a minor bug.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.