_grokparsefailure on [fields][document_type]

Hey guys,

I'm having a hard time to understand my mistake getting a _grokparsefailure on a Logstash filter. Im unsing an Apache access log and a NGINX access log which are prospected by Filebeat and then send to Logstash:

filebeat.prospectors:
- input_type: log
  paths:
    - /var/log/apache2/*access*log
  encoding: plain
  fields_under_root: false
  fields:
    document_type: apache-access-log
  scan_frequency: 10s
  harvester_buffer_size: 16384
  tail_files: false
  backoff: 1s
  max_backoff: 10s
  backoff_factor: 2
  max_bytes: 10485760

filebeat.prospectors:
- input_type: log
  paths:
    - /var/log/nginx/*access*log
  encoding: plain
  fields_under_root: false
  fields:
    document_type: nginx-access-log
  scan_frequency: 10s
  harvester_buffer_size: 16384
  tail_files: false
  backoff: 1s
  max_backoff: 10s
  backoff_factor: 2
  max_bytes: 10485760

On Logstash side I'm using following:

input {
  beats {
    port => 5050
    type => "logs"
  }
}

filter {
  if [fields][document_type] == "apache-access-log" {
    grok {
      match => { "message" => "%{COMBINEDAPACHELOG}"}
      add_tag => [ "apache-access-log" , "grokked" ]
    }
    geoip {
      source => "clientip"
    }
  }

output {
  elasticsearch { hosts => ["10.6.247.12:9200"] }
}

This is fine and Kibana is adding my defined tags "apache-access-log" and "grokked". But when I add a second filter for my NGINX access log like this:

input {
  beats {
    port => 5050
    type => "logs"
  }
}

filter {
  if [fields][document_type] == "apache-access-log" {
    grok {
      match => { "message" => "%{COMBINEDAPACHELOG}"}
      add_tag => [ "apache-access-log" , "grokked" ]
    }
    geoip {
      source => "clientip"
    }
  }

  if [fields][document_type] == "nginx-access-log" {
    grok {
      add_tag => [ "nginx-access-log" , "grokked" ]
    }
  }
}

output {
  elasticsearch { hosts => ["10.6.247.12:9200"] }
}

My NGINX access logs don't get aditional tags, moreover I get a _grokparsefailure tag.

Running Filebeat in debug mode I see my defined fields:

2018/01/27 15:06:42.978815 client.go:214: DBG  Publish: {
  "@timestamp": "2018-01-27T15:06:41.052Z",
  "beat": {
    "hostname": "SOME-HOSTNAME",
    "name": "SOME-DOMAIN",
    "version": "5.6.6"
  },
  "fields": {
    "document_type": "nginx-access-log"
  },
  "input_type": "log",
  "message": "[27/Jan/2018:16:06:38 +0100] Cache: - 10.6.247.12:8080 0.093 200 15121 SOMEIP /mountain-air-adventure/",
  "offset": 1598409,
  "source": "/var/log/nginx/SOME-DOMAIN.access_log",
  "type": "log"
}

Even when I run the NGINX access log filter on it's own, the result stays the same.

Apache access log grok is working like expected:

kibana_apache_grok

You have a match => pattern defined for the apache-access-log type, but not for the nginx-access-log. _grokparsefailure is added when the grok filter is not successful.

Yeah, and since I'm just adding a tag, I've to use mutate and not grok filter:

filter {
  if [fields][document_type] == "nginx-access-log" {
    mutate {
      add_tag => [ "nginx-access-log" , "grokked" ]
    }
  }
}

This topic can be closed.

Cheers,
David

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.