Logstash filter not working, when all the filters are applied at once, but working when applied only one filter

I have added 3 filters in Logstash but at a time only 2 are working, but all the three are not working at the same time. One of the filter is throwing error. When I applied single filter that is working fine, when 2 that is also working fine but when all are applied, one of the filter is not working.

Below is my filter configuration:

else if [type] == "tls_log" {
        json {
            source => "message"
            target => "json"
            remove_field=>["message"]
            }
        mutate {
            rename => {
              "path" => "filename"
            }
            }
        if "_jsonparsefailure" in [tags] {
          mutate {
            add_field => {
              "checker" => "value_tls_parseerror"
              "logplane" => {{ .Values.log.logplane.default | quote }}
            }
            remove_field => [ "json" ]
          }
        }
        else {
          mutate {
            add_field => {
              "checker" => "value_tls"
              "service_id" => "%{[json][service_id]}"
              "version" => "%{[json][version]}"
              "[metadata][container_name]" => "%{[json][metadata][container_name]}"
              "[metadata][node_name]" => "%{[json][metadata][node_name]}"
              "[metadata][namespace]" => "%{[json][metadata][namespace]}"
              "[metadata][pod_name]" => "%{[json][metadata][pod_name]}"
              "[metadata][pod_uid]" => "%{[json][metadata][pod_uid]}"
              "logplane" => {{ .Values.log.logplane.default | quote }}
              "severity" => "%{[json][severity]}"
              "message" => "%{[json][message]}"
              "timestamp" => "%{[json][timestamp]}"
            }
            remove_field => [ "json" ]
          }
        }
      }
      else if [type] == "metrics_log" {
        json {
            source => "message"
            target => "json"
            remove_field=>["message"]
            }
        mutate {
            rename => {
              "path" => "filename"
            }
            
            add_field => {
              "checker" => "value_metric"
              "service_id" => "%{[json][service_id]}"
              "version" => "%{[json][version]}"
              "timestamp" => "%{[json][timestamp]}"
              "[metadata][container_name]" => "%{[json][metadata][container_name]}"
              "logplane" => {{ .Values.log.logplane.default | quote }}
              "severity" => "%{[json][severity]}"
              "message" => "%{[json][message]}"
            }
            remove_field => [ "json" ]
          }
      }
      else if [type] == "log" {
        json {
            source => "message"
            target => "json"
            remove_field=>["message"]
            }

            if [json][facility] {
            mutate {
              add_field => { "facility" => "%{[json][facility]}" }
            }
            }

            if [json][metadata][proc_id] {
            mutate {
              add_field => { "proc_id" => "%{[json][metadata][proc_id]}" }
            }
            }

            if [json][metadata][category] {
            mutate {
              add_field => { "category" => "%{[json][metadata][category]}" }
            }
            }

        mutate {
            rename => {
              "path" => "filename"
            }
            
            add_field => {
              "checker" => "value"
              "logplane" => {{ .Values.log.logplane.default | quote }}
              "version" => "%{[json][version]}"
              "severity" => "%{[json][severity]}"
              "service_id" => "%{[json][service_id]}"
              "[kubernetes][pod][name]" => "%{[json][metadata][pod_name]}"
              "[kubernetes][pod][uid]" => "%{[json][metadata][pod_uid]}"
              "[kubernetes][namespace]" => "%{[json][metadata][namespace]}"
              "[kubernetes][node][name]" => "%{[json][metadata][node_name]}"
              "[metadata][container_name]" => "%{[json][metadata][container_name]}"
              "[metadata][node_name]" => "%{[json][metadata][node_name]}"
              "[metadata][namespace]" => "%{[json][metadata][namespace]}"
              "[metadata][pod_name]" => "%{[json][metadata][pod_name]}"
              "[metadata][pod_uid]" => "%{[json][metadata][pod_uid]}"
              "message" => "%{[json][message]}"
            }
            remove_field => [ "type", "host", "json" ]
          }
      }

Below is the error in case of [type] = "log":

{
        "_index" : "adp-app-logs-2022.07.20",
        "_type" : "_doc",
        "_id" : "JpSlHIIBBZFaCg47krlI",
        "_score" : 0.0023733466,
        "_source" : {
          "version" : "%{[json][version]}",
          "kubernetes" : {
            "namespace" : "%{[json][metadata][namespace]}",
            "pod" : {
              "uid" : "%{[json][metadata][pod_uid]}",
              "name" : "%{[json][metadata][pod_name]}"
            },
            "node" : {
              "name" : "%{[json][metadata][node_name]}"
            }
          },
          "logplane" : "adp-app-logs",
          "checker" : "value",
          "@timestamp" : "2022-07-20T17:26:09.911Z",
          "filename" : "/logs/logtransformer.log",
          "severity" : "%{[json][severity]}",
          "@version" : "1",
          "metadata" : {
            "namespace" : "%{[json][metadata][namespace]}",
            "container_name" : "%{[json][metadata][container_name]}",
            "pod_name" : "%{[json][metadata][pod_name]}",
            "pod_uid" : "%{[json][metadata][pod_uid]}",
            "node_name" : "%{[json][metadata][node_name]}"
          },
          "service_id" : "%{[json][service_id]}",
          "tags" : [
            "_jsonparsefailure"
          ],
          "message" : [
            "{\"version\": \"1.1.0\", \"timestamp\": \"2022-07-20T17:26:04.694Z\", \"severity\": \"warning\", \"service_id\": \"eric-log-transformer\", \"metadata\" : {\"namespace\": \"zyadros\", \"pod_name\": \"eric-log-transformer-59577c5f7b-24btq\", \"node_name\": \"node-10-63-142-143\", \"pod_uid\": \"58b1e1ef-318a-4a06-ba00-ba0345972b13\", \"container_name\": \"logtransformer\"}, \"message\": \"Error parsing json {:source=>'message', :raw=>' at [Source: (byte[])'{'version': '1.1.0', 'timestamp': '2022-07-20T17:25:50.100Z', 'severity': 'warning', 'service_id': 'eric-log-transformer', 'metadata' : {'namespace': 'zyadros', 'pod_name': 'eric-log-transformer-59577c5f7b-24btq', 'node_name': 'node-10-63-142-143', 'pod_uid': '58b1e1ef-318a-4a06-ba00-ba0345972b13', 'container_name': 'logtransformer'}, 'message': 'Error parsing json {:source=>'message', :raw=>'{\\\\'version\\\\': \\\\'1.1.0\\\\', \\\\'timestamp\\\\': \\\\'2022-07-20T17:25:48.011Z\\\\', \\\\'severity\\\\': \\\\'warning\\\\', \\\\'servi'[truncated 1147 bytes]; line: 1, column: 400]>}\\'}', :exception=>#<LogStash::Json::ParserError: Unrecognized token 'at': was expecting ('true', 'false' or 'null')",
            "%{[json][message]}"
          ]
        }
      }

But when applied only a single filter, each one is giving all the values correctly.

Which filters? You need to provide more context.

Share your logstash configuration with the filters that are not working and also share some sample messages.

Without more information it is not possible to know what is the issue.

Added the configuration and the error.

The mutate filter has no order guarantee with add_field and remove_field , so you should not use it with the same field in the same mutate block.

There is an note in the documentation about it.

Each mutation must be in its own code block if the sequence of operations needs to be preserved.

Since you need to run remove_field after add_field, you need to use them in differente mutate blocks.

Remove the field json from your remove_field and add another mutate block with it after.

mutate {
    remove_field => ["json"]
}

See if this solves your issue.

Although it is not documented, the order of the common filter options is fixed. The code processes them in the order add_field, remove_field, add_tag, remove_tag.

1 Like

Nothing happened, getting another error after adding this:

mutate { gsub => [ "message", "(\W)at(\W)", '\1""\2' ] }

The error:

"message" : "{\"version\": \"1.1.0\", \"timestamp\": \"2022-07-21T11:19:08.308Z\", \"severity\": \"warning\", \"service_id\": \"eric-log-transformer\", \"metadata\" : {\"namespace\": \"zyadros\", \"pod_name\": \"eric-log-transformer-846fc647f5-hhj46\", \"node_name\": \"node-10-63-142-142\", \"pod_uid\": \"b9aced00-0645-439c-a548-41ba593b9a97\", \"container_name\": \"logtransformer\"}, \"message\": \"Error parsing json {:source=>'message', :raw=>' \\'\\' [Source: (byte[])'{'version': '1.1.0', 'timestamp': '2022-07-21T11:19:05.628Z', 'severity': 'warning', 'service_id': 'eric-log-transformer', 'metadata' : {'namespace': 'zyadros', 'pod_name': 'eric-log-transformer-846fc647f5-hhj46', 'node_name': 'node-10-63-142-142', 'pod_uid': 'b9aced00-0645-439c-a548-41ba593b9a97', 'container_name': 'logtransformer'}, 'message': 'Error parsing json {:source=>'message', :raw=>' \\\\'\\\\' [Source: (byte[])'{'version': '1.1.0', 'timestamp': '2022-07-21T11:19:03.507Z', 'severity': 'warni'[truncated 588 bytes]; line: 1, column: 400]>}\\'}', :exception=>#<LogStash::Json::ParserError: Unrecognized token 'Source': was expecting ('true', 'false' or 'null')",

Even after this getting another Unrecognized token 'Source' error and another after another.

Please share your full configuration and some sample messages to make it possible to try to replicate the issue.

I resolved the issue. I moved the json {} decode part from the filtering part to the input part, and used codec => json_lines. Now this is working perfectly fine.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.