Adding field creates an array not a static string

When I add this in my filter it adds it but my other pipelines also add a field of the same and the values turn into an array instead of a static value per log.

     mutate {
       add_field => {
        "service_name" => "company"
       }
     }

If that filter is executed twice then you will get an array instead of a text field. How are your pipelines configured? pipelines.yml?

Its in K8s under logstashPipeline: I have multiple pipelines each with an input/filter/output

  input {
    beats {
      port => 5052
    }
  }
  filter {
  dissect {
    mapping => {
      "message" => "%%{TIMESTAMP_ISO8601:timestamp}||%%{LOGLEVEL:level}||%%{DATA:thread_id}||%%{IPV4:originating_ip}||%%{USERNAME:username}||%%{USERNAME:legacy_person_id}||%%{UUID:person_id}||%%{USER:amzn_trace_id}||%%{DATA:source_location}||%%{GREEDYDATA:message}"
    }
  }

              mutate {
                       rename => ["TIMESTAMP_ISO8601:timestamp", "timestamp" ]
              }

              mutate {
                       rename => ["LOGLEVEL:level", "level" ]
              }

              mutate {
                       rename => ["DATA:thread_id", "thread_id" ]
              }

              mutate {
                       rename => ["IPV4:originating_ip", "originating_ip" ]
              }

              mutate {
                       rename => ["USERNAME:username", "username" ]
              }

              mutate {
                       rename => ["USERNAME:legacy_person_id", "legacy_person_id" ]
              }

              mutate {
                       rename => ["UUID:person_id", "person_id" ]
              }

              mutate {
                       rename => ["USER:amzn_trace_id", "amzn_trace_id" ]
              }

              mutate {
                       rename => ["DATA:source_location", "source_location" ]
              }

              mutate {
                       rename => ["GREEDYDATA:message", "message" ]
              }





   mutate {
     add_field => {
      "service_name" => "company"
     }
   }
    }
  output {
    amazon_es {
      hosts =>
      ssl => true
      region => "us-east-1"
      index => "${env}-services-company-logs-%%{+YYYY.MM}"
    }
  }

Are you expecting that each configuration file will automatically get run in a separate pipeline? That does not happen unless you configure it that way using pipelines.yml. If you point path.config (or -f) at a directory that contains more than one configuration file then they are combined. So if you have two files

input { http { ... } }
filter { mutate { ... } }
output { stdout {... } }

and

input { file { ... } }
filter { csv { ... } }
output { elasticsearch {... } }

that is equivalent to

input {
    http { ... }
    file { ... }
}
filter {
    mutate { ... }
    csv { ... }
}
output {
    stdout {... }
    elasticsearch {... }
}

How do you separate that from a helm values file perspective?

  logstash.yml: |
    http.host: "0.0.0.0"
    path.config: /usr/share/logstash/pipeline

# Allows you to add any pipeline files in /usr/share/logstash/pipeline/
### ***warn*** there is a hardcoded logstash.conf in the image, override it first
logstashPipeline: 
  logstash.conf: |
    input {
      beats {
        port => 5044
      }
    }
    output {
      amazon_es {
        hosts => 
        ssl => true
        region => "us-east-1"
        index => "${env}-logs-%%{+YYYY.MM}"
      }
    }

I added the pipelines.yml config but still getting service_name field with other values from other pipelines.

    - pipeline.id: main
      path.config: "/usr/share/logstash/pipeline/logstash.conf"
    - pipeline.id: services 
      path.config: "/usr/share/logstash/pipeline/services.conf"
    - pipeline.id: avenger 
      path.config: "/usr/share/logstash/pipeline/avenger.conf"
    - pipeline.id: homeval 
      path.config: "/usr/share/logstash/pipeline/homeval.conf"
    - pipeline.id: address 
      path.config: "/usr/share/logstash/pipeline/address.conf"
    - pipeline.id: company 
      path.config: "/usr/share/logstash/pipeline/company.conf"

I have no idea why that would happen.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.