Logstash with docker logs and tags

Hello, I am trying to send a docker log from one machine with filebeat and add a tag to it, and when it reaches logstash it should mutate it and the give it an ilm_rollover_alias. The log appears in kibana, but with the original name instead of the mutated ilm_rollover_alias. It has worked for all the other logs that I have sent, but none of them is a docker log, so I'm guessing I am missing something

logstash.conf

input {
    beats {
        port => "5044"
    }
}

filter {
  if [tag] == "logs-docker"{
        mutate{
        add_tag=> ["logs-docker"]
                }
        }
}

output {

if "logs-docker" in [tags]{
    elasticsearch {
        hosts => ["https://localhost:9200"]
        user => "username"
        password => "pass"
        ssl => true
        cacert => "/home/ubuntu/elk/ca/ca.crt"
        ilm_rollover_alias => "logs-docker"
        ilm_pattern => "000001"
        ilm_policy => "logstash-policy"
        }
    }


}


filebeat.yml

# ============================== Filebeat modules ==============================

filebeat.config.modules:
  # Glob pattern for configuration loading
  path: ${path.config}/modules.d/*.yml

  # Set to true to enable config reloading
  reload.enabled: false

  # Period on which files under path should be checked for changes
  #reload.period: 10s


filebeat.inputs:
- type: docker
  combine_partial: true
  containers:
    ids:
      - "*"
  tags: ["logs-docker"]
  exclude_files: ['\.gz$']
  ignore_older: 3000m

processors:
  # decode the log field (sub JSON document) if JSON encoded, then maps it's fields to elasticsearch fields
  #- decode_json_fields:
  #  fields: ["log", "message"]
  #  target: ""
    # overwrite existing target elasticsearch fields while decoding json fields    
    #  overwrite_keys: true
- add_docker_metadata:
    host: "unix:///var/run/docker.sock"


output.logstash:
        hosts: ["my_ip:5044"]

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.