Elasticsearch index not created after adding logstash (works fine with filebeat -> es)

I spin up an ELK stack using docker-compose (see below). I have manged to successfully enable basic authentication (using xpack + certs + built-in passwords) for the ES cluster and connect filebeat to the the ES cluster and verify that it can create indices in ES that can be visualized in Kibana.

I am now struggling with getting the ELK stack working after I have added logstack to the mix. Basically I don't see any errors in any logs (the logstack application log explodes with filebeat related messages) but I don't see any new indices after I have added logstack both directly on es or from kibana.

Example of the stdout log from logstash:

logstash    | {
logstash    |            "log" => {
logstash    |         "offset" => 327249,
logstash    |           "file" => {
logstash    |             "path" => "/var/lib/docker/containers/00475a24e42f152d3e7a6044c456c315ec7fd4feac908137f7d32d36d8e0ea7a/00475a24e42f152d3e7a6044c456c315ec7fd4feac908137f7d32d36d8e0ea7a-json.log"
logstash    |         }
logstash    |     },
logstash    |          "input" => {
logstash    |         "type" => "container"
logstash    |     },
logstash    |          "agent" => {
logstash    |             "hostname" => "72023ae57024",
logstash    |                   "id" => "3661cf95-1132-4bc9-b8d1-ff20fccbb204",
logstash    |                 "type" => "filebeat",
logstash    |         "ephemeral_id" => "f6a19670-82bf-4f74-bfbb-ff6d2a719211",
logstash    |              "version" => "7.7.0"
logstash    |     },
logstash    |        "message" => "             \"version\"\e[0;37m => \e[0m\e[0;33m\"7.7.0\"\e[0m",
logstash    |       "@version" => "1",
logstash    |            "ecs" => {
logstash    |         "version" => "1.5.0"
logstash    |     },
logstash    |           "tags" => [
logstash    |         [0] "beats_input_codec_plain_applied"
logstash    |     ],
logstash    |         "stream" => "stdout",
logstash    |      "container" => {
logstash    |             "id" => "00475a24e42f152d3e7a6044c456c315ec7fd4feac908137f7d32d36d8e0ea7a",
...
logstash    |                      "com_docker_compose_config-hash" => "ed767746a4356329ad15d43cf459c4ba2cfa7edf0c01094681aa433167fb329e",
...
logstash    |                    "org_opencontainers_image_created" => "2020-05-04 00:00:00+01:00",
...
logstash    |         },
logstash    |           "name" => "logstash",
logstash    |          "image" => {
logstash    |             "name" => "docker.elastic.co/logstash/logstash:7.7.0"
logstash    |         }
logstash    |     },
logstash    |           "host" => {
logstash    |         "name" => "72023ae57024"
logstash    |     },
logstash    |     "@timestamp" => 2020-06-16T22:51:43.117Z
logstash    | }

Example of filebeat log (the only thing I changed in the filebeat configuation after adding logstash was the output.logstash part and commented out the output.elasticsearch part):

{"level":"debug","timestamp":"2020-06-17T05:36:28.051Z","logger":"processors","caller":"processing/processors.go:112","message":"Fail to apply processor client{add_docker_metadata=[match_fields=[] match_pids=[process.pid, process.ppid]], decode_json_fields=message}: multiple json elements found"}
{"level":"debug","timestamp":"2020-06-17T05:36:28.051Z","logger":"processors","caller":"processing/processors.go:187","message":"Publish event: {\n  \"@timestamp\": \"2020-06-17T05:35:37.545Z\",\n  \"@metadata\": {\n    \"beat\": \"filebeat\",\n    \"type\": \"_doc\",\n    \"version\": \"7.7.0\"\n  },\n  \"host\": {\n    \"name\": \"72023ae57024\"\n  },\n  \"agent\": {\n    \"ephemeral_id\": \"f6a19670-82bf-4f74-bfbb-ff6d2a719211\",\n    \"hostname\": \"72023ae57024\",\n    \"id\": \"3661cf95-1132-4bc9-b8d1-ff20fccbb204\",\n    \"version\": \"7.7.0\",\n    \"type\": \"filebeat\"\n  },\n  \"ecs\": {\n    \"version\": \"1.5.0\"\n  },\n  \"log\": {\n    \"file\": {\n      \"path\": \"/var/lib/docker/containers/00475a24e42f152d3e7a6044c456c315ec7fd4feac908137f7d32d36d8e0ea7a/00475a24e42f152d3e7a6044c456c315ec7fd4feac908137f7d32d36d8e0ea7a-json.log\"\n    },\n    \"offset\": 2090197\n  },\n  \"stream\": \"stdout\",\n  \"message\": \"        \\\"labels\\\"\\u001b[0;37m => \\u001b[0m{\",\n  \"input\": {\n    \"type\": \"container\"\n  },\n  \"container\": {\n    \"id\": \"00475a24e42f152d3e7a6044c456c315ec7fd4feac908137f7d32d36d8e0ea7a\",\n    \"image\": {\n      \"name\": \"docker.elastic.co/logstash/logstash:7.7.0\"\n    },\n    \"name\": \"logstash\",\n    \"labels\": {\n      \"org_label-schema_url\": \"https://www.elastic.co/products/logstash\",\n      \"org_label-schema_name\": \"logstash\",\n      \"org_label-schema_version\": \"7.7.0\",\n      \"org_opencontainers_image_licenses\": \"GPL-2.0-only\",\n      \"license\": \"Elastic License\",\n      \"com_docker_compose_project\": \"elasticsearch\",\n      \"org_label-schema_schema-version\": \"1.0\",\n      \"org_label-schema_vcs-url\": \"https://github.com/elastic/logstash\",\n      \"com_docker_compose_project_config_files\": \"docker-compose.yml\",\n      \"com_docker_compose_service\": \"logstash\",\n      \"org_label-schema_build-date\": \"20200504\",\n      \"org_opencontainers_image_title\": \"CentOS Base Image\",\n      \"com_docker_compose_container-number\": \"1\",\n      \"com_docker_compose_config-hash\": \"ed767746a4356329ad15d43cf459c4ba2cfa7edf0c01094681aa433167fb329e\",\n      \"org_opencontainers_image_vendor\": \"CentOS\",\n      \"com_docker_compose_version\": \"1.25.4\",\n      \"com_docker_compose_project_working_dir\": \"/home/user/repos/k8s-samples/elasticsearch\",\n      \"org_label-schema_license\": \"GPLv2\",\n      \"org_label-schema_vendor\": \"Elastic\",\n      \"com_docker_compose_oneoff\": \"False\",\n      \"org_opencontainers_image_created\": \"2020-05-04 00:00:00+01:00\"\n    }\n  }\n}"}

Details below:

docker-compose file:

version: '2.2'
services:
  es01:
    image: docker.elastic.co/elasticsearch/elasticsearch:7.7.0
    container_name: es01
    environment:
      - node.name=es01
      - cluster.name=es-docker-cluster
      - discovery.seed_hosts=es02,es03
      - cluster.initial_master_nodes=es01,es02,es03
      - bootstrap.memory_lock=true
      - "ES_JAVA_OPTS=-Xms512m -Xmx512m"
    ulimits:
      memlock:
        soft: -1
        hard: -1
    volumes:
      - data01:/usr/share/elasticsearch/data
      - ./es01.yml:/usr/share/elasticsearch/config/elasticsearch.yml
      - ./certs/instance/instance.key:/usr/share/elasticsearch/config/instance.key
      - ./certs/instance/instance.crt:/usr/share/elasticsearch/config/instance.crt
      - ./certs/ca/ca.crt:/usr/share/elasticsearch/config/ca.crt
    ports:
      - 9200:9200
    networks:
      - elastic

  es02:
    image: docker.elastic.co/elasticsearch/elasticsearch:7.7.0
    container_name: es02
    environment:
      - node.name=es02
      - cluster.name=es-docker-cluster
      - discovery.seed_hosts=es01,es03
      - cluster.initial_master_nodes=es01,es02,es03
      - bootstrap.memory_lock=true
      - "ES_JAVA_OPTS=-Xms512m -Xmx512m"
    ulimits:
      memlock:
        soft: -1
        hard: -1
    volumes:
      - data02:/usr/share/elasticsearch/data
      - ./es02.yml:/usr/share/elasticsearch/config/elasticsearch.yml
      - ./certs/instance/instance.key:/usr/share/elasticsearch/config/instance.key
      - ./certs/instance/instance.crt:/usr/share/elasticsearch/config/instance.crt
      - ./certs/ca/ca.crt:/usr/share/elasticsearch/config/ca.crt
    ports:
      - 9201:9201
    networks:
      - elastic

  es03:
    image: docker.elastic.co/elasticsearch/elasticsearch:7.7.0
    container_name: es03
    environment:
      - node.name=es03
      - cluster.name=es-docker-cluster
      - discovery.seed_hosts=es01,es02
      - cluster.initial_master_nodes=es01,es02,es03
      - bootstrap.memory_lock=true
      - "ES_JAVA_OPTS=-Xms512m -Xmx512m"
    ulimits:
      memlock:
        soft: -1
        hard: -1
    volumes:
      - data03:/usr/share/elasticsearch/data
      - ./es03.yml:/usr/share/elasticsearch/config/elasticsearch.yml
      - ./certs/instance/instance.key:/usr/share/elasticsearch/config/instance.key
      - ./certs/instance/instance.crt:/usr/share/elasticsearch/config/instance.crt
      - ./certs/ca/ca.crt:/usr/share/elasticsearch/config/ca.crt

    ports:
      - 9202:9202
    networks:
      - elastic

  kib01:
    image: docker.elastic.co/kibana/kibana:7.7.0
    container_name: kib01
    ports:
      - 5601:5601
    environment:
      ELASTICSEARCH_URL: http://es01:9200
    volumes:
      - ./kibana.yml:/usr/share/kibana/config/kibana.yml
    networks:
      - elastic

  filebeat:
    image: docker.elastic.co/beats/filebeat:7.7.0
    container_name: filebeat

    user: root
    # sudo chown root filebeat.yaml
    # sudo chmod 600 filebeat.yaml
    volumes:
      - ./filebeat.yaml:/usr/share/filebeat/filebeat.yml:ro
      - /var/lib/docker:/var/lib/docker:ro
      - /var/run/docker.sock:/var/run/docker.sock
    depends_on: ['es01']
    networks:
      - elastic


  logstash:
    image: docker.elastic.co/logstash/logstash:7.7.0
    container_name: logstash
    ulimits:
      memlock:
        soft: -1
        hard: -1    

    volumes:
      - ./poc-logstash.conf:/usr/share/logstash/config/logstash.conf:ro
      - ./logstash.yaml:/usr/share/logstash/config/logstash.yml:ro
    ports:
      - "5000:5000"
      - "5044:5044"
    stdin_open: true
    tty: true    
    networks:
      - elastic
    logging:
      driver: "json-file"
      options:
        max-size: "10m"
        max-file: "50"


volumes:
  data01:
    driver: local
  data02:
    driver: local
  data03:
    driver: local

networks:
  elastic:
    driver: bridge

es01.yaml (same for es02 and es03)

cluster.name: "docker-cluster"
network.host: 0.0.0.0
xpack.security.enabled: "true"
xpack.security.transport.ssl.enabled: "true"
xpack.security.transport.ssl.verification_mode: certificate
xpack.security.transport.ssl.key: /usr/share/elasticsearch/config/instance.key
xpack.security.transport.ssl.certificate: /usr/share/elasticsearch/config/instance.crt
xpack.security.transport.ssl.certificate_authorities: /usr/share/elasticsearch/config/ca.crt

filebeat.yaml

filebeat.inputs:
 - type: container
   paths:
     - /var/lib/docker/containers/*/*.log

   processors:
     - add_docker_metadata:
        host: "unix:///var/run/docker.sock"

     - decode_json_fields:
        fields: ["message"]
        target: "json"
        overwrite_keys: true


# E.g. for debugging/connecting directly to es        
#output.elasticsearch:
# username: elastic
# password: qwerty
# hosts: ["es01:9200"] 
# indices:
#   - index: "filebeat-%{[agent.version]}-%{+yyyy.MM.dd}"

output.logstash:
  hosts: ["logstash:5044"]
#================================ Logging =====================================

logging.json: true
logging.metrics.enabled: false
  
# Sets log level. The default log level is info.
# Available log levels are: error, warning, info, debug
logging.level: debug

# At debug level, you can selectively enable logging only for some components.
# To enable all selectors use ["*"]. Examples of other selectors are "beat",
# "publish", "service".
logging.selectors: ["*"]

logging.to_files: true
logging.to_syslog: false

logging.files:
  path: /tmp
  name: filebeat.log
  keepfiles: 2

logstash.yaml

log.level: error
path.logs: /var/log/logstash

poc-logstash.conf

input { 
  beats {
    port => 5044
  }
}

output {
  elasticsearch {
    hosts => ["es01:9200"]
    index => "logstash-%{[agent.version]}-%{+yyyy.MM.dd}"
	user => "elastic"
    password => "qwerty"
  }
}

According to the above format I would expect to see some logstash-* indices but they are not created:

$ curl -u elastic:qwerty http://localhost:9200/_cat/indices
green open .security-7                      Dw8XpnyKRtqbR53Ig9gJVQ 1 1    42 0 195.6kb 111.7kb
green open .apm-custom-link                 rhNlihx4QvydwGrYxVA5RA 1 1     0 0    416b    208b
green open filebeat-7.7.0-2020.06.16        2BYYtD2TScahZ4LNi3n0eQ 1 1 11587 0  18.4mb   9.2mb
green open .kibana_task_manager_1           S8t9FisxQkSK3lAcF5qTyw 1 1     5 1  61.7kb  30.8kb
green open .apm-agent-configuration         4q9YUYO7QR-hGZByrX-cUg 1 1     0 0    416b    208b
green open .async-search                    EMrwH1yFT92dG1wETi1svw 1 1     5 2  17.3mb   8.6mb
green open filebeat-7.7.0-2020.06.16-000001 BB6DWhlASOGdoltSCWawEg 1 1     0 0    416b    208b
green open .kibana_1                        8mR1ZJchQZyCVnTcIcY7bQ 1 1    49 6 384.6kb 198.2kb
green open filebeat-7.7.0-2020.06.10        dNxjQF1xSqaX1pHQxcjy_Q 1 1  2163 0 892.6kb 402.7kb

There are some filebeat-* indices but they are from when I only had filebeat -> es (which works fine also with basic authentication)

you should really look at filebeat related messages in the logstash logs to diagnose the issue

I have updated the original post with some info from the logs - had to limit it because of the post length limit.

Below I have added more some more info.

logstash stdout log:

kib01       | {"type":"log","@timestamp":"2020-06-17T06:42:09Z","tags":["listening","info"],"pid":6,"message":"Server running at http://0:5601"}
kib01       | {"type":"log","@timestamp":"2020-06-17T06:42:10Z","tags":["info","http","server","Kibana"],"pid":6,"message":"http server running at http://0:5601"}
logstash    | /usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/awesome_print-1.7.0/lib/awesome_print/formatters/base_formatter.rb:31: warning: constant ::Fixnum is deprecated
logstash    | {
logstash    |            "log" => {
logstash    |           "file" => {
logstash    |             "path" => "/var/lib/docker/containers/00475a24e42f152d3e7a6044c456c315ec7fd4feac908137f7d32d36d8e0ea7a/00475a24e42f152d3e7a6044c456c315ec7fd4feac908137f7d32d36d8e0ea7a-json.log"
logstash    |         },
logstash    |         "offset" => 310526
logstash    |     },
logstash    |        "message" => "    },",
logstash    |          "input" => {
logstash    |         "type" => "container"
logstash    |     },
logstash    |           "tags" => [
logstash    |         [0] "beats_input_codec_plain_applied"
logstash    |     ],
logstash    |      "container" => {
logstash    |             "id" => "00475a24e42f152d3e7a6044c456c315ec7fd4feac908137f7d32d36d8e0ea7a",
logstash    |          "image" => {
logstash    |             "name" => "docker.elastic.co/logstash/logstash:7.7.0"
logstash    |         },
logstash    |           "name" => "logstash",
logstash    |         "labels" => {
logstash    |                     "org_label-schema_schema-version" => "1.0",
logstash    |                     "org_opencontainers_image_vendor" => "CentOS",
logstash    |                                             "license" => "Elastic License",
logstash    |                               "org_label-schema_name" => "logstash",
logstash    |             "com_docker_compose_project_config_files" => "docker-compose.yml",
logstash    |              "com_docker_compose_project_working_dir" => "/home/user/repos/samples/elasticsearch",
logstash    |                          "com_docker_compose_project" => "elasticsearch",
logstash    |                          "com_docker_compose_version" => "1.25.4",
logstash    |                            "org_label-schema_version" => "7.7.0",
logstash    |                      "org_opencontainers_image_title" => "CentOS Base Image",
logstash    |                    "org_opencontainers_image_created" => "2020-05-04 00:00:00+01:00",
logstash    |                         "org_label-schema_build-date" => "20200504",
logstash    |                   "org_opencontainers_image_licenses" => "GPL-2.0-only",
logstash    |                            "org_label-schema_license" => "GPLv2",
logstash    |                          "com_docker_compose_service" => "logstash",
logstash    |                 "com_docker_compose_container-number" => "1",
logstash    |                           "com_docker_compose_oneoff" => "False",
logstash    |                             "org_label-schema_vendor" => "Elastic",
logstash    |                            "org_label-schema_vcs-url" => "https://github.com/elastic/logstash",
logstash    |                      "com_docker_compose_config-hash" => "ed767746a4356329ad15d43cf459c4ba2cfa7edf0c01094681aa433167fb329e",
logstash    |                                "org_label-schema_url" => "https://www.elastic.co/products/logstash"
logstash    |         }
logstash    |     },
logstash    |         "stream" => "stdout",
logstash    |           "host" => {
logstash    |         "name" => "72023ae57024"
logstash    |     },
logstash    |          "agent" => {
logstash    |         "ephemeral_id" => "dba47690-02a3-43ff-8953-462682fc1eea",
logstash    |                 "type" => "filebeat",
logstash    |             "hostname" => "72023ae57024",
logstash    |                   "id" => "3661cf95-1132-4bc9-b8d1-ff20fccbb204",
logstash    |              "version" => "7.7.0"
logstash    |     },
logstash    |       "@version" => "1",
logstash    |     "@timestamp" => 2020-06-17T05:56:38.759Z,
logstash    |            "ecs" => {
logstash    |         "version" => "1.5.0"
logstash    |     }
logstash    | }
logstash    | {
logstash    |            "log" => {
logstash    |           "file" => {
logstash    |             "path" => "/var/lib/docker/containers/00475a24e42f152d3e7a6044c456c315ec7fd4feac908137f7d32d36d8e0ea7a/00475a24e42f152d3e7a6044c456c315ec7fd4feac908137f7d32d36d8e0ea7a-json.log"
logstash    |         },
logstash    |         "offset" => 311733
logstash    |     },
logstash    |        "message" => "    },",
logstash    |          "input" => {
logstash    |         "type" => "container"
logstash    |     },
logstash    |           "tags" => [
logstash    |         [0] "beats_input_codec_plain_applied"
logstash    |     ],
logstash    |         "stream" => "stdout",
logstash    |      "container" => {
logstash    |             "id" => "00475a24e42f152d3e7a6044c456c315ec7fd4feac908137f7d32d36d8e0ea7a",
logstash    |          "image" => {
logstash    |             "name" => "docker.elastic.co/logstash/logstash:7.7.0"
logstash    |         },
logstash    |           "name" => "logstash",
logstash    |         "labels" => {
logstash    |                     "org_label-schema_schema-version" => "1.0",
logstash    |                     "org_opencontainers_image_vendor" => "CentOS",
logstash    |                                             "license" => "Elastic License",
logstash    |                               "org_label-schema_name" => "logstash",
logstash    |             "com_docker_compose_project_config_files" => "docker-compose.yml",
logstash    |              "com_docker_compose_project_working_dir" => "/home/user/repos/samples/elasticsearch",
logstash    |                          "com_docker_compose_project" => "elasticsearch",
logstash    |                          "com_docker_compose_version" => "1.25.4",
logstash    |                            "org_label-schema_version" => "7.7.0",
logstash    |                      "org_opencontainers_image_title" => "CentOS Base Image",
logstash    |                    "org_opencontainers_image_created" => "2020-05-04 00:00:00+01:00",
logstash    |                 "com_docker_compose_container-number" => "1",
logstash    |                   "org_opencontainers_image_licenses" => "GPL-2.0-only",
logstash    |                            "org_label-schema_license" => "GPLv2",
logstash    |                          "com_docker_compose_service" => "logstash",
logstash    |                         "org_label-schema_build-date" => "20200504",
logstash    |                             "org_label-schema_vendor" => "Elastic",
logstash    |                           "com_docker_compose_oneoff" => "False",
logstash    |                            "org_label-schema_vcs-url" => "https://github.com/elastic/logstash",
logstash    |                                "org_label-schema_url" => "https://www.elastic.co/products/logstash",
logstash    |                      "com_docker_compose_config-hash" => "ed767746a4356329ad15d43cf459c4ba2cfa7edf0c01094681aa433167fb329e"
logstash    |         }
logstash    |     },
logstash    |           "host" => {
logstash    |         "name" => "72023ae57024"
logstash    |     },
logstash    |          "agent" => {
logstash    |         "ephemeral_id" => "dba47690-02a3-43ff-8953-462682fc1eea",
logstash    |             "hostname" => "72023ae57024",
logstash    |                   "id" => "3661cf95-1132-4bc9-b8d1-ff20fccbb204",
logstash    |                 "type" => "filebeat",
logstash    |              "version" => "7.7.0"
logstash    |     },
logstash    |       "@version" => "1",
logstash    |     "@timestamp" => 2020-06-17T05:56:38.759Z,



And some more from filebeat.log:


{"level":"debug","timestamp":"2020-06-17T06:51:31.411Z","logger":"processors","caller":"processing/processors.go:112","message":"Fail to apply processor client{add_docker_metadata=[match_fields=[] match_pids=[process.pid, process.ppid]], decode_json_fields=message}: multiple json elements found"}
{"level":"debug","timestamp":"2020-06-17T06:51:31.412Z","logger":"processors","caller":"processing/processors.go:187","message":"Publish event: {\n  \"@timestamp\": \"2020-06-17T06:45:51.297Z\",\n  \"@metadata\": {\n    \"beat\": \"filebeat\",\n    \"type\": \"_doc\",\n    \"version\": \"7.7.0\"\n  },\n  \"ecs\": {\n    \"version\": \"1.5.0\"\n  },\n  \"host\": {\n    \"name\": \"72023ae57024\"\n  },\n  \"agent\": {\n    \"version\": \"7.7.0\",\n    \"type\": \"filebeat\",\n    \"ephemeral_id\": \"71c91c6a-8595-40cd-9035-a6d748c4e4ad\",\n    \"hostname\": \"72023ae57024\",\n    \"id\": \"3661cf95-1132-4bc9-b8d1-ff20fccbb204\"\n  },\n  \"message\": \"          \\\"file\\\"\\u001b[0;37m => \\u001b[0m{\",\n  \"log\": {\n    \"offset\": 801979,\n    \"file\": {\n      \"path\": \"/var/lib/docker/containers/00475a24e42f152d3e7a6044c456c315ec7fd4feac908137f7d32d36d8e0ea7a/00475a24e42f152d3e7a6044c456c315ec7fd4feac908137f7d32d36d8e0ea7a-json.log\"\n    }\n  },\n  \"stream\": \"stdout\",\n  \"input\": {\n    \"type\": \"container\"\n  },\n  \"container\": {\n    \"name\": \"logstash\",\n    \"labels\": {\n      \"org_label-schema_license\": \"GPLv2\",\n      \"org_label-schema_url\": \"https://www.elastic.co/products/logstash\",\n      \"org_opencontainers_image_vendor\": \"CentOS\",\n      \"com_docker_compose_oneoff\": \"False\",\n      \"org_label-schema_schema-version\": \"1.0\",\n      \"org_label-schema_vcs-url\": \"https://github.com/elastic/logstash\",\n      \"org_label-schema_version\": \"7.7.0\",\n      \"com_docker_compose_project_working_dir\": \"/home/user/repos/k8s-samples/elasticsearch\",\n      \"org_opencontainers_image_created\": \"2020-05-04 00:00:00+01:00\",\n      \"org_label-schema_name\": \"logstash\",\n      \"com_docker_compose_project_config_files\": \"docker-compose.yml\",\n      \"com_docker_compose_version\": \"1.25.4\",\n      \"com_docker_compose_project\": \"elasticsearch\",\n      \"org_label-schema_build-date\": \"20200504\",\n      \"license\": \"Elastic License\",\n      \"com_docker_compose_container-number\": \"1\",\n      \"com_docker_compose_service\": \"logstash\",\n      \"com_docker_compose_config-hash\": \"ed767746a4356329ad15d43cf459c4ba2cfa7edf0c01094681aa433167fb329e\",\n      \"org_opencontainers_image_title\": \"CentOS Base Image\",\n      \"org_opencontainers_image_licenses\": \"GPL-2.0-only\",\n      \"org_label-schema_vendor\": \"Elastic\"\n    },\n    \"id\": \"00475a24e42f152d3e7a6044c456c315ec7fd4feac908137f7d32d36d8e0ea7a\",\n    \"image\": {\n      \"name\": \"docker.elastic.co/logstash/logstash:7.7.0\"\n    }\n  }\n}"}
{"level":"debug","timestamp":"2020-06-17T06:51:31.412Z","logger":"truncate_fields","caller":"actions/decode_json_fields.go:119","message":"Error trying to unmarshal             \"path\"\u001b[0;37m => \u001b[0m\u001b[0;33m\"/var/lib/docker/containers/00475a24e42f152d3e7a6044c456c315ec7fd4feac908137f7d32d36d8e0ea7a/00475a24e42f152d3e7a6044c456c315ec7fd4feac908137f7d32d36d8e0ea7a-json.log\"\u001b[0m"}
{"level":"debug","timestamp":"2020-06-17T06:51:31.412Z","logger":"processors","caller":"processing/processors.go:112","message":"Fail to apply processor client{add_docker_metadata=[match_fields=[] match_pids=[process.pid, process.ppid]], decode_json_fields=message}: multiple 

Any thoughts/input?

based on this, you're sending output to ES, not Logstash.
However :

this shows stdout output. any chance you're looking wrong config or wrong output?

any connectivity error to ES should appear in /var/log/logstash. try increasing the log level to info or debug if you don't see anything

Yes that is what I have in poc-logstash.conf in the logstash container and its supposed to send to elasticsearch - not to logstash (that would result in logstash sending to itself...). Or am I misreading you comment?

yes, i wasn’t expecting any stdout log based on that config, but your previous post showed stdout log, which was the reason of my previous comment.

anyway, did you manage to find any ES related error in /var/log/logstash ?

It does not explain why you would not see an index, but note that where the [agent] field on an event is an object that contains a [version] field in logstash that is referred to as [agent][version]. [agent.version] would refer to a field with a period in its name.

Hm /var/log/logstash does not exist:

$ docker exec -it logstash bash
bash-4.2$ ls -la /var/log/
total 368
drwxr-xr-x 1 root root   4096 May  4 15:37 .
drwxr-xr-x 1 root root   4096 May  4 15:36 ..
-rw------- 1 root utmp      0 May  4 15:36 btmp
-rw-r--r-- 1 root root    193 May  4 15:36 grubby_prune_debug
-rw-r--r-- 1 root root 292292 May 12 04:44 lastlog
-rw------- 1 root root  64064 May 12 04:44 tallylog
-rw-rw-r-- 1 root utmp      0 May  4 15:36 wtmp
-rw------- 1 root root   3664 May 12 04:44 yum.log

Even though logstash.yml contains:

log.level: debug
path.logs: /var/log/logstash

So I still only have the stdout that does NOT contain any errors.

I do see:

logstash    | Sending Logstash logs to /var/log/logstash which is now configured via log4j2.properties

that indicates logstash logging needs to be configured using log4j (which is not the case for the other parts in the ELK stack). So I tried updating: /usr/share/logstash/config/log4j2.properties

status = debug
name = LogstashPropertiesConfig


appender.default.file=org.apache.log4j.FileAppender
appender.default.file.append=true
appender.default.file.file=/var/log/mylogfile.log

appender.console.type = Console
appender.console.name = plain_console
appender.console.layout.type = PatternLayout
appender.console.layout.pattern = [%d{ISO8601}][%-5p][%-25c]%notEmpty{[%X{pipeline.id}]}%notEmpty{[%X{plugin.id}]} %m%n

appender.json_console.type = Console
appender.json_console.name = json_console
appender.json_console.layout.type = JSONLayout
appender.json_console.layout.compact = true
appender.json_console.layout.eventEol = true

rootLogger.level = ${sys:ls.log.level}
rootLogger.appenderRef.console.ref = ${sys:ls.log.format}_console

but then logstash terminates with

logstash    | [FATAL] 2020-06-20 10:56:25.510 [main] runner - An unexpected error occurred! {:error=>org.apache.logging.log4j.core.config.ConfigurationException: No name attribute provided for Appender default, 
...
logstash    | [ERROR] 2020-06-20 10:35:52.439 [main] Logstash - java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit

But what is the point of directing the logs to a file if they don't contain any errors in the first place (before spending more time on debugging log4j configuration)?

Not exactly sure what you mean, but I also tried to the name the index like so:

output {
  elasticsearch {
    hosts => ["es01:9200"]
    #index => "logstash-%{[agent.version]}-%{+yyyy.MM.dd}"
    index => "logstash-%{+yyyy.MM.dd}" 
	  user => "elastic"
    password => "qwerty"
  }
}

still nothing gets created.

i doubt that log4j config is the cause. i've encountered cases where logstash ran from CLI with root user , then when trying to run it from systemctl, logstash user is unable to write log file due to permission issue (with no error whatsoever.

are you using self signed ca ? if so, maybe you need to add the ca to the config?

Ok seems like I have to stick with reading logstash logs from stdout for now.

Regarding CA I am using the approach described here:
https://www.elastic.co/guide/en/elasticsearch/reference/7.8/configuring-security.html
https://www.elastic.co/guide/en/elasticsearch/reference/7.8/configuring-tls.html

and configured that using xpack (see details in my original post). And have verified that it works. Also I have uploaded my configuration here:

A bit surprised that it so difficult to debug this "simple" issue considering all the input above - makes me wondering if it would be better to leave logstash out of the mix or would that mean we would be better of looking into a complete alternative to the ELK stack?

i’m not really familiar with running logstash in docker, but based on the documentation , logstash in docker will look for pipeline and config under /usr/share/logstash/pipeline/ and you will need to either overwrite logstash.conf in that directory or overwrite the entire default pipeline directory altogether. your config above uses /usr/share/logstash/config

nothing in your config indicates that you’re sending output to stdout, yet you’re seeing documents in stdout. my guess is that logstash is picking up wrong config due to that behavior mentioned in documentation.

Updating so the conf file get mounted into the pipeline folder fixed the issue:

     # WRONG! 
      #- ./poc-logstash.conf:/usr/share/logstash/config/logstash.conf:ro
      # conf file needs to go into pipeline folder!
      # CORRECT!
      - ./poc-logstash.conf:/usr/share/logstash/pipeline/logstash.conf:ro

I now see the new indices in es after this change, thanks for the patience!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.