Logstash starts with an error and does not receive filebeat data

Hello!
I am new to the ELK Stack. I configure ELK in docker for collecting logs of containers. I ran into some problems.

  1. When I run the filebeat container, I get a warning: "It is not configured / enabled. If you’ve already logged in, you’re already logged in. you can ignore this warning. "
  2. In kibana, there is no filebeat- * the index pattern.

Please help identify the cause of the errors. What have I done wrong.

Thank you very much!

Warning log:

filebeat_1_33eec2dba793 | 2019-05-27T13:25:13.573Z      INFO    instance/beat.go:571    Home path: [/usr/share/filebeat] Config path: [/usr/share/filebeat] Data path: [/usr/share/filebeat/data] Logs path: [/usr/share/filebeat/logs]
filebeat_1_33eec2dba793 | 2019-05-27T13:25:13.578Z      INFO    instance/beat.go:579    Beat ID: 1bd8d5cd-f352-49df-85be-213e43ea38f9
filebeat_1_33eec2dba793 | 2019-05-27T13:25:13.578Z      INFO    [index-management.ilm]  ilm/ilm.go:129  Policy name: filebeat-7.1.0
filebeat_1_33eec2dba793 | 2019-05-27T13:25:13.588Z      INFO    [seccomp]       seccomp/seccomp.go:116  Syscall filter successfully installed
filebeat_1_33eec2dba793 | 2019-05-27T13:25:13.588Z      INFO    [beat]  instance/beat.go:827    Beat info       {"system_info": {"beat": {"path": {"config": "/usr/share/filebeat", "data": "/usr/share/filebeat/data", "home": "/usr/share/filebeat", "logs": "/usr/share/filebeat/logs"}, "type": "filebeat", "uuid": "1bd8d5cd-f352-49df-85be-213e43ea38f9"}}}
filebeat_1_33eec2dba793 | 2019-05-27T13:25:13.588Z      INFO    [beat]  instance/beat.go:836    Build info      {"system_info": {"build": {"commit": "03b3db2a1d9d76fdf10475e829fce436c61901e4", "libbeat": "7.1.0", "time": "2019-05-15T23:59:19.000Z", "version": "7.1.0"}}}
filebeat_1_33eec2dba793 | 2019-05-27T13:25:13.588Z      INFO    [beat]  instance/beat.go:839    Go runtime info {"system_info": {"go": {"os":"linux","arch":"amd64","max_procs":8,"version":"go1.11.5"}}}
filebeat_1_33eec2dba793 | 2019-05-27T13:25:13.590Z      INFO    [beat]  instance/beat.go:843    Host info       {"system_info": {"host": {"architecture":"x86_64","boot_time":"2019-05-22T10:02:45Z","containerized":true,"name":"9dd693197533","ip":["127.0.0.1/8","172.23.0.5/16"],"kernel_version":"4.15.0-50-generic","mac":["02:42:ac:17:00:05"],"os":{"family":"redhat","platform":"centos","name":"CentOS Linux","version":"7 (Core)","major":7,"minor":6,"patch":1810,"codename":"Core"},"timezone":"UTC","timezone_offset_sec":0}}}
filebeat_1_33eec2dba793 | 2019-05-27T13:25:13.590Z      INFO    [beat]  instance/beat.go:872    Process info    {"system_info": {"process": {"capabilities": {"inheritable":["chown","dac_override","fowner","fsetid","kill","setgid","setuid","setpcap","net_bind_service","net_raw","sys_chroot","mknod","audit_write","setfcap"],"permitted":["chown","dac_override","fowner","fsetid","kill","setgid","setuid","setpcap","net_bind_service","net_raw","sys_chroot","mknod","audit_write","setfcap"],"effective":["chown","dac_override","fowner","fsetid","kill","setgid","setuid","setpcap","net_bind_service","net_raw","sys_chroot","mknod","audit_write","setfcap"],"bounding":["chown","dac_override","fowner","fsetid","kill","setgid","setuid","setpcap","net_bind_service","net_raw","sys_chroot","mknod","audit_write","setfcap"],"ambient":null}, "cwd": "/usr/share/filebeat", "exe": "/usr/share/filebeat/filebeat", "name": "filebeat", "pid": 1, "ppid": 0, "seccomp": {"mode":"filter","no_new_privs":true}, "start_time": "2019-05-27T13:25:12.780Z"}}}
filebeat_1_33eec2dba793 | 2019-05-27T13:25:13.590Z      INFO    instance/beat.go:280    Setup Beat: filebeat; Version: 7.1.0
filebeat_1_33eec2dba793 | 2019-05-27T13:25:13.590Z      INFO    [publisher]     pipeline/module.go:97   Beat name: 9dd693197533
filebeat_1_33eec2dba793 | 2019-05-27T13:25:13.591Z      INFO    instance/beat.go:391    filebeat start running.
filebeat_1_33eec2dba793 | 2019-05-27T13:25:13.591Z      INFO    registrar/migrate.go:104        No registry home found. Create: /usr/share/filebeat/data/registry/filebeat
filebeat_1_33eec2dba793 | 2019-05-27T13:25:13.591Z      INFO    registrar/migrate.go:112        Initialize registry meta file
filebeat_1_33eec2dba793 | 2019-05-27T13:25:13.592Z      INFO    [monitoring]    log/log.go:117  Starting metrics logging every 30s
filebeat_1_33eec2dba793 | 2019-05-27T13:25:13.594Z      INFO    registrar/registrar.go:108      No registry file found under: /usr/share/filebeat/data/registry/filebeat/data.json. Creating a new registry file.
filebeat_1_33eec2dba793 | 2019-05-27T13:25:13.598Z      INFO    registrar/registrar.go:145      Loading registrar data from /usr/share/filebeat/data/registry/filebeat/data.json
filebeat_1_33eec2dba793 | 2019-05-27T13:25:13.598Z      INFO    registrar/registrar.go:152      States Loaded from registrar: 0
filebeat_1_33eec2dba793 | 2019-05-27T13:25:13.598Z      WARN    beater/filebeat.go:357  Filebeat is unable to load the Ingest Node pipelines for the configured modules because the Elasticsearch output is not configured/enabled. If you have already loaded the Ingest Node pipelines or are using Logstash pipelines, you can ignore this warning.
filebeat_1_33eec2dba793 | 2019-05-27T13:25:13.598Z      INFO    crawler/crawler.go:72   Loading Inputs: 1
filebeat_1_33eec2dba793 | 2019-05-27T13:25:13.598Z      INFO    log/input.go:138        Configured paths: [/usr/share/filebeat/dockerlogs/*/*.log]
filebeat_1_33eec2dba793 | 2019-05-27T13:25:13.598Z      INFO    input/input.go:114      Starting input of type: log; ID: 3859208192045851309
filebeat_1_33eec2dba793 | 2019-05-27T13:25:13.598Z      INFO    crawler/crawler.go:106  Loading and starting Inputs completed. Enabled inputs: 1

Elasticsearch and kibana use standard images of the latest version (7.1.0). The filebeat and logstash containers use custom images.

Container configuration files:

Dockerfile for logstash:

FROM docker.elastic.co/logstash/logstash:7.1.0

RUN rm -f /usr/share/logstash/pipeline/logstash.conf

COPY pipeline/ /usr/share/logstash/pipeline/

logstash.conf:

input { 
	beats {
	    port => 5044
  	}
}


output {
  if [@metadata][pipeline] {
    elasticsearch {
      hosts => "http://195.168.122.45:9200"
      manage_template => false
      index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
      pipeline => "%{[@metadata][pipeline]}" 
    }
  } else {
    elasticsearch {
      hosts => "http://195.168.122.45:9200"
      manage_template => false
      index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
    }
  }

filebeat.yml:

filebeat.inputs:
- type: log
  json.keys_under_root: true
  json.message_key: log
  enabled: true
  encoding: utf-8
  paths: 
    - /usr/share/filebeat/dockerlogs/*/*.log
  document_type: docker

processors:
- decode_json_fields:
    fields: ["log"]
    target: ""

    overwrite_keys: true
- add_docker_metadata: ~

setup.template.settings:
  index.number_of_shards: 3
setup.template.name: "filebeat"
setup.template.pattern: "filebeat-*"
  
setup.kibana.host: "http://195.168.122.45:5601"

output.logstash:
  hosts: ["195.168.122.45:5044"]

logging.to_files: true
logging.to_syslog: false

Dockerfile for filebeat:

FROM docker.elastic.co/beats/filebeat:7.1.0

COPY filebeat.yml /usr/share/filebeat/filebeat.yml

USER root

RUN mkdir /usr/share/filebeat/dockerlogs

RUN chown -R root /usr/share/filebeat/

RUN chmod -R go-w /usr/share/filebeat/

And a common docker-compose.yml:

version: '3'
services:
     elasticsearch:
       image: docker.elastic.co/elasticsearch/elasticsearch:7.1.0
       environment:
          - "ES_JAVA_OPTS=-Xms512m -Xmx512m"
       ports:
          - "9200:9200"
          - "9300:9300"
       volumes:
          - ./elasticsearch/config/elasticsearch.yml:/usr/share/elasticsearch/config/elasticsearch.yml:ro

     kibana:
       image: docker.elastic.co/kibana/kibana:7.1.0
       environment:
          - "ELASTICSEARCH_URL=http://elasticsearch:9200"
          - "LOGGING_QUIET=true"
       ports:
          - "5601:5601"
       volumes:
          - ./kibana/config/kibana.yml:/usr/share/kibana/config/kibana.yml:ro
       links:
          - elasticsearch
          - logstash
       depends_on:
          - elasticsearch
          
     logstash:
       image: logstash:0.1
       ports:
          - "5044:5044"
       environment:
          - "LOG_LEVEL: error"
       volumes:
          - ./ELK/logstash/pipeline/:/usr/share/logstash/pipeline:ro
          - ./ELK/logstash/logstash/logstash.yml:/usr/share/logstash/config/logstash.yml:ro
       links:
          - elasticsearch
          
     filebeat:
       image: filebeat:0.1
       volumes:
          - /var/lib/docker/containers:/usr/share/dockerlogs/data:ro
          - /var/run/docker.sock:/var/run/docker.sock
       links:
          - logstash

in kibana I get this result:

Comand http://195.168.122.45:9200/_cat/indices?v
Displays the following result:

I hope for your help

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.