Filebeat, Kibana and ElasticSearch are configured, but not reading log files

Hi there,

I'm setting up filebeats to read log files from Ubuntu machine and send them to ElasticSearch/ Kibana. I made a few changes to .yml settings after reading through previous posts https://github.com/elastic/beats/issues/4333 and Able to configure Filebeat and index in Kibana, but no data is appearing. However my ElasticSearch doesn't seem to read any files from the folder I pointed to, despite the connections are set up.

Here's my settings in /etc/filebeat/filebeat.yml#

============================== Filebeat inputs ===============================

        filebeat.inputs:

        # Each - is an input. Most options can be set at the input level, so
        # you can use different inputs for various configurations.
        # Below are the input specific configurations.

        - type: log

          # Change to true to enable this input configuration.
          enabled: true

          # Paths that should be crawled and fetched. Glob based paths.
          paths:
            - ubuntuABC@xys.abc.com:/data/fscrawler/*.log
            #- c:\programdata\elasticsearch\logs\*

# ============================== Filebeat modules ==============================

filebeat.config.modules:
  # Glob pattern for configuration loading
  path: ${path.config}/modules.d/*.yml

  # Set to true to enable config reloading
  reload.enabled: true

  # Period on which files under path should be checked for changes
  #reload.period: 10s

        # =================================== Kibana ===================================

        # Starting with Beats version 6.0.0, the dashboards are loaded via the Kibana API.
        # This requires a Kibana endpoint configuration.
        setup.kibana:

          # Kibana Host
          # Scheme and port can be left out and will be set to the default (http and 5601)
          # In case you specify and additional path, the scheme is required: http://localhost:5601/path
          # IPv6 addresses should always be defined as: https://[2001:db8::1]:5601
                host: "http://xy.abc.def.gh:5601"

        # ================================== Outputs ===================================

        # Configure what output to use when sending the data collected by the beat.

        # ---------------------------- Elasticsearch Output ----------------------------
        output.elasticsearch:
          # Array of hosts to connect to.
                hosts: ["http://xy.abc.def.gh:9200"]

Then I changed the /etc/filebeat/filebeat.yml as root

    -rwxrwxrwx 1 root root 690K Jun 14 19:15 fields.yml
    -rwxrwxrwx 1 root root  99K Jun 14 19:15 filebeat.reference.yml
    -rwxr-xr-x 1 root root 8.2K Jul 15 15:26 filebeat.yml
    drwxrwxrwx 2 root root 4.0K Jul 15 11:52 modules.d

This is the permission setting in my log folder

total 1.8G

    -rw-rw-r-- 1 ubuntuABC ubuntuABC 81K Jun 27 11:38 fsc_uploader_1.log

    -rw-rw-r-- 1 ubuntuABC ubuntuABC 508 Jun 27 12:33 fsc_uploader_2.log

    -rw-rw-r-- 1 ubuntuABC ubuntuABC 185K Jun 29 19:07 fsc_uploader_3.log

When I ran sudo filebeat setup and sudo filebeat service start I can see messages like loaded dashboards, loaded ingest pipeline and index set up finished. However when I go to my elastic search indices, filebeat is running but not reading any files inside the folder.
yellow open filebeat-7.8.0-2020.07.15-000001 UxxxxxxxxxxxxxxA 1 1 0 0 208b 208b

I can't figure out if that has something to do with how I configured filebeats or the folder permissions? Thanks

Hi!

Can you run Filebeat in debug mode (./filebeat -e -d "*") and check if events are being shipped?

C.

Hi ChrsMark here's what I ran just now. thanks.

filebeat -e -d "*"

2020-07-16T09:45:17.561+0100 INFO instance/beat.go:647 Home path: [/usr/share/filebeat] Config path: [/etc/filebeat] Data path: [/var/lib/filebeat] Logs path: [/var/log/filebeat]

2020-07-16T09:45:17.562+0100 DEBUG [beat] instance/beat.go:699 Beat metadata path: /var/lib/filebeat/meta.json

2020-07-16T09:45:17.566+0100 INFO instance/beat.go:655 Beat ID: 57361b38-5d63-4a1b-8261-nnnnnnnnnn

2020-07-16T09:45:17.567+0100 DEBUG [add_cloud_metadata] add_cloud_metadata/providers.go:126 add_cloud_metadata: starting to fetch metadata, timeout=3s

2020-07-16T09:45:17.568+0100 DEBUG [docker] docker/client.go:48 Docker client will negotiate the API version on the first request.

2020-07-16T09:45:17.585+0100 DEBUG [add_docker_metadata] add_docker_metadata/add_docker_metadata.go:87 add_docker_metadata: docker environment not detected: Cannot connect to the Docker daemon at unix:///var/run/docker.sock. Is the docker daemon running?

2020-07-16T09:45:17.608+0100 DEBUG [kubernetes] add_kubernetes_metadata/kubernetes.go:138 Could not create kubernetes client using in_cluster config: unable to build kube config due to error: invalid configuration: no configuration has been provided {"libbeat.processor": "add_kubernetes_metadata"}

2020-07-16T09:45:20.568+0100 DEBUG [add_cloud_metadata] add_cloud_metadata/providers.go:169 add_cloud_metadata: timed-out waiting for all responses

2020-07-16T09:45:20.568+0100 DEBUG [add_cloud_metadata] add_cloud_metadata/providers.go:129 add_cloud_metadata: fetchMetadata ran for 3.000304077s

2020-07-16T09:45:20.568+0100 INFO [add_cloud_metadata] add_cloud_metadata/add_cloud_metadata.go:89 add_cloud_metadata: hosting provider type not detected.

2020-07-16T09:45:20.568+0100 DEBUG [processors] processors/processor.go:101 Generated new processors: add_host_metadata=[netinfo.enabled=[true], cache.ttl=[5m0s]], add_cloud_metadata={}, add_docker_metadata=[match_fields=[] match_pids=[process.pid, process.ppid]], add_kubernetes_metadata

2020-07-16T09:45:20.581+0100 INFO instance/beat.go:404 filebeat stopped.

2020-07-16T09:45:20.582+0100 ERROR instance/beat.go:958 Exiting: data path already locked by another beat. Please make sure that multiple beats are not sharing the same data path (path.data).

Exiting: data path already locked by another beat. Please make sure that multiple beats are not sharing the same data path (path.data).

It seems that Filebeat is stoped because there is an issue with path.data. Please make sure there is no other Filebeat process already running, since this is what the log indicates.

C.

thanks and yes there are two filebeats running. How can I tell which Filebeat is running on my machine?

You either need to stop one of them if it's not needed otherwise make sure that you configure a different file.path for those 2 different filebeat instances.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.