Filebeat don't harvest log

Filebeat can't harvest log after restart

  • I deploy filebeat with docker , but it can't harvest log after restart while log is still appended to the logFile. the log doesn't show the log path I configued in filebeat.yml. I am confused . what's wrong with my config ?
  • here is the log
        2020-12-10T07:02:59.332Z	INFO	instance/beat.go:645	Home path: [/usr/share/filebeat] Config path: [/usr/share/filebeat] Data path: [/usr/share/filebeat/data] Logs path: [/usr/share/filebeat/logs]
        2020-12-10T07:02:59.401Z	INFO	[seccomp]	seccomp/seccomp.go:124	Syscall filter successfully installed
        2020-12-10T07:02:59.430Z	INFO	[beat]	instance/beat.go:981	Beat info	{"system_info": {"beat": {"path": {"config": "/usr/share/filebeat", "data": "/usr/share/filebeat/data", "home": "/usr/share/filebeat", "logs": "/usr/share/filebeat/logs"}, "type": "filebeat", "uuid": "121dba48-c9d9-4555-93dd-38bd165a3ac1"}}}
        2020-12-10T07:02:59.463Z	INFO	[beat]	instance/beat.go:990	Build info	{"system_info": {"build": {"commit": "1428d58cf2ed945441fb2ed03961cafa9e4ad3eb", "libbeat": "7.10.0", "time": "2020-11-09T19:57:04.000Z", "version": "7.10.0"}}}
        2020-12-10T07:02:59.463Z	INFO	[beat]	instance/beat.go:993	Go runtime info	{"system_info": {"go": {"os":"linux","arch":"amd64","max_procs":8,"version":"go1.14.7"}}}
        2020-12-10T07:02:59.464Z	INFO	[beat]	instance/beat.go:997	Host info	{"system_info": {"host": {"architecture":"x86_64","boot_time":"2020-08-03T08:22:14Z","containerized":true,"name":"cb3e7abe2cbc","ip":["127.0.0.1/8","172.17.0.2/16"],"kernel_version":"4.9.0-7-amd64","mac":["02:42:ac:11:00:02"],"os":{"family":"redhat","platform":"centos","name":"CentOS Linux","version":"7 (Core)","major":7,"minor":8,"patch":2003,"codename":"Core"},"timezone":"UTC","timezone_offset_sec":0}}}
        2020-12-10T07:02:59.465Z	INFO	[beat]	instance/beat.go:1026	Process info	{"system_info": {"process": {"capabilities": {"inheritable":["chown","dac_override","fowner","fsetid","kill","setgid","setuid","setpcap","net_bind_service","net_raw","sys_chroot","mknod","audit_write","setfcap"],"permitted":null,"effective":null,"bounding":["chown","dac_override","fowner","fsetid","kill","setgid","setuid","setpcap","net_bind_service","net_raw","sys_chroot","mknod","audit_write","setfcap"],"ambient":null}, "cwd": "/usr/share/filebeat", "exe": "/usr/share/filebeat/filebeat", "name": "filebeat", "pid": 1, "ppid": 0, "seccomp": {"mode":"filter"}, "start_time": "2020-12-10T07:02:55.100Z"}}}
        2020-12-10T07:02:59.465Z	INFO	instance/beat.go:299	Setup Beat: filebeat; Version: 7.10.0
        2020-12-10T07:02:59.518Z	INFO	[publisher]	pipeline/module.go:113	Beat name: cb3e7abe2cbc
        2020-12-10T07:02:59.519Z	WARN	beater/filebeat.go:178	Filebeat is unable to load the Ingest Node pipelines for the configured modules because the Elasticsearch output is not configured/enabled. If you have already loaded the Ingest Node pipelines or are using Logstash pipelines, you can ignore this warning.
        2020-12-10T07:02:59.519Z	INFO	[monitoring]	log/log.go:118	Starting metrics logging every 30s
        2020-12-10T07:02:59.520Z	INFO	instance/beat.go:455	filebeat start running.
        2020-12-10T07:03:01.033Z	INFO	memlog/store.go:124	Finished loading transaction log file for '/usr/share/filebeat/data/registry/filebeat'. Active transaction id=741182808
        2020-12-10T07:03:01.034Z	WARN	beater/filebeat.go:381	Filebeat is unable to load the Ingest Node pipelines for the configured modules because the Elasticsearch output is not configured/enabled. If you have already loaded the Ingest Node pipelines or are using Logstash pipelines, you can ignore this warning.
        2020-12-10T07:03:01.556Z	INFO	[registrar]	registrar/registrar.go:109	States Loaded from registrar: 63752
        2020-12-10T07:03:01.557Z	INFO	[crawler]	beater/crawler.go:71	Loading Inputs: 2
        2020-12-10T07:03:29.523Z	INFO	[monitoring]	log/log.go:145	Non-zero metrics in the last 30s	{"monitoring": {"metrics": {"beat":{"cgroup":{"cpu":{"id":"docker-cb3e7abe2cbc3a7cfef5e202f56ad076e307db3339ad5d9a86cfd569c79374e9.scope"},"cpuacct":{"id":"docker-cb3e7abe2cbc3a7cfef5e202f56ad076e307db3339ad5d9a86cfd569c79374e9.scope"},"memory":{"id":"docker-cb3e7abe2cbc3a7cfef5e202f56ad076e307db3339ad5d9a86cfd569c79374e9.scope"}},"cpu":{"system":{"ticks":1420,"time":{"ms":1420}},"total":{"ticks":5720,"time":{"ms":5724},"value":5720},"user":{"ticks":4300,"time":{"ms":4304}}},"handles":{"limit":{"hard":1048576,"soft":1048576},"open":11},"info":{"ephemeral_id":"3bbded11-0713-4e5d-8274-7ec5a10e36ab","uptime":{"ms":30777}},"memstats":{"gc_next":286144480,"memory_alloc":224387440,"memory_total":552371192,"rss":325902336},"runtime":{"goroutines":19}},"filebeat":{"events":{"active":3,"added":3},"harvester":{"open_files":0,"running":0}},"libbeat":{"config":{"module":{"running":0}},"output":{"type":"logstash"},"pipeline":{"clients":1,"events":{"active":1,"filtered":2,"total":3}}},"registrar":{"states":{"current":63752,"update":1},"writes":{"success":1,"total":1}},"system":{"cpu":{"cores":8},"load":{"1":1.93,"15":2.42,"5":2.22,"norm":{"1":0.2413,"15":0.3025,"5":0.2775}}}}}}
        2020-12-10T07:03:59.523Z	INFO	[monitoring]	log/log.go:145	Non-zero metrics in the last 30s	{"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":2720,"time":{"ms":1308}},"total":{"ticks":9600,"time":{"ms":3892},"value":9600},"user":{"ticks":6880,"time":{"ms":2584}}},"handles":{"limit":{"hard":1048576,"soft":1048576},"open":11},"info":{"ephemeral_id":"3bbded11-0713-4e5d-8274-7ec5a10e36ab","uptime":{"ms":60777}},"memstats":{"gc_next":265895856,"memory_alloc":146215808,"memory_total":751770192,"rss":43409408},"runtime":{"goroutines":19}},"filebeat":{"events":{"added":1,"done":1},"harvester":{"open_files":0,"running":0}},"libbeat":{"config":{"module":{"running":0}},"pipeline":{"clients":1,"events":{"active":1,"filtered":1,"total":1}}},"registrar":{"states":{"current":63752,"update":1},"writes":{"success":1,"total":1}},"system":{"load":{"1":2.01,"15":2.41,"5":2.22,"norm":{"1":0.2512,"15":0.3013,"5":0.2775}}}}}}
        2020-12-10T07:04:29.523Z	INFO	[monitoring]	log/log.go:145	Non-zero metrics in the last 30s	{"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":4000,"time":{"ms":1276}},"total":{"ticks":13300,"time":{"ms":3700},"value":13300},"user":{"ticks":9300,"time":{"ms":2424}}},"handles":{"limit":{"hard":1048576,"soft":1048576},"open":11},"info":{"ephemeral_id":"3bbded11-0713-4e5d-8274-7ec5a10e36ab","uptime":{"ms":90778}},"memstats":{"gc_next":264274256,"memory_alloc":218662488,"memory_total":951243992,"rss":-4939776},"runtime":{"goroutines":19}},"filebeat":{"events":{"added":1,"done":1},"harvester":{"open_files":0,"running":0}},"libbeat":{"config":{"module":{"running":0}},"pipeline":{"clients":1,"events":{"active":1,"filtered":1,"total":1}}},"registrar":{"states":{"current":63752,"update":1},"writes":{"success":1,"total":1}},"system":{"load":{"1":2.06,"15":2.4,"5":2.21,"norm":{"1":0.2575,"15":0.3,"5":0.2763}}}}}}
        2020-12-10T07:04:59.524Z	INFO	[monitoring]	log/log.go:145	Non-zero metrics in the last 30s	{"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":5360,"time":{"ms":1360}},"total":{"ticks":17080,"time":{"ms":3780},"value":17080},"user":{"ticks":11720,"time":{"ms":2420}}},"handles":{"limit":{"hard":1048576,"soft":1048576},"open":11},"info":{"ephemeral_id":"3bbded11-0713-4e5d-8274-7ec5a10e36ab","uptime":{"ms":120780}},"memstats":{"gc_next":263833088,"memory_alloc":162086408,"memory_total":1150809504,"rss":-8777728},"runtime":{"goroutines":19}},"filebeat":{"events":{"added":1,"done":1},"harvester":{"open_files":0,"running":0}},"libbeat":{"config":{"module":{"running":0}},"pipeline":{"clients":1,"events":{"active":1,"filtered":1,"total":1}}},"registrar":{"states":{"current":63752,"update":1},"writes":{"success":1,"total":1}},"system":{"load":{"1":2.26,"15":2.41,"5":2.24,"norm":{"1":0.2825,"15":0.3013,"5":0.28}}}}}}
    
  • my Filebeat config yml
    filebeat.inputs:
    - type: log
      paths:
        /var/maillog/**/*.log
      clean_inactive:  72h
      clean_removed: true
      ignore_older: 24h
      multiline.pattern: '^[0-9]{4}-[0-9]{2}-[0-9]{2}|^web|^grpc'
      multiline.negate: true
      multiline.match: after
    - type: log
      paths:
        /var/mailsearch/**/*.json
      fields:
        logType: elasticDoc
      clean_inactive:  72h
      clean_removed: true
      ignore_older: 24h
    
    output.logstash:
        hosts: [logstash:5044]
    
    

Hello @W1nter-3Z any progress ? I am with the same behavior when trying to customize filebeat.inputs

Share please if you got any response.

I'm also having the same issue. It seems to be an issue where if the ingest pipelines or index template is already present it will not work. I'm using elasticsearch directly, the workaround is to stop filebeat, delete the ingest pipelines and index template from elasticsearch and then the next time filebeat starts up it will load them again and work as expected.

Hello @Nicholas_Hiser,

Just to make sure:

1 - stop filebeat
2 - in kibana delete index-pattern and templates related to filebeat
3 - deploy filebeat again with the custom log path

Should be something like that please ?

Thanks in advance.

That's what worked for me. Stop filebeat -> delete the ingest node pipelines -> delete the legacy index template for 7.10.1- -> start filebeat.


And keep your fingers crossed.

Thanks

I will try with filebeat.inputs: following your steps.

I have nothing related to filebeat in "Ingest Node Pipelines" option. Only xpack_monitoring_6 and xpack_monitoring_7

I tried here some variations with filebeat.autodiscover: and unfortunately did not work.

1 - Tried to force filebeat harvest logs from /opt/data/indexing-logs/ and only for namespace datastore I trying to ignore kube-system namespace

filebeatConfig:
  filebeat.yml: |
     filebeat.autodiscover:
       providers:
         - type: kubernetes
           node: ${NODE_NAME}
           hints.enabled: true
           templates:
             - condition:
                 equals:
                   kubernetes.namespace: datastore
               config:
                 - type: log
                   paths:
                     - /opt/data/indexing-logs/*.log
     output.elasticsearch:
       host: '${NODE_NAME}'
       hosts: '${ELASTICSEARCH_HOSTS:elasticsearch-master:9200}'

2 - Tried to use filebeat to harvest logs from containers pattern path /var/log/containers/ but forcing only for namespace datastore and again to try to ignore kube-system namespace

filebeatConfig:
  filebeat.yml: |
     filebeat.autodiscover:
       providers:
         - type: kubernetes
           node: ${NODE_NAME}
           hints.enabled: true
           templates:
             - condition:
                 equals:
                   kubernetes.namespace: datastore
               config:
                 - type: container
                   paths:
                     - /var/log/containers/*.log
     output.elasticsearch:
       host: '${NODE_NAME}'
       hosts: '${ELASTICSEARCH_HOSTS:elasticsearch-master:9200}'

For tests above using your hints as pre-req before to deploy a new version.

It might just be coincidence vs correlation.

I found it's logstash's problem . When I increased the pipeline.workers parameter in logstash.yml , things has backed to track;

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.