Filebeat unable to harvest ISC Bind logs

Hi,

I'm trying to use filebeat to ship ISC Bind logs - heavy writed and aroung 8Gb at the day end - to logstash, grok them and finally store it's in Elastic.

In Filebeat config I have:

  • type: log
    enabled: true
    paths:
    • /var/log/named/queries.log
      fields:
      document_type: bind
      fields_under_root: true

The Bind logs are owned by named:named and permissions rw-r--r--

I'm able to tail -f them and see the flow and the queries in realtime.

But when I try to start filebeat with debug mode onde I see this messages:

2019-11-08T23:18:46.046Z INFO crawler/crawler.go:72 Loading Inputs: 3
2019-11-08T23:18:46.046Z DEBUG [processors] processors/processor.go:66 Processors:
2019-11-08T23:18:46.047Z DEBUG [input] log/config.go:201 recursive glob enabled
2019-11-08T23:18:46.047Z DEBUG [input] log/input.go:147 exclude_files: . Number of stats: 0
2019-11-08T23:18:46.047Z DEBUG [input] log/input.go:168 input with previous states loaded: 0
2019-11-08T23:18:46.047Z INFO log/input.go:138 Configured paths: [/var/log/named/queries.log]
2019-11-08T23:18:46.047Z INFO input/input.go:114 Starting input of type: log; ID: 5379400643241004272
2019-11-08T23:18:46.047Z DEBUG [processors] processors/processor.go:66 Processors:
2019-11-08T23:18:46.048Z INFO crawler/crawler.go:139 Stopping Crawler
2019-11-08T23:18:46.048Z INFO crawler/crawler.go:149 Stopping 1 inputs
2019-11-08T23:18:46.048Z DEBUG [registrar] registrar/registrar.go:278 Starting Registrar
2019-11-08T23:18:46.048Z DEBUG [input] log/input.go:174 Start next scan
2019-11-08T23:18:46.048Z INFO log/input.go:453 Scan aborted because input stopped.
2019-11-08T23:18:46.048Z DEBUG [input] log/input.go:195 input states cleaned up. Before: 0, After: 0, Pending: 0
2019-11-08T23:18:46.048Z INFO input/input.go:149 input ticker stopped
2019-11-08T23:18:46.048Z INFO input/input.go:167 Stopping Input: 5379400643241004272
2019-11-08T23:18:46.048Z DEBUG [publisher] pipeline/client.go:149 client: closing acker
2019-11-08T23:18:46.048Z DEBUG [publisher] pipeline/client.go:151 client: done closing acker
2019-11-08T23:18:46.048Z DEBUG [publisher] pipeline/client.go:155 client: cancelled 0 events
2019-11-08T23:18:46.048Z INFO crawler/crawler.go:165 Crawler stopped
2019-11-08T23:18:46.048Z INFO registrar/registrar.go:367 Stopping Registrar
2019-11-08T23:18:46.048Z INFO registrar/registrar.go:293 Ending Registrar
2019-11-08T23:18:46.048Z DEBUG [registrar] registrar/registrar.go:411 Write registry file: /var/lib/filebeat/registry/filebeat/data.json
2019-11-08T23:18:46.049Z DEBUG [publisher] pipeline/client.go:149 client: closing acker
2019-11-08T23:18:46.050Z DEBUG [publisher] pipeline/client.go:151 client: done closing acker
2019-11-08T23:18:46.050Z DEBUG [publisher] pipeline/client.go:155 client: cancelled 0 events
2019-11-08T23:18:46.107Z DEBUG [registrar] registrar/registrar.go:404 Registry file updated. 0 states written.
2019-11-08T23:18:46.111Z INFO [monitoring] log/log.go:152 Total non-zero metrics {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":70,"time":{"ms":76}},"total":{"ticks":760,"time":{"ms":775},"value":760},"user":{"ticks":690,"time":{"ms":699}}},"handles":{"limit":{"hard":4096,"soft":1024},"open":8},"info":{"ephemeral_id":"be77e323-0663-4d5b-b0b9-a0b26ae3b8e2","uptime":{"ms":45117}},"memstats":{"gc_next":4564400,"memory_alloc":3270216,"memory_total":47457008,"rss":40169472}},"filebeat":{"harvester":{"open_files":0,"running":0}},"libbeat":{"config":{"module":{"running":0}},"output":{"type":"logstash"},"pipeline":{"clients":0,"events":{"active":0}}},"registrar":{"states":{"current":0},"writes":{"success":1,"total":1}},"system":{"cpu":{"cores":1},"load":{"1":0.31,"15":0.08,"5":0.16,"norm":{"1":0.31,"15":0.08,"5":0.16}}}}}}
2019-11-08T23:18:46.111Z INFO [monitoring] log/log.go:153 Uptime: 45.121468366s
2019-11-08T23:18:46.111Z INFO [monitoring] log/log.go:130 Stopping metrics logging.
2019-11-08T23:18:46.111Z DEBUG [monitoring] pipeline/client.go:149 client: closing acker
2019-11-08T23:18:46.111Z DEBUG [monitoring] pipeline/client.go:151 client: done closing acker
2019-11-08T23:18:46.111Z DEBUG [monitoring] pipeline/client.go:155 client: cancelled 0 events
2019-11-08T23:18:46.111Z DEBUG [monitoring] pipeline/pipeline.go:242 close pipeline
2019-11-08T23:18:46.111Z INFO instance/beat.go:401 filebeat stopped.
2019-11-08T23:18:46.111Z ERROR instance/beat.go:802 Exiting: Error while initializing input: No paths were defined for input accessing 'filebeat.inputs.1'
Exiting: Error while initializing input: No paths were defined for input accessing 'filebeat.inputs.1'

And the service keeps on restarting and restarting and restarting.

In Kibana / Xpack monitoring I see system infromations but no events or bytes registered.

I've tried tail_files, encoding and some more options but with no success at all.

Can you please help me?

Thank you, Pedro

Hi @pestevao,

From the debug lines, it would seem like there is an issue with your config. Could you please paste it without formatting (use the preformatted text button)? This way we can analize what's going on.

Best regards

You're right @exekias!

I have o typo after the first - type: log.

Another lost - sign but no configuration take filebeat to expect another insctructions.

Removed it and all was solved!

So many times looking at .yml file and...

Thank you for pointing me out.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.