Yet another case of Filebeat not reading/sending logs to LS

I have filebeat version 6.5.4 (amd64), libbeat 6.5.4 running on Ubuntu 16.04.5 LTS. It is configured to monitor the bro logs (I mean Zeek logs) and send those to Logstash. Up until this past weekend, it was working just fine. This past monday I came and found that logs were not being indexed into Elastic. Tracing back, I found that no logs were being sent by filebeat at all. This is obvious from doing a tcpdump at the logstash box looking for traffic from the bro/filebeat box either by IP or by port and seeing nothing but syn/ack traffic. I did not see any encrypted data being sent and there should be a steady stream of traffic. Yet Filebeat was running, the were logs being written to conn.log, and the config was correct. Following some of the troubleshooting tips (restarting, rebooting, focusing on one log file only, turning on debugging) hasn't found the problem yet. Wondering if anyone has additional suggestions.

Here is my yml:

filebeat.prospectors:

- type: log

  enabled: true

  paths:
- /usr/local/bro/logs/current/conn.log
  tags: ["bro_conn"]

filebeat.config.modules:
  path: ${path.config}/modules.d/*.yml

  reload.enabled: false

  reload.period: 5s

setup.template.settings:
  index.number_of_shards: 3

setup.kibana:

output.logstash:
  hosts: ["1.1.2.1:5044"]

According to the debug output shown in the reply below, it looks like filebeats reads each file, but nothing seems to get sent. This is what the tcpdump looks like:

11:55:36.098583 IP 1.1.2.35.5044 > 1.1.2.43.38784: Flags [P.], seq 139:145, ack 297779, win 1288, options [nop,nop,TS val 507072759 ecr 1018976], length 6
E..:.8@.@.<n
e.#
e.+.......s.........D.....
.9P....`2A....
11:55:36.098774 IP 1.1.2.43.38784 > 1.1.2.35.5044: Flags [.], ack 145, win 229, options [nop,nop,TS val 1020226 ecr 507072759], length 0
E..42D@.@..h
e.+
e.#...........y.....t.....
...B.9P.
11:55:36.688229 IP 1.1.2.43.38784 > 1.1.2.35.5044: Flags [F.], seq 297779, ack 145, win 229, options [nop,nop,TS val 1020373 ecr 507072759], length 0
E..42E@.@..g
e.+
e.#...........y...........
.....9P.
11:55:36.688439 IP 1.1.2.35.5044 > 1.1.2.43.38784: Flags [R.], seq 145, ack 297780, win 1288, options [nop,nop,TS val 507073349 ecr 1020373], length 0
E..4.9@.@.<s
e.#
e.+.......y.........k.....
.9SE....

Remember, during all this time, zeek is writing to "/usr/local/bro/logs/current/conn.log" and adding events.

filebeat is running as a service or started as root. /usr/local/bro/logs/current/conn.log is configured thus:

-rw-r--r-- 1 root root 36605035 Jan 29 16:59 /usr/local/bro/logs/current/conn.log

So why isn't filebeat reading and sending logs written by zeek?

(Not sure if this is a duplicate of FileBeat is not sending logfile to logstash or https://github.com/elastic/beats/issues/7908 but might be similar...)

Here is part one of the debug output of "/usr/share/filebeat/bin/filebeat -c /etc/filebeat/test.yml -e -d "*"" :

first a bunch of lines like this:

2019-01-29T16:53:37.026Z        DEBUG   [publish]       pipeline/processor.go:308       Publish event: {
  "@timestamp": "2019-01-29T16:53:37.026Z",
  "@metadata": {
    "beat": "filebeat",
    "type": "doc",
    "version": "6.5.4"
  },
  "input": {
    "type": "log"
  },
  "prospector": {
    "type": "log"
  },
  "host": {
    "name": "bro-01"
  },
  "beat": {
    "name": "bro-01",
    "hostname": "bro-01",
    "version": "6.5.4"
  },
  "message": "1548779412.854212\tCnLhEa2war705yPm0k\t1.1.2.242\t45018\t1.1.1.85\t79\ttcp\t-\t-\t-\t-\tS0\tT\tT\t0\tS\t1\t48\t0\t0\t(empty)",
  "source": "/usr/local/bro/logs/current/conn.log",
  "offset": 501617,
  "tags": [
    "bro_conn"
  ]
}
2019-01-29T16:53:37.026Z        DEBUG   [publish]       pipeline/processor.go:308       Publish event: {
  "@timestamp": "2019-01-29T16:53:37.026Z",
  "@metadata": {
    "beat": "filebeat",
    "type": "doc",
    "version": "6.5.4"
  },
  "beat": {
    "name": "bro-01",
    "hostname": "bro-01",
    "version": "6.5.4"
  },
  "host": {
    "name": "bro-01"
  },
  "message": "1548779412.854274\tCV9QH43fjbPqkjxOba\t1.1.2.242\t30760\t1.1.1.68\t79\ttcp\t-\t-\t-\t-\tS0\tT\tT\t0\tS\t1\t48\t0\t0\t(empty)",
  "source": "/usr/local/bro/logs/current/conn.log",
  "offset": 501726,
  "tags": [
    "bro_conn"
  ],
  "prospector": {
    "type": "log"
  },
  "input": {
    "type": "log"
  }
}

Here is part 2 of the debug output:

then the following repeated

2019-01-29T15:29:46.898Z        DEBUG   [input] input/input.go:152      Run input
2019-01-29T15:29:46.898Z        DEBUG   [input] log/input.go:174        Start next scan
2019-01-29T15:29:46.898Z        DEBUG   [input] log/input.go:404        Check file for harvesting: /usr/local/bro/logs/current/conn.log
2019-01-29T15:29:46.898Z        DEBUG   [input] log/input.go:494        Update existing file for harvesting: /usr/local/bro/logs/current/conn.log, offset: 544989
2019-01-29T15:29:46.898Z        DEBUG   [input] log/input.go:546        Harvester for file is still running: /usr/local/bro/logs/current/conn.log
2019-01-29T15:29:46.898Z        DEBUG   [input] log/input.go:195        input states cleaned up. Before: 1, After: 1, Pending: 0
2019-01-29T15:29:56.899Z        DEBUG   [input] input/input.go:152      Run input
2019-01-29T15:29:56.899Z        DEBUG   [input] log/input.go:174        Start next scan
2019-01-29T15:29:56.899Z        DEBUG   [input] log/input.go:404        Check file for harvesting: /usr/local/bro/logs/current/conn.log
2019-01-29T15:29:56.899Z        DEBUG   [input] log/input.go:494        Update existing file for harvesting: /usr/local/bro/logs/current/conn.log, offset: 544989
2019-01-29T15:29:56.899Z        DEBUG   [input] log/input.go:546        Harvester for file is still running: /usr/local/bro/logs/current/conn.log
2019-01-29T15:29:56.899Z        DEBUG   [input] log/input.go:195        input states cleaned up. Before: 1, After: 1, Pending: 0
2019-01-29T15:30:06.899Z        INFO    [monitoring]    log/log.go:144  Non-zero metrics in the last 30s        {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":270,"time":{"ms":276}},"total":{"ticks":770,"time":{"ms":784},"value":770},"user":{"ticks":500,"time":{"ms":508}}},"handles":{"limit":{"hard":1048576,"soft":1024},"open":7},"info":{"ephemeral_id":"0b5e9d91-96bb-4bd6-b872-b5e12a8e141a","uptime":{"ms":30021}},"memstats":{"gc_next":13084736,"memory_alloc":8785872,"memory_total":38430336,"rss":30441472}},"filebeat":{"events":{"active":4117,"added":4119,"done":2},"harvester":{"open_files":1,"running":1,"started":1}},"libbeat":{"config":{"module":{"running":0},"reloads":1},"output":{"events":{"active":4096,"batches":3,"total":4096},"read":{"bytes":30},"type":"logstash","write":{"bytes":339583}},"pipeline":{"clients":1,"events":{"active":4117,"filtered":2,"published":4116,"retry":2048,"total":4119}}},"registrar":{"states":{"current":6,"update":2},"writes":{"success":2,"total":2}},"system":{"cpu":{"cores":2},"load":{"1":1.18,"15":0.45,"5":0.57,"norm":{"1":0.59,"15":0.225,"5":0.285}}}}}}
2019-01-29T15:30:06.900Z        DEBUG   [input] input/input.go:152      Run input
2019-01-29T15:30:06.901Z        DEBUG   [input] log/input.go:174        Start next scan
2019-01-29T15:30:06.901Z        DEBUG   [input] log/input.go:404        Check file for harvesting: /usr/local/bro/logs/current/conn.log
2019-01-29T15:30:06.901Z        DEBUG   [input] log/input.go:477        Start harvester for new file: /usr/local/bro/logs/current/conn.log
2019-01-29T15:30:06.901Z        DEBUG   [harvester]     log/harvester.go:489    Setting offset for file based on seek: /usr/local/bro/logs/current/conn.log
2019-01-29T15:30:06.901Z        DEBUG   [harvester]     log/harvester.go:475    Setting offset for file: /usr/local/bro/logs/current/conn.log. Offset: 0
2019-01-29T15:30:06.901Z        DEBUG   [harvester]     log/harvester.go:390    Update state: /usr/local/bro/logs/current/conn.log, offset: 0
2019-01-29T15:30:06.901Z        DEBUG   [input] file/states.go:68       New state added for /usr/local/bro/logs/current/conn.log
2019-01-29T15:30:36.899Z        INFO    [monitoring]    log/log.go:144  Non-zero metrics in the last 30s        {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":270},"total":{"ticks":780,"time":{"ms":4},"value":780},"user":{"ticks":510,"time":{"ms":4}}},"handles":{"limit":{"hard":1048576,"soft":1024},"open":8},"info":{"ephemeral_id":"0b5e9d91-96bb-4bd6-b872-b5e12a8e141a","uptime":{"ms":60019}},"memstats":{"gc_next":13084736,"memory_alloc":9089936,"memory_total":38734400}},"filebeat":{"events":{"active":1,"added":1},"harvester":{"open_files":2,"running":1}},"libbeat":{"config":{"module":{"running":0}},"output":{"read":{"bytes":36}},"pipeline":{"clients":1,"events":{"active":4117}}},"registrar":{"states":{"current":6}},"system":{"load":{"1":1.27,"15":0.48,"5":0.66,"norm":{"1":0.635,"15":0.24,"5":0.33}}}}}}
2019-01-29T15:31:06.908Z        INFO    [monitoring]    log/log.go:144  Non-zero metrics in the last 30s        {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":280,"time":{"ms":4}},"total":{"ticks":830,"time":{"ms":48},"value":830},"user":{"ticks":550,"time":{"ms":44}}},"handles":{"limit":{"hard":1048576,"soft":1024},"open":8},"info":{"ephemeral_id":"0b5e9d91-96bb-4bd6-b872-b5e12a8e141a","uptime":{"ms":90026}},"memstats":{"gc_next":13084736,"memory_alloc":9345344,"memory_total":38989808}},"filebeat":{"harvester":{"open_files":2,"running":1}},"libbeat":{"config":{"module":{"running":0}},"output":{"read":{"bytes":36}},"pipeline":{"clients":1,"events":{"active":4117}}},"registrar":{"states":{"current":6}},"system":{"load":{"1":1,"15":0.49,"5":0.66,"norm":{"1":0.5,"15":0.245,"5":0.33}}}}}}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.