Hi,
We restarted filebeat in our server today and after restarting no new logs are pushed to logstash anymore.
Here is the config that we use for the input:
filebeat.inputs:
- type: log
enabled: true
paths:
- /var/log/remotelogs/*/*.log
scan_frequency: 2s
ignore_older: 72h
clean_inactive: 96h
output.logstash:
hosts: ["logstash:5044"]
ssl.enabled: true
Logstash works correctly and can receive data from other servers.
There is no indication of log files being read by harvester. lsof shows no log files are being read.
Tried to run in debug mode there is no error, but no data are pushed to Logstash.
2021-08-16T12:30:08.760+0200 INFO [registrar] registrar/registrar.go:109 States Loaded from registrar: 341619
2021-08-16T12:30:08.761+0200 INFO [crawler] beater/crawler.go:71 Loading Inputs: 1
2021-08-16T12:30:08.797+0200 DEBUG [registrar] registrar/registrar.go:140 Starting Registrar
2021-08-16T12:30:08.801+0200 DEBUG [input] log/config.go:207 recursive glob enabled
2021-08-16T12:30:08.802+0200 DEBUG [input] log/input.go:178 exclude_files: []. Number of states: 341619 {"input_id": "7c05bd9c-b9b3-45de-b720-0a502e416637"}
2021-08-16T12:30:08.803+0200 DEBUG [input] file/states.go:68 New state added for /var/log/remotelogs/X.X.X.X/1319244.log
2021-08-16T12:30:08.803+0200 DEBUG [acker] beater/acker.go:59 stateful ack {"count": 1}
2021-08-16T12:30:08.804+0200 DEBUG [publisher] pipeline/client.go:231 Pipeline client receives callback 'onFilteredOut' for event: {Timestamp:0001-01-01 00:00:00 +0000 UTC Meta:null Fields:null Private:{Id:native::695236-64769 PrevId: Finished:true Fileinfo:<nil> Source:/var/log/remotelogs/X.X.X.X/1319244.log Offset:231 Timestamp:2021-08-14 13:04:48.85553314 +0200 CEST TTL:96h0m0s Type:log Meta:map[] FileStateOS:695236-64769 IdentifierName:native} TimeSeries:false}
2021-08-16T12:30:08.804+0200 DEBUG [input] file/states.go:68 New state added for /var/log/remotelogs/X.X.X.X/6299244.log
2021-08-16T12:30:08.804+0200 DEBUG [acker] beater/acker.go:59 stateful ack {"count": 1}
2021-08-16T12:30:08.805+0200 DEBUG [publisher] pipeline/client.go:231 Pipeline client receives callback 'onFilteredOut' for event: {Timestamp:0001-01-01 00:00:00 +0000 UTC Meta:null Fields:null Private:{Id:native::2706842-64769 PrevId: Finished:true Fileinfo:<nil> Source:/var/log/remotelogs/X.X.X.X/6299244.log Offset:231 Timestamp:2021-08-13 16:45:38.862771576 +0200 CEST TTL:96h0m0s Type:log Meta:map[] FileStateOS:2706842-64769 IdentifierName:native} TimeSeries:false}
2021-08-16T12:30:08.805+0200 DEBUG [input] file/states.go:68 New state added for /var/log/remotelogs/X.X.X.X/2606788.log
2021-08-16T12:30:08.806+0200 DEBUG [acker] beater/acker.go:59 stateful ack {"count": 1}
2021-08-16T12:30:08.806+0200 DEBUG [registrar] registrar/registrar.go:263 Processing 1 events
2021-08-16T12:30:08.807+0200 DEBUG [registrar] registrar/registrar.go:230 Registrar state updates processed. Count: 1
2021-08-16T12:30:08.811+0200 DEBUG [registrar] registrar/registrar.go:253 Registrar states cleaned up. Before: 341619, After: 341619, Pending: 1
2021-08-16T12:30:57.521+0200 INFO [monitoring] log/log.go:145 Non-zero metrics in the last 30s {"monitoring": {"metrics": {"beat":{"cgroup":{"cpuacct":{"total":{"ns":3146741874}},"memory":{"mem":{"usage":{"bytes":61538304}}}},"cpu":{"system":{"ticks":2490,"time":{"ms":1055}},"total":{"ticks":17040,"time":{"ms":3148},"value":17040},"user":{"ticks":14550,"time":{"ms":2093}}},"handles":{"limit":{"hard":1048576,"soft":1024},"open":10},"info":{"ephemeral_id":"bf4579f9-2851-4d81-a48c-61005fc9d9c1","uptime":{"ms":60044},"version":"7.14.0"},"memstats":{"gc_next":1306827200,"memory_alloc":1037048024,"memory_sys":262144,"memory_total":1751020024,"rss":1361379328},"runtime":{"goroutines":16}},"filebeat":{"harvester":{"open_files":0,"running":0}},"libbeat":{"config":{"module":{"running":0}},"output":{"events":{"active":0}},"pipeline":{"clients":1,"events":{"active":1}}},"registrar":{"states":{"current":341619}},"system":{"load":{"1":1.27,"15":1.22,"5":1.17,"norm":{"1":1.27,"15":1.22,"5":1.17}}}}}}
We have a large number of log files that is why I added ignore_older and clean_inactive.
What could be the issue? Is there any solution for this kind of issue without deleting the registry that will cause data duplication?
Thank you.
Regards,
Nugroho