Hello.
I'm currently using a 6.4 ELK stack and 6.4 Filebeat on a other server.
A squid is running and the access.log get new lines but Filebeat is not able to read it.
I even tried to "echo >>" the file but Filebeat just read nothing.
here is the filebeat conf
filebeat.inputs:
- type: log
enabled: true
paths:
– /var/log/squid/access.log
output.logstash:
hosts: ["10.137.200.18:5044"]
logging.level: debug
Filebeat load the input correctly
2018-09-19T15:31:01.031+0200 INFO instance/beat.go:273 Setup Beat: filebeat; Version: 6.4.0
2018-09-19T15:31:01.031+0200 DEBUG [beat] instance/beat.go:290 Initializing output plugins
2018-09-19T15:31:01.031+0200 DEBUG [processors] processors/processor.go:66 Processors:
2018-09-19T15:31:01.034+0200 DEBUG [publish] pipeline/consumer.go:137 start pipeline event consumer
2018-09-19T15:31:01.034+0200 INFO pipeline/module.go:98 Beat name: squid
2018-09-19T15:31:01.035+0200 INFO instance/beat.go:367 filebeat start running.
2018-09-19T15:31:01.035+0200 DEBUG [registrar] registrar/registrar.go:114 Registry file set to: /var/lib/filebeat/registry
2018-09-19T15:31:01.036+0200 INFO registrar/registrar.go:134 Loading registrar data from /var/lib/filebeat/registry
2018-09-19T15:31:01.036+0200 INFO registrar/registrar.go:141 States Loaded from registrar: 0
2018-09-19T15:31:01.036+0200 WARN beater/filebeat.go:371 Filebeat is unable to load the Ingest Node pipelines for the configured modules because the Elasticsearch output is not configured/enabled. If you have already loaded the Ingest Node pipelines or are using Logstash pipelines, you can ignore this warning.
2018-09-19T15:31:01.036+0200 INFO crawler/crawler.go:72 Loading Inputs: 1
2018-09-19T15:31:01.036+0200 DEBUG [processors] processors/processor.go:66 Processors:
2018-09-19T15:31:01.037+0200 DEBUG [input] log/config.go:201 recursive glob enabled
2018-09-19T15:31:01.037+0200 DEBUG [input] log/input.go:147 exclude_files: []. Number of stats: 0
2018-09-19T15:31:01.037+0200 DEBUG [input] log/input.go:168 input with previous states loaded: 0
**2018-09-19T15:31:01.037+0200 INFO log/input.go:138 Configured paths: [/– /var/log/squid/access.log]**
2018-09-19T15:31:01.037+0200 INFO input/input.go:114 Starting input of type: log; ID: 1805975221398277003
2018-09-19T15:31:01.037+0200 INFO crawler/crawler.go:106 Loading and starting Inputs completed. Enabled inputs: 1
2018-09-19T15:31:01.038+0200 INFO [monitoring] log/log.go:114 Starting metrics logging every 30s
2018-09-19T15:31:01.038+0200 DEBUG [registrar] registrar/registrar.go:267 Starting Registrar
2018-09-19T15:31:01.038+0200 DEBUG [input] log/input.go:174 Start next scan
But even if i add line to the file, filebeat does not seems to see something
2018-09-19T15:31:01.038+0200 DEBUG [input] log/input.go:174 Start next scan
2018-09-19T15:31:01.039+0200 DEBUG [input] log/input.go:195 input states cleaned up. Before: 0, After: 0, Pending: 0
2018-09-19T15:31:11.040+0200 DEBUG [input] input/input.go:152 Run input
2018-09-19T15:31:11.040+0200 DEBUG [input] log/input.go:174 Start next scan
2018-09-19T15:31:11.040+0200 DEBUG [input] log/input.go:195 input states cleaned up. Before: 0, After: 0, Pending: 0
2018-09-19T15:31:21.042+0200 DEBUG [input] input/input.go:152 Run input
2018-09-19T15:31:21.043+0200 DEBUG [input] log/input.go:174 Start next scan
2018-09-19T15:31:21.043+0200 DEBUG [input] log/input.go:195 input states cleaned up. Before: 0, After: 0, Pending: 0
Am i missing something ?
Thank in advance.