Hi,
I have Filebeat configured to fetch data from log files in a folder. I have log files, which is sliced to 10 mb log fies in my linux box, and so I configured filebeat to scan the entire folder. There will be approx 10k entries per day. I am forwarding filebeat entries to kafka, and from kafka to logstash, and finally to ES.
FB --> KFK --> LS --> ES
but the thing is that, sometimes after 1 day after i start logstash, the process will be running, but there will be no indexes in elasticsearch for that day. Or sometimes, there will be 5 or 6 entries for that day. Once i terminate the filebeat process and restarts it, then the entry gets populated as normal and everything works fine.
Below is my filebeat configuration.
filebeat.prospectors:
- input_type: log
paths:
- /logfolder/logfile*
exclude_files: [".lck"]
fields:
logtype: LOGFILE
document_type: dummylog
scan_frequency: 1m
close_inactive: 10s
multiline.pattern: '^<[A-Za-z_]{3} [[:digit:]]{2}, [[:digit:]]{4} ([[:digit:]]{1}|[[:digit:]]{2}):[[:digit:]]{2}:[[:digit:]]{2}:([[:digit:]]{1}|[[:digit:]]{2}|[[:digit:]]{3}) [A-Z]{2}>'
multiline.negate: true
multiline.match: after
multiline.max_lines: 5000
filebeat.registry_file: .regfile
output.kafka:
enabled: true
hosts: ["192.168.1.1:9092"]
topic: filebeatlogs
worker: 1
max_retries: 2
output.file:
enabled: true
path: "/outputfolder/logdata"
filename: filebeatlogs
How can I fix this?