I am trying to ingest some large log files into ELK. Recently setup Filebeat to monitor a watch folder that I drop logs into. This has been working great on my sample data. As I copy over a directory with a days worth of real data the filebeat will process a few hundred thousand lines and then it stops.
I can see that process is still running. I notice the following error in /var/log/messages:
/usr/bin/filebeat: file.go:84: Fail to convert the event to JSON: reflect: call of reflect.Value.IsNil on zero Value
A 'service filebeat restart' will repair and it continue process logs (but I'm uncertain on if there are gaps). This has failed about half dozen times today and I know this isn't stable enough for what at we need in long term.
My filebeat.yml is relatively simple. Here is the prospector being used:
filebeat: prospectors: - paths: - /data/filebeat/watch/*/sn.log.* fields: logsrc-addr: 188.8.131.52 logsrc-name: name logsrc-site: site logsrc-cust: tenant logsrc-tz: MST document_type: logstash_type fields_under_root: true input_type: log exclude_files: [".gz$"] ... output: logstash: hosts: ["mylogstashhost:5044"] file: path: "/data/filebeat/diags" number_of_files: 20 rotate_every_kb: 10000