I am trying to ship CSV file to my logstash instance and every time I start the process it get's stuck at 2016/08/04 04:38:18.693034 log.go:113: INFO Harvester started for file: ../logs/ldap-access.csv
It works file if I just send a nomal log file, but for some reason the CSV file does not ship with no obvious errors.
Here is my filebeat configuration:
The logs I am getting are:
This file has been truncated.
[root@ip-0-0-0-0 filebeat-1.2.3-x86_64]# ./filebeat -e -d "publish"
2016/08/04 04:38:18.690593 geolite.go:24: INFO GeoIP disabled: No paths were set under output.geoip.paths
2016/08/04 04:38:18.690750 logstash.go:106: INFO Max Retries set to: 3
2016/08/04 04:38:18.691240 outputs.go:126: INFO Activated logstash as output plugin.
2016/08/04 04:38:18.691253 publish.go:232: DBG Create output worker
2016/08/04 04:38:18.691310 publish.go:274: DBG No output is defined to store the topology. The server fields might not be filled.
2016/08/04 04:38:18.691341 publish.go:288: INFO Publisher name: ip-0-0-0-0.ap-southeast-2.compute.internal
2016/08/04 04:38:18.691791 async.go:78: INFO Flush Interval set to: 1s
2016/08/04 04:38:18.691801 async.go:84: INFO Max Bulk Size set to: 2048
2016/08/04 04:38:18.691810 async.go:92: DBG create bulk processing worker (interval=1s, bulk size=2048)
Any help will be greatly appreciated.