I am trying to send some reports in CSV format to logstash and then Elasticsearch. I am getting this pipeline error:
[WARN ] [io.netty.channel.DefaultChannelPipeline] An exceptionCaught() event was fired, and it reached at the tail of the pipeline. It usually means he last handler in the pipeline did not handle the exception. java.net.SocketEception: Connection reset ...
sample.csv
Title,Author,Subject,Pages,Publication Date Test,John,Shipping,42,16 Dec 2018
filebeat.yml
filebeat.inputs: - type: log paths: - /path/to/input.csv fields: report_type: book output.logstash: hosts: ["10.0.0.1:5044"] processors: - decode_csv_fields: fields: message: decoded.csv
logstast-pipeline.conf
input { beats { port => "5044" host => "10.0.0.1" } } filter { csv { autodetect_column_names => true separator => "," } } output { elasticsearch { hosts => ["http://10.0.0.1:9200"] index => bookreports-%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYY.MM.dd}" } }