Best logstash plugin or advice to ingest this csv file on elasticsearch

Hello all,

i have this csv file that i need to send to elastic:

alwats the same csv structure

field_1,field_2,field_3,field_4,field_5,field_6
888,0000000A,S,4,2020-09-08 00:00:00.0,name
999,0000000C,S,4,2020-09-08 00:00:00.0,name
222,0000000B,S,4,2020-09-08 00:00:00.0,name

....

This file arrives every day to a folder (and filebeat sends csv to logstash), I need to read the whole file containing several lines and insert on elastic each line in a different document. What would be the best way to go?

Best regards

Unless you have taken steps to prevent it, filebeat will send each line of the file as a separate event. You can parse those events using csv filter.

Thanks Badget,

let me try with csv plugin to see if fits

Best regards

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.