Configure logstash to receive different csv files from filebeat and send it to different index

i am trying to configure the logstash to receive different csv files (/var/ticket/adr/adr_20171204101010.csv and /var/ticket/idr/idr_20171204101012.csv) from another server using filebeat . how to configure the logstash to send these files to different index , currently i am getting all these csv in single index.
the fields are different. how to define the columns for these different csv files as well .

Use a csv filter to parse the CSV data into different fields. Use the fields configuration option on the Logstash side to add fields that describe the kind of data in each file. With that in place you can e.g. add conditionals in your Logstash output to choose between different elasticsearch outputs. This is a commonly asked question so there should be plenty of examples in the archives.

Thanks for the response. i will try it out.

is there any option to define the first raw as the the column names.

autodetect_column_names => true

i tried it by enabling the autodetect_column_names option but it didnt work out.

it works only if i manually defines it

columns => ["userID" , "Date(ms)" , "Type" , "StartDate(ms)" , "Duration(s)" ]

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.