Bulk export of several directories of csv files to elasticsearch

I have several directories and each has several subdirectories and each has csv files within them. I would like to export all these csv files inside all directories to elasticsearch using logstash and automatically take column names. Note that different csv files have different column names. There are about 7 types of csv files in these directories and each type has the same column names.

Please suggest me what to do in such case.

input {
file {
path => "/dir/dir*///*.csv"
start_position => "beginning"
}
}

filter {
if ([message] =~ " ") {
drop { }
} else {
csv { }
}
}

output {
elasticsearch {
hosts => ["http://10.0.0.4:9200"]
}
stdout{}
}

The csv filter has an autodetect_column_names option that probably does what you're looking for.

But nothing in the documentation about how to use it.
Actually the columns are in the second row, not in the first row. What should I do in such case.

But nothing in the documentation about how to use it.

Well, there's this: Csv filter plugin | Logstash Reference [8.11] | Elastic

Actually the columns are in the second row, not in the first row. What should I do in such case.

That's probably not supported.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.