What is the best way for loading multiple csv files to different indexes from one config file ?
Hello @rp12,
this is nothing I would have done before but as no one else has replied I will describe how I would attempt to do this.
I would probably start with Filebeat and see how far I would get using it.
https://www.elastic.co/guide/en/beats/filebeat/master/decode-csv-fields.html
You could ingest directly to Elasticsearch https://www.elastic.co/guide/en/beats/filebeat/master/elasticsearch-output.html
Set index
as %{[csv_file_name]}
(could get tricky with path delimiters) or a filed name if you have something like that on which you could determine to which index the document should go.
Logstash does have a CSV filter as well. You could use a file input using a glob pattern to pick up several CSV files and on the output again use a field name (or what ever could identify a single file) as part of the index name.
My Logstash output looks like this. Maybe it can give you some ideas
output {
elasticsearch {
hosts => ["10.1.1.1:9200"]
index => "%{[@metadata][log_prefix]}-%{[@metadata][index]}-%{[@metadata][date]}"
}
}
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.