Import of huge quantity of csv files in elasticsearch with logstash

Hi !
I'm new in ELK and have some questions concerning the import of multiple csv-files in elasticsearch via logstash.
I have a folder with more than 425 csv-files within and each csv file have different column header; So I want to import all this cvs files in elasticsearch via logstash with one config file.
Can you please tell me how my config-file must look like ?

Thanks.

If every file is different then you will likely need to have seperate configs :frowning:

thanks for your quick reply @warkolm. So when I good understand it mean if I want absolutly import them with one config-file, I must differentiate each path of each csv-file with the "conditionals" right ?

You'd probably be better off running logstash with -e and pass the config in via the command line. Then you can programatically iterate over the various files and generate appropriate configs.

Sorry @warkolm I'm new in ELK and I don't really understand what you mean :sweat_smile:
Cant you please give me an example ?

You can call Logstash with the -e flag and then pass in a config in the shell/cli, per https://www.elastic.co/guide/en/logstash/6.0/running-logstash-command-line.html#command-line-flags.

Then you could script something to pass in changing configs based on the files and the headers.

Also we’ve renamed ELK to the Elastic Stack, otherwise Beats and APM feel left out! :wink:

Hi @warkolm,
I didn't find something about iterations in Logstash. Can you please tell me how I can iterate over this various files ?
It would be great if you can give me an example and tell me how the config file must look like.

That'll really depend on what OS you are on. Even then you're asking a lot of things to be done, and I don't want to be rude but I don't have the time to properly help you I am sorry to say :frowning:

If you're relatively new to doing this sort of thing, are you able to ask someone at your company that might be able to walk you through it?

If the first line of each file has the column headers then you might be able to use the same config, assuming you want the fields of the document to be named after the columns.

filter { csv { autodetect_column_names => true } }
1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.