Hi !
I'm new in ELK and have some questions concerning the import of multiple csv-files in elasticsearch via logstash.
I have a folder with more than 425 csv-files within and each csv file have different column header; So I want to import all this cvs files in elasticsearch via logstash with one config file.
Can you please tell me how my config-file must look like ?
thanks for your quick reply @warkolm. So when I good understand it mean if I want absolutly import them with one config-file, I must differentiate each path of each csv-file with the "conditionals" right ?
You'd probably be better off running logstash with -e and pass the config in via the command line. Then you can programatically iterate over the various files and generate appropriate configs.
Hi @warkolm,
I didn't find something about iterations in Logstash. Can you please tell me how I can iterate over this various files ?
It would be great if you can give me an example and tell me how the config file must look like.
That'll really depend on what OS you are on. Even then you're asking a lot of things to be done, and I don't want to be rude but I don't have the time to properly help you I am sorry to say
If you're relatively new to doing this sort of thing, are you able to ask someone at your company that might be able to walk you through it?
If the first line of each file has the column headers then you might be able to use the same config, assuming you want the fields of the document to be named after the columns.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.