CSV and XLS import to Elastic Cloud

Hello, I would like to automatically integrate some CSV and XLS files into Elastic Cloud. How could I do this?

This Files are updated every morning

Hi @Vog93,

Welcome to the community. I'm assuming you don't want to manually upload the CSV file each day?

Could you make use of a script running on a schedule using a tool like cron to process and upload the file? An example would be Python scraping the CSV and then uploading the documents using the Elasticsearch Python client.

Thanks for your reply, yes I don't want to manually upload the CSV file each day.
Is there any alternative using an Elastic Agent rather than the python scraping?

Filebeat or Elastic Agent can upload daily if you name the file differently each day...

Neither read Excel .xls files natively just CSV

1 Like

Thanks Stephenb, how can I automate the daily upload naming the file differently each day with filebeat or elastic?
Do you have any links for documentation or tutorials?
Thank you very much for your help!

Here is the quick start for filebeat...

See if you can follow that and read the file into elastic cloud..

You don't need any modules....

Get that working then we can show you how to parse the CSV....

1 Like

Thank you Stephen, I am going to try it!

You can also use Logstash: Enriching Your Postal Addresses With the Elastic Stack - Part 1 | Elastic Blog

Or filebeat: 4 years at elastic! - David Pilato (although it's very old, it's still accurate).

1 Like

I managed to do it using logstash, how does the output looklike if i want to use elastic cloud?

It could be similar...

See an example at:

And

Thanks for all your responses.
I managed to be able to upload the csv file to elastic cloud with filebeat.
First I uploaded the file to elastic and copied the filebeat.yml that elastic created for that file.
The thing is that i don't want to do this every time that i want to upload a file. I dont want to modify the yml and reinstall every time because i will add more csv's on the future and need a scalable solution.
What alternative can I implement?

Once filebeat has started and is monitoring a directory, you can normally just add your new csv file and it will be picked up.

You can ask for help in Beats with the filebeat tag.

1 Like

Thanks for your response, but the csv files are different, with different headers and so on, how is going to be able to recognize the structure without modifying the filebeat.yml every time I want to ingest a new csv?

No easy solution here. I guess you can do something with a ruby filter maybe but as this is probably too much generic, I'm not sure about the outcome.

I'm moving your question to Logstash in case others have ideas.

But in general I think you should do that manually every time. Specifically with the mapping.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.