I have a bunch of csv files I wish to ingest into Elasticsearch/Kibana that are localhost in a docker containers.
Wish to seek advice on how I can best ingest these csv files that have 100,000+ rows and be represented as 1 data-index within Elasticsearch.
The data I have on hand is not allowed to upload onto any online cloud.
Do I require logstash in my use case? I have problems setting up the logstash container in proper manner. I cannot get my logstash container to connect to my elasticsearch container.
In regards to how to bulk upload CSV files, it doesn't matter how the ELK components are installed. They can be installed on Docker, or on different hardware, or they can be all local on a single machine.
Using a Logstash configuration like this, you can bulk upload data from CSV files into Elasticsearch.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.