Bulk Upload Csv Files Into Standalone ELK Stack In Docker

I have a bunch of csv files I wish to ingest into Elasticsearch/Kibana that are localhost in a docker containers.

Wish to seek advice on how I can best ingest these csv files that have 100,000+ rows and be represented as 1 data-index within Elasticsearch.

The data I have on hand is not allowed to upload onto any online cloud.

Do I require logstash in my use case? I have problems setting up the logstash container in proper manner. I cannot get my logstash container to connect to my elasticsearch container.

Please help me.

Yes, the best way to achieve this would be to use Logstash.

You can use the file input type to read the data into a Logstash pipeline.
Then you can use the csv filter to parse the input text as CSV data

You can try adding the error message in a post to the folks in the Logstash forum area.

Thank you.

So its just Logstash Container connect to Kibana?

Do I even need Elasticsearch docker container?

In regards to how to bulk upload CSV files, it doesn't matter how the ELK components are installed. They can be installed on Docker, or on different hardware, or they can be all local on a single machine.

Using a Logstash configuration like this, you can bulk upload data from CSV files into Elasticsearch.

input { 
    file { 
        path => "/path/to/your-file.csv" 
        start_position => "beginning" 
    } 
}
filter { 
    csv { 
        columns => [ "column1", "column2", "column3" ] 
    } 
}
output { 
    elasticsearch { 
        hosts => ["your-elasticsearch:9200"] 
        index => "your-index" 
    } 
}
1 Like

Thanks for the help.

Can I just check for path => "/path/to/your-file.csv"

This path will need to be iterated in a loop because I have multiple folders of csv within the same Tree to cycle thru.

How to include in this configuration?

You can provide an array for the path, and use patterns in the path strings.

I recommend reading through the pages of docs for the parts that are involved and get familiar with how you can configure things

@tsullivan @stephenb

I have just git clone all 3 repos

Not sure if I shld clone them into same directory space as my logstash docker image folder.

Where / how should I organise my logstash workspace?

PS: in docker-env, 1 level up. ES Kib is here. This docker-compose.yml is to create docker-env Image.

You shouldn't have to install the plugins for file input, elasticsearch output, and csv input filter, I believe those come with Logstash.

Since you have questions about using the Logstash plugins, you should create a new topic in Logstash.

@Ethan777100 ....

Lets stick to 1 thread, please... as we have covered much of this in the other thread... two topics just confuse everyone.

I am going to close this topic, and we can continue in this one

We have solved most of this