How to use Logstash to import csv file into elasticsearch

Need help guys,

i need to store data (csv file) on elasticsearch , but i m confused about the way to do that .

May i have to install firstly logstash on docker compose and try configure it manually???
What are the main steps ?

Thank you ,

idris

We aren't all guys :slight_smile:

How many CSV files are there? Do they change number and format?

4 csv files to store on elasticsearch , i think they are the same format but how can i check on it ?i m not sure

If there's just 4 and they are static, then look at this https://www.elastic.co/blog/importing-csv-and-log-data-into-elasticsearch-with-file-data-visualizer

how can i add data visualizer to kibana?
i installed kibana and elasticsearch using docker compose

services:

zookeeper:
image: rawmind/alpine-zk:3.4.10-0 #wurstmeister/zookeeper
ports:
- 2181:2181
- 2888:2888
- 3888:3888

kafka:
image: wurstmeister/kafka:0.11.0.1
depends_on:
- zookeeper
ports:
- "9092:9092"
environment:
KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
KAFKA_RESERVED_BROKER_MAX_ID: "1001"
KAFKA_BROKER_ID: "42"
KAFKA_ADVERTISED_HOST_NAME: "kafka"
KAFKA_CREATE_TOPICS: "newsin:1:1,newsout:1:1,legalin:1:1,legalout:1:1,legalfr:1:1,lematinin:1:1,lematinout:1:1,24heuresin:1:1,24heuresout:1:1"

search engine

elasticsearch:
image: docker.elastic.co/elasticsearch/elasticsearch:6.0.0 #kiasaki/alpine-elasticsearch #docker.elastic.co/elasticsearch/elasticsearch:5.6.3
ports:
- "9200:9200"
- "9300:9300"
environment:
- xpack.security.enabled=false
Volumes:
- $PWD/elasticsearch:/usr/share/elasticsearch/data

elasticsearch dashboard

kibana:
image: perriea/alpine-kibana
ports:
- "5601:5601"
environment:
ELASTICSEARCH_HOSTS: "http://elasticsearch:9200"
links:
- elasticsearch

connect:
image: confluentinc/cp-kafka-connect:3.3.0
ports:
- 8083:8083
depends_on:
- zookeeper
- kafka
volumes:
- $PWD/connect-plugins:/connect-plugins

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.