Ingesting a csv file with logstash

Hello,
I am trying to ingest a csv file with logstash.

Can someone check my logstash.conf file. I want the content to be stored in elasticsearch as well as to be printed on stdout.

input {
file{
path => ["/home/abhinavkumar.gurung/Applications/csv/devian/data/Admission_Predict.csv"]
start_position => "beginning"
#read from the beginning of the file
sincedb_path => "/dev/null"
codec => plain {
charset => "UTF-8"
}
}

}
filter {
csv{
columns => ["Serial No","GRE Score","TOEFL Score","University Rating","SOP","LOR","CGPA","Research","Chance of Admit"]

separator => ","
}

mutate {
	convert => {
			"Serial No" => "integer"
			"GRE Score" => "integer"
			"TOEFL Score" => "integer"
			"University Rating" => "integer"
			"SOP" => "float"
			"LOR" => "float"
			"CGPA" => "float"
			"Research" => "integer"
			"Chance of Admit" => "float"
		}
}

}
output {
elasticsearch {
action => "index"
hosts => ["elasticsearch:9200"]
document_type => "_doc"
user => elastic
password => changeme
index => "cars"
}
stdout {
codec => rubydebug
}
}

Did you try to it ? Did you get an error .... ?

I did and i dont see any error but I don't see any out put on my std out or nothing stored in es.
which log file should i post for more clarification.

Did you try to run ?
bin/logstash -f /etc/logstash/<your_conf_file_name>.conf

I am running with docker-compose up.

version: '2'

services:

elasticsearch:
build:
context: elasticsearch/
args:
ELK_VERSION: 7.0.1
volumes:
- ./elasticsearch/config/elasticsearch.yml:/usr/share/elasticsearch/config/elasticsearch.yml:ro
ports:
- "9200:9200"
- "9300:9300"
environment:
ES_JAVA_OPTS: "-Xmx256m -Xms256m"
ELASTIC_PASSWORD: changeme
networks:
- elk

logstash:
build:
context: logstash/
args:
ELK_VERSION: 7.0.1
volumes:
- ./logstash/config/logstash.yml:/usr/share/logstash/config/logstash.yml:ro
- ./logstash/pipeline:/usr/share/logstash/pipeline:ro
ports:
- "5000:5000"
- "9600:9600"
environment:
LS_JAVA_OPTS: "-Xmx256m -Xms256m"
networks:
- elk
depends_on:
- elasticsearch

kibana:
build:
context: kibana/
args:
ELK_VERSION: 7.0.1
volumes:
- ./kibana/config/kibana.yml:/usr/share/kibana/config/kibana.yml:ro
ports:
- "5601:5601"
networks:
- elk
depends_on:
- elasticsearch

networks:

elk:
driver: bridge

any one with some knowledge on this, can share their thoughts. I have been stuck on this.
thanks

In the past, whenever a file input completely fails to read a file in a docker instance I have always found the answer is that the file is not mounted in the instance.

in my case, I can see that the file is being copied from my local folder to the logstash container. while the logstash was running, I checked the folder in the container and the data is there. Now I need to send that data to elastic search and on stdout as well. any suggestion on how can I do that?
thanks

Hello,
Hope this example helps you.
[Load csv file in logstash]

I have same for the config file for logstash. Can i see the docker-compose file. My local csv file is being copied in logstash container but nothing shows in elasticsearch or stdout.
What and where can I inspect. I will post my compose and config file in an hr

Hello,
i think the problem is not with the input config file, but the way you are executing the logstash. Please check thoroughly.

I am executing logstash with docker-compose, could you check it if you see any thing . thanks

`version: '2'

services:

elasticsearch:
build:
context: elasticsearch/
args:
ELK_VERSION: 7.0.1
volumes:
- ./elasticsearch/config/elasticsearch.yml:/usr/share/elasticsearch/config/elasticsearch.yml:ro
ports:
- "9200:9200"
- "9300:9300"
environment:
ES_JAVA_OPTS: "-Xmx256m -Xms256m"
ELASTIC_PASSWORD: changeme
networks:
- elk

logstash:
build:
context: logstash/
args:
ELK_VERSION: 7.0.1
volumes:
- ./logstash/config/logstash.yml:/usr/share/logstash/config/logstash.yml:ro
- ./logstash/pipeline:/usr/share/logstash/pipeline:ro
- ./path/to/storage:/usr/share/logstash/storage1:ro
ports:
- "5000:5000"
- "9600:9600"
environment:
LS_JAVA_OPTS: "-Xmx256m -Xms256m"
networks:
- elk
depends_on:
- elasticsearch

kibana:
build:
context: kibana/
args:
ELK_VERSION: 7.0.1
volumes:
- ./kibana/config/kibana.yml:/usr/share/kibana/config/kibana.yml:ro
ports:
- "5601:5601"
networks:
- elk
depends_on:
- elasticsearch

networks:

elk:
driver: bridge`

Also, since logstash is not outputting anything even in the console, does that mean its not ingesting at all? logstash is able to copy the file from my local host to its container folder though.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.