Import CSV File as input in logstash and pass the output data to elasticsearch

I am new to elastic search and I am trying to import data from csv to elasticsearch using logstash.
I am using the below logstash conf file

input {
file {
path => ["D:/AnalyticsTool/Exe/logstash-5.2.0/logstash-5.2.0/bin/data.csv"]
type => "core2"
start_position => "beginning"
filter {
csv {
separator => ","
columns => ["Date","Open","High","Low","Close","Volume","Adj Close"]
mutate {convert => ["High", "float"]}
mutate {convert => ["Open", "float"]}
mutate {convert => ["Low", "float"]}
mutate {convert => ["Close", "float"]}
mutate {convert => ["Volume", "float"]}
output {
elasticsearch {
action => "index"
hosts => ["localhost:9200"]
index => "stock"
workers => 1
user => elastic
password => changeme
stdout { codec => rubydebug}

I am not able to visualize in kibana. Do we need to create index manually in elastic search or kibana will create above mentioned index automatically ?
Please let me know if I am missing anything

Thanks in advance


It would be helpful to know where exactly you are stuck with this. Do you see any errors in Logstash? Is the index missing in Elasticsearch? Or is it just a visualization issue.

For example, to see if your "stock" index has been created you can try issue commands like:

curl http://localhost:9200/_cat/indices
curl http://localhost:9200/stock/_search

Issue is resolved. Actually I missed to add sincedb in input section.

Thanks for your reply

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.