Logstash is neither uploading csv nor creating index in elastic search

Hi All,
I am new to ELK stack, i read multiple articles about the same issue , but none of them were able to solve my problem . This is the logging which i am getting after starting the logstash.
Now i have a csv file of around 20-25 columns and 822 rows, but i am not able to send it . Can you please help !!!

Sending Logstash's logs to C:/ELK/logstash/logstash-5.3.1/logs which is now configured via log4j2.properties
[2017-04-27T08:38:26,530][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed
], :added=>[http://localhost:9200/]}}
[2017-04-27T08:38:26,534][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch conn
ion is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2017-04-27T08:38:26,684][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>#<URI::HTT
x4327b99 URL:http://localhost:9200/>}
[2017-04-27T08:38:26,686][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2017-04-27T08:38:26,746][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"
plate"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"_al

{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_
e"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"
ing", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword"}}}}}], "properties"=>{"@ti
tamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>
ynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_flo
}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2017-04-27T08:38:26,750][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs
lasticSearch", :hosts=>[#<URI::HTTP:0x5729e2c0 URL:http://localhost:9200>]}
[2017-04-27T08:38:26,754][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>4, "
eline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500}
[2017-04-27T08:38:27,498][INFO ][logstash.pipeline ] Pipeline main started
[2017-04-27T08:38:27,687][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9601}

What does your Logstash configuration file look like? Please read the file input documentation carefully and pay special attention to what's said about sincedb files.

Thanks @magnusbaeck i figured it out . My conf was correct , i am doing some kind of comparison between Splunk and ELK , so i was trying to migrate some data from Splunk to ELK , issue was my data format was not correct .

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.