I am trying to import a CSV file to elasticsearch using logstash. There is no error displayed on the screen after running the logstash.bat. However, there is no index created in ealsticsearch. Please help.
Below is the content of the last log file.
[2018-10-01T00:21:52,539][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.4.1"}
[2018-10-01T00:21:59,852][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-10-01T00:22:00,730][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2018-10-01T00:22:00,743][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2018-10-01T00:22:01,164][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2018-10-01T00:22:01,300][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-10-01T00:22:01,312][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type
event field won't be used to determine the document _type {:es_version=>6}
[2018-10-01T00:22:01,436][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-10-01T00:22:01,404][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost"]}
[2018-10-01T00:22:01,651][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-10-01T00:22:02,983][INFO ][logstash.inputs.file ] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"C:/Users/ahkartika/Downloads/logstash-6.4.1/data/plugins/inputs/file/.sincedb_c810ab01ee71279c8ef52f6ad226d496", :path=>["C:\TESLA\ELASTIC\data\cars.csv"]}
[2018-10-01T00:22:03,053][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x6f6a12e4 run>"}
[2018-10-01T00:22:03,172][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2018-10-01T00:22:03,173][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2018-10-01T00:22:03,845][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}