Logstash not indexing csv to elasticsearch 6.7.1

I have ELK setup in Win7. when i try to index a csv file using logstash to elasticsearch, its not indexing any data. please help me.

logstash.conf:

input {
file {
path => "C:\Users\260210\Naresh\ELK\data\test.csv"
start_position => "beginning"

}

}
filter {
csv {
separator => ","
columns => ['IncidentID', 'Status', 'AssignmentGroup']

}

}
output {
elasticsearch {
hosts => "localhost:9200"
index => "hpsm"

}

stdout {}

}

logstash.log in debug mode::

[2019-05-10T10:26:13,872][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"rubydebug", :type=>"codec", :class=>LogStash::Codecs::RubyDebug}
[2019-05-10T10:26:13,887][DEBUG][logstash.codecs.rubydebug] config LogStash::Codecs::RubyDebug/@id = "rubydebug_f0052972-c704-47b8-8ebf-6207fa194d54"
[2019-05-10T10:26:13,887][DEBUG][logstash.codecs.rubydebug] config LogStash::Codecs::RubyDebug/@enable_metric = true
[2019-05-10T10:26:13,887][DEBUG][logstash.codecs.rubydebug] config LogStash::Codecs::RubyDebug/@metadata = false
[2019-05-10T10:26:15,338][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2019-05-10T10:26:15,432][DEBUG][logstash.outputs.stdout ] config LogStash::Outputs::Stdout/@id = "2cdf91d5a41f9dd9d5506650518b698f23e3f5e6528a2e6389c522aaa20c79d0"
[2019-05-10T10:26:15,432][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2019-05-10T10:26:15,447][DEBUG][logstash.outputs.stdout ] config LogStash::Outputs::Stdout/@enable_metric = true
[2019-05-10T10:26:15,447][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2019-05-10T10:26:15,447][DEBUG][logstash.outputs.stdout ] config LogStash::Outputs::Stdout/@codec = <LogStash::Codecs::RubyDebug id=>"rubydebug_f0052972-c704-47b8-8ebf-6207fa194d54", enable_metric=>true, metadata=>false>
[2019-05-10T10:26:15,447][DEBUG][logstash.outputs.stdout ] config LogStash::Outputs::Stdout/@workers = 1
[2019-05-10T10:26:15,494][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2019-05-10T10:26:15,572][DEBUG][logstash.outputs.elasticsearch] Normalizing http path {:path=>nil, :normalized=>nil}
[2019-05-10T10:26:16,056][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://localhost:9200/]}}
[2019-05-10T10:26:16,071][DEBUG][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2019-05-10T10:26:16,290][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2019-05-10T10:26:16,336][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2019-05-10T10:26:16,336][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>6}
[2019-05-10T10:26:16,368][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2019-05-10T10:26:16,383][INFO ][logstash.outputs.elasticsearch] Using default mapping template
[2019-05-10T10:26:16,383][DEBUG][logstash.filters.csv ] CSV parsing options {:col_sep=>",", :quote_char=>"""}
[2019-05-10T10:26:16,414][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2019-05-10T10:26:16,508][DEBUG][logstash.outputs.elasticsearch] Found existing Elasticsearch template. Skipping template management {:name=>"logstash"}
[2019-05-10T10:26:17,132][INFO ][logstash.inputs.file ] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"D:/Naresh/ELK/logstash-6.7.1/logstash-6.7.1/data/plugins/inputs/file/.sincedb_aab74b5297e068e48e79b8faf28a80cb", :path=>["C:\Users\260210\Naresh\ELK\data\test.csv"]}
[2019-05-10T10:26:17,179][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x1682afe1 run>"}
[2019-05-10T10:26:17,241][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2019-05-10T10:26:17,241][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2019-05-10T10:26:17,272][DEBUG][logstash.agent ] Starting puma
[2019-05-10T10:26:17,288][DEBUG][logstash.agent ] Trying to start WebServer {:port=>9600}
[2019-05-10T10:26:17,366][DEBUG][logstash.api.service ] [api-service] start
[2019-05-10T10:26:17,631][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

In windows you need to give as shown below,
path => "C:/Users/260210/Naresh/ELK/data/test.csv"

Thanks it works!!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.