Logstash starts but does not pass csv data to index

Hey all,
I am trying new to Logstash and the ELK stack altogether and am trying to accomplish a simple task- reading data from a CSV file into elastic search via Logstash. I followed the tutorial and it seems as if Logstash is starting a pipeline with no errors, but it doesn't read the CSV file. Below is my config file:

input{
file{
path => ["C:\toolkit\ELKstack\SamplData\SalesJan2009.csv"]
start_position => "beginning"
sincedb_path => "NUL"
ignore_older => 0
}
}
filter{
csv{
separator => ","
columns =>["TransactionID","Product","Payment_Type","Name","City","State","Country"]
}
mutate{convert => ["TransactionID","integer"]
}
}
output{
elasticsearch{
hosts => "127.0.0.1:9200"
index => "customer"
document_type => "customer"
document_id => "%{id}"
workers => 1
}
stdout{}
}

This is what Logstash outputs:

C:\toolkit\ELKstack\logstash-6.4.2\bin>logstash -f C:\toolkit\ELKstack\logstash-6.4.2\config\samplCof.config

Sending Logstash logs to C:/toolkit/ELKstack/logstash-6.4.2/logs which is now configured via log4j2.properties
[2018-10-12T11:35:20,929][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-10-12T11:35:21,471][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.4.2"}
[2018-10-12T11:35:23,890][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch index=>"customer", id=>"e8eafe54e92f8ada95b3e0c687a4b32406bcd43e331f6660727e42e74e05d2ca", document_id=>"%{id}", workers=>1, hosts=>[//127.0.0.1:9200], document_type=>"customer", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_560b3d5f-2b6b-4e1f-92aa-511f29f3db58", enable_metric=>true, charset=>"UTF-8">, manage_template=>true, template_name=>"logstash", template_overwrite=>false, doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64, retry_on_conflict=>1, action=>"index", ssl_certificate_verification=>true, sniffing=>false, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false>}
[2018-10-12T11:35:25,422][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-10-12T11:35:25,820][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://127.0.0.1:9200/]}}
[2018-10-12T11:35:25,830][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://127.0.0.1:9200/, :path=>"/"}
[2018-10-12T11:35:25,979][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://127.0.0.1:9200/"}
[2018-10-12T11:35:26,027][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-10-12T11:35:26,030][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>6}
[2018-10-12T11:35:26,058][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//127.0.0.1:9200"]}
[2018-10-12T11:35:26,079][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-10-12T11:35:26,102][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-10-12T11:35:26,618][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x774bb6f5 run>"}
[2018-10-12T11:35:26,671][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2018-10-12T11:35:26,684][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2018-10-12T11:35:26,987][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

Should it not display the data it reads after it starts? Please advise
Oh and I am using Windows 10 64 bit.

Thanks,
Vlad

UPDATE:

I ran in debug and it seems I am stuck a loop after logstash starts:

[2018-10-16T15:42:08,879][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-10-16T15:42:08,879][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-10-16T15:42:10,752][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x2d1b5d15 sleep>"}
[2018-10-16T15:42:13,756][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu

Has anyone ran into this before? Help would be greatly appreciated

Thanks in advance

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.