Can't send CSV data to elastic using logstash

Hi there,

I m new to ELK and I m facing this issue since two days. I couldn't send CSV datas to elastic with logstash.

I m running ELK on windows 10, I m using wamp server (note sure if this is relevant)

I've read a lot of threads before posting here, please note that I've tried with different config, and since_db values.

that is the csv data that I've used tu run my tests

"name","gender"
"john","m"
"alicia","f"
"dan","m"
"sara","f" 

That is the last config I've used (since_db "nul" as I m on windows)

input {
  file {
    path => "D:\myuser\Clients\folder\retest.csv"
    start_position => "beginning"
	sincedb_path => "nul"
  }
}

filter {
  csv {
    separator => ","
    columns => ["name","gender"]
  }
}
output {
   elasticsearch {
     hosts => ["localhost:9200"]
     index => "blablaidx"
  }
	
	stdout { codec => dots }
}

and these are some (truncated) debug datas

[2018-11-29T12:50:51,488][INFO ][logstash.pipeline        ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-11-29T12:50:51,541][DEBUG][logstash.outputs.elasticsearch] Normalizing http path {:path=>nil, :normalized=>nil}
[2018-11-29T12:50:51,886][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2018-11-29T12:50:51,894][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2018-11-29T12:50:52,045][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2018-11-29T12:50:52,100][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-11-29T12:50:52,105][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2018-11-29T12:50:52,137][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
[2018-11-29T12:50:52,161][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-11-29T12:50:52,161][DEBUG][logstash.filters.csv     ] CSV parsing options {:col_sep=>",", :quote_char=>"\""}
[2018-11-29T12:50:52,185][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-11-29T12:50:52,234][DEBUG][logstash.outputs.elasticsearch] Found existing Elasticsearch template. Skipping template management {:name=>"logstash"}
[2018-11-29T12:50:52,594][INFO ][logstash.pipeline        ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x2a1ae7ac sleep>"}
[2018-11-29T12:50:52,653][INFO ][filewatch.observingtail  ] START, creating Discoverer, Watch with file and sincedb collections
[2018-11-29T12:50:52,654][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2018-11-29T12:50:52,687][DEBUG][logstash.agent           ] Starting puma
[2018-11-29T12:50:52,703][DEBUG][logstash.agent           ] Trying to start WebServer {:port=>9600}
[2018-11-29T12:50:52,760][DEBUG][logstash.api.service     ] [api-service] start
[2018-11-29T12:50:52,955][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2018-11-29T12:50:53,960][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2018-11-29T12:50:54,206][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-11-29T12:50:54,209][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-11-29T12:50:57,604][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x2a1ae7ac sleep>"}
[2018-11-29T12:50:58,981][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2018-11-29T12:50:59,220][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-11-29T12:50:59,221][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-11-29T12:51:02,610][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x2a1ae7ac sleep>"}
[2018-11-29T12:51:03,987][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2018-11-29T12:51:04,239][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-11-29T12:51:04,240][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-11-29T12:51:07,613][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x2a1ae7ac sleep>"}
[2018-11-29T12:51:08,992][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu

I don't know if I've made some mistake but any help will be muche appriciated. Thank you

change path,
D:/myuser/Clients/gateaucreation/retest.csv

try like this

input {
file {
path => "D:/Balu/ELK-stack/csvfile.csv"
start_position => "beginning"
sincedb_path => "NUL"
}
}
filter {
csv {
separator => ","
columns => ["open","high","low","close","volume"]
}
}
output {
elasticsearch {
hosts => "http://localhost:9200"
index => "wallet-address-index"
}
stdout { codec => rubydebug }
}

1 Like

Hey thank you

I mistakenely (due to akismet) duplicated that post and the issue was solved with the help of a community member. Yes the problem was coming from my path. Thank you very much for taken the time to answer :wink:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.