I m failing in getting csv data and index those on elastic using logstash on Windows

hi everybody, please apolgize me for my english

I’m new to ELK, running on Windows 10 for the moment. I'm using wamp server (not sure if that's a relevant informations).

It’s two days now that I’m facing an issue that i tried to resole by reading a lot of threads about similar problems. But no ways, I m probably missing something that's why I m asking for help please :slight_smile:

I succed in making the conexion between mysql data and elastic with logstash and creating indecies, but no way to achieve that starting from a CSV file.

In order to help who can helps me I m posting the content csv sample that i m trying to index on elastic, the logstash config, and the debug lines obtained with --debug

For informations I've found a lot of threads discussing the issue, especially things about since_db. Please bear in mind that I’ve tried nearly all the different configs, ignoring since_db, using "nul" as since_db value (as I m running logstash on windows), I’ve also tried dev/null, but no ways to make it work, here is my config

Any help would be much appreciated, thank you !

The CSV Sample


config file

   input {
  file {
    path => "D:\myuser\Clients\testfolder\retest.csv"
    start_position => "beginning"
	sincedb_path => "nul"

filter {
  csv {
    separator => ","
    columns => ["name","gender"]
output {
   elasticsearch {
     hosts => ["localhost:9200"]
     index => "blablaidx"
	stdout { codec => dots }

here are some debug lines (truncated)

 [2018-11-29T12:50:52,594][INFO ][logstash.pipeline        ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x2a1ae7ac sleep>"}
[2018-11-29T12:50:52,653][INFO ][filewatch.observingtail  ] START, creating Discoverer, Watch with file and sincedb collections
[2018-11-29T12:50:52,654][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2018-11-29T12:50:52,687][DEBUG][logstash.agent           ] Starting puma
[2018-11-29T12:50:52,703][DEBUG][logstash.agent           ] Trying to start WebServer {:port=>9600}
[2018-11-29T12:50:52,760][DEBUG][logstash.api.service     ] [api-service] start
[2018-11-29T12:50:52,955][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2018-11-29T12:50:53,960][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2018-11-29T12:50:54,206][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-11-29T12:50:54,209][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-11-29T12:50:57,604][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline {:pipeline_id=>"main", :thread=>"#<Thread:0x2a1ae7ac sleep>"}

Any help would be much appreciated, thank you :slight_smile:

Try this:

file {
    path => "D:/myuser/Clients/testfolder/retest.csv"
    start_position => "beginning"
	sincedb_path => "NUL"
1 Like

Hi Lewis

thank you so much for the help, it worked, now i I have my data indexed, not with the correct data_type but that's a first good step. Thank you again :slight_smile:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.