Problem in importing csv with logstrash

i have read a lot of similar problems in this nice forum , but non of them fit my situation

my .conf file :

input {
      file {
        path => "E:/test/*.csv"
        start_position => "beginning"
        sincedb_path => "NUL"
      }
    }
        filter {
          csv {
            separator => ","
        columns => ["phone","id","name"]
      }
    }

output {
  elasticsearch {
    hosts => "http://localhost:9200"
    index => "test"
  }
  stdout {
    codec => rubydebug
  }
}

'''
i have also tried sincedb_path => "NULL" and checked my file permissions and checked the forward slashes
but i always get stucked here

[2021-04-16T08:35:04,132][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
    [2021-04-16T08:35:04,133][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
    [2021-04-16T08:35:07,057][DEBUG][org.logstash.execution.PeriodicFlush] Pushing flush onto pipeline.
    [2021-04-16T08:35:08,673][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu

i have tried to feed the data just from the stdin without the file path , and it worked

i have also tried on my windows machine and my ubuntu vps , and i couldn't import data to both machines
i really hope i can find a solution soon

Enable --log.level trace and review the message from the filewatch module. See here.

thanks a lot for your fast response

here is the loop i am in for a long time without adding any index to my elasticsearch

i typed logstash -f test.conf --log.level trace

and that what appears in the console :

[2021-04-16T20:46:40,279][DEBUG][org.logstash.execution.PeriodicFlush] Pushing flush onto pipeline.                                            [2021-04-16T20:46:40,573][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}                                      [2021-04-16T20:46:40,576][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}                         [2021-04-16T20:46:40,653][TRACE][filewatch.discoverer     ] discover_files {"count"=>0}                                                        [2021-04-16T20:46:44,435][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu

If your configuration includes a file input then there will always be trace level messages from filewatch. Did you redirect the output to a file and search for them?

i am sorry
but what does it mean to redirect the output to a file ?

Try

logstash -f test.conf --log.level trace > logstash.log

[TRACE][filewatch.discoverer ] discover_files {"count"=>0}

It is finding zero files that match that pattern. Is the E: drive mounted where you are running logstash, is the test directory readable by the user running logstash?

i have also tried to copy the csv file to the C: drive where logstash is located also logstash is running as administrator but still getting the same results
[filewatch.discoverer ] discover_files {"count"=>0}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.