'''
i have also tried sincedb_path => "NULL" and checked my file permissions and checked the forward slashes
but i always get stucked here
[2021-04-16T08:35:04,132][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2021-04-16T08:35:04,133][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2021-04-16T08:35:07,057][DEBUG][org.logstash.execution.PeriodicFlush] Pushing flush onto pipeline.
[2021-04-16T08:35:08,673][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
i have tried to feed the data just from the stdin without the file path , and it worked
i have also tried on my windows machine and my ubuntu vps , and i couldn't import data to both machines
i really hope i can find a solution soon
here is the loop i am in for a long time without adding any index to my elasticsearch
i typed logstash -f test.conf --log.level trace
and that what appears in the console :
[2021-04-16T20:46:40,279][DEBUG][org.logstash.execution.PeriodicFlush] Pushing flush onto pipeline. [2021-04-16T20:46:40,573][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"} [2021-04-16T20:46:40,576][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"} [2021-04-16T20:46:40,653][TRACE][filewatch.discoverer ] discover_files {"count"=>0} [2021-04-16T20:46:44,435][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
If your configuration includes a file input then there will always be trace level messages from filewatch. Did you redirect the output to a file and search for them?
It is finding zero files that match that pattern. Is the E: drive mounted where you are running logstash, is the test directory readable by the user running logstash?
i have also tried to copy the csv file to the C: drive where logstash is located also logstash is running as administrator but still getting the same results [filewatch.discoverer ] discover_files {"count"=>0}
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.