I have following code.
input {
file {
path => "C:\logstash-7.2.0\samples\test_10.log"
start_position => "beginning"
ignore_older => 0
}
}
filter {
}
output {
stdout { codec => rubydebug}
}
I ran in debug mode after deleting sincedb file from default location.
I am constantly getting following message in the log files:
[2019-07-30T21:59:19,509][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2019-07-30T21:59:19,722][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2019-07-30T21:59:19,723][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2019-07-30T21:59:21,970][DEBUG][org.logstash.execution.PeriodicFlush] Pushing flush onto pipeline.
THanks both of you. I implemented both the suggestions and getting follows:
[2019-07-31T09:46:54,623][TRACE][filewatch.sincedbcollection] associate: finding {"inode"=>"3054569709-95108-11403264", "path"=>"C:/logstash-7.2.0/samples/test_10.log"}
[2019-07-31T09:46:54,629][TRACE][filewatch.sincedbcollection] associate: unmatched
[2019-07-31T09:46:54,647][TRACE][filewatch.discoverer ] Discoverer discover_files: C:/logstash-7.2.0/samples/test_10.log: skipping because it was last modified more than 0.0 seconds ago
Not able to understand the disconnect. I am deleteing the sincedb file before every run.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.