Good morning,
I have a process on my server where is installed Logstash which generates some data files in an output folder. I have configured my conf file so that Logstash 7 collect these data as soon they are published and push them into an Elasticsearch server.
Bellow is a very simple implementation of the way to do it:
pipelines.yml:
- pipeline.id: test
path.config: "C:\\logstash-7.0.0\\config\\conf\\test.conf"
test.conf:
input {
file {
mode => read
path => "C:/output/test.txt"
file_completed_action => delete
sincedb_path => "nul"
}
}
output {
stdout {}
}
When a start logstash then create a test.txt file, it is successfully read. But if I create it once again, it is never read anymore.
I tried to specify a real sincedb path then delete the content file to force the process to read again my file, but it doesn't work too.
Logs:
2019-04-12T09:46:14,623][DEBUG][logstash.javapipeline ] Starting pipeline {:pipeline_id=>"test"}
[2019-04-12T09:46:14,645][INFO ][logstash.javapipeline ] Starting pipeline {:pipeline_id=>"test", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, :thread=>"#<Thread:0x118fa46a run>"}
[2019-04-12T09:46:15,296][INFO ][logstash.javapipeline ] Pipeline started {"pipeline.id"=>"test"}
[2019-04-12T09:46:15,342][DEBUG][logstash.javapipeline ] Pipeline started successfully {:pipeline_id=>"test", :thread=>"#<Thread:0x118fa46a run>"}
[2019-04-12T09:46:15,343][DEBUG][org.logstash.execution.PeriodicFlush] Pushing flush onto pipeline.
[2019-04-12T09:46:15,453][INFO ][filewatch.observingread ] START, creating Discoverer, Watch with file and sincedb collections
[2019-04-12T09:46:15,550][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:test], :non_running_pipelines=>[]}
[2019-04-12T09:46:15,596][DEBUG][logstash.agent ] Starting puma
[2019-04-12T09:46:15,649][DEBUG][logstash.agent ] Trying to start WebServer {:port=>9600}
[2019-04-12T09:46:14,743][DEBUG][org.logstash.config.ir.CompiledPipeline] Compiled output
P[output-stdout{}|[str]pipeline:11:5:```
stdout {}
```]
into
org.logstash.config.ir.compiler.ComputeStepSyntaxElement@c96a6eaa
[2019-04-12T09:46:14,728][DEBUG][org.logstash.config.ir.CompiledPipeline] Compiled output
P[output-stdout{}|[str]pipeline:11:5:```
stdout {}
```]
into
org.logstash.config.ir.compiler.ComputeStepSyntaxElement@c96a6eaa
[2019-04-12T09:46:14,743][DEBUG][org.logstash.config.ir.CompiledPipeline] Compiled output
P[output-stdout{}|[str]pipeline:11:5:```
stdout {}
```]
into
org.logstash.config.ir.compiler.ComputeStepSyntaxElement@c96a6eaa
[2019-04-12T09:46:14,743][DEBUG][org.logstash.config.ir.CompiledPipeline] Compiled output
P[output-stdout{}|[str]pipeline:11:5:```
stdout {}
```]
into
org.logstash.config.ir.compiler.ComputeStepSyntaxElement@c96a6eaa
[2019-04-12T09:46:15,873][DEBUG][logstash.api.service ] [api-service] start
[2019-04-12T09:46:16,306][INFO ][filewatch.readmode.handlers.readfile] buffer_extract: a delimiter can't be found in current chunk, maybe there are no more delimiters or the delimiter is incorrect or the text before the delimiter, a 'line', is very large, if this message is logged often try increasing the `file_chunk_size` setting. {"delimiter"=>"\n", "read_position"=>0, "bytes_read_count"=>12, "last_known_file_size"=>12, "file_path"=>"C:/output/test.txt"}
[2019-04-12T09:46:16,322][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2019-04-12T09:46:16,368][DEBUG][filewatch.sincedbcollection] writing sincedb (delta since last write = 1555055176)
[2019-04-12T09:46:16,415][DEBUG][logstash.inputs.file ] Received line {:path=>"C:/output/test.txt", :text=>"CONTENT TEST"}
And then it repetitively shows these logs:
[2019-04-12T09:46:28,203][DEBUG][logstash.instrument.periodicpoller.cgroup] One or more required cgroup files or directories not found: /proc/self/cgroup, /sys/fs/cgroup/cpuacct, /sys/fs/cgroup/cpu
[2019-04-12T09:46:28,528][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2019-04-12T09:46:28,529][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
Thanks for your help.