Logstash service discover_files {:count=>0}

hello,

i have an issue i can't figure out. I am also not sure how logstash sincedb_path works too so if someone could help me understand - I'd highly appreciate.

I am running ELK on RedHat Linux virtual machine. If i am runing logstash file in terminal:

sudo /usr/share/logstash/bin/logstash  -f /etc/logstash/conf.d/new.conf 

everything works and I even see logs in kibana which is awesome btw.

However, if i try to run it as a service everything stops here with info logging option:

[2022-10-11T22:42:26,668][INFO ][filewatch.observingtail  ][main][08eaaedb6ad66113aed8069a590d6833a9770ddcf4e214fd8b2500e546978c6e] START, creating Discoverer, Watch with file and sincedb collections
[2022-10-11T22:42:26,681][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}

trace logging option gives me this:

[2022-10-11T23:16:07,000][TRACE][filewatch.discoverer     ][main][08eaaedb6ad66113aed8069a590d6833a9770ddcf4e214fd8b2500e546978c6e] discover_files {:count=>0}
[2022-10-11T23:16:07,755][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2022-10-11T23:16:08,001][DEBUG][filewatch.sincedbcollection][main][08eaaedb6ad66113aed8069a590d6833a9770ddcf4e214fd8b2500e546978c6e] writing sincedb (delta since last write = 15)
[2022-10-11T23:16:08,001][TRACE][filewatch.sincedbcollection][main][08eaaedb6ad66113aed8069a590d6833a9770ddcf4e214fd8b2500e546978c6e] sincedb_write: /dev/null (time = 2022-10-11 23:16:08 +0300)
[2022-10-11T23:16:08,001][TRACE][filewatch.sincedbcollection][main][08eaaedb6ad66113aed8069a590d6833a9770ddcf4e214fd8b2500e546978c6e] non_atomic_write:  {:time=>2022-10-11 23:16:08 +0300}
[2022-10-11T23:16:09,888][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2022-10-11T23:16:09,888][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2022-10-11T23:16:12,755][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2022-10-11T23:16:14,892][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2022-10-11T23:16:14,892][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2022-10-11T23:16:17,755][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2022-10-11T23:16:19,896][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2022-10-11T23:16:19,896][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2022-10-11T23:16:22,005][TRACE][filewatch.discoverer     ][main][08eaaedb6ad66113aed8069a590d6833a9770ddcf4e214fd8b2500e546978c6e] discover_files {:count=>0}
[2022-10-11T23:16:22,755][DEBUG][org.logstash.execution.PeriodicFlush][main] Pushing flush onto pipeline.
[2022-10-11T23:16:23,006][DEBUG][filewatch.sincedbcollection][main][08eaaedb6ad66113aed8069a590d6833a9770ddcf4e214fd8b2500e546978c6e] writing sincedb (delta since last write = 15)
[2022-10-11T23:16:23,006][TRACE][filewatch.sincedbcollection][main][08eaaedb6ad66113aed8069a590d6833a9770ddcf4e214fd8b2500e546978c6e] sincedb_write: /dev/null (time = 2022-10-11 23:16:23 +0300)

which I guess means that it can't find my json file. and also
writing sincedb (delta since last write = 15) value doesn't
change.

I have tried changing ownership of files to logstash user but still no luck.
Maybe someone had a similar issue and could help me out.

Thanks in advance.

M.

You need to share your configuration.

Also, share the pipelines.yml file.

new.conf:

input {
file {
path => "/home/mariussur/logstash/c.json"
codec => "json"
sincedb_path => "/dev/null"
start_position => "beginning"
}
}

filter {}

output {
stdout { codec => rubydebug }
elasticsearch {
hosts => ["http://***.**.*.**:9200/"]
user => "*******"
password => "*******"
index => "some-tag"
action => "create"
}
}

pipelines.yml:

- pipeline.id: main
  path.config: "/etc/logstash/conf.d/*.conf"

When you run as a service, Logstash will run under the logstash user, which will not have permissions to read this file.

Giving permissions to the file c.json will not solve the issue because the logstash user would still not have permissions to the path and I do not recommend changing the permissions of a user path.

You should put the file in another path where the Logstash user will have permissions to read it.

Brilliant, thank you so much.

Maybe you could help me understand how sincedb works as well?
I have tried adding new lines to c.json file while the service is running but
it does not detect change.

[2022-10-12T09:32:21,829][TRACE][filewatch.tailmode.processor][main][bae877d8831ddf84ad30d1d61739ea69f9829aa6d10cdae2e5542394d7ed1f4c] process_restat_for_watched_and_active
[2022-10-12T09:32:21,830][TRACE][filewatch.tailmode.processor][main][bae877d8831ddf84ad30d1d61739ea69f9829aa6d10cdae2e5542394d7ed1f4c] process_rotation_in_progress
[2022-10-12T09:32:21,830][TRACE][filewatch.tailmode.processor][main][bae877d8831ddf84ad30d1d61739ea69f9829aa6d10cdae2e5542394d7ed1f4c] process_watched
[2022-10-12T09:32:21,830][TRACE][filewatch.tailmode.processor][main][bae877d8831ddf84ad30d1d61739ea69f9829aa6d10cdae2e5542394d7ed1f4c] process_active
[2022-10-12T09:32:21,830][TRACE][filewatch.tailmode.processor][main][bae877d8831ddf84ad30d1d61739ea69f9829aa6d10cdae2e5542394d7ed1f4c] process_active no change {:path=>"c.json"}

Already figured out, that you have to truncate c.json before adding new data and then it works as expected.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.