Logstash not showing generated output

The pipeline file name is newstash.conf

        path => [ "/etc/logstash/sample.txt" ]

    output {

The sample.txt file has some string only.

The output is like this only.

# logstash --path.settings=/etc/logstash -f /etc/logstash/newstash.conf
Using bundled JDK: /usr/share/logstash/jdk
OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
Sending Logstash logs to /var/log/logstash which is now configured via log4j2.properties
[2021-06-02T00:26:32,912][INFO ][logstash.runner          ] Log4j configuration path used is: /etc/logstash/log4j2.properties
[2021-06-02T00:26:32,929][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.13.0", "jruby.version"=>"jruby (2.5.7) 2021-03-03 f82228dc32 OpenJDK 64-Bit Server VM 11.0.10+9 on 11.0.10+9 +indy +jit [linux-x86_64]"}
[2021-06-02T00:26:33,487][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2021-06-02T00:26:35,306][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2021-06-02T00:26:36,054][INFO ][org.reflections.Reflections] Reflections took 39 ms to scan 1 urls, producing 24 keys and 48 values
[2021-06-02T00:26:37,063][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>250, "pipeline.sources"=>["/etc/logstash/newstash.conf"], :thread=>"#<Thread:0x13b75cb9 run>"}
[2021-06-02T00:26:37,863][INFO ][logstash.javapipeline    ][main] Pipeline Java execution initialization time {"seconds"=>0.8}
[2021-06-02T00:26:38,123][INFO ][logstash.inputs.file     ][main] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/etc/logstash/plugins/inputs/file/.sincedb_62f3cbc77d9db9fa307e825263c60826", :path=>["/etc/logstash/sample.txt"]}
[2021-06-02T00:26:38,140][INFO ][logstash.javapipeline    ][main] Pipeline started {"pipeline.id"=>"main"}
[2021-06-02T00:26:38,204][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2021-06-02T00:26:38,278][INFO ][filewatch.observingtail  ][main][f880caa4fd4eaba258b92e6df7499886dc59ac0d1916c75599a5e498bc7c20fc] START, creating Discoverer, Watch with file and sincedb collections


I think it's just the logstash caching tricking you add this in your file directive

file {
    sincedb_path => "/dev/null"
    start_position => "beginning"

Maybe take a look at this : File input plugin | Logstash Reference [7.13] | Elastic

Thank You very much @grumo35 .

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.