Logstash is not outputing my file with or without grok

Hello,

I'm trying to set a simple .txt file as an input.
And I want a parsed .txt for the output/result.
Inbetween, I'm using a Grok filter to match a specific pattern.
This is a simple test and it doesn't work :

My.conf :

input {
  file {
    path => "C:/Users/x/Desktop/logstash-7.2.0/bin/test.txt"
    start_position => "beginning" 
   #sincedb_path => "NUL"
  }
}

filter {
grok {
  match => {
    "message" => "MIAM(%{GREEDYDATA:test_filter})"
  }
  remove_field => "message"
}
}

output {
 file {
   path => "C:/Users/x/Desktop/logstash-7.2.0/bin/test3.txt"
 }
 stdout { codec => rubydebug }
}

Started using : logstash -f my.conf

C:/Users/x/Desktop/logstash-7.2.0/bin/test.txt content :
MIAM(DONE)

C:/Users/x/Desktop/logstash-7.2.0/bin/test3.txt is not created at all.

An intersting fact is that if I use input { stdin {} } instead of a file as an input everything would work fine I would just have to type MIAM(test) from the console, it'll output the expected result with the expected file test3.txt containing : test_filter : DONE so I think the problem is from the input

OS : Windows 10 x64

Output :

Sending Logstash logs to C:/Users/x/Desktop/logstash-7.2.0/logs which is now configured via log4j2.properties
[2019-07-09T17:09:38,774][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2019-07-09T17:09:38,787][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.2.0"}
[2019-07-09T17:09:45,999][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge] A gauge metric of an unknown type (org.jruby.RubyArray) has been create for key: cluster_uuids. This may result in invalid serialization. It is recommended to log an issue to the responsible developer/development team.
[2019-07-09T17:09:46,004][INFO ][logstash.javapipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>12, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1500, :thread=>"#<Thread:0x2218960d run>"}
[2019-07-09T17:09:46,777][INFO ][logstash.javapipeline ] Pipeline started {"pipeline.id"=>"main"}
[2019-07-09T17:09:46,833][INFO ][filewatch.observingtail ] START, creating Discoverer, Watch with file and sincedb collections
[2019-07-09T17:09:46,836][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2019-07-09T17:09:47,393][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

Thanks :elasticheart: :arrow_backward:

That only does anything the first time the file input sees a file. Thereafter it checks the sincedb to see how much of the file it has read.

Do you get events if you append lines to the file?

1 Like

It works great when I modify it while Logstash is running ! Thanks.
I was excepting that it would read / parse the file right after I started Logstash
Is there a way to have that kind of behavior ?

What I read on the doc is :

Choose where Logstash starts initially reading files: at the beginning or at the end. The default behavior treats files like live streams and thus starts at the end. If you have old data you want to import, set this to beginning . (About start_position)

Why do I have to "reload" the log file ? Or refresh its content... while Logstash is running in order to have my output ? Why can't it do it at start ?

To be honest, this is my first five hours on the Elastic Stack, I may have misunderstood/misread something.

If you uncomment this then it will read the file from the beginning every time logstash starts.

Without that option, logstash reads the file from the beginning the first time it sees it, and persists how much of the file it has read in the sincedb. When it restarts it starts tailing the file from point.

If there is some problem with your output the first time it runs, it still persists the fact that it has read the file, so when you fix the output it does not re-read it.

1 Like

Thanks Badger :wink:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.