Logstash start successfully but no result

I really need help importing my log files.
If I create a file right now, this will be imported successfully, but if I try to import a file that already exists since about 11 hours, logstash will not show any output!!
i get this :

[INFO ] 2020-06-18 09:45:24.043 [Converge PipelineAction::Create<main>] pipeline - Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x698e3f67 run>"}
[INFO ] 2020-06-18 09:45:24.097 [[main]<file] observingtail - START, creating Discoverer, Watch with file and sincedb collections
[INFO ] 2020-06-18 09:45:24.114 [Agent thread] agent - Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[INFO ] 2020-06-18 09:45:24.588 [Api Webserver] agent - Successfully started Logstash API endpoint {:port=>9600}

and my config file is :

        file {
        path => ["/home/...../Gest/*"]
                start_position => beginning
                ignore_older => 86400
                codec => multiline {
                        charset => "BINARY"
                        pattern => "^%{YEAR}%{MONTHDAY}%{MONTHNUM2}"
                        negate => true
                        what => previous


Maybe the file was already read before and won't give any new results. Add the below to test that.

sincedb_path => "/dev/null"

Or you can just add a line entry in the file to see if it just reads that single one.

I tried this but I get same result

        file {
        path => ["/home/user_ftp/..../*.txt"]
                start_position => "beginning"
                ignore_older => 0
                sincedb_path => "/dev/null"
                codec => multiline {
                        charset => "BINARY"
                        pattern => "^%{YEAR}%{MONTHDAY}%{MONTHNUM2}"
                        negate => true
                        what => previous

I just want to specify that this configuration works with the test files I create myself, and I didn't perform an import on the current folder .

If you remove the multiline and test what happens?

I've commented every line, but still no output

 #codec => multiline {
                #       charset => "BINARY"
                #       pattern => "^%{YEAR}%{MONTHDAY}%{MONTHNUM2}"
                #       negate => true
                #       what => previous

In filebeat, setting ignore_older to zero tells filebeat not to filter files based on age.

In a logstash file input, setting ignore_older to zero tells the input to ignore any files more than zero seconds old, which is every file.

1 Like

I think it works now, but it was just a test (I left some commented lines) so I need to import it again. do I need to change sincedb_path ?

If you want to re-read files you could stop logstash and remove entries from the sincedb, or point sincedb_path to a new location to completely start over.

1 Like

is there another way to stop logstash other than ctrl+c


After the last line, logstash still waiting so i can't execute systemctl stop logstash

      "@version" => "1",
          "path" => "/home/.../.txt",
           "seq" => 374167,
         "count" => "1",
     "timestamp" => "20201806\t00:07:09.993",
    "@timestamp" => 2020-06-18T00:07:09.993Z,
          "data" => "MPC CarteAbsente A:3,I:1,P:0,R:0,E:",
       "message" => "20201806\t00:07:09.993\tMPC CarteAbsente A:3,I:1,P:0,R:0,E:1\\n"

My problem is not really solved:
I can retrieve yesterday's logs with ignore_older => 86400, but with ignore_older => 7200 no output !

Have you removed the ignore_older parameter and tried it? It doesn't sound like you even need it.

I have a lot of files, and I only need the log that was generated today, that's why I'm using ignore_older

Sometimes I success to import the file and other times not.
Sometimes I change the path or ignore_older.
But I don't really understand this situation!!

Is it an option for you to create a folder where only the files you want imported go? Then just remove the ignore parameter? So every file currently in that folder and all new ones will get processed?

I don't understand what's going on either.


You can reload logstash configuration using a SIGHUP signal, so as you specified to gather lines from the begining it shoud ingest it again.

To do so, you can just do a kill -1 to logstash's pid, for instance:
ps -ef |grep [l]ogstash | awk -F' ' '{print $2}' | xargs -I {} kill -1 {}

If it's not enough, you should delete the sincedb files used yo track the parser position in the file, using a find -rm .

1 Like

I didn't think about moving the already imported files to another directory, I thought I could do it with ignore_older

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.