Logstash reading the input files but not parsing it

logstash is reading input files but it is not displaying anything on the output. No errors are generated as well.

when i do --verbose it gives me following output

    Sending Logstash's logs to /usr/local/Cellar/logstash/6.2.4/libexec/logs which is now configured via log4j2.properties
    [2018-08-14T10:24:09,278][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/usr/local/Cellar/logstash/6.2.4/libexec/modules/netflow/configuration"}
    [2018-08-14T10:24:09,305][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/usr/local/Cellar/logstash/6.2.4/libexec/modules/fb_apache/configuration"}
    [2018-08-14T10:24:09,809][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
    [2018-08-14T10:24:10,692][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"6.2.4"}
    [2018-08-14T10:24:11,256][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
    [2018-08-14T10:24:28,028][INFO ][logstash.pipeline        ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
    [2018-08-14T10:24:28,797][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
    [2018-08-14T10:24:28,827][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
    [2018-08-14T10:24:29,172][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
    [2018-08-14T10:24:29,268][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
    [2018-08-14T10:24:29,274][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
    [2018-08-14T10:24:29,315][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
    [2018-08-14T10:24:29,343][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
    [2018-08-14T10:24:29,345][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
    [2018-08-14T10:24:29,356][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
    [2018-08-14T10:24:29,371][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
    [2018-08-14T10:24:29,372][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
    [2018-08-14T10:24:29,377][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost:9200"]}
    [2018-08-14T10:24:30,782][INFO ][logstash.pipeline        ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x57ecfb15 run>"}
    [2018-08-14T10:24:31,070][INFO ][logstash.agent           ] Pipelines running {:count=>1, :pipelines=>["main"]}

And nothing after that and when i do --debug it gives me very long output but following as the most important lines

    [2018-08-14T09:51:48,833][INFO ][logstash.agent           ] Pipelines running {:count=>1, :pipelines=>["main"]}
    [2018-08-14T09:51:49,044][DEBUG][logstash.inputs.file     ] _open_file: /Users/mine/Desktop/New_Data/input/log.txt: opening
    [2018-08-14T09:51:49,048][DEBUG][logstash.inputs.file     ] _open_file: /Users/mine/Desktop/New_Data/input/log2: opening
    [2018-08-14T09:51:49,065][DEBUG][logstash.inputs.file     ] /Users/mine/Desktop/New_Data/input/log.txt: initial create, no sincedb, seeking to end 239876960
    [2018-08-14T09:51:49,065][DEBUG][logstash.inputs.file     ] /Users/mine/Desktop/New_Data/input/log2: initial create, no sincedb, seeking to end 570524675
    [2018-08-14T09:51:49,113][DEBUG][logstash.inputs.file     ] each: file grew: /Users/mine/Desktop/New_Data/input/log2.txt: old size 0, new size 570524675
    [2018-08-14T09:51:49,122][DEBUG][logstash.inputs.file     ] each: file grew: /Users/mine/Desktop/New_Data/input/log.txt: old size 0, new size 239876960
    [2018-08-14T09:51:50,125][DEBUG][logstash.inputs.file     ] each: file grew: /Users/mine/Desktop/New_Data/input/log2.txt: old size 0, new size 570524675
    [2018-08-14T09:51:50,127][DEBUG][logstash.inputs.file     ] each: file grew: /Users/mine/Desktop/New_Data/input/log.txt: old size 0, new size 239876960
    [2018-08-14T09:51:51,133][DEBUG][logstash.inputs.file     ] each: file grew: /Users/mine/Desktop/New_Data/input/log.txt: old size 0, new size 239876960
    [2018-08-14T09:51:51,133][DEBUG][logstash.inputs.file     ] each: file grew: /Users/mine/Desktop/New_Data/input/log2.txt: old size 0, new size 570524675
    [2018-08-14T09:51:52,136][DEBUG][logstash.inputs.file     ] each: file grew: /Users/mine/Desktop/New_Data/input/log.txt: old size 0, new size 239876960
    [2018-08-14T09:51:52,136][DEBUG][logstash.inputs.file     ] each: file grew: /Users/mine/Desktop/New_Data/input/log2.txt: old size 0, new size 570524675
    [2018-08-14T09:51:53,143][DEBUG][logstash.inputs.file     ] each: file grew: /Users/mine/Desktop/New_Data/input/log2.txt: old size 0, new size 570524675
    [2018-08-14T09:51:53,143][DEBUG][logstash.inputs.file     ] each: file grew: /Users/mine/Desktop/New_Data/input/log.txt: old size 0, new size 239876960
    [2018-08-14T11:08:43,822][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=&gt;"main", :thread=&gt;"#&lt;Thread:0x20ebcb42 sleep&gt;"}

And the output is repeated again and again.

Here is link to my config file https://pastebin.com/pZTAzCFd

You have told it to read the file from the end, so it will only read new data appended to the file. If the "new size 570524675" numbers are not changing, then nothing is being appended.

Telling to ignore any file more than zero seconds old may not be a good idea.

i see. but getting rid of ignore_older => 0 also doesn't make any difference.

Is the "new size" number changing?

I am not adding content to the file but i am modifying it to change its last modified date before running the program. Just adding extra spaces and saving it so that it should look like it was recently modified.

That will not work. Read the documentation. It tracks device, inode, and offset. It cares not one whit about file modification times, except for excluding files.

1 Like

finally managed to make it run. thanks for your help. i wrote a script to add more data to log file and it is working.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.