Logstash file input not working for large file

Hello,
I am using elk (v5.6.9-1) . I have a file of size 1.3G containing debug logs. When I take a small snippet of file(eg. create a file with 10 lines of the original) , then it correctly reads the file and indexes it into elasticsearch(as expected), but when I use the whole file(i.e as it is ,without trimming) in the logstash-file-input , it does nothing and the debug logs show something like:-

[2018-05-27T10:27:27,526][INFO ][logstash.pipeline        ] Starting pipeline {"id"=>"main", "pipeline.workers"=>24, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>3000}
[2018-05-27T10:27:27,723][INFO ][logstash.pipeline        ] Pipeline main started
[2018-05-27T10:27:27,733][DEBUG][logstash.agent           ] Starting puma
[2018-05-27T10:27:27,733][DEBUG][logstash.inputs.file     ] _globbed_files: /root/Downloads/debug: glob is: ["/root/Downloads/debug"]
[2018-05-27T10:27:27,735][DEBUG][logstash.agent           ] Trying to start WebServer {:port=>9600}
[2018-05-27T10:27:27,735][DEBUG][logstash.inputs.file     ] _discover_file: /root/Downloads/debug: new: /root/Downloads/debug (exclude is [])
[2018-05-27T10:27:27,736][DEBUG][logstash.api.service     ] [api-service] start
[2018-05-27T10:27:27,736][DEBUG][logstash.inputs.file     ] _open_file: /root/Downloads/debug: opening
[2018-05-27T10:27:27,737][DEBUG][logstash.inputs.file     ] /root/Downloads/debug: sincedb last value 1324520770, cur size 1324520770
[2018-05-27T10:27:27,737][DEBUG][logstash.inputs.file     ] /root/Downloads/debug: sincedb: seeking to 1324520770
[2018-05-27T10:27:27,758][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2018-05-27T10:27:32,725][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2018-05-27T10:27:37,725][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2018-05-27T10:27:41,752][DEBUG][logstash.inputs.file     ] _globbed_files: /root/Downloads/debug: glob is: ["/root/Downloads/debug"]
[2018-05-27T10:27:42,726][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2018-05-27T10:27:47,726][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2018-05-27T10:27:52,727][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2018-05-27T10:27:56,764][DEBUG][logstash.inputs.file     ] _globbed_files: /root/Downloads/debug: glob is: ["/root/Downloads/debug"]
[2018-05-27T10:27:57,728][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2018-05-27T10:28:02,728][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline
[2018-05-27T10:28:07,728][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline

My logstash configuration is:-

input {
file {    
        path => "/root/Downloads/debug"    
        type => "juniper_poc"
        start_position => "beginning"
    }
}
filter {
  if [type] == "juniper_poc" {
    grok {
          patterns_dir => "./patterns"
          match => { "message" => [  "%{DATESTAMP:log_date} \(%{NUMBER:pid}\): %{GREEDYDATA:message}"
                       ]} 
                     overwrite => ["message"]
    }
      date {
        timezone => "Asia/Karachi"
        match => [ "log_date", "MM/dd/yy HH:mm:ss.SSS" , "ISO8601" ]
    }
mutate {
    remove_field => ["@version" ,"log_date","path" ]
}
}
}
output{
stdout { codec => rubydebug } 
elasticsearch {   hosts => "http://192.168.100.10:9200"}
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.