File input processing question

I have logstash config file setup as shown below. It is monitoring demo directory for specific file pattern. It finds the log file and reads it, but once it reads it I can't ever convince it to re-read it from the beginning. It's normally a good thing that it remembers, but I'm still working out bugs in my conf file and I need to re-load the index from time to time from the beginning of the file. How can I convince it to read that input file from the beginning once again?!

I've tried the start_position => "beginning", but no luck...

Here is my config today:
input { 
    # Log file
    file {
       path => "C:/logstash-1.5.3/demo/System*.log"
       #type => "sysout"
       start_position => "beginning"
      # Standard input file
      #stdin { }

filter {
  if [path] =~ "SLIC" {
    mutate { replace => { "type" => "slic" } }
  } else {
    mutate { replace => { "type" => "sysout" } }
  grok {
    match => [
      "^\[%{DATESTAMP:tslice} GMT\] %{NUMBER} CurrencyExcha %{GREEDYDATA} (%{WORD:status}(\.)?)"
  if "_grokparsefailure" in [tags] {
          drop { }
output { 
    elasticsearch {
        host => localhost
        index => "sprint"
    stdout { 
        codec => rubydebug 

This is a sincedb issue, there's a bunch of other threads on this so just do a search on that and you should be good :slight_smile:

I have an open pull request that attempts to document the sincedb files and related matters a little better. Please feel free to comment if it isn't clear or if it doesn't answer your questions.

1 Like