I am trying to establish a pipeline that reads in an entire file every time logstash is started and output it into stdout. I set my start_position to beginning and sincedb_path to /dev/null. It's my understanding that this should be equivalent to telling logstash that it has "always" never seen the file before and to process it from the start of the file each time it's been "seen" for the first time - which should be every time in this instance.
input {
file {
path => [ '/home/cpattonj/deduping/test.log' ]
codec => 'json_lines'
start_position => 'beginning'
sincedb_path => '/dev/null'
type => 'testing'
}
}
filter {
date {
match => ['timestamp', 'ISO8601']
}
}
output {
stdout {
}
}
and here is the test.log (short and sweet, just for testing):
{"count":10, "timestamp":"2015-09-22T00:00:00.000Z"}
{"count":11, "timestamp":"2015-09-22T00:00:00.001Z"}
{"count":12, "timestamp":"2015-09-22T00:00:00.002Z"}
{"count":13, "timestamp":"2015-09-22T00:00:00.003Z"}
My experience with this has been that the behavior of the file input works as expected; every time I start logstash it reports that it has received each line in my test log file.
...
Received line {:path=>"/home/cpattonj/deduping/test.log", :text=>"{\"count\":13, \"timestamp\":\"2015-09-22T00:00:00.003Z\"}", :level=>:debug, :file=>"logstash/inputs/file.rb", :line=>"134"}
writing sincedb (delta since last write = 1443014785) {:level=>:debug, :file=>"filewatch/tail.rb", :line=>"177"}
...
However, no matter what kind of output I use I don't get anything out of logstash. I would think that with at least stdout I would get the raw log lines. Any pointers?
EDIT: if it matters, I have a separate logstash agent running in the background in addition to the one I'm running as part of this test.. afaik that shouldn't be an issue?