Input file plugin under Windows is continuously looping through the file


(discuss03) #1

I am trying to use logstash under WIndows (don't ask)... and everything seems fine, my logstash configuration the input file definition, the grok filter, and my output to debug and elastic search...

In order to test I used a small 10 line file (syslog format) and I noticed that the logstash agent is continuously reading the file and re-inserting the same lines over and over, hence the number of entries keeps on multiplying - at this point after 10 minutes, I have over 423 docs (I started from 0) in my current logstash-* index.

This is a test environment so there are no others pushing data into the elastic search by me... and only 1 logstash.

Here is y definition:

input {
file {
path => "e:/logs/*.txt"
type => "syslog"
#delimiter => "\n\r"
#start_position => "beginning"
}
}
filter {
grok {
patterns_dir => "c:/logstash/grok_patterns"
match => { "message" => "%{DATE_SYSLOG:syslog_timestamp} %{DATA:syslog_category} %{IPORHOST:host} %{GREEDYDATA:message}" }
overwrite => [ "message", "host" ]
add_tag => [ "network" ]
tag_on_failure => [ "BadBadSucks" ]
}
grok {
patterns_dir => "c:/logstash/grok_patterns"
match => { "message" => "%{DATE_FIREWALL:firewall_timestamp}: %{DATA:fw_event} %{GREEDYDATA:message}" }
add_tag => [ "%{fw_event}", "firewall" ]
tag_on_failure => [ "BadBadSucks2" ]
}
date {
match => [ "syslog_timestamp" , "YYYY-MM-dd HH:mm:ss" ]
}
}
output {
stdout {
codec => rubydebug
}
elasticsearch{
host => "127.0.0.1"
}
}

Any help is greatly appreciated.

Thanks.


(system) #2