Updating data doubles and then triples the hit count on kibana

I am trying to parse logs. whenever new logs are written in input file, the logstash reads the whole log file again and adds them in the exisiting parsed logs. my configuration is below. please guide me what am i doing wrong.

** Configuration **

input {
file {
path => "/home/elk/Desktop/policypermit.log"
start_position => "beginning"
type => 'ppermit'
sincedb_path => '/dev/null'
}
file {
path => "/home/elk/Desktop/policydeny.log"
type => 'pdeny'
start_position => "beginning"
sincedb_path => '/dev/null'
}
}

filter{
if [type] == "pdeny" {
grok {
match => [ "message", "\ USG6300 %%01POLICY/6/POLICYDENY(l):\ vsys=public, protocol=%{WORD:pnum}, source-ip=%{IP:src}, source-port=%{WORD:port}, destination-ip=%{IP:dest}, destination-port=%{WORD:destp}, time=%{GREEDYDATA:timestamp}, source-zone=%{WORD:szone}, destination-zone=%{WORD:dzone}, rule-name=%{GREEDYDATA:rname}" ]

}

mutate
{
remove_field => [ "message" ]
}
}

else if [type] == "ppermit"{
grok {
match => [ "message", "\ USG6300 %%01POLICY/6/POLICYPERMIT(l):\ vsys=public, protocol=%{WORD:pnum}, source-ip=%{IP:src}, source-port=%{WORD:port}, destination-ip=%{IP:dest}, destination-port=%{WORD:destp}, time=%{GREEDYDATA:timestamp}, source-zone=%{WORD:szone}, destination-zone=%{WORD:dzone}, rule-name=%{GREEDYDATA:rname}" ]

}

mutate
{
remove_field => [ "message" ]
}

}

}

output {
if [type] == "pdeny" {
elasticsearch {
hosts => ["http://localhost:9200"]
index => "alog"

}
}
else
{
elasticsearch {
hosts => ["http://localhost:9200"]
index => "blog"

}
}
stdout { codec => rubydebug }
}

This means that Logstash will not keep track of what it has read across restarts. Removing this or setting it to a valid path should resolve the problem.

removed it, but still getting the duplicates. is there something to do with start_position?

Do you have any other files in the config directory, e.g. older versions? Logstash concatenates all files which means all data will go to all outputs unless conditionals are used.

Nope, no other file. i have tried using fingerprint filter and it seems to solve my problem for now.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.