Logstash pipeline to pipeline - outputs too many documents

Hi, i am using filebeat in order to write to a pipeline:

input {
    beats {
        port => "5050"
    }
}

output {

    if  [message] =~ /value/ {

        pipeline { send_to => [reciving_pipeline]
         }
}

}

and this pipeline is writing to elasticsearch:

input {

    pipeline { address => reciving_pipeline }
}


filter {

    json {
        source => "message"
    }


}


output {

        elasticsearch {
            hosts => ["elasticsearch:9200"]
            index => "reciving_pipeline_index-%{+YYYY.MM}"
        }

}

and from some reason, i get more and more documents inserted into the elasticsearch although the log file im testing has only 2. in the last try the number was like 728 documents.

anyone knows something about this?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.