Logstash cannot correctly send data of mongo collection to elasticsaearch

hi
here is my logstash config
input {
mongodb {
uri => 'mongodb://boomrang:*******/boomrang'
placeholder_db_dir => '/tmp'
placeholder_db_name => 'boomrang_tarrifs.db'
collection => 'tariffs'
batch_size => 5000
}
}
filter {
mutate {
remove_field =>["_id", "__v", "wageFormulas"]
}
}
filter {
date {
match => [ "logdate", "ISO8601" ]
}
}

output {
elasticsearch {
action =>"update"
doc_as_upsert => true
hosts => ["localhost:9200"]
index => "tariffs"
document_id => "%{mongo_id}%{date}"
}
stdout { codec => rubydebug}
}

and it is one of my record in collection "tarrifes" that i want sent it to elasticsearch
{
"_id" : ObjectId("5b98d8114ef6cb5bcebb9894"),
"name" : "Usual 9706",
"farsiName" : "کارمزد استاندارد 9706",
"type" : "PUBLIC",
"createdDate" : "13970621133938",
"lastUpdate" : "13970621133938",
"supportWage" : 1000000,
"wageFormulas" : [ ],
"__v" : 0
}

problem is the tarrifs collection has 23 record and logstash read more data so in kibana i see 100 recort for tarrif collection and the num of this hits become more when i refresh kibana.
how can i solve this problem?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.