Filebeat reads 1 event log multiple times

I have a Elasticsearch, Logstash, Kibana, Filebeat (5.1.1) running on same server.
When I load a json file in the file beat and try and see the output in Kibana, I see same event_log is added 4 times in Elastic index.

Here is my filebeat.yml

filebeat.prospectors:

- input_type: log 

paths:
- /path_to_json_log_file/*.json
exclude_files: [".gz$"]
tag: ["JSON"]

scan_frequency: 1s
close_inactive: 1m
clean_removed: true

multiline.pattern: ^{

multiline.negate: true

multiline.match: after

document_type: JSON

output.logstash:

The Logstash hosts

hosts: ["localhost:5045"]

output.console:
pretty: true
logging.level: debug

Logstash config file

input {
beats{
port => 5045
}

}

filter {
json{
source => message
}
date{
match => ["timeMillis","UNIX_MS"]
target => "@timestamp"
}
}
output
{
stdout { codec => json_lines }
elasticsearch {
index => "ap-index"
hosts => ["localhost:9222"]
}
}

JSON Format:

{"timeMillis":1491826809179,"thread":"main","level":"INFO","loggerName":"some logger","message":"Data load Driver","endOfBatch":true,"loggerFqcn":"org.apache.logging.slf4j.Log4jLogger","contextMap":{"hostName":"some hostn_name","instanceName":"node3"},"threadId":1,"threadPriority":5}

Which filebeat version are you using? Could you share your filebeat log output? Could you try to write to file and see if one event still ends up in multiple times in the output?

I am using Filebeat 5.1.1
filebeat log output says it does not receive an acknowledgement.

Then there is probably the problem. Can you share the log output?

Have you tried to upgrade the logstash-input-beats plugin?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.