I have a Elasticsearch, Logstash, Kibana, Filebeat (5.1.1) running on same server.
When I load a json file in the file beat and try and see the output in Kibana, I see same event_log is added 4 times in Elastic index.
Here is my filebeat.yml
filebeat.prospectors:
- input_type: log
paths:
- /path_to_json_log_file/*.json
exclude_files: [".gz$"]
tag: ["JSON"]
scan_frequency: 1s
close_inactive: 1m
clean_removed: true
multiline.pattern: ^{
multiline.negate: true
multiline.match: after
document_type: JSON
output.logstash:
The Logstash hosts
hosts: ["localhost:5045"]
output.console:
pretty: true
logging.level: debug
Logstash config file
input {
beats{
port => 5045
}
}
filter {
json{
source => message
}
date{
match => ["timeMillis","UNIX_MS"]
target => "@timestamp"
}
}
output
{
stdout { codec => json_lines }
elasticsearch {
index => "ap-index"
hosts => ["localhost:9222"]
}
}
JSON Format:
{"timeMillis":1491826809179,"thread":"main","level":"INFO","loggerName":"some logger","message":"Data load Driver","endOfBatch":true,"loggerFqcn":"org.apache.logging.slf4j.Log4jLogger","contextMap":{"hostName":"some hostn_name","instanceName":"node3"},"threadId":1,"threadPriority":5}