Duplicate entries in Kibana in the logs generated in the last minute

Hi there,

I am trying to visualize logs on Kibana from a log file which is updating in real time.

The log file is splitting in each minuite generating a new log file with the name of the

particular minute. I have written a bash script for this splitting of logs after each minute as follows.

fileCreator.sh file

#!/bin/bash

#execute at the time of creating the request-response-logger.log file


file="request-response-logger";

current_date_time="`date "+%Y-%m-%d %H:%M"`";

sleep 5


#this part is to extract and move the logs recorded in time between the start of generation og the main log file and the time of executing the script.


sed -i -e '/' "$current_date_time"'/{w '"$file-$current_date_time before'' ''-e 'd}' $file

echo "splitted logs on" $current_date_time "before"

sleep 1m


while [ -s $file ]

do

before_time="`date "+%Y-%m-%d %H:%M" -d "1 min ago"`"

sed -i -e '/'"$before_time"'/{w '"$file-$before_time"'' -e 'd}' $file

echo "splitted logs on" $before_time

sleep 1m

done

Executing of this script will generate log files in each minute as follows.

Screenshot%20from%202018-10-26%2020-02-05

The main log file (request-response-logger.log) has been pointed as the input file to filebeat in filebeat.yml.

When loaded in to kibana, at every time logs recorded in the last minute are duplicating. That means two logs with the same id are recorded in Kibana for the logs in the log file generated in the last minute only.I tried several times with altering the functions in the script file but none of them worked out. The resulting issue is as follows.

Please help me to resolve this.

Filebeat, Logstash, Elasticsearch, Kibana versions : 6.4.0

Hi @Chamani_Shiranthika,

Can you post your filebeat configuration? I see you're also using Logstash, can you also provide the configuration for that as well?

1 Like

Thanks @Larry_Gregory , These are my filebeat and logstash configurations.

Filebeat configuration (in filebeat.yml)
#------------input for request-response-logger------------------
- type: log
# Change to true to enable this input configuration.
enabled: true
# Paths that should be crawled and fetched. Glob based paths.
paths:
#- /var/log/*.log
- /home/playground/elk-analytics/logs/request-response-logger.log

Logstash configuration ( in .conf file)
output {
if [type] == "request-response" {
elasticsearch {
hosts => [ "localhost:9200" ]
index => "transactions"
}}

Thanks in advance. :slight_smile:

Nothing in the configuration you posted looks out of place to me. Is it possible that Filebeat is processing the input file multiple times?

You might have better luck posting this question in the Beats topic, as there isn't anything Kibana is doing wrong here.

1 Like

@Larry_Gregory Thanks very much. Yes I will post this on Beats

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.