How to prevent old log appending existing log in elasticsearch

I am using filebeat to get logs from the remote server and shipping it to logstash so it's working fine. But when new logs being appending in the source log file so filebeat reads those logs from the beginning and send it to logstash and logstash appending all logs with older logs in the elasticsearch even though we already have those older logs in elasticsearch so here repetition is happening of logs.

So my question is how to ship only new added log into logstash. Once new lines of log append in log file so those new files should ship to logstash to elasticsearch.

Here is my filebeat.yml

- type: log
  enabled: true
    - /home/user/Documents/ELK/locals/*.log

logstash input is logstash-input.conf

input {
  beats {
    port => 5044

Exactly how is data appended to the log files? Is Filebeat reading from a local disk?

Hi @Christian_Dahlqvist,

Thanks for reply. Filebeat is reading from local and remote server.

Appending data is like

I have 4 lines in log file and all 4 line are read by filebeat and send it to logstash to elasticsearch. And in elasticsearch log count is 4.

Now if in my log file new 2 lines of log is created then filebeat read all log (old 4 lines + new 2 lines) and send it to logstash to elasticsearch. And in elasticsearch count become 10 but it should be 6.

And one more issue is filebeat isn't reading the last line of a log file. Any solution for last line also?

It is not recommended to read files over networked storage using Filebeat as this can cause problems. It is also important that all lines end with a newline (could be the reason the last line is not read correctly) and that data is truly appended to the file so the inode does not change (which will cause the file to be reread). Replacing a file with a new one with appended data will generally cause it all to be reread as it appears as a new file to Filebeat even though it has the same name as the old file.

This is why I was asking whether files were on local storage and exactly how they are appended to.

@Christian_Dahlqvist First as of now my log files are on local storage. But I'm doing it for my remote logs, are on the cloud.

Second while running filebeat, If I'm adding anything new in the local log file then it will read new logs along with old logs and send them to logstash to elasticsearch. But I'm expecting to read only new logs because old logs are already rad and shipped to logstash to elasticsearch.

Exactly how are you adding new data to the local file? If you use an editor to do this it will usually create a new file with a new inode behind the scenes. If you make sure you properly append data, e.g. echo "New log line\n" >> test.txt on Linux, it should not resend all data.

Right now I'm adding data manually to local log file but on my remote log file server will add new file automatically. After adding \n to the local log file so filebeat is able to read all the lines.

My other problem is why filebeat is reading and sending old logs along with new log and sending it to logstash to elasticsearch?

I want new logs which are appends to my log file and want to ship them to logstash to elasticsearch.

I explained this in my previous response. You need to append to the same file, which you can not do using an editor.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.