Change dynamically log file without re adding previous lines in elastic search and kibana


I have a log file which is generated by a script and exported on ELK machine, ingested by logstash. Every 1 or 2 months, I regenerate the file and the news logs are added to the initial file, and if there're new information on a previous log line, the information are modified.

Is there any way to put this new file on ELK machine without erasing all the previous information or adding all the logs to the previous in kibana ?
I try to new logs at the end of the previous log file ingested, and the result is :
1200 previous info
1000 new info

=> in kibana => 1200 + 1200 + 1000
I want => 1200 + 1000, and just modify the previous log line with the new one if there're modifications.

I don't know if my question is very clear. Thank you.

Reading the docs, what I want to do is impossible.
So, when I want to import the new logs, I delete the old ones and old indexes, import new ones and create the visualizations. I automated that with a python script.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.