Spontaneously duplicated logs - need advice - thanks

Hi there,

I have following setup: version 7.3.1

A Live System logs into a csv
With xcopy.bat I copy this csv every 5 Minutes to my ES Server to a specific folder.
Beats is whatching to that csv file in that specific folder.
Logstash is listen to Beats and write to my index!

I know its better to have Beats installed within Live System but this setup has been going on for a very long time without any problem or duplicated entries!

But since a few days I got thousends of duplicated logs in my index
Wow, thats spooky!

What could have caused this? My Setup not changed!

Any suggestion ?

Have you looked at the documents in Kibana to verify they are diuplicated?

Hi, yes I checked them in Kibana also generate csv report and doublechecked.
It is strange and I have no Idea why.

HI @Datakids

We always used to joke.. when there is no change....there is always a change :slight_smile: ... it just the who, what, when, where that we do not know ... (of course it could be a latent bug)

Can you check the dates on the files I perhaps that the dates / timestamps on the files are being touched / updated so filebeat is reading multiple times.

Are the past files being removed?

FIlebeat keeps a track of what the data/registry/filebeat perhaps that has become corrupt or permissions have changed so it can no longer write. You can clean that up or just look in it is is all human readable.

Have you simply restarted each component filebeat and logstash?

How long was this been running has there been any kind of rollover in filenames etc.

Have any of the boxes been patched / rebooted lately?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.