Duplicate content when using azure-blob-storage input

Hi @suman.kumar

thanks for your reply! filebeat is sending data to Graylog which in turn stores it in Elasticsearch. I guess i could give it a try to setup a logstash instance as a middle step however this would add complexity to the solution and add another possible point of failure.

also im seeing a worse issue, when this is happening (403 auth error when using the filebeat input azure-blob-storage (error: Request date header too old)) and i have to restart filebeat, it will re download everything again from the blob storage.... it seems that the local filebeat db keeping state of what was already downloaded gets corrupted.