Duplicate Documents

If there are communication problems, then in some cases the logs come 2 or 3 times, all the logs of one call are duplicated.

Functionbeat config:

functionbeat:
  provider:
    aws:
      deploy_bucket: functionbeat-deploy
      endpoint: s3.amazonaws.com
      functions:
      - description: lambda function for cloudwatch logs
        enabled: true
        name: elk-shipper
        triggers:
        - log_group_name: /aws/lambda/secure-downloads
        type: cloudwatch_logs
keystore:
  path: /tmp/functionbeat.keystore
output:
  logstash:
    hosts:
    - logstash.net:1999
    pipelining: 0
    ttl: 0
    worker: 1
path:
  config: /etc/functionbeat
  data: /tmp
  home: /etc/functionbeat
  logs: /tmp/logs
processors:
- add_fields:
    fields:
      app_name: lss
      component_type: aws
      hostname: -lambda
      location: aws
      log_format: json
      log_type: shipper
      log_signal: ASDFGH
    target: ""

AWS logs:

2022-01-25T10:15:14.519Z	ERROR	[publisher_pipeline_output]	pipeline/output.go:180	failed to publish events: write tcp 169.254.76.1:52578->1.1.1.1:1999: write: connection reset by peer
2022-01-25T10:15:14.519Z	INFO	[publisher_pipeline_output]	pipeline/output.go:143	Connecting to backoff(tcp://logstash.net:1999)

I tried to change ttl and pipelining, but it did not work.
Logstash logs are empty, no errors.
I will be glad for any help.

image
Each burst is a duplicate of the logs.

I increased the timeout in the lambda settings, now AWS does not restart lambda and duplicates do not appear.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.