Filebeat duplicating events always (twice)

Hello All, two days with this issue without find out the solution.

Enviroment:

filebeat on Windows sending .txt files to logstash into another machine. One file, one event whole file (multiline)

Problem

filebeat always send the evento twice to logstash (and then logstash two events to ES)

filebeat conf

filebeat.inputs:
- type: log
  enabled: true
  paths:
    - f:\statements\*.txt

  multiline.pattern: ' \A.*'
  multiline.negate: true
  multiline.match: after

filebeat.config.modules:
  path: ${path.config}/modules.d/*.yml
  reload.enabled: false
  close_inactive: 2h

output.logstash: 
  hosts: ["192.168.0.24:5044"]

processors:
  - add_host_metadata: ~

logging.level: debug
logging.selectors: ["*"]

filebeat log

2020-01-29T08:51:30.467+0100    DEBUG   [processors]    processing/processors.go:186    Publish event: {
  "@timestamp": "2020-01-29T07:51:25.418Z",
  "@metadata": {
    "beat": "filebeat",
    "type": "_doc",
    "version": "7.5.2"
  },
  "host": {
    "name": "",
    "architecture": "x86_64",
    "os": {
      "platform": "windows",
      "version": "6.1",
      "family": "windows",
      "name": "Windows 7 Professional",
      "kernel": "6.1.7601.17514 (win7sp1_rtm.101119-1850)",
      "build": "7601.0"
    },
    "id": "54ae6b7a-0fd5-4060-ab52-7dbc69e1f71d",
    "hostname": ""
  },
  "log": {
    "file": {
      "path": "f:\\statements\\test.txt"
    },
    "offset": 0
  },
  "message": "hello",
  "input": {
    "type": "log"
  },
  "agent": {
    "id": "787b704f-406c-4675-a2d3-f0de03061d6d",
    "version": "7.5.2",
    "type": "filebeat",
    "ephemeral_id": "547a9731-de28-4af5-8724-40e5ee7721e6",
    "hostname": ""
  },
  "ecs": {
    "version": "1.1.0"
  }
}
2020-01-29T08:51:31.544+0100    INFO    pipeline/output.go:95   Connecting to backoff(async(tcp://192.168.0.24:5044))
2020-01-29T08:51:31.546+0100    DEBUG   [logstash]      logstash/async.go:111   connect
2020-01-29T08:51:31.551+0100    INFO    pipeline/output.go:105  Connection to backoff(async(tcp://192.168.0.24:5044)) es
tablished
2020-01-29T08:51:31.556+0100    DEBUG   [logstash]      logstash/async.go:159   1 events out of 1 events sent to logstas
h host 192.168.0.24:5044. Continue sending
2020-01-29T08:51:32.430+0100    DEBUG   [harvester]     log/log.go:107  End of file reached: f:\statements\test.txt; Bac
koff now.
2020-01-29T08:51:32.848+0100    DEBUG   [publisher]     memqueue/ackloop.go:160 ackloop: receive ack [0: 0, 1]
2020-01-29T08:51:32.851+0100    DEBUG   [publisher]     memqueue/eventloop.go:535       broker ACK events: count=1, star
t-seq=1, end-seq=1

2020-01-29T08:51:32.856+0100    DEBUG   [publisher]     memqueue/ackloop.go:128 ackloop: return ack to broker loop:1
2020-01-29T08:51:32.858+0100    DEBUG   [publisher]     memqueue/ackloop.go:131 ackloop:  done send ack
2020-01-29T08:51:32.857+0100    DEBUG   [acker] beater/acker.go:64      stateful ack    {"count": 1}
2020-01-29T08:51:32.862+0100    DEBUG   [registrar]     registrar/registrar.go:356      Processing 1 events
2020-01-29T08:51:32.864+0100    DEBUG   [registrar]     registrar/registrar.go:326      Registrar state updates processe
d. Count: 1
2020-01-29T08:51:32.868+0100    DEBUG   [registrar]     registrar/registrar.go:411      Write registry file: C:\Program
Files\Filebeat\data\registry\filebeat\data.json (2)
2020-01-29T08:51:32.885+0100    DEBUG   [registrar]     registrar/registrar.go:404      Registry file updated. 2 states
written.
2020-01-29T08:51:35.422+0100    DEBUG   [input] input/input.go:152      Run input
2020-01-29T08:51:35.424+0100    DEBUG   [input] log/input.go:191        Start next scan
2020-01-29T08:51:35.426+0100    DEBUG   [input] log/input.go:421        Check file for harvesting: f:\statements\test.tx
t
2020-01-29T08:51:35.430+0100    DEBUG   [input] log/input.go:511        Update existing file for harvesting: f:\statemen
ts\test.txt, offset: 7
2020-01-29T08:51:35.434+0100    DEBUG   [input] log/input.go:563        Harvester for file is still running: f:\statemen
ts\test.txt
2020-01-29T08:51:35.441+0100    DEBUG   [input] log/input.go:212        input states cleaned up. Before: 2, After: 2, Pe
nding: 0
2020-01-29T08:51:35.445+0100    DEBUG   [input] log/input.go:243        State for file not removed because harvester not
 finished: f:\statements\test.txt
2020-01-29T08:51:35.449+0100    DEBUG   [input] log/input.go:233        Remove state for file as file removed or renamed
: f:\statements\test.txt
2020-01-29T08:51:40.182+0100    DEBUG   [harvester]     log/log.go:107  End of file reached: f:\statements\test.txt; Bac
koff now.
2020-01-29T08:51:40.434+0100    DEBUG   [harvester]     log/log.go:107  End of file reached: f:\statements\test.txt; Bac
koff now.
2020-01-29T08:51:42.815+0100    INFO    [monitoring]    log/log.go:145  Non-zero metrics in the last 30s        {"monito
ring": {"metrics": {"beat":{"cpu":{"system":{"ticks":2480,"time":{"ms":47}},"total":{"ticks":4929,"time":{"ms":63},"valu
e":4929},"user":{"ticks":2449,"time":{"ms":16}}},"handles":{"open":119},"info":{"ephemeral_id":"547a9731-de28-4af5-8724-
40e5ee7721e6","uptime":{"ms":32490580}},"memstats":{"gc_next":7625072,"memory_alloc":5869256,"memory_total":114599504,"r
ss":2269184},"runtime":{"goroutines":35}},"filebeat":{"events":{"added":2,"done":2},"harvester":{"open_files":2,"running
":2,"started":1}},"libbeat":{"config":{"module":{"running":0}},"output":{"events":{"acked":1,"batches":1,"total":1},"rea
d":{"bytes":6},"write":{"bytes":483}},"pipeline":{"clients":1,"events":{"active":0,"filtered":1,"published":1,"retry":1,
"total":2},"queue":{"acked":1}}},"registrar":{"states":{"current":2,"update":2},"writes":{"success":2,"total":2}}}}}
2020-01-29T08:51:45.452+0100    DEBUG   [input] input/input.go:152      Run input
2020-01-29T08:51:45.454+0100    DEBUG   [input] log/input.go:191        Start next scan
2020-01-29T08:51:45.456+0100    DEBUG   [input] log/input.go:421        Check file for harvesting: f:\statements\test.tx
t
2020-01-29T08:51:45.460+0100    DEBUG   [input] log/input.go:511        Update existing file for harvesting: f:\statemen
ts\test.txt, offset: 7
2020-01-29T08:51:45.464+0100    DEBUG   [input] log/input.go:563        Harvester for file is still running: f:\statemen
ts\test.txt
2020-01-29T08:51:45.467+0100    DEBUG   [input] log/input.go:212        input states cleaned up. Before: 2, After: 2, Pe
nding: 0
2020-01-29T08:51:45.471+0100    DEBUG   [input] log/input.go:243        State for file not removed because harvester not
 finished: f:\statements\test.txt
2020-01-29T08:51:45.477+0100    DEBUG   [input] log/input.go:233        Remove state for file as file removed or renamed
: f:\statements\test.txt
2020-01-29T08:51:50.186+0100    DEBUG   [harvester]     log/log.go:107  End of file reached: f:\statements\test.txt; Bac
koff now.
2020-01-29T08:51:50.442+0100    DEBUG   [harvester]     log/log.go:107  End of file reached: f:\statements\test.txt; Bac
koff now.
2020-01-29T08:51:55.482+0100    DEBUG   [input] input/input.go:152      Run input
2020-01-29T08:51:55.484+0100    DEBUG   [input] log/input.go:191        Start next scan
2020-01-29T08:51:55.487+0100    DEBUG   [input] log/input.go:421        Check file for harvesting: f:\statements\test.tx
t
2020-01-29T08:51:55.494+0100    DEBUG   [input] log/input.go:511        Update existing file for harvesting: f:\statemen
ts\test.txt, offset: 7
2020-01-29T08:51:55.499+0100    DEBUG   [input] log/input.go:563        Harvester for file is still running: f:\statemen
ts\test.txt
2020-01-29T08:51:55.506+0100    DEBUG   [input] log/input.go:212        input states cleaned up. Before: 2, After: 2, Pe
nding: 0
2020-01-29T08:51:55.512+0100    DEBUG   [input] log/input.go:243        State for file not removed because harvester not
 finished: f:\statements\test.txt
2020-01-29T08:51:55.522+0100    DEBUG   [input] log/input.go:233        Remove state for file as file removed or renamed
: f:\statements\test.txt
2020-01-29T08:52:00.192+0100    DEBUG   [harvester]     log/log.go:107  End of file reached: f:\statements\test.txt; Bac
koff now.
2020-01-29T08:52:00.447+0100    DEBUG   [harvester]     log/log.go:107  End of file reached: f:\statements\test.txt; Bac
koff now.
2020-01-29T08:52:03.690+0100    DEBUG   [service]       service/service.go:53   Received sigterm/sigint, stopping
2020-01-29T08:52:03.690+0100    DEBUG   [service]       service/service.go:64   Received svc stop/shutdown request
2020-01-29T08:52:03.692+0100    INFO    beater/filebeat.go:443  Stopping filebeat
2020-01-29T08:52:03.696+0100    INFO    crawler/crawler.go:139  Stopping Crawler
2020-01-29T08:52:03.698+0100    INFO    crawler/crawler.go:149  Stopping 1 inputs
2020-01-29T08:52:03.699+0100    INFO    cfgfile/reload.go:229   Dynamic config reloader stopped
2020-01-29T08:52:03.699+0100    INFO    input/input.go:149      input ticker stopped
2020-01-29T08:52:03.703+0100    INFO    input/input.go:167      Stopping Input: 7136789230235349743
2020-01-29T08:52:03.705+0100    DEBUG   [multiline]     multiline/multiline.go:146      Multiline event flushed because
timeout reached.
2020-01-29T08:52:03.709+0100    INFO    log/harvester.go:272    Reader was closed: f:\statements\test.txt. Closing.
2020-01-29T08:52:03.705+0100    DEBUG   [multiline]     multiline/multiline.go:146      Multiline event flushed because
timeout reached.
2020-01-29T08:52:03.714+0100    INFO    log/harvester.go:272    Reader was closed: f:\statements\test.txt. Closing.
2020-01-29T08:52:03.711+0100    DEBUG   [harvester]     log/harvester.go:522    Stopping harvester for file: f:\statemen
ts\test.txt
2020-01-29T08:52:03.720+0100    DEBUG   [harvester]     log/harvester.go:532    Closing file: f:\statements\test.txt
2020-01-29T08:52:03.716+0100    DEBUG   [harvester]     log/harvester.go:522    Stopping harvester for file: f:\statemen
ts\test.txt
2020-01-29T08:52:03.725+0100    DEBUG   [harvester]     log/harvester.go:532    Closing file: f:\statements\test.txt
2020-01-29T08:52:03.722+0100    DEBUG   [harvester]     log/harvester.go:406    Update state: f:\statements\test.txt, of
fset: 7
2020-01-29T08:52:03.727+0100    DEBUG   [harvester]     log/harvester.go:406    Update state: f:\statements\test.txt, of
fset: 7
2020-01-29T08:52:03.731+0100    DEBUG   [harvester]     log/harvester.go:543    harvester cleanup finished for file: f:\
statements\test.txt
2020-01-29T08:52:03.735+0100    DEBUG   [harvester]     log/harvester.go:543    harvester cleanup finished for file: f:\
statements\test.txt
2020-01-29T08:52:03.742+0100    DEBUG   [publisher]     pipeline/client.go:162  client: closing acker
2020-01-29T08:52:03.744+0100    DEBUG   [publisher]     pipeline/client.go:173  client: done closing acker
2020-01-29T08:52:03.746+0100    DEBUG   [publisher]     pipeline/client.go:176  client: cancelled 0 events
2020-01-29T08:52:03.748+0100    DEBUG   [publisher]     pipeline/client.go:147  client: wait for acker to finish
2020-01-29T08:52:03.750+0100    DEBUG   [publisher]     pipeline/client.go:149  client: acker shut down
2020-01-29T08:52:03.752+0100    INFO    crawler/crawler.go:165  Crawler stopped
2020-01-29T08:52:03.753+0100    INFO    registrar/registrar.go:367      Stopping Registrar
2020-01-29T08:52:03.755+0100    INFO    registrar/registrar.go:293      Ending Registrar
2020-01-29T08:52:03.757+0100    DEBUG   [registrar]     registrar/registrar.go:411      Write registry file: C:\Program
Files\Filebeat\data\registry\filebeat\data.json (2)
2020-01-29T08:52:03.775+0100    DEBUG   [registrar]     registrar/registrar.go:404      Registry file updated. 2 states
written.

logstash pipeline

input {
  
  beats {
    port => "5044"
    client_inactivity_timeout => 1880
  }  
  
}

filter {}


output{

stdout { codec => rubydebug } 

   #elasticsearch {
   #   manage_template => false
   #   hosts => ["http://192.168.0.25:9200"]
   #   index => "test_index"
   #   user => ""
   #   password => ""
   # }

}

I simplified pipeline to check if the problem was from filebeat, and it seems the problem is with filebeat, always duplicating events twice.

Best regards

No way, maybe it's a bug, i am not be able to solve it. I can't figure out why filebeat is always duplicating events.

Anyone with a similar problem?.

Best regards

Hi!

In the Filebeat's logs there is nothing that seems problematic. Could you try with console output instead of sending the events to Logstash and see if this behaviour persists?

C.

Hello,

thanks ChrsMark, finally i sove it (it was my fault) two days with this... I had two instances of filebeat running, the filebeat service and a console instance filebeat -e. Damn it!

Best regards

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.