Filebeat OOM error

We've been running filebeat to tail logs using the following prospector configuration

    - input_type: log
      scan_frequency: "30s"
      close_older: "24h"
      ignore_older: "15h"
      clean_inactive: "24h"
      tail_files: false
      exclude_files: ['\.gz$']
      json.message_key: log
      paths:
        - /var/log/docker/highgarden/docker-process-highgarden.log

Every once in a while, i see filebeat run out of memory and the process stops. Here is a stacktrace for the same

goroutine 199734 [running]:
runtime.systemstack_switch()
	/usr/local/go/src/runtime/asm_amd64.s:252 fp=0xc42035b5b0 sp=0xc42035b5a8
runtime.mallocgc(0x3ad64000, 0x0, 0x25a7c000, 0xc4227ca000)
	/usr/local/go/src/runtime/malloc.go:670 +0x903 fp=0xc42035b650 sp=0xc42035b5b0
runtime.growslice(0x8f5780, 0xc4227ca000, 0x2f11c000, 0x2f11c000, 0x2f11c400, 0xc4227ca000, 0x79a45d, 0xc4210a0800)
	/usr/local/go/src/runtime/slice.go:126 +0x24e fp=0xc42035b6e0 sp=0xc42035b650
code.abc.xyz/search/filebeat/vendor/github.com/elastic/beats/libbeat/common/streambuf.(*Buffer).doAppend(0xc420a60aa0, 0xc4210a0800, 0x400, 0x400, 0x2cdd2000, 0xffffffffffffffff, 0x400, 0x400)
	/go/src/code.abc.xyz/search/filebeat/vendor/github.com/elastic/beats/libbeat/common/streambuf/streambuf.go:143 +0x48f fp=0xc42035b770 sp=0xc42035b6e0
code.abc.xyz/search/filebeat/vendor/github.com/elastic/beats/libbeat/common/streambuf.(*Buffer).Write(0xc420a60aa0, 0xc4210a0800, 0x400, 0x400, 0xc501efa001, 0x1e2d348c, 0x2cdd2000)
	/go/src/code.abc.xyz/search/filebeat/vendor/github.com/elastic/beats/libbeat/common/streambuf/io.go:83 +0x5b fp=0xc42035b7d0 sp=0xc42035b770
code.abc.xyz/search/filebeat/vendor/github.com/elastic/beats/filebeat/harvester/reader.(*Line).decode(0xc42031c310, 0x4d3ef48c, 0xc420485b69, 0x1, 0x1)
	/go/src/code.abc.xyz/search/filebeat/vendor/github.com/elastic/beats/filebeat/harvester/reader/line.go:175 +0x216 fp=0xc42035b890 sp=0xc42035b7d0
code.abc.xyz/search/filebeat/vendor/github.com/elastic/beats/filebeat/harvester/reader.(*Line).advance(0xc42031c310, 0x0, 0x1)
	/go/src/code.abc.xyz/search/filebeat/vendor/github.com/elastic/beats/filebeat/harvester/reader/line.go:131 +0x1f8 fp=0xc42035b958 sp=0xc42035b890
code.abc.xyz/search/filebeat/vendor/github.com/elastic/beats/filebeat/harvester/reader.(*Line).Next(0xc42031c310, 0x429efc, 0xc42035b950, 0xc42035b8f0, 0x0, 0x0, 0x0)
	/go/src/code.abc.xyz/search/filebeat/vendor/github.com/elastic/beats/filebeat/harvester/reader/line.go:57 +0x3d fp=0xc42035b9e0 sp=0xc42035b958
code.abc.xyz/search/filebeat/vendor/github.com/elastic/beats/filebeat/harvester/reader.Encode.Next(0xc42031c310, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x40f509, ...)
	/go/src/code.abc.xyz/search/filebeat/vendor/github.com/elastic/beats/filebeat/harvester/reader/encode.go:30 +0x5e fp=0xc42035ba98 sp=0xc42035b9e0
code.abc.xyz/search/filebeat/vendor/github.com/elastic/beats/filebeat/harvester/reader.(*JSON).Next(0xc420cf0f00, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0xc420ba0c68, ...)
	/go/src/code.abc.xyz/search/filebeat/vendor/github.com/elastic/beats/filebeat/harvester/reader/json.go:79 +0x66 fp=0xc42035bb70 sp=0xc42035ba98
code.abc.xyz/search/filebeat/vendor/github.com/elastic/beats/filebeat/harvester/reader.(*StripNewline).Next(0xc420485b70, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0xc420ba0cc0, ...)
	/go/src/code.abc.xyz/search/filebeat/vendor/github.com/elastic/beats/filebeat/harvester/reader/strip_newline.go:16 +0x66 fp=0xc42035bc18 sp=0xc42035bb70
code.abc.xyz/search/filebeat/vendor/github.com/elastic/beats/filebeat/harvester/reader.(*Limit).Next(0xc420cf0f20, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0xc420ba0e58, ...)
	/go/src/code.abc.xyz/search/filebeat/vendor/github.com/elastic/beats/filebeat/harvester/reader/limit.go:17 +0x66 fp=0xc42035bcc0 sp=0xc42035bc18
code.abc.xyz/search/filebeat/vendor/github.com/elastic/beats/filebeat/harvester.(*Harvester).Harvest(0xc421ed0d00, 0xc97da0, 0xc420cf0f20)
	/go/src/code.abc.xyz/search/filebeat/vendor/github.com/elastic/beats/filebeat/harvester/log.go:93 +0x24d fp=0xc42035bf68 sp=0xc42035bcc0
code.abc.xyz/search/filebeat/vendor/github.com/elastic/beats/filebeat/prospector.(*Prospector).startHarvester.func1(0xc42028a000, 0xc421ed0d00, 0xc97da0, 0xc420cf0f20)
	/go/src/code.abc.xyz/search/filebeat/vendor/github.com/elastic/beats/filebeat/prospector/prospector.go:248 +0x65 fp=0xc42035bf90 sp=0xc42035bf68

The data is produced in burst mode at ~500EPS. Looks like the GC is not able to kick that eventually leads to OOM.

Which filebeat version are you using? Can you share your full config with the output part?

@ruflin,

Sorry for the delayed reply. We are using filebeat 5.2.0 and a custom output plugin. There was an issue in the way we were shipping the data which was leading to a lot of copying and object creation. Refactoring the code to handle that cleanly fixed the problem.

Thanks!