Hi,
I receive the following panic error:
2017/04/07 11:35:58.534945 outputs.go:108: INFO Activated elasticsearch as output plugin.
2017/04/07 11:35:58.535113 publish.go:295: INFO Publisher name: weblogic
2017/04/07 11:35:58.535350 async.go:63: INFO Flush Interval set to: 1s
2017/04/07 11:35:58.535364 async.go:64: INFO Max Bulk Size set to: 50
2017/04/07 11:35:58.535904 beat.go:221: INFO filebeat start running.
2017/04/07 11:35:58.536076 registrar.go:85: INFO Registry file set to: /home/.../filebeat/data/registry
2017/04/07 11:35:58.536371 registrar.go:106: INFO Loading registrar data from /home/.../filebeat/data/registry
2017/04/07 11:35:58.536719 registrar.go:123: INFO States Loaded from registrar: 5
2017/04/07 11:35:58.536745 crawler.go:38: INFO Loading Prospectors: 3
2017/04/07 11:35:58.536853 prospector_log.go:61: INFO Prospector with previous states loaded: 1
2017/04/07 11:35:58.537001 prospector.go:124: INFO Starting prospector of type: log; id: 13661519049829733422
2017/04/07 11:35:58.537094 prospector_log.go:61: INFO Prospector with previous states loaded: 4
2017/04/07 11:35:58.537221 spooler.go:101: INFO Stopping spooler
2017/04/07 11:35:58.537585 prospector.go:232: INFO Stopping Prospector: 13661519049829733422
2017/04/07 11:35:58.537692 registrar.go:236: INFO Starting Registrar
2017/04/07 11:35:58.537725 sync.go:41: INFO Start sending events to output
2017/04/07 11:35:58.537756 spooler.go:63: INFO Starting spooler: spool_size: 2048; idle_timeout: 5s
2017/04/07 11:35:58.538572 prospector.go:134: INFO Prospector channel stopped
panic: send on closed channel
goroutine 104 [running]:
panic(0x9345e0, 0xc4202a7c10)
/usr/local/go/src/runtime/panic.go:500 +0x1a1
github.com/elastic/beats/filebeat/beater.(*spoolerOutlet).OnEvent(0xc42018f480, 0xc4202977a0, 0x7f6d498d24b0)
/go/src/github.com/elastic/beats/filebeat/beater/channels.go:57 +0x194
github.com/elastic/beats/filebeat/prospector.(*Prospector).updateState(0xc4201e1e00, 0xc4202977a0, 0x27, 0x61394)
/go/src/github.com/elastic/beats/filebeat/prospector/prospector.go:203 +0x6e
github.com/elastic/beats/filebeat/prospector.(*Prospector).startHarvester(0xc4201e1e00, 0xc4200d79b0, 0x27, 0x61394, 0x0, 0xc831e0, 0xc420213a00, 0x1524, 0x1f, 0xed079709e, ...)
/go/src/github.com/elastic/beats/filebeat/prospector/prospector.go:303 +0x17d
github.com/elastic/beats/filebeat/prospector.(*ProspectorLog).harvestExistingFile(0xc4201d3980, 0xc4200d79b0, 0x27, 0x0, 0x0, 0xc831e0, 0xc420213a00, 0x1524, 0x1f, 0xed079709e, ...)
/go/src/github.com/elastic/beats/filebeat/prospector/prospector_log.go:265 +0xb9b
github.com/elastic/beats/filebeat/prospector.(*ProspectorLog).scan(0xc4201d3980)
/go/src/github.com/elastic/beats/filebeat/prospector/prospector_log.go:247 +0x717
github.com/elastic/beats/filebeat/prospector.(*ProspectorLog).Run(0xc4201d3980)
/go/src/github.com/elastic/beats/filebeat/prospector/prospector_log.go:81 +0xce
github.com/elastic/beats/filebeat/prospector.(*Prospector).Run(0xc4201e1e00)
/go/src/github.com/elastic/beats/filebeat/prospector/prospector.go:170 +0x4f
github.com/elastic/beats/filebeat/prospector.(*Prospector).Start.func2(0xc4201e1e00)
/go/src/github.com/elastic/beats/filebeat/prospector/prospector.go:161 +0x51
created by github.com/elastic/beats/filebeat/prospector.(*Prospector).Start
/go/src/github.com/elastic/beats/filebeat/prospector/prospector.go:162 +0x1a1
This is my config file:
filebeat.prospectors:
- input_type: log
paths:
- /.../weblogic.log
fields:
source_type: weblogic
multiline.pattern: '^#'
multiline.negate: true
multiline.match: after
multiline.max_lines: 100000
- input_type: log
paths:
- /.../*/*.log
exclude_files: ['isp_.*']
fields:
source_type: logback-application
multiline.pattern: '^[0-9]{4}-[0-9]{2}-[0-9]{2} '
multiline.negate: true
multiline.match: after
multiline.max_lines: 100000
harvester_limit: 5
close_eof: true
- input_type: log
paths:
- /.../*/isp_*.log
fields:
source_type: logback-isp-application
multiline.pattern: '^[0-9]{2}:[0-9]{2}:[0-9]{2}.[0-9]{3} '
multiline.negate: true
multiline.match: after
multiline.max_lines: 100000
harvester_limit: 1
close_eof: true
output.elasticsearch:
hosts: ["http://H1:9200", "http://H2:9200", "http://H3:9200", "http://H4:9200"]
indices:
- index: "weblogic-%{+yyyy.MM.dd}"
when.contains:
fields.source_type: "weblogic"
- index: "logback-application-%{+yyyy.MM.dd}"
when.contains:
fields.source_type: "logback-application"
- index: "logback-application-%{+yyyy.MM.dd}"
when.contains:
fields.source_type: "logback-performance"
- index: "logback-application-%{+yyyy.MM.dd}"
when.contains:
fields.source_type: "logback-isp-application"
pipelines:
- pipeline: "weblogic"
when.equals:
fields.source_type: "weblogic"
- pipeline: "logback-application"
when.equals:
fields.source_type: "logback-application"
- pipeline: "logback-performance"
when.equals:
fields.source_type: "logback-performance"
- pipeline: "logback-isp-application"
when.equals:
fields.source_type: "logback-isp-application"
template.name: "filebeat"
template.path: "filebeat.template.json"
template.overwrite: true
loadbalance: true
worker: 2
And this is OS Version:
Red Hat Enterprise Linux Server release 5.8 (Tikanga)
I'm using Filebeat 5.3.
I noticed that if I comment the second prospector all works fine, but I don't know why.
Thanks