Hi,
when I start filebeat reading from stdin sometime its end with 141 pipe status.
# filebeat.1
zcat somefile.gz | filebeat -c filebeat.yml --once; rc=${PIPESTATUS[*]}; echo $rc
0 0
# filebeat
zcat somefile.gz | filebeat -c filebeat.yml --once; rc=${PIPESTATUS[*]}; echo $rc
141 0
filebeat version
filebeat version 6.4.1 (amd64), libbeat 6.4.1 [37b5f2d2a20f2734b2373a454b4b4cbb2627e841 built 2018-09-13 21:25:47 +0000 UTC]
Logs
-
filebeat.1
2018-10-18T10:22:27.732+0200 INFO elasticsearch/client.go:163 Elasticsearch url: https://XXX
2018-10-18T10:22:27.733+0200 INFO pipeline/module.go:98 Beat name: XXX
2018-10-18T10:22:27.733+0200 INFO instance/beat.go:367 filebeat start running.
2018-10-18T10:22:27.733+0200 INFO [monitoring] log/log.go:114 Starting metrics logging every 30s
2018-10-18T10:22:27.733+0200 INFO registrar/registrar.go:134 Loading registrar data from XXX
2018-10-18T10:22:27.734+0200 INFO registrar/registrar.go:141 States Loaded from registrar: 6
2018-10-18T10:22:27.734+0200 INFO crawler/crawler.go:72 Loading Inputs: 1
2018-10-18T10:22:27.734+0200 INFO input/input.go:114 Starting input of type: stdin; ID: 5005293166929256556
2018-10-18T10:22:27.734+0200 INFO input/input.go:167 Stopping Input: 5005293166929256556
2018-10-18T10:22:27.734+0200 INFO crawler/crawler.go:106 Loading and starting Inputs completed. Enabled inputs: 1
2018-10-18T10:22:27.734+0200 INFO log/harvester.go:251 Harvester started for file: -
2018-10-18T10:22:27.734+0200 INFO beater/filebeat.go:387 Running filebeat once. Waiting for completion ...
2018-10-18T10:22:27.734+0200 INFO beater/filebeat.go:389 All data collection completed. Shutting down.
2018-10-18T10:22:27.734+0200 INFO crawler/crawler.go:139 Stopping Crawler
2018-10-18T10:22:27.734+0200 INFO crawler/crawler.go:149 Stopping 1 inputs
2018-10-18T10:22:27.734+0200 INFO crawler/crawler.go:165 Crawler stopped
2018-10-18T10:22:27.818+0200 INFO pipeline/output.go:95 Connecting to backoff(elasticsearch(XXX))
2018-10-18T10:22:27.863+0200 INFO elasticsearch/client.go:712 Connected to Elasticsearch version 6.1.1
2018-10-18T10:22:27.865+0200 INFO template/load.go:129 Template already exists and will not be overwritten.
2018-10-18T10:22:27.865+0200 INFO pipeline/output.go:105 Connection to backoff(elasticsearch(XXX)) established
2018-10-18T10:22:43.072+0200 INFO log/harvester.go:274 End of file reached: . Closing because close_eof is enabled.
2018-10-18T10:22:44.337+0200 INFO beater/signalwait.go:93 Continue shutdown: All enqueued events being published.
2018-10-18T10:22:44.337+0200 INFO registrar/registrar.go:356 Stopping Registrar
2018-10-18T10:22:44.337+0200 INFO registrar/registrar.go:282 Ending Registrar
2018-10-18T10:22:44.343+0200 INFO [monitoring] log/log.go:150 Uptime: 16.623762982s
2018-10-18T10:22:44.343+0200 INFO [monitoring] log/log.go:127 Stopping metrics logging.
2018-10-18T10:22:44.343+0200 INFO instance/beat.go:373 filebeat stopped. -
filebeat
2018-10-18T10:30:43.546+0200 INFO elasticsearch/client.go:163 Elasticsearch url: XXX
2018-10-18T10:30:43.547+0200 INFO pipeline/module.go:98 Beat name: XXX
2018-10-18T10:30:43.547+0200 INFO [monitoring] log/log.go:114 Starting metrics logging every 30s
2018-10-18T10:30:43.547+0200 INFO instance/beat.go:367 filebeat start running.
2018-10-18T10:30:43.547+0200 INFO registrar/registrar.go:134 Loading registrar data from XXX
2018-10-18T10:30:43.547+0200 INFO registrar/registrar.go:141 States Loaded from registrar: 6
2018-10-18T10:30:43.547+0200 INFO crawler/crawler.go:72 Loading Inputs: 1
2018-10-18T10:30:43.548+0200 INFO input/input.go:114 Starting input of type: stdin; ID: 5005293166929256556
2018-10-18T10:30:43.548+0200 INFO input/input.go:167 Stopping Input: 5005293166929256556
2018-10-18T10:30:43.548+0200 INFO crawler/crawler.go:106 Loading and starting Inputs completed. Enabled inputs: 1
2018-10-18T10:30:43.548+0200 INFO beater/filebeat.go:387 Running filebeat once. Waiting for completion ...
2018-10-18T10:30:43.548+0200 INFO beater/filebeat.go:389 All data collection completed. Shutting down.
2018-10-18T10:30:43.548+0200 INFO crawler/crawler.go:139 Stopping Crawler
2018-10-18T10:30:43.548+0200 INFO crawler/crawler.go:149 Stopping 1 inputs
2018-10-18T10:30:43.548+0200 INFO crawler/crawler.go:165 Crawler stopped
2018-10-18T10:30:43.548+0200 INFO log/harvester.go:251 Harvester started for file: -
2018-10-18T10:30:43.548+0200 INFO beater/signalwait.go:93 Continue shutdown: All enqueued events being published.
2018-10-18T10:30:43.548+0200 INFO registrar/registrar.go:356 Stopping Registrar
2018-10-18T10:30:43.548+0200 INFO registrar/registrar.go:282 Ending Registrar
2018-10-18T10:30:43.554+0200 INFO harvester/forwarder.go:52 Input outlet closed
2018-10-18T10:30:43.558+0200 INFO [monitoring] log/log.go:150 Uptime: 26.752154ms
2018-10-18T10:30:43.558+0200 INFO [monitoring] log/log.go:127 Stopping metrics logging.
2018-10-18T10:30:43.558+0200 INFO instance/beat.go:373 filebeat stopped.
Thanks
krrz