Logstash 7.6.2 (EPIPE) Broken pipe Error (pipeline stopped processing new events)

Hi All,

Just upgrade Logstash version from version 6.2.3 to 7.6.2 by RPM upgrade and i have 9 pipeline. Logstash will push log to Elasticsearch cloud version 7.4. After the upgrade, log can be seen in Kibana, but daily shown below error in Logstash log and stopping the log show in Kibana.

Have issue logstash service/instance down everyday around 6-9AM. Could anyone help what causing below error and all pipeline is stop working.

Log before Logstash service down

[2020-06-23T22:46:30,274][WARN ][logstash.outputs.elasticsearch][internal-fuse1] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"esbuat-int-fuse1-2020.06", :routing=>nil, :_type=>"_doc"}, #<LogStash::Event:0x47b34099>], :response=>{"index"=>{"_index"=>"esbuat-int-fuse1-2020.06", "_type"=>"_doc", "_id"=>"mlyl4XIBmyJ2rBHTDvc6", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"Limit of total fields [1000] in index [esbuat-int-fuse1-2020.06] has been exceeded"}}}}
[2020-06-23T22:50:20,201][INFO ][logstash.inputs.jdbc     ][esbdata] (0.000820s) select top 20000 * from t_mail_item_status order by id desc
[2020-06-23T23:00:20,199][INFO ][logstash.inputs.jdbc     ][esbdata] (0.000958s) select top 20000 * from t_mail_item_status order by id desc
[2020-06-23T23:02:39,554][WARN ][logstash.outputs.elasticsearch][internal-fuse3] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"esbuat-int-fuse3-2020.06", :routing=>nil, :_type=>"_doc"}, #<LogStash::Event:0x78c10e73>], :response=>{"index"=>{"_index"=>"esbuat-int-fuse3-2020.06", "_type"=>"_doc", "_id"=>"-mCz4XIBmyJ2rBHT2BO8", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [jsondoc.status] of type [long] in document with id '-mCz4XIBmyJ2rBHT2BO8'. Preview of field's value: 'success'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"For input string: \"success\""}}}}}
[2020-06-23T23:10:20,229][INFO ][logstash.inputs.jdbc     ][esbdata] (0.001145s) select top 20000 * from t_mail_item_status order by id desc
[2020-06-23T23:20:20,380][INFO ][logstash.inputs.jdbc     ][esbdata] (0.000928s) select top 20000 * from t_mail_item_status order by id desc
[2020-06-23T23:30:20,156][INFO ][logstash.inputs.jdbc     ][esbdata] (0.000997s) select top 20000 * from t_mail_item_status order by id desc
[2020-06-23T23:40:20,230][INFO ][logstash.inputs.jdbc     ][esbdata] (0.000869s) select top 20000 * from t_mail_item_status order by id desc
[2020-06-23T23:48:20,534][WARN ][logstash.outputs.elasticsearch][internal-fuse3] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"esbuat-int-fuse3-2020.06", :routing=>nil, :_type=>"_doc"}, #<LogStash::Event:0x53951624>], :response=>{"index"=>{"_index"=>"esbuat-int-fuse3-2020.06", "_type"=>"_doc", "_id"=>"Wmfd4XIBmyJ2rBHTq2eu", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [jsondoc.status] of type [long] in document with id 'Wmfd4XIBmyJ2rBHTq2eu'. Preview of field's value: 'success'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"For input string: \"success\""}}}}}
[2020-06-23T23:50:20,191][INFO ][logstash.inputs.jdbc     ][esbdata] (0.001075s) select top 20000 * from t_mail_item_status order by id desc

Logstash stop

[2020-06-24T05:40:15,641][ERROR][org.logstash.execution.WorkerLoop][internal-fuse3] Exception in pipelineworker, the pipeline stopped processing new events, please check your filter configuration and restart Logstash.

org.jruby.exceptions.SystemCallError: (EPIPE) Broken pipe - <STDOUT>

at org.jruby.RubyIO.write(org/jruby/RubyIO.java:1477) ~[jruby-complete-9.2.9.0.jar:?]

at org.jruby.RubyIO.write(org/jruby/RubyIO.java:1432) ~[jruby-complete-9.2.9.0.jar:?]

at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_output_minus_stdout_minus_3_dot_1_dot_4.lib.logstash.outputs.stdout.multi_receive_encoded(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-stdout-3.1.4/lib/logstash/outputs/stdout.rb:43) ~[?:?]

at org.jruby.RubyArray.each(org/jruby/RubyArray.java:1814) ~[jruby-complete-9.2.9.0.jar:?]

at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_output_minus_stdout_minus_3_dot_1_dot_4.lib.logstash.outputs.stdout.multi_receive_encoded(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-stdout-3.1.4/lib/logstash/outputs/stdout.rb:42) ~[?:?]

at usr.share.logstash.logstash_minus_core.lib.logstash.outputs.base.multi_receive(/usr/share/logstash/logstash-core/lib/logstash/outputs/base.rb:87) ~[?:?]

at org.logstash.config.ir.compiler.OutputStrategyExt$AbstractOutputStrategyExt.multi_receive(org/logstash/config/ir/compiler/OutputStrategyExt.java:118) ~[logstash-core.jar:?]

at org.logstash.config.ir.compiler.AbstractOutputDelegatorExt.multi_receive(org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:101) ~[logstash-core.jar:?]

at usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.start_workers(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:262) ~[?:?]

[2020-06-24T05:40:15,617][ERROR][org.logstash.execution.WorkerLoop][internal-fuse2] Exception in pipelineworker, the pipeline stopped processing new events, please check your filter configuration and restart Logstash.

org.jruby.exceptions.SystemCallError: (EPIPE) Broken pipe - <STDOUT>

at org.jruby.RubyIO.write(org/jruby/RubyIO.java:1477) ~[jruby-complete-9.2.9.0.jar:?]

at org.jruby.RubyIO.write(org/jruby/RubyIO.java:1432) ~[jruby-complete-9.2.9.0.jar:?]

at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_output_minus_stdout_minus_3_dot_1_dot_4.lib.logstash.outputs.stdout.multi_receive_encoded(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-stdout-3.1.4/lib/logstash/outputs/stdout.rb:43) ~[?:?]

at org.jruby.RubyArray.each(org/jruby/RubyArray.java:1814) ~[jruby-complete-9.2.9.0.jar:?]

at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_output_minus_stdout_minus_3_dot_1_dot_4.lib.logstash.outputs.stdout.multi_receive_encoded(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-stdout-3.1.4/lib/logstash/outputs/stdout.rb:42) ~[?:?]

at usr.share.logstash.logstash_minus_core.lib.logstash.outputs.base.multi_receive(/usr/share/logstash/logstash-core/lib/logstash/outputs/base.rb:87) ~[?:?]

at org.logstash.config.ir.compiler.OutputStrategyExt$AbstractOutputStrategyExt.multi_receive(org/logstash/config/ir/compiler/OutputStrategyExt.java:118) ~[logstash-core.jar:?]

at org.logstash.config.ir.compiler.AbstractOutputDelegatorExt.multi_receive(org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:101) ~[logstash-core.jar:?]

at usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.start_workers(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:262) ~[?:?]

Thanks,
Usman

ES enforce maximum number of fields in an index to avoid mapping explosion. see here

if you need more than 1000 fields, you need to update your index template settings

Hi @ptamba thanks for your quick response and help.
Ok, let me try to remove this log from sending to elasticsearch and i will put condition in logstash to ignore this log.
What i don't get it is, if really due to this - why make all my 9 pipeline in logstash stop working to sending to elasticsearch and the weird thing is, its happening the next day.

I will test and update again here.

Hi All,

I'm really frustrated with Logstash Error below, which stopped all event to all my pipeline (i have about 7 pipeline using different port). As my initial issue, there is no issue at all for Logstash version 6.2.3 - all working as expected with all my filter config.

as i test 7.6.2 - same issue, then i do fresh installation to increase version to Logstash 7.8 also having the same problem.

Could anyone help what could be wrong on this, and why its STOP all my pipeline.. This is so weird on the new Logstash version.

There is no error in Logstash and suddenly its stop send event daily/ in 2 days.

[2020-07-10T02:03:06,537][ERROR][org.logstash.execution.WorkerLoop][internal-fuse1] Exception in pipelineworker, the pipeline stopped processing new events, please check your
 filter configuration and restart Logstash.
org.jruby.exceptions.SystemCallError: (EPIPE) Broken pipe - <STDOUT>
        at org.jruby.RubyIO.write(org/jruby/RubyIO.java:1475) ~[jruby-complete-9.2.11.1.jar:?]
        at org.jruby.RubyIO.write(org/jruby/RubyIO.java:1430) ~[jruby-complete-9.2.11.1.jar:?]
        at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_output_minus_stdout_minus_3_dot_1_dot_4.lib.logstash.outputs.stdout.multi_receive_encoded(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-stdout-3.1.4/lib/logstash/outputs/stdout.rb:43) ~[?:?]
        at org.jruby.RubyArray.each(org/jruby/RubyArray.java:1809) ~[jruby-complete-9.2.11.1.jar:?]
        at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_output_minus_stdout_minus_3_dot_1_dot_4.lib.logstash.outputs.stdout.multi_receive_encoded(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-stdout-3.1.4/lib/logstash/outputs/stdout.rb:42) ~[?:?]
        at usr.share.logstash.logstash_minus_core.lib.logstash.outputs.base.multi_receive(/usr/share/logstash/logstash-core/lib/logstash/outputs/base.rb:103) ~[?:?]
        at org.logstash.config.ir.compiler.OutputStrategyExt$AbstractOutputStrategyExt.multi_receive(org/logstash/config/ir/compiler/OutputStrategyExt.java:138) ~[logstash-core.jar:?]
        at org.logstash.config.ir.compiler.AbstractOutputDelegatorExt.multi_receive(org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:121) ~[logstash-core.jar:?]
        at RUBY.start_workers(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:278) ~[?:?]
[2020-07-10T02:03:06,788][ERROR][org.logstash.execution.WorkerLoop][internal-fuse3] Exception in pipelineworker, the pipeline stopped processing new events, please check your filter configuration and restart Logstash.
org.jruby.exceptions.SystemCallError: (EPIPE) Broken pipe - <STDOUT>
        at org.jruby.RubyIO.write(org/jruby/RubyIO.java:1475) ~[jruby-complete-9.2.11.1.jar:?]
        at org.jruby.RubyIO.write(org/jruby/RubyIO.java:1430) ~[jruby-complete-9.2.11.1.jar:?]
        at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_output_minus_stdout_minus_3_dot_1_dot_4.lib.logstash.outputs.stdout.multi_receive_encoded(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-stdout-3.1.4/lib/logstash/outputs/stdout.rb:43) ~[?:?]
        at org.jruby.RubyArray.each(org/jruby/RubyArray.java:1809) ~[jruby-complete-9.2.11.1.jar:?]
        at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_output_minus_stdout_minus_3_dot_1_dot_4.lib.logstash.outputs.stdout.multi_receive_encoded(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-stdout-3.1.4/lib/logstash/outputs/stdout.rb:42) ~[?:?]
        at usr.share.logstash.logstash_minus_core.lib.logstash.outputs.base.multi_receive(/usr/share/logstash/logstash-core/lib/logstash/outputs/base.rb:103) ~[?:?]
        at org.logstash.config.ir.compiler.OutputStrategyExt$AbstractOutputStrategyExt.multi_receive(org/logstash/config/ir/compiler/OutputStrategyExt.java:138) ~[logstash-core.jar:?]
        at org.logstash.config.ir.compiler.AbstractOutputDelegatorExt.multi_receive(org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:121) ~[logstash-core.jar:?]
        at usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.start_workers(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:278) ~[?:?]
[2020-07-10T02:03:06,728][ERROR][org.logstash.execution.WorkerLoop][cpi-server] Exception in pipelineworker, the pipeline stopped processing new events, please check your filter configuration and restart Logstash.
org.jruby.exceptions.SystemCallError: (EPIPE) Broken pipe - <STDOUT>
        at org.jruby.RubyIO.write(org/jruby/RubyIO.java:1475) ~[jruby-complete-9.2.11.1.jar:?]
        at org.jruby.RubyIO.write(org/jruby/RubyIO.java:1430) ~[jruby-complete-9.2.11.1.jar:?]
        at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_output_minus_stdout_minus_3_dot_1_dot_4.lib.logstash.outputs.stdout.multi_receive_encoded(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-stdout-3.1.4/lib/logstash/outputs/stdout.rb:43) ~[?:?]
        at org.jruby.RubyArray.each(org/jruby/RubyArray.java:1809) ~[jruby-complete-9.2.11.1.jar:?]
        at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_output_minus_stdout_minus_3_dot_1_dot_4.lib.logstash.outputs.stdout.multi_receive_encoded(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-stdout-3.1.4/lib/logstash/outputs/stdout.rb:42) ~[?:?]
        at usr.share.logstash.logstash_minus_core.lib.logstash.outputs.base.multi_receive(/usr/share/logstash/logstash-core/lib/logstash/outputs/base.rb:103) ~[?:?]
        at org.logstash.config.ir.compiler.OutputStrategyExt$AbstractOutputStrategyExt.multi_receive(org/logstash/config/ir/compiler/OutputStrategyExt.java:138) ~[logstash-core.jar:?]

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.