Hi All,
Just upgrade Logstash version from version 6.2.3 to 7.6.2 by RPM upgrade and i have 9 pipeline. Logstash will push log to Elasticsearch cloud version 7.4. After the upgrade, log can be seen in Kibana, but daily shown below error in Logstash log and stopping the log show in Kibana.
Have issue logstash service/instance down everyday around 6-9AM. Could anyone help what causing below error and all pipeline is stop working.
Log before Logstash service down
[2020-06-23T22:46:30,274][WARN ][logstash.outputs.elasticsearch][internal-fuse1] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"esbuat-int-fuse1-2020.06", :routing=>nil, :_type=>"_doc"}, #<LogStash::Event:0x47b34099>], :response=>{"index"=>{"_index"=>"esbuat-int-fuse1-2020.06", "_type"=>"_doc", "_id"=>"mlyl4XIBmyJ2rBHTDvc6", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"Limit of total fields [1000] in index [esbuat-int-fuse1-2020.06] has been exceeded"}}}}
[2020-06-23T22:50:20,201][INFO ][logstash.inputs.jdbc ][esbdata] (0.000820s) select top 20000 * from t_mail_item_status order by id desc
[2020-06-23T23:00:20,199][INFO ][logstash.inputs.jdbc ][esbdata] (0.000958s) select top 20000 * from t_mail_item_status order by id desc
[2020-06-23T23:02:39,554][WARN ][logstash.outputs.elasticsearch][internal-fuse3] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"esbuat-int-fuse3-2020.06", :routing=>nil, :_type=>"_doc"}, #<LogStash::Event:0x78c10e73>], :response=>{"index"=>{"_index"=>"esbuat-int-fuse3-2020.06", "_type"=>"_doc", "_id"=>"-mCz4XIBmyJ2rBHT2BO8", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [jsondoc.status] of type [long] in document with id '-mCz4XIBmyJ2rBHT2BO8'. Preview of field's value: 'success'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"For input string: \"success\""}}}}}
[2020-06-23T23:10:20,229][INFO ][logstash.inputs.jdbc ][esbdata] (0.001145s) select top 20000 * from t_mail_item_status order by id desc
[2020-06-23T23:20:20,380][INFO ][logstash.inputs.jdbc ][esbdata] (0.000928s) select top 20000 * from t_mail_item_status order by id desc
[2020-06-23T23:30:20,156][INFO ][logstash.inputs.jdbc ][esbdata] (0.000997s) select top 20000 * from t_mail_item_status order by id desc
[2020-06-23T23:40:20,230][INFO ][logstash.inputs.jdbc ][esbdata] (0.000869s) select top 20000 * from t_mail_item_status order by id desc
[2020-06-23T23:48:20,534][WARN ][logstash.outputs.elasticsearch][internal-fuse3] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"esbuat-int-fuse3-2020.06", :routing=>nil, :_type=>"_doc"}, #<LogStash::Event:0x53951624>], :response=>{"index"=>{"_index"=>"esbuat-int-fuse3-2020.06", "_type"=>"_doc", "_id"=>"Wmfd4XIBmyJ2rBHTq2eu", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [jsondoc.status] of type [long] in document with id 'Wmfd4XIBmyJ2rBHTq2eu'. Preview of field's value: 'success'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"For input string: \"success\""}}}}}
[2020-06-23T23:50:20,191][INFO ][logstash.inputs.jdbc ][esbdata] (0.001075s) select top 20000 * from t_mail_item_status order by id desc
Logstash stop
[2020-06-24T05:40:15,641][ERROR][org.logstash.execution.WorkerLoop][internal-fuse3] Exception in pipelineworker, the pipeline stopped processing new events, please check your filter configuration and restart Logstash.
org.jruby.exceptions.SystemCallError: (EPIPE) Broken pipe - <STDOUT>
at org.jruby.RubyIO.write(org/jruby/RubyIO.java:1477) ~[jruby-complete-9.2.9.0.jar:?]
at org.jruby.RubyIO.write(org/jruby/RubyIO.java:1432) ~[jruby-complete-9.2.9.0.jar:?]
at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_output_minus_stdout_minus_3_dot_1_dot_4.lib.logstash.outputs.stdout.multi_receive_encoded(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-stdout-3.1.4/lib/logstash/outputs/stdout.rb:43) ~[?:?]
at org.jruby.RubyArray.each(org/jruby/RubyArray.java:1814) ~[jruby-complete-9.2.9.0.jar:?]
at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_output_minus_stdout_minus_3_dot_1_dot_4.lib.logstash.outputs.stdout.multi_receive_encoded(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-stdout-3.1.4/lib/logstash/outputs/stdout.rb:42) ~[?:?]
at usr.share.logstash.logstash_minus_core.lib.logstash.outputs.base.multi_receive(/usr/share/logstash/logstash-core/lib/logstash/outputs/base.rb:87) ~[?:?]
at org.logstash.config.ir.compiler.OutputStrategyExt$AbstractOutputStrategyExt.multi_receive(org/logstash/config/ir/compiler/OutputStrategyExt.java:118) ~[logstash-core.jar:?]
at org.logstash.config.ir.compiler.AbstractOutputDelegatorExt.multi_receive(org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:101) ~[logstash-core.jar:?]
at usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.start_workers(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:262) ~[?:?]
[2020-06-24T05:40:15,617][ERROR][org.logstash.execution.WorkerLoop][internal-fuse2] Exception in pipelineworker, the pipeline stopped processing new events, please check your filter configuration and restart Logstash.
org.jruby.exceptions.SystemCallError: (EPIPE) Broken pipe - <STDOUT>
at org.jruby.RubyIO.write(org/jruby/RubyIO.java:1477) ~[jruby-complete-9.2.9.0.jar:?]
at org.jruby.RubyIO.write(org/jruby/RubyIO.java:1432) ~[jruby-complete-9.2.9.0.jar:?]
at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_output_minus_stdout_minus_3_dot_1_dot_4.lib.logstash.outputs.stdout.multi_receive_encoded(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-stdout-3.1.4/lib/logstash/outputs/stdout.rb:43) ~[?:?]
at org.jruby.RubyArray.each(org/jruby/RubyArray.java:1814) ~[jruby-complete-9.2.9.0.jar:?]
at usr.share.logstash.vendor.bundle.jruby.$2_dot_5_dot_0.gems.logstash_minus_output_minus_stdout_minus_3_dot_1_dot_4.lib.logstash.outputs.stdout.multi_receive_encoded(/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-stdout-3.1.4/lib/logstash/outputs/stdout.rb:42) ~[?:?]
at usr.share.logstash.logstash_minus_core.lib.logstash.outputs.base.multi_receive(/usr/share/logstash/logstash-core/lib/logstash/outputs/base.rb:87) ~[?:?]
at org.logstash.config.ir.compiler.OutputStrategyExt$AbstractOutputStrategyExt.multi_receive(org/logstash/config/ir/compiler/OutputStrategyExt.java:118) ~[logstash-core.jar:?]
at org.logstash.config.ir.compiler.AbstractOutputDelegatorExt.multi_receive(org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:101) ~[logstash-core.jar:?]
at usr.share.logstash.logstash_minus_core.lib.logstash.java_pipeline.start_workers(/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:262) ~[?:?]
Thanks,
Usman