Encountered a retryable error. Will Retry with exponential backoff code=>400

Hello
I was using logstash 7.0.1 while running the below config file i was continuously getting retry-able error (400)

input {
 elasticsearch {
 hosts => ["https://ful-elastic.com:443/_bulk"]
 user => "username"
 password => "password"
 index => "testing"
 size => 100
 scroll => "1m"
 }
}

filter {

}
output {
 amazon_es {
 hosts => ["https://example.com:443"]
 region => "us-east-1"
 aws_access_key_id => "accessKeyId"
 aws_secret_access_key => "password"
 index => "testing"
 }
}

This is error we are getting continuously

[2019-08-29T19:55:49,399][ERROR][logstash.outputs.elasticsearch] Encountered a retryable error. Will Retry with exponential backoff {:code=>400, :url=>"https://example.com:443/_bulk"}
[2019-08-29T19:55:49,620][ERROR][logstash.outputs.elasticsearch] Encountered a retryable error. Will Retry with exponential backoff {:code=>400, :url=>""https://example.com:443/_bulk"}
[2019-08-29T19:55:52,818][ERROR][logstash.outputs.elasticsearch] Encountered a retryable error. Will Retry with exponential backoff {:code=>400, :url=>""https://example.com:443/_bulk"}

Can't able to find the reason why it is happening continuously?

Does the "testing" index exist in aws? I've seen similar errors when the index didn't exist and it couldn't be automatically created for some reason.

Is there a more informative error in the elasticsearch logs?

Yeah even after creating the testing index in aws-es with the same mapping type throwing same retryable error - 400

Hey this is entire logs we got in our end
Sending Logstash logs to /Users/thilak/Desktop/logstash-7.0.1/logs which is now configured via log4j2.properties

[2019-08-30T09:55:24,262][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified

[2019-08-30T09:55:24,306][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.0.1"}

/Users/thilak/Desktop/logstash-7.0.1/vendor/bundle/jruby/2.5.0/gems/logstash-output-amazon_es-6.4.2-java/lib/logstash/outputs/amazon_es/http_client/pool.rb:33: warning: already initialized constant ROOT_URI_PATH

/Users/thilak/Desktop/logstash-7.0.1/vendor/bundle/jruby/2.5.0/gems/logstash-output-amazon_es-6.4.2-java/lib/logstash/outputs/amazon_es/http_client/pool.rb:36: warning: already initialized constant DEFAULT_OPTIONS

/Users/thilak/Desktop/logstash-7.0.1/vendor/bundle/jruby/2.5.0/gems/logstash-output-amazon_es-6.4.2-java/lib/logstash/outputs/amazon_es/http_client/pool.rb:160: warning: already initialized constant ES1_SNIFF_RE_URL

/Users/thilak/Desktop/logstash-7.0.1/vendor/bundle/jruby/2.5.0/gems/logstash-output-amazon_es-6.4.2-java/lib/logstash/outputs/amazon_es/http_client/manticore_adapter.rb:7: warning: already initialized constant DEFAULT_HEADERS

/Users/thilak/Desktop/logstash-7.0.1/vendor/bundle/jruby/2.5.0/gems/logstash-output-amazon_es-6.4.2-java/lib/logstash/outputs/amazon_es/http_client.rb:24: warning: already initialized constant TARGET_BULK_BYTES

/Users/thilak/Desktop/logstash-7.0.1/vendor/bundle/jruby/2.5.0/gems/logstash-output-amazon_es-6.4.2-java/lib/logstash/outputs/amazon_es/common.rb:8: warning: already initialized constant DOC_DLQ_CODES

/Users/thilak/Desktop/logstash-7.0.1/vendor/bundle/jruby/2.5.0/gems/logstash-output-amazon_es-6.4.2-java/lib/logstash/outputs/amazon_es/common.rb:9: warning: already initialized constant DOC_SUCCESS_CODES

/Users/thilak/Desktop/logstash-7.0.1/vendor/bundle/jruby/2.5.0/gems/logstash-output-amazon_es-6.4.2-java/lib/logstash/outputs/amazon_es/common.rb:10: warning: already initialized constant DOC_CONFLICT_CODE

/Users/thilak/Desktop/logstash-7.0.1/vendor/bundle/jruby/2.5.0/gems/logstash-output-amazon_es-6.4.2-java/lib/logstash/outputs/amazon_es/common.rb:16: warning: already initialized constant VERSION_TYPES_PERMITTING_CONFLICT

/Users/thilak/Desktop/logstash-7.0.1/vendor/bundle/jruby/2.5.0/gems/logstash-output-amazon_es-6.4.2-java/lib/logstash/outputs/amazon_es/common.rb:133: warning: already initialized constant VALID_HTTP_ACTIONS

/Users/thilak/Desktop/logstash-7.0.1/vendor/bundle/jruby/2.5.0/gems/logstash-output-amazon_es-6.4.2-java/lib/logstash/outputs/amazon_es/common.rb:247: warning: already initialized constant DEFAULT_EVENT_TYPE_ES6

/Users/thilak/Desktop/logstash-7.0.1/vendor/bundle/jruby/2.5.0/gems/logstash-output-amazon_es-6.4.2-java/lib/logstash/outputs/amazon_es/common.rb:248: warning: already initialized constant DEFAULT_EVENT_TYPE_ES7

url template
{:scheme=>"https", :user=>nil, :password=>nil, :host=>"URLTEMPLATE", :port=>443, :path=>nil}

[2019-08-30T09:55:40,246][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[https://example.com:443/_bulk]}}

[2019-08-30T09:55:40,265][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>https://search-full-es-test-ow3zdiqyfv2dxu75hxwczayzxi.us-east-1.es.amazonaws.com:443/, :path=>"/"}

[2019-08-30T09:55:42,813][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"https://example.com:443/_bulk/"}

[2019-08-30T09:55:43,196][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}

[2019-08-30T09:55:43,201][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>6}

[2019-08-30T09:55:43,237][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["https://example.com:443/_bulk]"]}

[2019-08-30T09:55:43,263][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}

[2019-08-30T09:55:43,280][INFO ][logstash.javapipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, :thread=>"#<Thread:0x1bef95e0 run>"}

[2019-08-30T09:55:43,293][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}

[2019-08-30T09:55:44,137][INFO ][logstash.javapipeline ] Pipeline started {"pipeline.id"=>"main"}

[2019-08-30T09:55:44,233][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}

[2019-08-30T09:55:44,845][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

[2019-08-30T09:55:48,895][ERROR][logstash.outputs.elasticsearch] Encountered a retryable error. Will Retry with exponential backoff {:code=>400, :url=>"https://example.com:443/_bulk"}
[2019-08-30T09:55:48,895][ERROR][logstash.outputs.elasticsearch] Encountered a retryable error. Will Retry with exponential backoff {:code=>400, :url=>"https://example.com:443/_bulk"}
[2019-08-30T09:55:48,895][ERROR][logstash.outputs.elasticsearch] Encountered a retryable error. Will Retry with exponential backoff {:code=>400, :url=>"https://example.com:443/_bulk"}
[2019-08-30T09:55:48,895][ERROR][logstash.outputs.elasticsearch] Encountered a retryable error. Will Retry with exponential backoff {:code=>400, :url=>"https://example.com:443/_bulk"}
[2019-08-30T09:55:48,895][ERROR][logstash.outputs.elasticsearch] Encountered a retryable error. Will Retry with exponential backoff {:code=>400, :url=>"https://example.com:443/_bulk"}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.