Error elasticsearch encountered a retryable error. Will Retry with exponential backoff code 400


#1

[ERROR] 2018-07-02 16:23:20.482 [[main]>worker5] elasticsearch - Encountered a retryable error. Will Retry with exponential backoff {:code=>400, :url=>"http://ip:9200/_bulk"}

logstash 6.0.0, elasticsearch 6.0.0

I increased elasticsearch configuration heap size in jvm.options from 1 gb to 26 gb. It has improved numbers. There are still errors.

now:

-- it has stopped loading logs
-- logs loaded 44419
-- error count 2058, continue to increase as new logs are processed
-- index doc.deleted 7701

before:

-- it has stopped loading logs
-- logs loaded 8412
-- error count 12600, continue to increase as new logs are processed
-- index doc.deleted 6109

Run logstash job.

input {
file {
path => [
"/path/nifi-app.log",
"/path/nifi-user.log",
"/path/nifi-bootstrap.log"
]
type => "nifiapp"
sincedb_path => "/path/nifiapp"
}
}
filter {
if [type] == "nifiapp" {
grok {
match => {
"message" => "%{GREEDYDATA:log_date} %{GREEDYDATA:log_time} %{EMAILLOCALPART:log_level} [%{GREEDYDATA:log_type}] %{GREEDYDATA:log_text}"
}
}
}
}
output {
if [type] == "nifiapp" {
elasticsearch {
hosts => "ip:9200"
index => "nifiapp"
document_id => "%{log_date}%{log_time}%{log_level}%{log_type}%{log_text}"
}
}
}

Run nifi job to copy log files from nifi server to elk server.

At this point, I get this error.

[ERROR] 2018-07-02 16:23:20.482 [[main]>worker5] elasticsearch - Encountered a retryable error. Will Retry with exponential backoff {:code=>400, :url=>"http://ip:9200/_bulk"}

curl -XGET 'ip:9200/_cat/indices?v&pretty'

index nifiapp, docs.count 44419, doc.deleted 7701


(Thiago Souza) #2

It is hard to tell what is exactly without more logs from either Logstash or Elasticsearch. An error with code 400 is indicating that Logstash tried to index invalid data. So only more logs will tell exactly what the error is.

But judging from your configuration I would say that there is a great chance that it is because you are using a document_id that indicates really long values. If you really want to use those fields to build the document's id then you should consider building a hash out of these fields using the Fingerprint filter plugin


(system) #3

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.