Elasticsearch.Net.ElasticsearchClientException: The remote server returned an error: (413) Request Entity Too Large

dear,
i do trying sent a bulk size 1GB, but ... (i need help)

Elasticsearch.Net.ElasticsearchClientException: The remote server returned an error: (413) Request Entity Too Large.. Call: Status code 413 from: POST /_bulk?_s
ource=false&refresh=false&pipeline=attachments ---> System.Net.WebException: The
remote server returned an error: (413) Request Entity Too Large.
at System.Net.HttpWebRequest.GetResponse()
at Elasticsearch.Net.HttpWebRequestConnection.Request[TResponse](RequestData
requestData)
--- End of inner exception stack trace ---
#2 The remote server returned an error: (413) Request Entity Too Large

my elasticsearch.yml configuration: (Elastic 6.5)

bootstrap.memory_lock: true
indices.recovery.max_bytes_per_sec: 2g
indices.memory.index_buffer_size: 60%
thread_pool.bulk.queue_size: 3000
thread_pool.index.queue_size: 3000
http.max_content_length: 15000mb
http.max_header_size: 1500mb
http.max_initial_line_length: 2500mb
http.port: 9200
node.data: true
node.ingest: true
node.master: true
node.max_local_storage_nodes: 1
xpack.license.self_generated.type: basic
xpack.security.enabled: false

can help me?

Send smaller bulks. It does not make sense to send so big requests IMO.

1 Like

my bulks are mails with attachments.

i use the attachment pipeline with ingest attachment processor plugin.

I have a volume of 3 million e-mail send to index.

thanks for reply :slight_smile:

Send one attachment one by one or one email per one email...

At most, I'd not send to elasticsearch bulk requests with more than 50mb of data.

The other solution is to do the text extraction outside elasticsearch and send just the extracted text to elasticsearch.
FSCrawler might help although I'm aware of limitations as well.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.