Hello,
On Elasticsearch 1.7.3
On Laptop with 64GB OF RAM
Number of instance 3 X 12GB
file size = 405M
My config File :
path:
logs: /var/log/company/elasticsearch/instance1
data: /opt/company/elasticsearch/instance1
cluster.name: company
node.name: "company-1"
index.merge.scheduler.max_thread_count: 1
threadpool.bulk.type: fixed
threadpool.bulk.size: 8
threadpool.bulk.queue_size: 300
indices.memory.index_buffer_size: 30%
index.refresh_interval: 30s
index.fielddata.cache: node
indices.fielddata.cache.size: 40%
http.compression: true
index.number_of_replicas: 0
bootstrap.mlockall: true
script.indexed: on
script.dynamic: on
script.file: on
script.inline: on
script.groovy.sandbox.enabled: true
http.max_content_length: 1gb
When i curl for debugging "curl -XPOST 'http://127.0.0.1:9200/test/test/1' -d @file
curl: (56) Recv failure: Connection reset by peer
Elasticsearch log :
Caught exception while handling client http traffic, closing connection [id: 0x18d0d451, /127.0.0.1:58268 => /127.0.0.1:9200]
org.elasticsearch.common.netty.handler.codec.frame.TooLongFrameException: HTTP content length exceeded 104857600 bytes.
Now on my server
On Elasticsearch 1.7.3
On Server with 256GB OF RAM
Number of instance 6 X 12GB
file size = 405M
curl -XPOST 'http://127.0.0.1:9200/test/test/1' -d @file return success and no error on ES log
For information on server side or on client side when i curl -XGET 'http://localhost:9200/_nodes' | grep http.max_content_length i see "http.max_content_length:":"1gb"