Bulk insert not inserting

(James) #1

I've been trying to insert a jsonl file into elastic search using:

curl -H 'Content-Type: application/x-ndjson' -XPOST 'localhost:9200/ri/RiAll/_bulk?pretty' --data-binary @riall.jsonl

The command doesn't throw an error but nothing gets imported. This is the response:

% Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
  0  602M    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0

A smaller file size works but I'm sure elasticsearch can take a 600mb file, I know lucene can. Do I need to configure the heap or something like that?

(Clinton Gormley) #2

http.max_content_length defaults to 100MB, so your 600MB file is being rejected.

That said, the answer is not to increase the limit, but rather to break your file down into smaller (much smaller!) chunks. We usually recommend bulk requests in the 5-10MB range.

(James) #3

Thanks. I did end up bumping up the max_content_length to 700mb and
increasing the heap to 8gb. In my case it worked. That's probably because
the documents are very short, a couple of sentences max.

my own book http://myownbook.org

(system) #4

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.