I have problems since a few days ago, when I try to send a large JSON file (aprox. 6 GB) to Elasticsearch using
Bulk API. Before putting this question I have documented a lot and I saw there are two possibilities to send data to Elasticsearch:
Bulk API or
Logstash. In fact,
Logstash uses behind the
Bulk functionality. I know that when you want to send large files to Elasticsearch, you have to take into consideration the HTTP limitation, which is aprox. 2 GB, because data is firstly loaded into memory and then sent to Elasticsearch. Consequently, I split the large JSON file into smaller files, each of 350 MB (100.000 lines), using:
split -l 100000 -a 6 large_data.json /home/.../Chunk_Files.
Afterwards, I tried to send each one using the curl command:
curl -s -H "Content-Type: application/x-ndjson" -XPOST localhost:9200/_bulk --data-binary "@Chunk_Filesaaaaaa.json, but I get nothing in terminal (neither success, nor error). Also I don't get anything in Elasticsearch. I have to mention that my file contains 100.000 lines of this form:
If you have any idea where I am wrong or you could give me other alternatives which may work I would be very happy! There should be professional people who know a solution to my problem.
Thanks in advance!