When using the _bulk API, curl command, I'm seeing this error. I have given 3 JSON files. 2 files were worked, one file is thrown this error. I don't understand the reason here. The 3rd file input data is like the below formate. Is there any mistake in the input file? {"index":{}} {"rawmessage":"{\"client_id\":\"6b3c549d\",\"sensor_id\":\"BE_JH_S01\",\"log\":{\"file\":{\"path\":\"/usr/local/logs/current/dns.log\"}},\"id.resp_p\":53,\"beat\":{\"version\":\"6.8.7\",\"hostname\":\"localhost.localdomain\",\"name\":\"localhost.localdomain\"},\"@version\":\"1\",\"AA\":false,\"trans_id\":14828,\"ts\":1.603953834579946E9,\"query\":\"api.cylance.com\",\"qtype\":1,\"host\":{\"name\":\"localhost.localdomain\"},\"uid\":\"CzLnfD204cdNneMa34\",\"rejected\":false,\"fields\":{\"type\":\"dns\"},\"qclass_name\":\"C_INTERNET\",\"id.orig_p\":55351,\"RD\":false,\"qclass\":1,\"@timestamp\":\"2020-10-29T06:44:05.449Z\",\"offset\":909106,\"RA\":false,\"source\":\"/usr/local/logs/current/dns.log\",\"Z\":0,\"id.resp_h\":\"192.203.230.10\",\"tags\":[\"beats_input_raw_event\"],\"TC\":false,\"qtype_name\":\"A\",\"id.orig_h\":\"10.40.1.4\",\"proto\":\"udp\"}","client_id":"6b3c549d","sensor_id":"BE_JH_S01","dns":{"qtype":1,"uid":"CzLnfD204cdNneMa34","dpt":53,"rejected":false,"qclass_name":"C_INTERNET","RD":false,"qclass":1,"src":"10.40.1.4","RA":false,"Z":0,"dst":"192.203.230.10","AA":false,"start":"2020-10-29T06:43:54.579Z","trans_id":14828,"spt":55351,"TC":false,"query":"api.cylance.com","qtype_name":"A","proto":"udp"},"log":{"file":{"path":"/usr/local/logs/current/dns.log"}},"host":{"name":"localhost.localdomain"},"timestamp":"2020-10-29T06:44:05.449Z","fields":{"type":"dns"},"@timestamp":"2020-10-29T06:43:54.579Z"}
indent preformatted text by 4 spaces
Here, my 1st, 2nd files sizes are 100mb,150mb and the 3rd file is around 500mb. when I'm splitting the 3rd file into 100mb sizes the command (curl -X POST -s -u elastic:XXXXXXXXX "http://{HOSTNAME}:9200/{index[i]}/_doc/_bulk" -H "Content-Type: application/x-ndjson" --data-binary "@${path[i]}) is working, So, what size of file should i have to give here, if i have 1gb file suggest me other way to write data to elasticsearch?
You must split it in smaller pieces.
The bulk interface is used to send batches of documents to Elasticsearch to make indexing more efficient. A general recommendation is to keep the size of each bulk request below 5MB. It is not designed to upload very large files.
Thanks guys,
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.