Bulk insert from file, SSL record too big.

Hello all,
I am trying to following along with an acloud.guru ES deep dive and there is a section that you bulk insert ndjson files.

I also tried the curl examples from elasic.co

I receive the follow from curl:


* About to connect() to localhost port 9200 (#0)
*   Trying ::1...
* Connected to localhost (::1) port 9200 (#0)
* Initializing NSS with certpath: sql:/etc/pki/nssdb
* NSS error -12263 (SSL_ERROR_RX_RECORD_TOO_LONG)
* SSL received a record that exceeded the maximum permissible length.
* Closing connection 0
curl: (35) SSL received a record that exceeded the maximum permissible length.

I moved the heap to 4gb and ran it on a data only node to rule out any memory issues with the master node and still the same error.

Any suggestions? Is this a curl issue maybe?
Thanks in advance for help

Hey,

Could you share how you're starting elasticsearch locally and your curl example please?

In a 3 node cluster and running curl from the nodes to localhost. Here is the curl command:

curl -u elastic --insecure -H 'Content-Type: application/x-ndjson' -X POST https://localhost:9200/bank/_bulk --data-binary @accounts.json -v

Can you perform a normal curl request to elasticsearch? Just to double check its not to do with the SSL cert here.

yes normal curls work including creating indicies so POST, PUT, and GET all working it is only when I try to do a bulk insert. I also tried to do a bullk with only a few records and I even shortened the records. I also tried the elastic.co example that was a small record and count of records.

could you try with a single node example and worth following our documentation to start this node. I get the feeling theres something up with the communication between the nodes as it propagates the change across nodes.