haborda
(Hugo)
September 14, 2024, 4:36pm
1
Hi all:
I do have a file (my-file.json) that meets ES Bulk REST API. I can call it like this:
curl --data-binary "@./<my-file>json" -X POST <ES-cloud-URL>:443/_bulk?pretty \
-H "Authorization: <my-APi-key>" \
-H "Content-Type: application/json"
and the data gets loaded into my index. So I know the file is correct.
I would like not to use ES' API, and try to call the same REST end-point from NodeJS
const fs = require('fs/promises');
async function sendItToES()
{
const filePath = "<my-file>.json";
const stats = await fs.stat(filePath); //added this to be sure I was reading the right file
const fileSizeInBytes = stats.size;
const formData = new FormData();
formData.append('file', filePath);
const theURL = '<ES-cloud-URL>:443/_bulk?pretty -H';
const theLoad = await fetch (theURL, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
// 'Content-Type': 'application/x-ndjson', //this one did not work either
'Authorization': '<my-APi-key>'
},
body: formData
})
.then(response => {
if (response.ok) {
console.log('the response:', response);
} else {
console.log('upload the error');
}
return response.json();
})
.then (data => {
console.log('server reponse', data);
})
.catch(err => {
console.error('error uploading file:', err);
});
}
I trie many different permutations and I always get "server reponse {error: {…}, status: 400}"
Thanks in advance.
I think you need to send file data in binary. Also could you share exact error you getting?
haborda
(Hugo)
September 15, 2024, 11:00pm
3
Thanks Ashish, it finally got it to work. bellow if the code just incase somebody else run into the same issue
const fs = require('fs/promises'); // for file manipulation, notice we are not using fs but fs/promises
async function sendItToES()
{
const theURL = 'https://<for ES cluster>:443/_bulk?pretty';
const theLoad = await fetch (theURL, {
method: 'POST',
headers: {
'Content-Type': 'application/x-ndjson',
'Authorization': 'ApiKey <your API key>'
},
body: await fs.readFile('<your file and path>')
})
.then(response => {
if (response.ok) {
console.log('succeded sending payload to ES');
} else {
console.log('upload to ES failed: ', response);
}
return response.json();
})
.then (data => {
console.log('data sent to ES: ', data);
})
.catch(err => {
console.error('error uploading file: ', err);
});
}
1 Like
joemcelroy
(Joseph McElroy)
September 17, 2024, 8:09am
4
Curious - Why not use the Node Elasticsearch Client here?
Node Elasticsearch client has some nice helpers which allow you to import docs very easily
const split = require('split2')
const { Client } = require('@elastic/elasticsearch')
const client = new Client({
node: 'https://<for ES cluster>:443/',
auth: { apiKey: 'base64EncodedKey' }
})
const result = await client.helpers.bulk({
datasource: createReadStream('<your ndjson file and path>').pipe(split()),
onDocument (doc) {
return {
index: { _index: 'my-index' }
}
}
})
console.log(result)
For more information Client helpers | Elasticsearch JavaScript Client [8.15] | Elastic
1 Like
system
(system)
Closed
October 15, 2024, 8:09am
5
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.