ELK: How do I retrieve more than 10000 results using Elastic-search in node js

function getElasticData(endTime, startTime) {
     return new Promise(function (resolve, reject) {
      elasticClient.search({
             index: 'logstash-*',
             type: 'callEnd',
             size: 10000,
             body: {
                 query: {

                     range: {
                         "@timestamp": {
                             "gte": startTime,
                             "lte": endTime
                         }

                     }
                 }
             }
         }, function (error, response, status) {
             if (error) {
                 reject(error);
             }
             else {
                 resolve(response);
             }
         })

     });
 }

Please format your code, logs or configuration files using </> icon as explained in this guide and not the citation button. It will make your post more readable.

Or use markdown style like:

```
CODE
```

There's a live preview panel for exactly this reasons.

Lots of people read these forums, and many of them will simply skip over a post that is difficult to read, because it's just too large an investment of their time to try and follow a wall of badly formatted text.
If your goal is to get an answer to your questions, it's in your interest to make it as easy to read and understand as possible.
Please update your post.

To extract lot of data in a consistent way, use the scroll API

I have changed the code and removed the console.log

Thanks,
Jasso

Thank you!

Hi dadoonet,
I am using nodejs and querying in elasticsearch. I want the output that matches the conditions. But, I am not able to get the output more than 10,000 records and below the error message I'm getting Result window is too large, from + size must be less than or equal to: [10000] but was [20000]. See the scroll api for a more efficient way to request large data sets. This limit can be set by changing the [index.max_result_window]. please let me know how we can achieve it

I already answered you with:

To extract lot of data in a consistent way, use the scroll API

The error message says it all as well:

See the scroll api for a more efficient way to request large data sets. This limit can be set by changing the [index.max_result_window]

Is there something unclear?

Ok thanks dadoonet. Everything is clear now

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.