Best approach for Fetching large dataset using NEST

Hi,
I'm using NEST API for fetching data from elastic Index. My index contains more than 700000 records and I want to fetch the data in chunks. I have tried it with From and size parameter on each request I'm querying according to the page number and fetching 10 records per request. But at a certain stage, the query stops fetching the data, when the from parameter is larger than 50000. I have tried changing the "index.max_result_window" to 100000 and still couldn't resolve the issue.

Is this the correct way to query a large dataset or should I use any other approach?

        var settings = new ConnectionSettings(new Uri("http://10.80.20.125:9200/"))
                  .DefaultIndex("parcels");
                if (page == null || page==1)
                {
                    start = 0;
                }
                else
                {
                    start = (int)page - 1;
                    start = start * 10 + 1;
                }
                client = new ElasticClient(settings);
                var searchResponse = client.Search<parcels>(s => s
                .From(start)
                .Size(10)
                .AllTypes()
                .Query(q => q
                .MultiMatch(c => c
                .Fields(f => f.Field("Parcel_Code").Field("CityName").Field("PlanName")
                .Field("DistrictName"))
                .Query(search) //user input will be passed here as the query
                .Operator(Operator.Or)
                .FuzzyRewrite(MultiTermQueryRewrite.ConstantScoreBoolean)
                .TieBreaker(1.1)
                .Lenient()
            )
            )
              );

above is my code.
Any help would be appreciated. TIA

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.