Can someone please help locate the file containing this parameter

I trying to parse a single log file of around 3.2 GB using ELK stack. The same has got indexed and when I am trying to discover indexed logs in Kibana I get the following error.

Error: Discover: Trying to query 2051 shards, which is over the limit of 1000. This limit exists because querying many shards at the same time can make the job of the coordinating node very CPU and/or memory intensive. It is usually a better idea to have a smaller number of larger shards. Update [] to a greater value if you really want to query that many shards at the same time.

This is just a test setup and I am using ElasticSearch 5.0 beta version for this setup.

you can reduce the index shards!

Thanks. Can you please guide how this can be done.

It's set in elasticsearch.yml.

Dont think so Mark. With ElasticSearch 5.0 version they have decoupled these settings from .yml file.

How much data do you have in the cluster? How many shards in total?

The check is there for a reason, so unless you have a very large amount of data in your cluster and the average shard size is over a few GB, I would recommend looking into reducing the number of shards even if it is just a test system.

Well we're both wrong :stuck_out_tongue:
It's a cluster level setting, so _cluster/settings.


This is the way to change the "shard_count" limit in searches on clusters settings:

curl -u admin:admin -XPUT 'https://localhost:9200/_cluster/settings' -H 'Content-Type: application/json' -d' 
    "persistent" : {
        "" : "1500"

-u admin:admin (authenticated for user: admin, password: admin)
https (only on clusters with ssl in rest layer)