Search for any word without mappings using URI query

Hi,

I am indexing very big json and it do not have mappings defined. My requirement is to search for any word in this json.

I have created nGram tokenizer like below -

{
  "index": {
    "index": "my_idx",
    "type": "my_type",
    "analysis": {
      "index_analyzer": {
        "my_index_analyzer": {
          "type": "custom",
          "tokenizer": "standard",
          "filter": [
            "lowercase",
            "mynGram"
          ]
        }
      },
      "search_analyzer": {
        "my_search_analyzer": {
          "type": "custom",
          "tokenizer": "standard",
          "filter": [
            "standard",
            "lowercase",
            "mynGram"
          ]
        }
      },
      "filter": {
        "mynGram": {
          "type": "nGram",
          "min_gram": 2,
          "max_gram": 50
        }
      }
    }
  }

}

When I do this query, it do not return anything.

http://es_host:9200/apihub-*/_search?pretty=true&q=findByStatus

I think URI query looks into _all field and it do not have ngrams..

Please suggest how to do this. Is there any other query I can use to search.

Thanks in advance.

You did not apply your analyzer to _all field.

https://www.elastic.co/guide/en/elasticsearch/reference/current/mapping-all-field.html

Thanks @dadoonet. How do we apply analyzer to _all field? I could not find online...

Is there any other best way to achieve my requirements?

Thanks

How do we apply analyzer to _all field? I could not find online...

The link I gave you says:

The _all field is just a text field, and accepts the same parameters that other string fields accept, including analyzer, term_vectors, index_options, and store.

So like any normal text field.

Is there any other best way to achieve my requirements?

I always prefer using copy_to feature. As explained in the same doc: Mapping | Elasticsearch Guide [8.11] | Elastic

Thanks again. I tried like this but getting "Field [_all] is defined twice in [main]" error..

curl -XPUT 'es_host:9200/_template/sapihub_template?pretty' -H 'Content-Type: application/json' -d'
{
  "template": "sapihub-*",
  "settings": {
  "index": {
    "number_of_shards": 1,
    "number_of_replicas" : 2,
	"mapping.total_fields.limit": 4000	
  },
  "analysis": {
      "index_analyzer": {
        "my_index_analyzer": {
          "type": "custom",
          "tokenizer": "keyword",
          "filter": [
            "lowercase",
            "mynGram"
          ]
        }
      },
      "analyzer": {
        "my_search_analyzer": {
          "type": "custom",
          "tokenizer": "keyword",
          "filter": [
            "lowercase",
            "mynGram"
          ]
        }
      },
      "filter": {
        "mynGram": {
          "type": "nGram",
          "min_gram": 2,
          "max_gram": 50
        }
      }
    }
	},
    "mappings": {
      "main": {
        "properties": {
          "_all": {
			"type": "text",
            "analyzer": "my_search_analyzer"
          }
        }
      }
    }
}
'

Finally I came up with below template but when I query, it is returning all documents for every string. What am I doing wrong? Thanks

http://es_host:9200/sapihub-*/_search?pretty=true&q=internal

curl -XPUT 'es_host:9200/_template/sapihub_template?pretty' -H 'Content-Type: application/json' -d'
{
"template": "sapihub-*",
"settings": {
"index": {
"number_of_shards": 1,
"number_of_replicas" : 2,
"mapping.total_fields.limit": 4000
},
"analysis": {
"filter": {
"mynGram": {
"type": "nGram",
"min_gram": 2,
"max_gram": 50
}
},
"analyzer": {
"index_ngram_analyzer": {
"type": "custom",
"tokenizer": "standard",
"filter": [
"lowercase",
"mynGram"
]
},
"search_ngram_analyzer": {
"type": "custom",
"tokenizer": "standard",
"filter": [
"standard",
"lowercase",
"mynGram"
]
}
}
}
},
"mappings": {
"main": {
"_all": {
"type": "text",
"search_analyzer": "search_ngram_analyzer",
"analyzer": "index_ngram_analyzer"
}
}
}
}
'

Hi, any help is greatly appreciated. Thanks.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.