Searching special characters in elastic

Hello, I have a problem I could not index special characters in Elasticsearch. I try to do it like that:

PUT my-index-000001
{
    "settings": {
        "index": {
            "max_result_window": 100000,
            "number_of_replicas": 0
        },
        "analysis": {
            "filter": {
                "autocomplete_filter": {
                    "type": "edge_ngram",
                    "min_gram": 1,
                    "max_gram": 35
                }
            },
            "analyzer": {
                "autocomplete": {
                    "type": "custom",
                    "tokenizer": "my_tokenizer",
                    "filter": [
                        "lowercase",
                        "autocomplete_filter"
                    ]
                }
            },
            "tokenizer": {
                "my_tokenizer": {
                    "type": "edge_ngram",
                    "min_gram": 1,
                    "max_gram": 35,
                    "tokenize_on_chars": [
                        "%"
                    ]
                }
            }
        }
    },
    "mappings": {
        "properties": {
        }
    }
}
POST my-index-000001/_doc/
{
  "@timestamp": "2099-11-15T13:12:00",
  "message": "GET /search HTTP/1.1 200 1070000",
  "user": {
    "id": "kimchy % D"
  }
}

but I could not receive anything by search request:

GET my-index-000001/_search
{
    "size": 10000,
    "query": {
        "term": {
            "user.id": "%"
        }
    }
}

Could you help me please?

Welcome!

In the mapping you did not define any field and the analyzer to apply on the fields. So user.id is using the standard analyzer by default.
You must fix that part.

Could you please show me how to fix this?

Sure. Look at: Specify an analyzer | Elasticsearch Guide [8.12] | Elastic

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.