Elasticsearch - searching with "keyword" type don't work

I use Elasticsearch Java rest client 6.1. When I try to find logs due to a few parametrs than can be in one field I get nothing
Here is my code:

 @Override
    public SearchResponse findLogsByValues(ElasticSearchLogRequest esLogRequest, Pageable pageable) {
        SearchRequest searchRequest = new SearchRequest("portal-logs-*");

        SearchSourceBuilder sourceBuilder = new SearchSourceBuilder();
        BoolQueryBuilder bqb = QueryBuilders.boolQuery();
 if (esLogRequest.getLevels() != null) {
            Iterator<String> iterator = esLogRequest.getLevels().iterator();
            int counter = 0;
            SpanOrQueryBuilder spanOrQueryBuilder = null;
            while (iterator.hasNext()) {
                if (counter == 0) {
                    spanOrQueryBuilder = new SpanOrQueryBuilder(QueryBuilders.
                            spanTermQuery("level", iterator.next().toLowerCase()));
                } else {
                    spanOrQueryBuilder.addClause(QueryBuilders.
                            spanTermQuery("level", iterator.next().toLowerCase()));
                }
                counter++;
            }
            bqb.filter(spanOrQueryBuilder);
        }
 try {
            searchResponse = client.search(searchRequest);
        } catch (IOException e) {
            e.printStackTrace();
        }
        return searchResponse;
    }

Here is my json request:

{
	
	"levels": ["TRACE","INFO"]
	
}

Here is my mapping template:

  PUT _template/portal-logs
{
  "template": "portal-logs-*",
  "settings": { "number_of_shards": 5 },
  "mappings": {
      "logs_info": {
        "_all": {
          "enabled": false
        },
        "properties": {
          "device": {"type": "keyword"},
          "header": {"type": "text"},
          "ip": {"type": "keyword"},
          "level": {"type": "keyword"},
          "location": {"type": "geo_point"},
          "message": {"type": "text"},
          "module": {"type": "keyword"},
          "node": {"type": "keyword"},
          "office": {"type": "keyword"},
          "operation": {"type": "keyword"},
          "port": {"type": "integer"},
          "sessionId": {"type": "keyword"},
          "submodule": {"type": "keyword"},
          "system": {"type": "keyword"},
          "thread": {"type": "keyword"},
          "timeStamp": {"type": "date"},
          "userLogin": {"type": "keyword"},
          "userName": {"type": "keyword"}
        }
      }
    }
  }

So when in mapping field "level" and set it as "text" - it works fine but when I set "keyword" - I receive an empty json.
I need that field "level" has a strict type "keyword" and it has to work when I want to get all logs that have "level" "INFO" or "TRACE".
What should I do in such case? Why with keyword it's not working?

1 Like

I believe that this is wrong:

iterator.next().toLowerCase()

And should be

iterator.next().toUpperCase()

I just tried to use toLowerCase() after your answer [Searching for a few fields from Set Collection](http://dadoonet answer) and it works with "text" type but didn't work with "keyword".
May you help with explanation of this case?

with iterator.next().toUpperCase() it's also not working.
Even if I use just iterator.next() - it's not working.
But all letters in "level" fields in ES consists of upper case letters.

I don't know exactly as I don't fully understand what you are doing. It would be better to provide a full recreation script as described in About the Elasticsearch category. It will help to better understand what you are doing. Please, try to keep the example as simple as possible.

Not using Java but just with simple recreation that you can run in Kibana console.

Most likely, when you index a field "foo": "BAR", with a default analyzer, foo is indexed as bar. With a keyword type, it's indexed as BAR.

When running a match query for example on that field, searching for BaR in the former case will transform to bar and it will match.
Searching for BaR in the later case won't match bar.

You can understand all that by using the _analyze API.

GET /_search
{
    "query": {
        "span_or" : {
            "clauses" : [
                { "span_term" : { "level" : "TRACE" } },
                { "span_term" : { "level" : "INFO" } }
            ]
        }
    }
}

and in response I get this:

{
  "took": 16,
  "timed_out": false,
  "_shards": {
    "total": 26,
    "successful": 25,
    "skipped": 0,
    "failed": 1,
    "failures": [
      {
        "shard": 3,
        "index": "portal-logs-02.02.2018",
        "node": "SAPKQLGGSJiR8bMBdkNvfQ",
        "reason": {
          "type": "illegal_state_exception",
          "reason": """field "level" was indexed without position data; cannot run SpanTermQuery (term=TRACE)"""
        }
      }
    ]
  },
  "hits": {
    "total": 0,
    "max_score": null,
    "hits": []
  }
}

but when I try to set another mapping for "level" : "level": {"type": "keyword", "term_vector":"with_positions"}
I get this exception:

{
  "error": {
    "root_cause": [
      {
        "type": "mapper_parsing_exception",
        "reason": "Mapping definition for [level] has unsupported parameters:  [term_vector : with_positions]"
      }
    ],
    "type": "mapper_parsing_exception",
    "reason": "Failed to parse mapping [logs_info]: Mapping definition for [level] has unsupported parameters:  [term_vector : with_positions]",
    "caused_by": {
      "type": "mapper_parsing_exception",
      "reason": "Mapping definition for [level] has unsupported parameters:  [term_vector : with_positions]"
    }
  },
  "status": 400
}

bqb.filter(QueryBuilders.termsQuery("level", esLogRequest.getLevels()));
this helps

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.