Filtering by date field with DSL returning invalid results

I am trying to query one of my indexes for all records that have a date field (labels.expiresAt) set gte to now. In other words, the date field should be later than today. That seems super straightforward, but when I set the range to "gte": "now" I get 0 results and then I set it to "lte" and the results show up. In the example result below, you can see that the expiresAt value is 2023-06-03T17:31:50.000Z which is 3 months from today so why would it show up in a search for dates lte now? What am I doing wrong here? This seems incredibly unintuitive if it isn't a bug.

DSL Query:

GET /logs-*/_search
{
  "query": {
    "range": {
      "labels.expiresAt": {
        "gte": "now"
      }
    }
  }
}

Hit that shows up but only if I set it to lte instead of gte:

{
  "_index": "xxx",
  "_id": "xxx",
  "_score": 1,
  "_source": {
    "host": {
      "hostname": "xxx"
    },
    "transaction": {
      "id": "xxx"
    },
    "message": "Example",
    "@timestamp": "2023-03-03T17:31:53.915Z",
    "service": {
      "name": "server"
    },
    "event": {
      "dataset": "server.log"
    },
    "ecs": {
      "version": "1.6.0"
    },
    "log.level": "info",
    "process": {
      "pid": 75
    },
    "labels": {
      "expiresAt": "2023-06-03T17:31:50.000Z"
    },
    "environment": "development",
    "trace": {
      "id": "xxx"
    },
    "@version": "1",
    "data_stream": {
      "type": "logs",
      "dataset": "generic",
      "namespace": "default"
    }
  }
}

What is the mapping of that field?

Where would I find that information?

Get the index mappings using the get mapping API.

If the field was incorrectly mapped as a keyword field I believe it would explain the behaviour you are seeing as now and the timestamps would be compared as strings and now would not be converted to a date.

Yes! That's the problem it looks like it was incorrectly mapped as a keyword field. I'm very new to field mapping. Any tips on how I can get started fixing this?

You would need to create or update an index template with the mapping. It however looks like you are using ECS, so I am not sure whether that would break something. The index template would take care of new indices but as you can not change mappings of existing indices you would need to reindex any data that is already indexed.

How are you indexing data into Elasticsearch?

Data is currently coming in in ECS format as you noted. At the moment, I'm using a data_stream from Logstash. I haven't setup any index templates or configuration for indexing so I guess I'm indexing data with whatever the default is in Elastic Cloud. Looks like I should probably go through this guide: Set up a data stream | Elasticsearch Guide [8.6] | Elastic

holy jesus I figured out how to create all of the necessary index mapping stuff and reindex (moving everything from the main index to a new one, deleting the old one, and then moving it back) and now my query is working! That was exhasusting, but now I know how index mapping works and know what I need to do next time I run into a similar issue.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.