Searching or filtering on a multi value field


(Cyril BOGNOU) #1

Hello Everyone,

I have a MySQL database with a table that contains 2 importants fields title and age_range.

That table saves documents like this 45;60 for documents designed for users between 45 and 60 years old, 18;70 for users between 18 and 70 years old and so on...

Now I would like to fire the query test on the field title with the filter 18;50 for the field age_range that will return all documents matching test with the age range field contained in this interval including the 2 the 2 cases above for example.

For instance, I use Logstash to index my data.
How can I achieve this?
Any treatment to do while indexing my data with logstash?
Any filter, tokenizer to use while indexing using ES analyzer?

Thank you in advance


(David Pilato) #2

No need to transform the data before it gets indexed in elasticsearch.

As for mapping, I'd recommend using Range datatype which is designed for that kind of use case IMO.

I'd do something like:

PUT range_index
{
  "mappings": {
    "my_type": {
      "properties": {
        "age": {
          "type": "integer_range"
        }
      }
    }
  }
}

PUT range_index/my_type/1
{
  "title": "whatever",
  "age" : { 
    "gte" : 45,
    "lte" : 60
  }
}
PUT range_index/my_type/1
{
  "title": "whatever",
  "age" : { 
    "gte" : 18,
    "lte" : 70
  }
}

Then search with something like:

POST range_index/_search
{
  "query" : {
    "range" : {
      "age" : { 
        "gte" : 18,
        "lte" : 50 
      }
    }
  }
}

But again, you need to parse your documents before indexing them.
You can use a grok filter in logstash or use ingest grok processor in elasticsearch directly.


(Cyril BOGNOU) #3

Thank you for your quick response @dadoonet .

But it seams the answer you provided is for integer fields? or maybe am I wrong?
Because as I said the ages are saved in a string field (MySQL varchar) in this format 'bottom_age;top_age' for each row.


(David Pilato) #4

Yes. This is why I said that:

But again, you need to parse your documents before indexing them.
You can use a grok filter in logstash or use ingest grok processor in elasticsearch directly.


(Cyril BOGNOU) #5

Ooh Alright.
Sorry I missed that part.
Thank you very much


(system) #6

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.