ElasticSearch Version: 0.90.2
Here's the problem: I want to find documents in the index so that they:
- match all query tokens across multiple fields
- fields own analyzers are used
So if there are 4 documents:
{ "_id" : 1, "name" : "Joe Doe", "mark" : "1", "message" : "Message
First" }
{ "_id" : 2, "name" : "Ann", "mark" : "3", "message" : "Yesterday
Joe Doe got 1 for the message First"}
{ "_id" : 3, "name" : "Joe Doe", "mark" : "2", "message" : "Message
Second" }
{ "_id" : 4, "name" : "Dan Spencer", "mark" : "2", "message" : "Message
Third" }
And the query is "Joe First 1" it should find ids 1 and 2. I.e., it should
find documents which contain all the tokens from search query, no matter in
which fields they are (maybe all tokens are in one field, or maybe each
token is in its own field).
One solution would be to use elasticsearch "_all" field functionality: that
way it will merge all the fields I need (name, mark, message) into one and
I'll be able to query it with something like
"match": {
"_all": {
"query": "Joe First 1",
"operator": "and"
}
}
But this way I can specify analyzer for the "_all" field only. And I need
"name" and "message" fields to have different set of tokenizers/token
filters (let's say name will have phonetic analyzer and message will have
some stemming token filter).
Is there a way to do this?
Thanks.
--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.