Seperate tokenizer for Search and Indexing

Can we setting like this in whicn indexing uses standard tokenizer and
searching uses whitespace tokenizer?

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/bc065029-b100-457a-84ca-0b1da5c41d6c%40googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

yes you can have different tokenizer for index time and query time. but you
have to be careful cause if the tokens produced during indexing is
different than that produced during searching you will not get proper
results or no results.

On Monday, 16 December 2013 11:02:25 UTC+5:30, deep saxena wrote:

Can we setting like this in whicn indexing uses standard tokenizer and
searching uses whitespace tokenizer?

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/e4f5708f-e021-4f34-84b3-3ef223c1e5ab%40googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.