Tokenizers for common words

Hi,

I am using a whitespace analyzer on the content that I have but when I try to search for words which really belong together like (New York, San Francisco) Elastic Search breaks it up into 2 words and look for San or New or York. How do I solve this kind of problem?

(Sorry if the title doesn't make sense I wasn't sure how to put this problem without making the title 2 miles long)

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.