Multiple words but same token

Hi,

I would like to index

"driving with car A is a bad thing"

knowing that I have special words composed of several words e.g. "car A"
and "car B".

if I use standard appraoch the a or b are lost. ideally for "driving with
car A is a bad thing" the analyzer should return this:
( "driving", "car A", "bad", "thing")

help, how will you do that?


my filters night well
filter: {
protect_words:{
type: "keyword_marker",
keywords: "car a", "car b"]
},
protect_words2:{
type: "word_delimiter",
protected_words: ["car a", "car b"]
}
},
analyzer: {

    custom_level2: {  
     tokenizer: 'standard',        
      filter: ["asciifolding", "lowercase", "protect_words"],
      type: 'custom'
    },
    custom_level1: {       
      tokenizer: 'keyword',   
      filter: ["asciifolding", "lowercase", "protect_words2"],
      type: 'custom'
    },
}

}

any help?

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/3aa288d7-19b4-4f49-b321-feccfd2b30a5%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

This will not produce exactly what your are looking for but I would use shingles: Elasticsearch Platform — Find real-time answers at scale | Elastic

And boost results found with the shingle analyzer than the standard analyzer So results containing "car A" will appear first.

David

Le 18 févr. 2015 à 23:45, NM n.maisonneuve@gmail.com a écrit :

Hi,

I would like to index

"driving with car A is a bad thing"

knowing that I have special words composed of several words e.g. "car A" and "car B".

if I use standard appraoch the a or b are lost. ideally for "driving with car A is a bad thing" the analyzer should return this:
( "driving", "car A", "bad", "thing")

help, how will you do that?


my filters night well
filter: {
protect_words:{
type: "keyword_marker",
keywords: "car a", "car b"]
},
protect_words2:{
type: "word_delimiter",
protected_words: ["car a", "car b"]
}
},
analyzer: {

    custom_level2: {  
     tokenizer: 'standard',        
      filter: ["asciifolding", "lowercase", "protect_words"],
      type: 'custom'
    },
    custom_level1: {       
      tokenizer: 'keyword',   
      filter: ["asciifolding", "lowercase", "protect_words2"],
      type: 'custom'
    },
}

}

any help?

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/3aa288d7-19b4-4f49-b321-feccfd2b30a5%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

--
You received this message because you are subscribed to the Google Groups "elasticsearch" group.
To unsubscribe from this group and stop receiving emails from it, send an email to elasticsearch+unsubscribe@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/elasticsearch/F696CA5D-600E-4FA7-BA8C-C869B21B2B40%40pilato.fr.
For more options, visit https://groups.google.com/d/optout.