curl -XGET
'localhost:9200/_analyze?tokenizer=keyword&filters=lowercase&char_filter=myCharFilter'
-d 'this is a test'
and
curl -XGET
'localhost:9200/_analyze?tokenizer=keyword&filters=lowercase&char_filters=myCharFilter'
-d 'this is a test'
I have never tried a custom filter, but with analyzers, they will only work
if you are analyze against an index that is using them. The analyzer would
need to be loaded/instantiated first before it can be used, and that only
occurs when an index that uses the analyzer (in the mapping) is created. I
would assume filters have the same limitation.
curl -XGET
'localhost:9200/_analyze?tokenizer=keyword&filters=lowercase&char_filter=myCharFilter'
-d 'this is a test'
and
curl -XGET
'localhost:9200/_analyze?tokenizer=keyword&filters=lowercase&char_filters=myCharFilter'
-d 'this is a test'
curl -XGET
'localhost:9200/_analyze?tokenizer=keyword&filters=lowercase&char_filter=myCharFilter'
-d 'this is a test'
and
curl -XGET
'localhost:9200/_analyze?tokenizer=keyword&filters=lowercase&char_filters=myCharFilter'
-d 'this is a test'
Ha, didn't think that would be the limitation! Shouldn't be too hard to add
in. The eventual consumer, TransportAnalyzeAction, is just using an empty
char filter array.
curl -XGET 'localhost:9200/_analyze?tokenizer=keyword&filters=
lowercase&char_filter=**myCharFilter' -d 'this is a test'
and
curl -XGET 'localhost:9200/_analyze?tokenizer=keyword&filters=
lowercase&char_filters=**myCharFilter' -d 'this is a test'
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.