I pass custom logs to elastocsearch using logstash
For example, the string "A+B=AB" contains the field name in "fields1".
If I search for "A+B=AB" literally, other fields containing A,B,AB are partially searched. This is my problem.
I will list the methods I would like to solve in order.
I know that the problem I have is because elasticsearch automatically analyzes the search terms, so can I pass the type of "fileds1" from logstash to elasticsearch as "keyword"?
After checking, the type of fields1 is automatically text.
When I searched a lot of documents, I found that the reserved words (+,-...) specified in elasticsearch are processed as blanks.
Is that correct?
Question 2 is suspicious. I searched "₩₩+" but nothing was found. As a workaround I found, you can replace "+" with a character like "plus" in logstash and pass it to Elastic.
However, I haven't been able to find a way to change it in Logstash, which function should I use to change it?
How fields are mapped in Elasticsearch is determined by whether you have matching index templates in place or rely on dynamic mappings. By default mappings create a fields1.keyword subfield that can be used to search for exact matches.
The most curious thing, suppose I put the string "abcd+efgh-hijk" into elasticsearch through logstash.
Is the string that I put in elasticsearch stored as it is? (including special characters such as +,-)
If so, why can't I search for anything when I search for "₩₩+"?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.