I am new in ELK and I am sorry if my question was already asked.
I am facing issue by creating a query for displaying the messages which contain [error] (including the square brackets) string in the message part. I tried to escape the [ and ], but I had no success. The result always displays all the messages which contain the string error, it does not matter if there are no square brackets.
Is there any solution for that?
If your field is a text field any contents will have been chopped into individual words in a search index. This process is called "analysis" and the default configuration uses the "standard" Analyzer optimised for text like this comment. Analyzers typically throw away punctuation and lowercase words. You can see what effects they have using the _analyze api:
GET /_analyze
{
"analyzer":"standard",
"text":"[error] Something happened."
}
Note that the tokens put in the index are stripped of punctuation - including the square brackets so you cannot search for a square bracket because they are not in the index.
You can modify your choice of analyzer to a less aggressive one that preserves the brackets in the index but I expect a better approach is to pre-process the docs so that the log level (warn/error/debug etc) is a separate structured keyword field in your JSON docs. This structured field would allow you to display analytic charts summarising logs by type etc.
You can pre-process your docs in any number of ways:
using regex patterns in custom client code
using logstash or other forms of ETL tool (many of which recognise common log formats)
Thank you for your answer.
I solved the issue by using filters in logstash. The solution was to add a new field depending on the string "[ERROR]" in the message part.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.