Hi All,
I'm starting with custom logs. Currently I put logs without any additional processing of log lines.
My first problem is that I want search for "ERROR" but my logs look like:
timestamp ERROR example message
timestamp INFO example message finished without error
By default _source.message is text type so when I try search for ERROR I receive both lines in output, as far as I know text type field is case-insensitive.
I was looking for solutions in internet and now I have few concepts but not sure which is the best
- Dissect processor. Add processor to Filebeat. Example definition:
- dissect:
tokenizer: '%{timestamp} %{log_level} %{log_message}'
But I'm not sure if it work with multiline, if it not problem if logs can contain .NET exception, because finally I think its good to have %{log_message} and exception in same field. And I don't know if multiline message has additional field for situation when we need to search for all errors and exceptions.
2. Add_field processor + regex, but it may require as many processors as log level (INFO, WARN, ERROR...)
3. Ingest node. Like in examples above I can try to split log line to different fields. I can copy and edit current module provided with Filebeat ant try to customize it for my case.
4. Change type of field _source.message to keyword type in Filebeat index settings, it should make field case-sensitive for search, but I'm not sure how it affect on Elasticsearch performance.
Do you have idea which option is the best and why? Or maybe do you have better solution for that situation?
I played with _search API in lots of way, but it not help me. Always searches was case-insensitive. I try to use different analyzer for query but it analyze my query only, not affect on result of search. For me it look like data stored in text field are saved as lowercase, but I don't understand why in search results my data looks not changed. Could you explain how it works?
Additionally I want to ask you kindly if is there possibility to send logs from one log file to different index and rest of log files to filebeat index by one running Filebeat agent? And if yes how to do that? I want to play with my data, structure etc. but I don't want to make a mess in filebeat index.
Even if I make a mess in default index I can override all index settings by agent, can't I?
I'm using Elasticsearch and Filebeat in version 7.6
Thanks,
Marcin