TLDR: Can I use ElasticSearch backend + some sort of front end to "real-time" filter single log files? Somewhat like grep-functionality in the terminal.
In the company I work for, we often get logs to analyze when things crash or behave weirdly. Normally this is done by finding the file on a disk, and then finding the relevent information, either using less, or grep.
This works somewhat OK, but especially for our automated tests, it is quite cumbersome to find the right file on disk, etc. Thats why I was happy to find the ELK-stack and also Graylog. I set up both an ELK-stack and Logstash-->Graylog/ElasticSearch, and it works well to get the logs into these systems.
My main use case is to look at a single log file and find and filter the relevant information in this file. What I found is that Kibana and Graylog more looks at multiple files and generate graphs, etc. This is useful to find overall trends (number of crashes) but is not very useful for single log analysis. The filtering part is more or less what I am looking for, but it works better when you know in advance what you are looking for (backtraces, for example)
Is there a way to use ElasticSearch + (something else?) to replace our current grep-centralized way of working?
A few wanted features (in order of priority :-))
0. Output format should be simple. (Our logs become harder to read when put in a table format rather than text, since they are typically quite wide)
- Free text search (grep "pattern_to_amtch" file.log)
- Exclude pattern (grep -v "pattern_to_exclude" file.log)
- Be able to combine filters (grep "pattern_to_amtch" file.log | grep -v "pattern_to_exclude")
- Be able to work on a single file e.g. kibana_url:9000/?file=file.log
4. Highlight match (grep "pattern_to_filter_and_highlight" --color=always file.log)
5. Turn on and of "columns" such as timestamp (AWK could do this, for example)
Sorry for the long post!