Hi all.
What we are trying to do is use kibana to analyze post mortem logs after a crash of our systems.
We have various components generating logs with line's format different from each other.
We get them and feed them to filebeat Wich using patterns is sending them to elasticsearch and then viewed by kibana.
All good but kibana discovery looks like having some limitations like showing only 500 events and not searching the log message part as text (ignoring some characters). This I guess can be overcome, but do you think kibana is the right instrument to see all the logs from different sources together as it was just a big log?
I suppose it depends. You might consider using multiple indices and index patterns. But keep in mind that many visualizations only allow displaying data from one index pattern.
Or you may change the discover:sampleSize
setting under Management - Advanced Settings to something greater than 500 (the default).
Hi Nick
Thank you for pointing out sampleSize.
Why it would be good to use multiple indexes? Right now we have the filebeat-* ones.
Also I was wondering if kibana is the right tool to explore the logs post mortem taking advantage of the fact that events from different sources are mixed together based on timestamp...
Kibana can be very useful for quickly analyzing events from many different data sources. You might also want to check out our Metrics Solution for Infrastructure Monitoring.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.