I have configured ELK with Layer 7 logs locally by taking the logs from gateway. I am able to get the logs in Kibana dashboard but the logs are not displayed in order. Below log is complete one entry but in kibana it is dividing multiple lines.
I can see a couple of spelling mistakes in the file (messge and rebydebug). Your log is also not a COMBINEDAPACHELOG, so I doubt your grok expression will work. What output are you seeing? What does the resulting events look like when output to stdout with a rubydebug codec?
Thanks for your reply,
Configuration file has correct spellings. While copy and paste then add something some spellings got missed.
Yes, My log is not an COMBINEDAPACHELOG but i dont know what to give in grok expression. Please guide me on this.
I am seeing the logs in Kibana dashboard but not event wise. It is showing row wise. Like i said above i need complete event to be displayed in each.
Have you read through the documentation? When building grok expression it is often recommended to do so gradually. Start with a simple expression that captures the first field and uses a GREEDYDATA pattern to capture the rest. Test it, and then gradually expand on the pattern until all data has been captured.
As you have a JSON object at the end, you may want to capture all fields up to that point and store the whole JSON object in a field so that you can apply the json filter on this after the grok filter has completed.
I have gone thru the docs and made the configuration but while making the grok and multiline codec configuration , i am not understanding how can i proceed.So i posted the question here to understand and get some configuration to do it but still it didnt working out.
My main goal is to display the logs as per the event but not per the row.
If you have gone through the documentation and have started building your grok pattern(s), you must surely have something else than the configuration file above that you can share?
When debugging Logstash config it often helps to remove the Elasticsearch output and work with the representation written to stdout using the rubydebug codec (still not correct in your config). Please show us an example event written to stdout so we can see exactly what the events you are talking about looks like.
And what do you get from a stdout { codec => rubydebug } output?
The pattern in your date filter doesn't match reality so the @timestamp field (used by Kibana for sorting) won't get the correct value from the log. Try "ISO8601" instead of "dd/MMM/yyyy:HH:mm:ss Z".
Please replace the elasticsearch output with stdout as described earlier and show us the output from this. This makes debugging a lot quicker and easier.
Please copy and paste the output from the stdout filter you have configured. This will show us the structure and content of your events and make it all much easier to debug.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.