I'm 3 day new to log stash and during this learning process, I tried to filter the unstructured log which contains key & value pairs and want to assign each values to variables.
It looks like the JSON part of the log is malformed. If it was valid JSON, your configuration probably should work as long as you modify the GREEDYMESSAGE to just capture {"Key1":"Value1","Key2":Value2,"Key3":"Value3"}.
Okay, so what follows "notify_event=" looks like a stringified Python array containing a single string value. You can use grok to only capture what's between notify_event=[u' and '], then feed the extracted JSON string to a json filter.
How to view these logs in kibana & search in elasticsearch?
Go to Kibana, configure an index pattern that matches your index(es), and you're done. With your configuration Logstash is going to write to indexes named logstash-YYYY.MM.DD, where YYYY.MM.DD is the date when an event occurred.
How these outputs as redirected to elasticsearch? Is it realtime data or how to save it in a file?
Logstash sends the data to Elasticsearch continuously. If things are working okay events written to log files should be searchable in Kibana within a few seconds.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.