I just created an index using Elasticsearch and Logstash but the fields I've specified using the Logstash filter (grok) aren't really reflecting in Kibana.
I am not sure I understand the problem. The message field has a timestamp, number, number and username. Could you provide an example of what you want the message field to look like.
I want the fields 'time', 'temp', 'light' and 'node' to be separated in the message just like I've specified in the grok filter. Any idea how that can be done?
If you would post copy/pasteable text instead of screenshots it would be easier to help you. While debugging i strongly recommend that you use a simple stdout { codec => rubydebug } output. Adding ES and Kibana prematurely is a source of confusion and a general waste of time.
But as the _grokparsefailure tag indicates your grok expression isn't working, and comparing your expression to the original message it's easy to see why. If we for a moment pretend that DATESTAMP_RFC2822 matches your timestamp you'd want something like
but before that actually works you have to find an expression that matches your timestamp. DATESTAMP_RFC2822 is close but the timezone and the comma throws is off. Perhaps
would work? Afterwards you'll have to assemble the separate fields to a single field that you can feed to the date filter. Or maybe the whole timestamp would be acceptable to the date filter, come to think of it.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.