I have used the split filter in order to parse a multiline XML and as a result, one of the fields obtained is the one called parsed.logEntry.logTimestamp as you can see in the following image:
What I would like now, is a way to have this date information in a timestamp form as interpreted by kibana to manage indexes.
I have tried to used the json filter as following:
I have tried what you mentioned but what I am getting as "ts" parameter is the literal string: "%{year}/%{month}/%{day} %{hour}:%{minute}:%{second}". So, it seems like in some way we have to extract the values from the parsed.logEntry.Timestamp field.
As a check, I have added a new field with the value obtained from [parsed][logEntry][logTimestamp] and the result was: {month=11, hour=9, year=2013, day=4, second=12, minute=24}. Therefore, I have tried to extract the fields with a grok/dissect filter but without success.
To be honest I was ignoring your filter and just looking at the JSON you posted from Kibana.
We might do better if you removed the json filter and showed what %{[parsed][logEntry][logTimestamp]} looks like in output { stdout { codec => rubydebug } }. Sometime (not always) there is a better idiomatic solution in logstash.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.