I am aware that there have been many questions asked around this and this probably could be a repeat of sorts. But have to request someone guiding me on this. My log file looks like this -
<event timestamp="2017-04-17T14:00:59.9604138-04:00" .......> my log message <event>
I tried grok filter and xml filter as below -
filter {
grok {
match => ["message", "%{TIMESTAMP_ISO8601:timestamp}"]
}
date {
match => ["logmessage.timestamp", "yyyy-MM-dd HH:mm:ss.SSSZ"]
}
}
The xml just takes every attribute of the xml node "event" and populates into the "logmessage" field as - logmessage.timestamp etc.
As for the first match its comparing to the ISO8601 format but not able to get in elasticsearch db as date. I basically need that time in the given format from my log to go in as a date and not string so that I can set index pattern based off of this value. Or update the @timestamp default value to log time of my file
Can you please highlight what am I doing wrong here and what's the resolve? Sorry if this is a repeat question but I couldn't find someone having this field format for timestamp.
You're asking Logstash to store the whole XML document in the logmessage field but also store the timestamp attribute in the logmessage field. Pick one of them, not both. If you remove the target option remember to disable store_xml.
The xml just takes every attribute of the xml node "event" and populates into the "logmessage" field as - logmessage.timestamp etc.
Yes, and you need to reference the field with the syntax [logmessage][timestamp.
match => ["logmessage.timestamp", "yyyy-MM-dd HH:mm:ss.SSSZ"]
Since your timestamps have microsecond precision I suspect you'll have to use the pattern "yyyy-MM-dd HH:mm:ss.SSSSSSZ" instead. Or "ISO8601".
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.