I have the following logstash config for adding 2 new fields "timestamp1" and "response1" , when I use mutate block I am able to see the new fields but the data parsing is failing from the log file.
That sets the [timestamp1] field to the value of the [timestamp1] field (i.e. it does nothing) and similarly for [response1]. What are you trying to do with that?
If you are getting a _grokparsefailure tag then your grok patterns do not match. How are RESPPATTERN and DATEPATTERN defined?
RESPPATTERN [0-9]{1,2}ms => to match "6ms"
DATEPATTERN [0-9]{4}-[0-9]{2}-[0-9]{1,2}.?[0-9]{1,2}:[0-9]{1,2}:[0-9]{1,2},[0-9]{3} => to match timestamp
I am trying to match the timestamp and "6ms" in the log.
By using mutate i get the new fields in kibana but due to parse error values are not coming up.
how do I make logstash to ship/assign the extracted values to the new fields ?. As per what i understood mutate creates a new field and assign the value to it and grok patterns exatracts the matched value from logs.
The grok pattern has to match the log message. Your message starts with DATEPATTERN, it does not end with it. Also, the '6 ms' has a space in it, so you could try
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.