Hi all
can someone help me how to convert the column value in csv from the date 11/09/2019 15:47 to 2019-09-11T15:47:00Z format
Thanks
Hi all
can someone help me how to convert the column value in csv from the date 11/09/2019 15:47 to 2019-09-11T15:47:00Z format
Thanks
Try this filter this would work for you
filter {
grok {
match => {"message" => "%{DATE:date} %{HOUR:hour}:%{MINUTE:minute}"}
}
mutate {
add_field => { "timestamp" => "%{date} %{hour}:%{minute}" }
}
date {
match => [ "timestamp", "dd/MM/yyyy HH:mm" ]
}
}
when passing the timestamp in ruby filter, as below
there is an exception :
Ruby exception occurred: no implicit conversion of LogStash::Timestamp into String
ruby
{
init => "require 'date'"
code => '
duration = 0.0
createtime= event.get("@timestamp")
ctime = DateTime.parse(createtime)
event.set("Total_duration_sec", ctime)
'
}
i need to parse this date using the DateTime.parse
Thanks
Add the filter configuration i gave. Output should be like below. I hope you are looking for this only.
@timestamp field is updated with the time field that you have given.
Hi Chandu.. thanks for your time on this
I used the below datefilter which gives me @timestamp in 2019-09-11T15:47:00Z format but it was of 'date' type.. need this @timestamp to be converted 'string' .. so that it can be used for parsing in ruby "createtime= event.get("@timestamp")
ctime = DateTime.parse(createtime)"
date
{
match => ["Time_Closed", "dd/MM/yyyy HH:mm"]
timezone => "Europe/London"
target => "@timestamp"
}
i used the ruby .to_s for the string conversion... which worked Thanks Chandu for your time...
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.
© 2020. All Rights Reserved - Elasticsearch
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant logo are trademarks of the Apache Software Foundation in the United States and/or other countries.