The below is my csv output data inserted to elasticsearch through logstash.
"build_end_time" => "2021-01-13 01:29:49",
"build_duration" => "6409651",
"build_start_time" => "2021-01-12 23:43:00",
"build_date" => "2021-01-12",
"@timestamp" => 2021-02-02T11:40:50.747Z,
Currently "build_duration" => "6409651",
i have in milliseconds format. before inserting into elasticsearch is it possible to convert into HH:mm:ss:ss
format in my logstash.conf file itself? if yes please help me to achieve this? i tried with below way but unable to succeeded.
filter {
csv {
separator => ","
columns => ["build_date", "start_time", "build_start_time", "build_end_time", "build_duration", "requester", "full-name", "metric_id", "config", "status"]
}
ruby {
code => "event.set('build_duration', event.get('build_duration').to_f / 1000 * 60))"
}
}