Coverting miliseconds into HH:mm:ss in logstash

The below is my csv output data inserted to elasticsearch through logstash.

"build_end_time" => "2021-01-13 01:29:49",
"build_duration" => "6409651",
"build_start_time" => "2021-01-12 23:43:00",
"build_date" => "2021-01-12",
"@timestamp" => 2021-02-02T11:40:50.747Z,

Currently "build_duration" => "6409651", i have in milliseconds format. before inserting into elasticsearch is it possible to convert into HH:mm:ss:ss format in my logstash.conf file itself? if yes please help me to achieve this? i tried with below way but unable to succeeded.

filter {
  csv {
      separator => ","
      columns => ["build_date", "start_time", "build_start_time", "build_end_time", "build_duration", "requester", "full-name", "metric_id", "config", "status"]
       }
  ruby {
  code => "event.set('build_duration', event.get('build_duration').to_f / 1000 * 60))"
      }
}

Any help will be highly appreciated.

You could do that with a scripted field in kibana (see here and here for examples).

In logstash you could do it using a ruby filter. See here and here for code examples.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.