Add a new filed in csv based on a value in colum and cal the time diff

Hi all

I have a set of csv data as below

i am interested in adding a new field 'status' based on the 'endtime' column value, i.e if its not 'null' then the status should be completed else open.

also need to calculate the timedifference between 'endtime' and 'createtime' if the endtime value is not 'null' and then add a new field 'timetaken'

can any one help me on this


Take at look at this post for calculating the difference. You can set the status based on whether you get a _dateparsefailure on the endtime.

Hi @Badger

i had the below config, its resulting only the "open' stats

separator => ","
columns => ["product","createtime","endtime"]
skip_header => true
date { match => ["createtime", "dd/MM/yyyy HH:mm"] target => "@timestamp"}

        date { match => [ "createtime", "ISO8601" ] target => "createtime" }

        date { match => [ "endtime", "ISO8601" ] target => "endtime" }

        ruby {

    code => '

        duration = 0.0

        ended = event.get("endtime")

        created = event.get("createtime")

        if created and ended then

            duration = ended.to_f - created.to_f


        event.set("req_duration_sec", duration)
        if (("_dateparsefailure" in [tags]))  


                    add_field => { "Status" => "open" }







                    add_field => { "Status" => "closed" }




can u suggest if there are any mistakes


Change the pattern in the last two to match the first. Your timestamps are not ISO8601.

I tried the changes, but still i'm not able to index the fileds with null value as endtime..

the output section is as below
output {
elasticsearch {
hosts => ["localhost:9200"]
document_id => "%{productname}-%{pid}"
index => "records"

[2019-08-22T10:01:42,774][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>"Express product-pid", :_index=>"records", :_type=>"doc", :_routing=>nil}, #LogStash::Event:0x2cdf70c4], :response=>{"index"=>{"_index"=>"records", "_type"=>"doc", "_id"=>"Express product-pid", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [endtime]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: "NULL""}}}}}

from the "reason"=>"Invalid format: "NULL"" , its surely because of null value, but i need this to be indexed to the same index "records"
can you please validate

elasticsearch has learned, from the documents where endtime is a timestamp, that endtime should be a timestamp. "NULL" is not a valid timestamp.

I would configure logstash to delete [endtime] if it equal to the string "NULL".

if [endtime] == "NULL" {
    mutate { remove_field => [ "endtime" ] }
1 Like

thank you @Badger

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.