@timestamp format while exporting from elasticsearch to csv

Hi,
I want to know how can I change the format of @timestamp while exporting from elasticsearch to csv via logstash. In Kibana I have kept the format as "Feb-03-2021 20:03:04.603" using advance settings. I want same format in csv as well. I have kept following filter in logstash which is unfortunately not working.

filter {
      date {
        match => [ "@timestamp", "MMM-dd-yyyy HH:mm:ss.SSS" ]
      }
    }

Currently I am getting date as "2021-02-03T12:30:00.315Z" in csv.

If you want to reformat a Logstash::Timestamp then use a ruby filter and strftime.

Hi Badger,

ruby {
	code => "event.set('@timestamp' , event.get('@timestamp').time.strftime('%b-%d-%Y %H:%M:%S.%L'))"
	}

I tried this and got rubyexception. Can you please tell me the correct way?

There is an example here.

1 Like

Hi Badger,
I tried that, will it change the original @timestamp format? Or I have to add some new field??
Here is what I have tried:

ruby {
        code => '
            t = Time.at(event.get("@timestamp").to_f)
            event.set("@timestamp", t.strftime("%Y-%m-%d"))
        '
    }

I am still getting rubyexception.

its resolved :slightly_smiling_face:
Thanks Badger