Parse string to timestamp in csv

Hi, I'm trying to parse the 1st column of line in csv to be a timestamp column. But elasticsearch is still treating it as string, not a timestamp. Please help.

filter {
csv {
columns => ["nowdatetime","Total_sum","DIFF"]
convert => {
"Total_sum" => "integer"
"DIFF" => "integer"
date {
match => [ "nowdatetime", "yyyy-MM-dd'T'HH:mm:ss'.'SSS" ]
target => "nowdatetime"

Where is it wrong in this logstash?

Do you get a _dateparsefailure tag? If so, what does the nowdatetime field look like? If not, then the mapping of the field in elasticsearch is string, that's not going to change for the current index. If you start over with a new index does it get mapped as a date?

Ok, I deleted the index. The nowdatetime does get the value like '2019-03-08 10:25:17.105' but it's a string, not a timestamp.
When I import it from Kibana, Kibana only recognizes the default @timestamp as the only selection in 'Time Filter field name'. I want 'nowdatetime' to be an option from here.

What should I change?

'2019-03-08 10:25:17.105' does not match the format in your date filter.

Got it... I had an extra T in it.

In my logstash logs, I found

"nowdatetime" => 2019-03-08T15:41:17.960Z,

In the Kibana, I got March 8th 2019, 10:40:17.916

What are those 'T' and 'Z' in the logstash?

Still does not match. Try

    date {
        match => [ "nowdatetime", "yyyy-MM-dd'T'HH:mm:ss.SSS'Z'" ]
        target => "nowdatetime"

I changed to match => [ "nowdatetime", "yyyy-MM-dd' 'HH:mm:ss'.'SSS" ] and it works. The col is now a timestamp type.

Here is a line from my csv
2019-03-08 11:02:19.08, 0, 0

I had the 'T' in the expression and logstash treat the col to string. My timestamp values doesn't have 'Z' either.

I don't understand why logstash would print logs containing 'T' and 'Z'.
"nowdatetime" => 2019-03-08T15:41:17.960Z

It's the ISO 8601 format.


This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.