Logstash : Time Field

Hi ,

I'm trying to parse a CSV file on my local machine via logstash and visualize it on Kibana. But the date field ("timestamp") is being parsed as a string, and not as a date type.

This is how my config looks :

input {
file {
path => "C:\Users\310309705\Desktop\events-jan2.csv"
start_position => beginning

}
}

filter {
csv {
columns => ["siteid", "state", "hostaddress", "hostname", "service", "timestamp"]
}
date {
match => ["timestamp", "MM-dd-yyyy HH:mm:ss"]
}
}

output {
stdout { }
elasticsearch {
hosts => "localhost"
index => "events-jan2"
}
}


Please help.

Thank you,
Rahul

Its taking the time which logstash parsed the file, and not the actual time in the data itself.

What does an event look like?

siteid state hostaddress hostname service timestamp
IDM02 CRITICAL 192.16.9.22 19.22 VirtualizationStatus 12/11/2017 2:40

Should I retry renaming the "timestamp" field to something else? (Just a thought)

The timestamp you provided for the date filter does not match the format in the event, which is why it fails. You need to update the pattern to match what you actually have got in the event, e.g. dd/MM/yyyy h:mm (if that is the correct order for day and month).

The data is in MM/dd/yyyy format.

Then adjust the pattern accordingly.

That's how I have adjusted the pattern. But still no luck.

This test of the date format parsing seems to work for me:

input {
  generator {
    lines => ['12/11/2017 2:40']
    count => 1
  } 
} 

filter {
    date {
        match => ["message", "MM/dd/yyyy h:mm"]
    }
}

output {
  stdout { codec => rubydebug }
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.