Logstash : Time Field


(Rahul Pm) #1

Hi ,

I'm trying to parse a CSV file on my local machine via logstash and visualize it on Kibana. But the date field ("timestamp") is being parsed as a string, and not as a date type.

This is how my config looks :

input {
file {
path => "C:\Users\310309705\Desktop\events-jan2.csv"
start_position => beginning

}
}

filter {
csv {
columns => ["siteid", "state", "hostaddress", "hostname", "service", "timestamp"]
}
date {
match => ["timestamp", "MM-dd-yyyy HH:mm:ss"]
}
}

output {
stdout { }
elasticsearch {
hosts => "localhost"
index => "events-jan2"
}
}


Please help.

Thank you,
Rahul


(Rahul Pm) #2

Its taking the time which logstash parsed the file, and not the actual time in the data itself.


(Christian Dahlqvist) #3

What does an event look like?


(Rahul Pm) #4
siteid state hostaddress hostname service timestamp
IDM02 CRITICAL 192.16.9.22 19.22 VirtualizationStatus 12/11/2017 2:40

(Rahul Pm) #5

Should I retry renaming the "timestamp" field to something else? (Just a thought)


(Christian Dahlqvist) #6

The timestamp you provided for the date filter does not match the format in the event, which is why it fails. You need to update the pattern to match what you actually have got in the event, e.g. dd/MM/yyyy h:mm (if that is the correct order for day and month).


(Rahul Pm) #7

The data is in MM/dd/yyyy format.


(Christian Dahlqvist) #8

Then adjust the pattern accordingly.


(Rahul Pm) #9

That's how I have adjusted the pattern. But still no luck.


(Christian Dahlqvist) #10

This test of the date format parsing seems to work for me:

input {
  generator {
    lines => ['12/11/2017 2:40']
    count => 1
  } 
} 

filter {
    date {
        match => ["message", "MM/dd/yyyy h:mm"]
    }
}

output {
  stdout { codec => rubydebug }
}

(system) #11

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.