Date/time field formatting from csv input

Please excuse the ignorance of my question, I'm still trying get my arms around working with elk, not my forte.
I'm generating a csv for a report that contains a few columns that refer to date/time stamps.
During generation I'm normalizing these fields from their original formatted state to their (assume more flexible) millisecond representation.
Now I'm trying to work out what I need to do with these when sending to ES when processing. Do I configure a plugin to perform the conversion back to a human readable format? What is the suggested workflow to perform this?

Do you mean milliseconds since the epoch? If so I would suggest using a date filter to match the UNIX_MS format. That will create a Timestamp object (logstash has a Timestamp data type).

When a Timestamp field is sent to elasticsearch then date detection will map the field as a date (really date+time) in the index.

Yep, that's the format I'm talking about.
I see you answered a similar question a few years back.

Sorry, I'll work on my forum searching chops.

Regards!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.