Elastic timestamp ingest via logstash

Not sure what you mean by JSON event. This is a CSV I am ingesting.

Also I did a rename on accessdat and createdat to be what you see in the kibana screenshot.

if you mean the full line of the actual event I cannot post that here.

Logstash transforms the CSV data into JSON documemts, which is what is indexed into Elasticsearch. Please show the data in JSON form. You can remove fields not used or containing results of this logic.

If you are renaming fields in your Logstash config, make sure you do it in the correct order so the date filters are correctly configured.

This was the order in logstash:

  date {
        match => ["accessdat","yyyy-MM-dd HH:mm:ss"]
        #target => "accessdat"
 }
    date {
        match => ["createdat", "yyyy-MM-dd HH:mm:ss"]
        #target => "createdat"
 }
  mutate {
      rename => { "accessdat" => "access date"}
      rename => { "createdat" => "create date"}
      }

this is a sample row:

"id","user","accessedat","createdat","type","createdipaddress","organization_id","email_domain"
"11111111","22222222","2024-02-19 03:46:15","2024-02-19 03:46:11","persistent","3.3.3.3","","abc.aa"

Ok it is working now. I moved the Mutate Rename above the Date match. I see those columns as proper date fields now.

I also have a timestamp like this in a different file I need ingested":

2024-05-15T07:25:55+00:00

What would you suggest I use instead of this:

    match => ["timestamp", "yyyy-MM-dd HH:mm:ss"]