Elastic timestamp ingest via logstash

I have a CSV file I am ingesting. The 'timestamp' field shows the time like this: 2024-03-14 09:30:58.000.

In Logstash my Date Filter is show below. When the ingest runs the elastic index shows the timestamp field as text and not a proper timestamp field. Can someone tell me what is wrong here?

  date {
    match => ["Timestamp","YYYY-MM-dd HH:mm:ss"]
 }

I have tried the yyyy in both upper and lowercase, neither work.

There are two issues in your case.

This is related to the mapping of the field, you need to adjuste the mapping of the Timestamp field in your template to be a date type.

This filter would parse your Timestamp field using the specified format and store it in the @timestamp field, which is the default target, but the issue here is that the format is wrong, it does not match your date string.

Your date string is this: 2024-03-14 09:30:58.000, but your format will not match because it is missing the miliseconds, it should be YYYY-MM-dd HH:mm:ss.SSS

Are you suggesting this config? I had this mutate convert filter earlier, but logstash error'd out. It said the timestamp convert needed to be string, boolean, or keyword.

  }
  mutate {
    convert => { "timestamp" => "date"
    }
  }
  date {
    match => ["timestamp","YYYY-MM-dd HH:mm:ss.SSS"]
 }

No, the mapping is done in Elasticsearch side, not on Logstash, if you didn't specify a mapping for your index while creating it or using an index template, Elasticsearch will try to infer the type of the field, but sometimes it does not infer the correct type, which seems to be your case.

You need to change the type in elasticsearch by recreating the index with the correct mapping or using an index template and then create a new index.

Try something like this:

date {
    match => ["Timestamp","YYYY-MM-dd HH:mm:ss.SSS"]
    target => "Timestamp"
}

This should format the field so that it matches the format Elasticsearch requires to dynamically map it as a date.

You will need to start with a new index as the existing mapping will not be changed.

1 Like

Thanks. This worked. Now I have the correct timestamp showing up. Thanks Folks for your assistance.

I now have 2 columns to add in a date filter. This config I thought would work but it is not:

 date {
        match => ["accessdat","yyyy-MM-dd HH:mm:ss", "createdat","yyyy-MM-dd HH:mm:ss" ]
        target => ["accessdat", "createdat"]
 }

Use the date filter twice, once for each field.

1 Like

Thanks, Will try it now.

For a CSV ingest should pipeline.workers be set to 1?

date {
match => ["Timestamp","YYYY-MM-dd HH:mm:ss"]
}

No, that is not required.

Thanks. Restarting this process. Logstash is running but index so far has 0 size to it.

The Timestamp fields are not working.

Here is my config:

 date {
        match => ["accessdat","yyyy-MM-dd HH:mm:ss"]
        target => "accessdat"
 }
    date {
        match => ["createdat", "yyyy-MM-dd HH:mm:ss"]
        target => "createdat"
 }

Here is how they appear in the log:

2024-02-19 03:46:11

and here is how they show up:

2024-03-11T09:59:29.000Z

Change the target field so it does not overwrite, then please share a sample event showing the issue.

Meaning change Target to a different name all together, or just comment it out?

When I comment out the target => they show up like this. The index shows them as text columns. not actual date columns

image

Please show the full JSON document.

Can only show partial with times:

looks like it is putting the @timestamp == the Create Date.

image

It doesn't look like the field names match what is the event. Please show the JSON event, not screenshots from Kibana.