Logstash pulling in Date as a Timestamp

I'm losing my mind here so I'm hoping someone can point out where I'm being dumb.

My DB view has a date field and the dates are formatted "yyyy-MM-dd" (2023-08-15). However, when ingesting the data from the view with Logstash it freaking turns it into a Timestamp (2023-08-15T06:00:00.000Z) and I cannot figure out how to stop it from doing that.

I've tried:

  1. Enforcing conversion to a date from the statement (created::date).
  2. Using the DateFilter
date {
        match => ["created", "yyyy-MM-dd"]
        target => "created"
    }
  1. as well as using a date filter to try and push it into a whole new field
date {
        match => ["created", "yyyy-MM-dd"]
        target => "created_date"
}

and it just keeps the timestamp format no matter what I've tried. No errors during startup or indexing (unless I try to enforce the format in the custom-mappings file in which I get a date_time_parse exception because it's trying to format the timestamp into a date).

What am I missing?

Installed the date_formatter plugin and I got it to format the date properly but for some fields it just spits out a bunch of warning log messages (with typo) if the field column is empty in the db.

date_formatter  {
        source => "created"
        target => "created_date"
        pattern => "yyyy-MM-dd"
}

^ that one works with no warning messages.

This one screams the warning messages because there are some nulls in the column

date_formatter  {
        source => "last_updated"
        target => "last_updated_date"
        pattern => "yyyy-MM-dd"
    }
Unsupporter source field. It is neither a ruby Time or a Logstash::Timestamp

Can I just do this somehow with Ruby Code? By just converting the date to a string and formatting it from there? Or maybe a better way? It really shouldn't be this difficult, especially since the DB already has the data in the date format I want! Why does Logstash convert it to a Timestamp?!

Came up with a (dirty) solution for the time being unless someone can show me how to stop Logstash from ingesting a Date field as a Timestamp.

I had 2 issues I needed to address:.

if ([created]) {
        date_formatter  {
            source => "created"
            target => "Creation_Date__TYPE_DATE__FORMAT_DATE"
            pattern => "yyyy-MM-dd"
        }
    }

This is able to convert the date field properly that comes from the main query. The if statement is needed for a null check, otherwise you get a flood of warning messages whenever the object has a null for that particular field. Remember to remove the old source field later in the mutate {} block

The next issue I had is we join in child objects using the jdbc_streaming plugin and I also needed to parse any dates that occurred there:

if ([Monitorings]) {
       ruby {
              code => "
                     visitDate = []
                     event.get('[Monitorings]')&.each { |k|
                            myVisitDate = k['visit_date']
                            if myVisitDate
                                   myVisitDate = Time.parse(myVisitDate.to_s).strftime('%Y-%m-%d')
                            end
                            k['Visit_Date__TYPE_DATE__FORMAT_DATE'] = myVisitDate

                            k.delete('visit_date')
                            visitDate << k['Visit_Date__TYPE_DATE__FORMAT_DATE'].to_s
                     }
                     event.set('Visit_Date__TYPE_DATE__FORMAT_DATE', visitDate&.compact)
              "
       }
}

But why do you want to stop this? It's the right thing to do IMO. If you don't want to transform the text as a date, just don't use the date filter.

The jdbc input will automatically convert a date column fetched from the DB to a LogStash::Timestamp, no date filter required. And yes, that is usually a good thing.

It can be changed to a string in any desired date using ruby+strftime.

1 Like

Because the field isn't a timestamp in the DB; it's just a date. And the users wouldn't/don't care to see a "00:00:00:000" at the end of date fields for all their records.

ruby+strftime is what I ended up using (see above) for the jdbc_streaming areas. I'm planning on converting the date_formatter blocks to use it as well.

Users have a direct access to Elasticsearch? There's no frontend?

No, they get the data returned from a frontend UI based on their search criteria. And instead of having to loop through 10's of thousands of records to parse dates to the format they want every time they run a report it (should be) easier to just have it be indexed that way in the first place.