Timestamp field reordering events from patterns?

Hi,

Yesterday i cam across something i can't understand.

Let's say i stopped my logstash at 10 AM and put it back on at 12, usually logs are coming and re-ordered as normal ( i can't figure how since @timestamp field should be ingest time ). As normal i mean it's near their creation time and everything is put back in the timeline.

Yesterday my pattern had a conflict or missing field and when i started my logstash every logs were spiking and @timestamp at the time i started my logstash thus creating a false timeline.

I tried to reproduce it but it looks like i am missing some pieces of informations.

My logs are shipped through filebeat to logstash then elastic.

I've tried to :

  • Set mapping to type and format date to try to replace @timestamp in the pattern with event.created time or something more accurate to keep track of the event timeline.
PUT index
{
  "mappings": {
    "properties": {
      "datedate": {
        "type": "date" ,
        "format": "YYYY MM DD HH:mm:ss"
      }
    }
  }
}

Combining this to a logstash filter with the correct date format :

   mutate {
                                add_field => {  "datedate" => "2020 10 07 16:01:01" }
                        }


                        date {
                                match => [ "datedate", "YYYY MM dd HH:mm:ss" ]
                        }

Seemed to work but my Kibana refused to understand the first part of the year depsite the configuration of the date format of kibana in the settings ( note that "dd" in logstash is not "dd" in Kibana )

And the final JSON doc :

"datedate": "2020 10 07 16:01:01",
"fields": {
    "datedate": [
      "2019-12-30T16:01:01.000Z"
    ],
    "@timestamp": [
      "2020-10-07T14:01:01.000Z"
    ]
  }, 

The original field in my doc is still shown as 2020 10 07 but Kibana created another from the pattern i believe with another value.

I dont know how all this works and i'd like to find answers on how the @timestamp field behave and how can we improve our configuration.

From what i've always understand @timestamp is from the ingestion time but i've seen it re-ordering logs in the correct time after a production crash or something.

I'd like to understand how all this is working.

By default, logstash adds the @timestamp field with the current time (i.e. when it is ingested). The date filter will overwrite this with whatever it parses. @timestamp is always in UTC, so if you are two hours ahead of UTC a date filter will convert "2020 10 07 16:01:01" to 2020-10-07T14:01:01.000Z

By default, logstash adds the @timestamp field with the current time

What i can't understand is how this value is changed to a time were logstash was down ?

And why the date filter is changing the Year of the field in Kibana from 2020 10 07 to 2019 12 30 ?

Sorry if i'm not very clear on the issue,

First on pattern X :


Then on pattern Y:

based on @timestamp definition this happening is kinda weird ?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.