Problem with date filter to replace timestamp

Hello,

I have latest ELK docker image, and I send application logs from windows with filebeat, everything is almost fine :wink:

I have this kinf of message :

2019-02-21 10:35:00,085 [3140] INFO  - Début du batch EDNotif 
2019-02-21 10:35:00,993 [3140] INFO  - Récupération des cheptels à traiter... 
2019-02-21 10:35:01,005 [3140] INFO  - Récupération des cheptels à traiter... OK ! (15ms) 
2019-02-21 10:35:01,006 [3140] INFO  - Récupération des événements à traiter par cheptels... 
2019-02-21 10:35:01,006 [3140] INFO  - Récupération des événements à traiter par cheptels... OK ! (0ms) 
2019-02-21 10:35:01,007 [3140] INFO  - Fin du batch EDNotif (Code retour : 0) 

I work with this Logstash config :

input {
  beats {
    port => 5044
    codec => plain {
     charset => "UTF-8"
     }
  }
}

    filter {
  if [fields][infra] {

    date {
        match => ["timestamp", "yyyy-MM-dd HH:mm:ss,SSS"]
        target => "@timestamp"
     }


    grok {
      match => { "message" => "%{TIMESTAMP_ISO8601}%{SPACE}%{EXIM_PID:PID} %{WORD:severity} %{GREEDYDATA:message}" }
      overwrite => "message"
    }

  }
}
output {
  if [fields][infra] {


    elasticsearch {
      hosts => ["localhost"]
      manage_template => false
      index => "osmos-%{+YYYY.MM.dd}"
    }
#    file {
#       path => "/etc/logstash/conf.d/osmos.log"
#    }
  }
}

The message is correctly sended to ES, and in Kibana I can see my messages with fields.
BUT I have a problem with Timestamp. My date filter is not applied. Kibana shows me this data :

The timestamp is not correct, so the sequence of events is not in the correct order. Can anybody helps me ?

Thank you :slight_smile:

The grok that parses the timestamp has to come before the date filter that is meant to parse it.

The grok filter does not capture the timestamp. Change %{TIMESTAMP_ISO8601} to %{TIMESTAMP_ISO8601:timestamp}. That said, that timestamp does not match TIMESTAMP_ISO8601 so you need a different pattern.

It appears that your input is not UTF-8. Maybe ISO-8859-1?

Hello @Badger, thank you for your response !

You're right, I have modified filter like that :

  filter {
    if [fields][infra] {
        grok {
          match => { "message" => "%{TIMESTAMP_ISO8601:date_batch}%{SPACE}%{EXIM_PID:PID} %{WORD:severity} %{GREEDYDATA:message}" }
  overwrite => "message"
}

date {
    match => ["date_batch", "yyyy-MM-dd HH:mm:ss,SSS"]
#        target => "@timestamp"
    add_field => ["date_OK"]
    }
  }
}

And now it works !! My message comes in the correct order :wink:

For the encoding I have done this :

In filebeat :

encoding: ISO-8859-1

In Logstash beats input :

input {
  beats {
    port => 5044
    codec => plain {
     charset => "UTF-8"
     }
  }
}

And everything is fine, thank you !!

Ooops, in fact it doesn't work, I'm not able to reproduce a working situation :frowning:

So my filter is like that now :

filter {
  if [fields][infra] {
    grok {
      match => { "message" => "%{TIMESTAMP_ISO8601:date_batch}%{SPACE}%{EXIM_PID:PID} %{WORD:severity} %{GREEDYDATA:message}" }
      overwrite => "message"
    }

    date {
        match => ["date_batch", "yyyy-MM-dd HH:mm:ss,SSS"]
        target => "@timestamp"

    }
  }
}

You said my timestamp is not ISO8601, is it because of milliseconds, wich are with a coma , and not a dot ?

That filter works for me.

I was mistaken when I said the timestamp does not match ISO8601. It does match.

It's strange, it works, with one hour of delay :smiley:

I think I must define a timezone ?

It's OK !!

With this :

date {
    match => ["date_batch", "yyyy-MM-dd HH:mm:ss,SSS"]
    timezone => "Europe/Paris"
    target => "@timestamp"

}

Every thing is fine :smiley:

Thank you very much !!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.