Elasticsearch stop indexing with custom timestamp

Hi All,
I'm processing two different log type using elasticsearch (each log type is stored in different index).
I have a problem processing the timestamp field only with one log type.
I would like to create @timestamp starting from date and time fields, in order to have "Event Date" and not "Processed Date" as referrer.

The Type 1 works great with the following configuration and everything is ok:

******* LOG EXAMPLE *******
Date,Time,...
"29-07-2016","10:19:04",...


mutate {
   add_field => {
                 "datetime" => "%{Date} %{Time}"
                }
       }
date {
      locale => "en"
      match => ["datetime", "dd-MM-yyyy HH:mm:ss"]
      timezone =>  "UTC"
      target => "@timestamp"
     }
     mutate {
       remove_field => ["datetime"]
     }

When the second type is enabled, elasticsearch stops indexing and no more event are shown in the kibana dashboard (logstash.log doesn't report any error or warning at all).

******* LOG EXAMPLE *******
DateSent,TimeSent,...
2016/07/29,10:27:20.405,...


mutate {
        gsub => ["TimeSent", "\.\d{3}$", ""]
        add_field => {
           "tempts" => "%{DateSent} %{TimeSent}"
       }
}

date {
      locale => "en"
      match => [ "tempts", "yyyy/MM/dd HH:mm:ss" ]
      timezone => "UTC"
      target => "@timestamp"
     }
     mutate {
       remove_field => ["tempts", "DateSent", "TimeSent"]
            }

If I comment out all the date section, logs are processed (with received timestamp)
Could you help me please ?
Thanx in advance

Hi,

I haven't tried it but why are you doing a gsub. You can simplify your filter :

Type 1

mutate { 
   add_field => {
                 "datetime" => "%{Date} %{Time}"
                }
       }
date {
      locale => "en"
      match => ["datetime", "dd-MM-yyyy HH:mm:ss"]
      timezone =>  "UTC"
      remove_field => ["datetime"]
     }
  • target => "@timestamp" : @timestamp is the default value
  • You can remove field in date

Type 2

mutate {
        add_field => {
           "tempts" => "%{DateSent} %{TimeSent}"
       }
}
date {
      locale => "en"
      match => [ "tempts", "yyyy/MM/dd HH:mm:ss.SSS" ]
      timezone => "UTC"
      remove_field => ["tempts", "DateSent", "TimeSent"]
     }
  • use .SSS for fraction of a second in date pattern
  • target => "@timestamp" : @timestamp is the default value
  • You can remove field in date

Have you got some _dateparsefailure in tags ?
try not to remove fields to see what they look like

Hi Rom1 and thanx for your help.
I changed my configuration as suggested and now both logs are shown in Kibana dashboard.
I confirm I have many _dateparsefailure tags in the second type.

**TYPE 1 LOG CONFIG**
 mutate {
                add_field => {
                                 "datetime" => "%{Date} %{Time}"
                                }
              }
date {
   locale => "en"
   match => ["datetime", "dd-MM-yyyy HH:mm:ss"]
   timezone =>  "UTC"
    }

mutate {
        remove_field => ["datetime"]
}


    **TYPE 2 LOG CONFIG**
    mutate {
          add_field => {
                    "tempts" => "%{DateSent} %{TimeSent}"
           }
    }

    date {
       match => [ "tempts", "yyyy/MM/dd HH:mm:ss.SSS" ]
       timezone => "UTC"
       target => "@timestamp"
               }
       mutate {
             remove_field => ["tempts", "DateSent", "TimeSent"]
               }
    }

Now in the logstash.log file I have tons of warning like this:

{:timestamp=>"2016-07-29T14:17:07.012000+0200", :message=>"Failed parsing date from field", :field=>"tempts", :value=>"2016/07/29 14:17:06", :exception=>"Invalid format: \"2016/07/29 14:17:06\" is too short", :config_parsers=>"yyyy/MM/dd HH:mm:ss.SSS", :config_locale=>"default=en_US", :level=>:warn}

In fact, when I look at your log, there is no fraction of second :

2016/07/29 14:17:06

Are you sure that your input is always in the same format ? or maybe a line match all your grok or csv
If there is 2 inputs file, you should separate the filter with a condition testing the type :

input {
   file {
      ...
      type => "log1" (for the 1st, log2 for the 2nd)
   }
}

filter {
   if [type] == 'log1" {
      ...
   } else {
      ...
   }
}

I checked the input and every row is with microseconds. The error in the post above was for a mistake: I leave the microsecond gsub in place before save. Now I fix the error and I have the same issue: Type 1 log shown in Kibana, Type 2 not. Nothing logged in logstash.log. I confirm I have the if condition for both type of logs (input, output and filter section).

This is an example of the Type 2 input log:

MSG5,111111,2016/07/29,14:28:45.511,,29,,,,,,,0,,0,0

UPDATE:

As far as I understand, if the date pattern in config file match the date format I have in input then no errors in the logs and no events in Kibana.
If I change the date pattern in config file(i.e. removing .SSS for microseconds), I have tons of warning in the log file but events are shown in Kibana with _dateparsefailure tag applied....

Is it possible to share your entire conf file ?

Thx

input {
  file {
    path => "/logs/type1.log"
    type => "type1"
    start_position => "beginning"
    codec => multiline {
      pattern => "^\x22[\d]{1,2}-[\d]{1,2}-[\d]{4}\x22,\x22[\d]{1,2}:[\d]{1,2}:[\d]{1,2}\x22,"
      negate => true
      what => previous
    }
  }

  file {
    path => "/logs/type2.log"
    type => "type2"
    start_position => "beginning"
  }
}


filter {
    if [type] == "type1" {
       csv {
           columns => ["Date", "Time", "DailyID", "Src", "Dst", "Reg", "Ident", "Label", "Number", "Payload"]
           separator => ","
       }
       mutate {
         gsub => [
                  "Reg","\.",""
                 ]
         add_field => {
                       "datetime" => "%{Date} %{Time}"
                      }
       }
       translate {
          field => "Label"
          destination => "Label_Description"
          dictionary_path => "/etc/logstash/dataset/labels.yaml"
       }
       translate {
          field => "Number"
          destination => "Number_Type"
          dictionary_path => "/etc/logstash/dataset/numbers.yaml"
       }
       date {
         locale => "en"
         match => ["datetime", "dd-MM-yyyy HH:mm:ss"]
         timezone =>  "UTC"
       }
       mutate {
         remove_field => ["datetime"]
       }
    }


    if [type] == "type2" {
       csv {
           columns => [ "Fix", "Type", "Fix1", "Fix2", "SrcIp", "Fix3", "DateRx", "TimeRx", "DateSent", "TimeSent", "DstIp", "ID", "Fix4", "Fix5", "Lat", "Lon", "Fix6", "Fix7", "Fix8", "Fix9", "Fix10", 

"Fix00" ]
           separator => ","
           remove_field => [ "Fix1", "Fix2", "Fix3", "Fix4", "Fix5", "Fix6", "Fix7", "Fix8", "Fix9", "Fix10", "Fix00", "Fix", "TimeRx", "DateRx" ]
           }
           translate {
             field => "SrcIp"
             destination => "SrcIp_match"
             dictionary_path => "/etc/logstash/dataset/ip1.yaml"
           }
           translate {
             field => "DstIp"
             destination => "DstIp_match"
             dictionary_path => "/etc/logstash/dataset/ip2.yaml"
           }
           mutate {
              add_field => {
                "tempts" => "%{DateSent} %{TimeSent}"
              }
           }
          date {
             match => [ "tempts", "yyyy/MM/dd HH:mm:ss.SSS" ]
             timezone => "UTC"
             remove_field => [ "tempts" ]
             add_tag => [ "tsmatch" ]
           }
           mutate {
              remove_field => ["DateSent", "TimeSent"]
           }
       }
}

output {
  if [type] == "type1" {
     elasticsearch {
        action => "index"
        hosts => "localhost"
        index => "type1-%{+YYYY.MM.dd}"
        workers => 1
     }
  }

  if [type] == "type2" {
     elasticsearch {
        action => "index"
        hosts => "localhost"
        index => "type2-%{+YYYY.MM.dd}"
        workers => 1
     }
  }
}

Dear all,
unfortunately I'm stuck with the issue. Everything looks ok in the config file.
I'm using logstash* template as field map.
Could anyone help me ?
Thanx