Logstash does not created proper timestamp from field

Hello I have a rule where I am trying to convert the timestamp from the logfile into @timestamp. However since logstash 6.5.0 that is not working anymore which results that data not being imported in elasticsearch.

The rules look like this:

if [year] {
    mutate {
      add_field => {
        "timestamp_match" => "%{month} %{day} %{year} %{time} %{day_period}"
      remove_field => [ "month", "day", "year", "time", "day_period" ]

    mutate {
      convert => { "timestamp_match" => "string" }

    date {
      match => [ "timestamp_match",
                 "MMM dd YYYY KK:mm:ss aa",
                 "MMM dd YYYY K:mm:ss aa" ]
      timezone => "UTC"
      target => "@timestamp"

the result is looks like this: "@timestamp" => 2018-11-22T19:16:23.000Z

Any idea how to fix this issue?

Does it look like this in Kibana or in raw elastic?

I cannot see any of the data with that timestamp format in kibana. I am assuming that it's not loaded into elasticsearch or kibana cannot show it since it has no valid timestamp.

So where are you seeing the result?

stdout and file output.

Can you output the timestamp_match field also and see how that looks?

Nov 23 05:45:04 vps188864 logstash[28555]:     "timestamp_match" => "Nov 23 2018 5:45:03 AM",
Nov 23 05:45:04 vps188864 logstash[28555]:               "input" => {
Nov 23 05:45:04 vps188864 logstash[28555]:         "type" => "log"
Nov 23 05:45:04 vps188864 logstash[28555]:     },

Sorry one more question, what do you want it to look like?

I don't care as long it is a valid timestamp which makes the data to be loaded into elasticsearch and will show up in kibana.

Can you try the following:

date {
  match => [ "timestamp_match",
             "MMM dd YYYY hh:mm:ss aa",
             "MMM dd YYYY h:mm:ss aa" ]
  timezone => "UTC"
  target => "@timestamp"

Also, your filter is currently working, so I don't understand why you don't like the converted timestamp?

It is not like I don't like it. It just prevents data loading up in elasticsearch.

I don't understand how, it is a correct and valid timestamp as far as I can see. Its the same as my timestamp. What errors do you get?

Not working. timestamp format is the same and I cannot see the data loaded up in kibana

Nov 23 06:08:58 vps188864 logstash[6968]: "@timestamp" => 2018-11-23T06:08:50.000Z,

That is the trick no errors, The data is just does not show up in kibana. Only if I keep the timestamp generated by filebeat at the import time. It has few seconds of delay.

Just as an information for the amount of missing data which is not loaded up in kibana http://prntscr.com/llzmq3

Can you try changing the target in the date filter to a new field to see?

That is what I was doing as a workaround.

Interesting, so I wonder if it doesn't like the timestamp field being overwritten.

Could you try a "remove field" on the timestamp field directly before your date plugin?

It does not like it.

#<LogStash::Error: timestamp field is missing>, :backtrace=>["org/logstash/ext/JrubyEventExtLibrary.java:177:in `sprintf'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.2.1-java/lib/logstash/outputs/elasticsearch/common.rb:68:in `event_action_tuple'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.2.1-java/lib/logstash/outputs/elasticsearch/common.rb:38:in `block in multi_receive'", "org/jruby/RubyArray.java:2486:in `map'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.2.1-java/lib/logstash/outputs/elasticsearch/common.rb:38:in `multi_receive'", "org/logstash/config/ir/compiler/OutputStrategyExt.java:114:in `multi_receive'", "org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:97:in `multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:373:in `block in output_batch'", "org/jruby/RubyHash.java:1343:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:372:in `output_batch'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:324:in `worker_loop'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:286:in `block in start_workers'"]}
Nov 23 06:32:31 vps188864 logstash[7882]: [2018-11-23T06:32:31,472][ERROR][org.logstash.Logstash    ] java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit
Nov 23 06:32:31 vps188864 systemd[1]: logstash.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 06:32:31 vps188864 systemd