Logstash does not created proper timestamp from field


(Peter) #1

Hello I have a rule where I am trying to convert the timestamp from the logfile into @timestamp. However since logstash 6.5.0 that is not working anymore which results that data not being imported in elasticsearch.

The rules look like this:

if [year] {
    mutate {
      add_field => {
        "timestamp_match" => "%{month} %{day} %{year} %{time} %{day_period}"
      }
      remove_field => [ "month", "day", "year", "time", "day_period" ]
    }

    mutate {
      convert => { "timestamp_match" => "string" }
    }

    date {
      match => [ "timestamp_match",
                 "MMM dd YYYY KK:mm:ss aa",
                 "MMM dd YYYY K:mm:ss aa" ]
      timezone => "UTC"
      target => "@timestamp"
    }

the result is looks like this: "@timestamp" => 2018-11-22T19:16:23.000Z

Any idea how to fix this issue?


(Lewis Barclay) #2

Does it look like this in Kibana or in raw elastic?


(Peter) #3

I cannot see any of the data with that timestamp format in kibana. I am assuming that it's not loaded into elasticsearch or kibana cannot show it since it has no valid timestamp.


(Lewis Barclay) #4

So where are you seeing the result?


(Peter) #5

stdout and file output.


(Lewis Barclay) #6

Can you output the timestamp_match field also and see how that looks?


(Peter) #7
Nov 23 05:45:04 vps188864 logstash[28555]:     "timestamp_match" => "Nov 23 2018 5:45:03 AM",
Nov 23 05:45:04 vps188864 logstash[28555]:               "input" => {
Nov 23 05:45:04 vps188864 logstash[28555]:         "type" => "log"
Nov 23 05:45:04 vps188864 logstash[28555]:     },

(Lewis Barclay) #8

Sorry one more question, what do you want it to look like?


(Peter) #9

I don't care as long it is a valid timestamp which makes the data to be loaded into elasticsearch and will show up in kibana.


(Lewis Barclay) #10

Can you try the following:

date {
  match => [ "timestamp_match",
             "MMM dd YYYY hh:mm:ss aa",
             "MMM dd YYYY h:mm:ss aa" ]
  timezone => "UTC"
  target => "@timestamp"
}

(Lewis Barclay) #11

Also, your filter is currently working, so I don't understand why you don't like the converted timestamp?


(Peter) #12

It is not like I don't like it. It just prevents data loading up in elasticsearch.


(Lewis Barclay) #13

I don't understand how, it is a correct and valid timestamp as far as I can see. Its the same as my timestamp. What errors do you get?


(Peter) #14

Not working. timestamp format is the same and I cannot see the data loaded up in kibana

Nov 23 06:08:58 vps188864 logstash[6968]: "@timestamp" => 2018-11-23T06:08:50.000Z,


(Peter) #15

That is the trick no errors, The data is just does not show up in kibana. Only if I keep the timestamp generated by filebeat at the import time. It has few seconds of delay.


(Peter) #16

Just as an information for the amount of missing data which is not loaded up in kibana http://prntscr.com/llzmq3


(Lewis Barclay) #17

Can you try changing the target in the date filter to a new field to see?


(Peter) #18

That is what I was doing as a workaround.


(Lewis Barclay) #19

Interesting, so I wonder if it doesn't like the timestamp field being overwritten.

Could you try a "remove field" on the timestamp field directly before your date plugin?


(Peter) #20

It does not like it.

#<LogStash::Error: timestamp field is missing>, :backtrace=>["org/logstash/ext/JrubyEventExtLibrary.java:177:in `sprintf'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.2.1-java/lib/logstash/outputs/elasticsearch/common.rb:68:in `event_action_tuple'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.2.1-java/lib/logstash/outputs/elasticsearch/common.rb:38:in `block in multi_receive'", "org/jruby/RubyArray.java:2486:in `map'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.2.1-java/lib/logstash/outputs/elasticsearch/common.rb:38:in `multi_receive'", "org/logstash/config/ir/compiler/OutputStrategyExt.java:114:in `multi_receive'", "org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:97:in `multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:373:in `block in output_batch'", "org/jruby/RubyHash.java:1343:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:372:in `output_batch'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:324:in `worker_loop'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:286:in `block in start_workers'"]}
Nov 23 06:32:31 vps188864 logstash[7882]: [2018-11-23T06:32:31,472][ERROR][org.logstash.Logstash    ] java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit
Nov 23 06:32:31 vps188864 systemd[1]: logstash.service: Main process exited, code=exited, status=1/FAILURE
Nov 23 06:32:31 vps188864 systemd