Date filter in logstash.conf not parsing the date while grok parses it by timestamp_iso8601

I have a date field in a logfile that follows as 2015-09-26 03:45:12:051-0400.
I want to parse it which i am successfully able to do by using TIMESTAMP_ISO8601 grok pattern.
But the result is a string field which is successfully shipped to elastic search.I want to use this field as timestamp for kibana.For this I need to convert this to a timestamp field . To achieve this i am using the date filter in logstash config.Unfortunately with all try and combination i am unable to achieve it. Can anybody please help me.
The date filter i use is
date {
"match" => [
"time_stamp",
"MMM d HH:mm:ss",
"MMM dd HH:mm:ss",
"yyyy-MM-dd HH:mm:ss,SSS",
"yyyy-MM-dd HH:mm:ss:SSS-zzzz",
"ISO8601"
]
"target" => "time_stamp"
}

  • yyyy-MM-dd HH:mm:ss,SSS: Won't work because of the timezone offset in the input.
  • yyyy-MM-dd HH:mm:ss:SSS-zzzz: Won't work because "z" (not "Z") is the correct token for timezone offsets.
    ISO8601: Won't work because you have a non-standard colon separating seconds from milliseconds.

Try yyyy-MM-dd HH:mm:ss:SSSZ.

1 Like

Yes i got it parsed. Your solution worked.Thanks a lot.
But still i am getting the time_stamp as a string field.
Could you please tell why so?
And how shall i convert it to a timestamp.

Sorry.. It started working without any other changes.

I have the same problem, just have a different string maybe you can help me out on this one too.
My timestamp strong gets parsed by logstash in Json format:
{"timestamp":"2015-10-02T10:46:52.235010"}

In the logstash config:
date {
match => [ "timestamp", "ISO8601" ]
}
but the line just disappears and never enters logstash. It seems as though the timestamp follows the protocol to me, do you see where it goes wrong?

1 Like

Use the format in logstash.conf yyyy-MM-dd HH:mm:ss.SSSSSS and see if it works.
Delete the index once prior to trying this.

It shows up in elastic/kibana now but it gets tags:_dateparsefailure

Edit: that is with the parsing from Bubu

but the line just disappears and never enters logstash.

What do you mean by this? Can you give us a minimal reproducible example?

The logstash forwarder replies that it is sending 1 new entry if I enter but nothing ever shows up in my kibana, not even with a _dateparsefailure. I am also unable to find anything in logstash.log or logstash.err about the lost line.

This is the minimum setup I tried that reproduced the issue for me.

logstash.conf

input{
  lumberjack {
      port => 5000
      codec => "json"
      ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
      ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
    }
}
filter {
  if [type] == "SuricataIDPS"{
    date {
      match => [ "timestamp", "ISO8601" ]
    }
  }
}

logstash-forwarder.conf

  "files": [
    {
      "paths": [
        "/var/log/logstashfake.log"
      ],
      "fields": { "type": "test" }
    }
  ]

logstashfake.log contains the following lines:

{"field1":"This is not a UFO"}
{"timestamp":"2015-10-02T10:46:52.235010"}
{"timestamp":"2015-10-02T10:46:52.235010Z"}
{"timestamp":"2015-10-02T10:46:52.235010+02:00"}

The first line is being sent as a test to check if the whole setup is working, shows up in kibana es expected
The second line does not appear in kibana at all, no error is made in logstash logs. This is also the timestamp that Suricata posts which can not be changed from Suricata.
The third and fourth line I added to test if it was the date format which gave issues, which turned out to be true. Both lines get added to kibana.

I suggest you simplify things even more. Use a local file (or stdin) as input instead of lumberjack and use a stdout output (with codec => rubydebug) instead of elasticsearch. I'd also consider cranking up the loglevel by starting Logstash with --verbose. You could go as far as --debug but it might make it hard to see the forest for all the trees.

What timezone are you in? I wonder if the second line with no timezone gets a really weird timestamp by the date filter and ends up outside the interval that you're looking in.

I will try tomorrow with a local file and different output. As for logging I have already started it with --verbose, but will check it again or with --debug if it doesn't show anything.

I'm in Central European Timezone.

I didn't realize tomorrow would be after the weekend, but here it comes:
It works as expected with the file input and output, but still doesn't work via elastic. No logging is made with --verbose but this is the result that belongs to the parsing with --debug:

Local file input with file output

{:timestamp=>"2015-10-05T03:32:22.964000-0400", :message=>"Received line", :path=>"/var/log/logstashfake.log", :text=>"{\"timestamp\":\"2015-10-02T10:46:52.235010\"}", :level=>:debug, :file=>"logstash/inputs/file.rb", :line=>"142", :method=>"run"}
{:timestamp=>"2015-10-05T03:32:22.969000-0400", :message=>"writing sincedb (delta since last write = 1444030342)", :level=>:debug, :file=>"filewatch/tail.rb", :line=>"195", :method=>"_read_file"}
{:timestamp=>"2015-10-05T03:32:22.974000-0400", :message=>"filter received", :event=>{"timestamp"=>"2015-10-02T10:46:52.235010", "@version"=>"1", "@timestamp"=>"2015-10-05T07:32:22.965Z", "host"=>"localhost.localdomain", "path"=>"/var/log/logstashfake.log", "type"=>"SuricataIDPS"}, :level=>:debug, :file=>"(eval)", :line=>"398", :method=>"filter_func"}
{:timestamp=>"2015-10-05T03:32:22.977000-0400", :message=>"Date filter: received event", :type=>"SuricataIDPS", :level=>:debug, :file=>"logstash/filters/date.rb", :line=>"206", :method=>"filter"}
{:timestamp=>"2015-10-05T03:32:22.977000-0400", :message=>"Date filter looking for field", :type=>"SuricataIDPS", :field=>"timestamp", :level=>:debug, :file=>"logstash/filters/date.rb", :line=>"209", :method=>"filter"}
{:timestamp=>"2015-10-02T14:46:52.235Z", :message=>"Date parsing done", :value=>"2015-10-02T10:46:52.235010", :level=>:debug, :file=>"logstash/filters/date.rb", :line=>"239", :method=>"filter"}
{:timestamp=>"2015-10-05T03:32:22.983000-0400", :message=>"output received", :event=>{"timestamp"=>"2015-10-02T10:46:52.235010", "@version"=>"1", "@timestamp"=>"2015-10-02T14:46:52.235Z", "host"=>"localhost.localdomain", "path"=>"/var/log/logstashfake.log", "type"=>"SuricataIDPS"}, :level=>:debug, :file=>"(eval)", :line=>"409", :method=>"output_func"}
{:timestamp=>"2015-10-05T03:32:23.025000-0400", :message=>"File, writing event to file.", :filename=>"/opt/logs/testlog.log", :level=>:debug, :file=>"logstash/outputs/file.rb", :line=>"133", :method=>"write_event"}
{:timestamp=>"2015-10-05T03:32:23.025000-0400", :message=>"Opening file", :path=>"/opt/logs/testlog.log", :level=>:info, :file=>"logstash/outputs/file.rb", :line=>"213", :method=>"open"}
{:timestamp=>"2015-10-05T03:32:23.026000-0400", :message=>"Starting flush cycle", :level=>:debug, :file=>"logstash/outputs/file.rb", :line=>"183", :method=>"flush_pending_files"}

Output in the local file:

{"timestamp":"2015-10-02T10:46:52.235010","@version":"1","@timestamp":"2015-10-02T14:46:52.235Z","host":"localhost.localdomain","path":"/var/log/logstashfake.log","type":"SuricataIDPS"}

continued, post limit is 5000 characters.

local file input with elastic output

{:timestamp=>"2015-10-05T03:36:27.039000-0400", :message=>"/var/log/logstashfake.log: file grew, old size 414, new size 457", :level=>:debug, :file=>"filewatch/watch.rb", :line=>"96", :method=>"each"}
{:timestamp=>"2015-10-05T03:36:27.040000-0400", :message=>"Received line", :path=>"/var/log/logstashfake.log", :text=>"{\"timestamp\":\"2015-10-02T10:46:52.235010\"}", :level=>:debug, :file=>"logstash/inputs/file.rb", :line=>"142", :method=>"run"}
{:timestamp=>"2015-10-05T03:36:27.044000-0400", :message=>"writing sincedb (delta since last write = 1444030587)", :level=>:debug, :file=>"filewatch/tail.rb", :line=>"195", :method=>"_read_file"}
{:timestamp=>"2015-10-05T03:36:27.045000-0400", :message=>"filter received", :event=>{"timestamp"=>"2015-10-02T10:46:52.235010", "@version"=>"1", "@timestamp"=>"2015-10-05T07:36:27.041Z", "host"=>"localhost.localdomain", "path"=>"/var/log/logstashfake.log", "type"=>"SuricataIDPS"}, :level=>:debug, :file=>"(eval)", :line=>"394", :method=>"filter_func"}
{:timestamp=>"2015-10-05T03:36:27.062000-0400", :message=>"Date filter: received event", :type=>"SuricataIDPS", :level=>:debug, :file=>"logstash/filters/date.rb", :line=>"206", :method=>"filter"}
{:timestamp=>"2015-10-05T03:36:27.063000-0400", :message=>"Date filter looking for field", :type=>"SuricataIDPS", :field=>"timestamp", :level=>:debug, :file=>"logstash/filters/date.rb", :line=>"209", :method=>"filter"}
{:timestamp=>"2015-10-02T14:46:52.235Z", :message=>"Date parsing done", :value=>"2015-10-02T10:46:52.235010", :level=>:debug, :file=>"logstash/filters/date.rb", :line=>"239", :method=>"filter"}
{:timestamp=>"2015-10-05T03:36:27.072000-0400", :message=>"output received", :event=>{"timestamp"=>"2015-10-02T10:46:52.235010", "@version"=>"1", "@timestamp"=>"2015-10-02T14:46:52.235Z", "host"=>"localhost.localdomain", "path"=>"/var/log/logstashfake.log", "type"=>"SuricataIDPS"}, :level=>:debug, :file=>"(eval)", :line=>"405", :method=>"output_func"}
{:timestamp=>"2015-10-05T03:36:27.096000-0400", :message=>"Flushing output", :outgoing_count=>1, :time_since_last_flush=>36.269, :outgoing_events=>{nil=>[["index", {:_id=>nil, :_index=>"logstash-2015.10.02", :_type=>"SuricataIDPS", :_routing=>nil}, #<LogStash::Event:0x146acc5 @metadata_accessors=#<LogStash::Util::Accessors:0x165405f @store={"path"=>"/var/log/logstashfake.log", "retry_count"=>0}, @lut={"[path]"=>[{"path"=>"/var/log/logstashfake.log", "retry_count"=>0}, "path"]}>, @cancelled=false, @data={"timestamp"=>"2015-10-02T10:46:52.235010", "@version"=>"1", "@timestamp"=>"2015-10-02T14:46:52.235Z", "host"=>"localhost.localdomain", "path"=>"/var/log/logstashfake.log", "type"=>"SuricataIDPS"}, @metadata={"path"=>"/var/log/logstashfake.log", "retry_count"=>0}, @accessors=#<LogStash::Util::Accessors:0x55de3673 @store={"timestamp"=>"2015-10-02T10:46:52.235010", "@version"=>"1", "@timestamp"=>"2015-10-02T14:46:52.235Z", "host"=>"localhost.localdomain", "path"=>"/var/log/logstashfake.log", "type"=>"SuricataIDPS"}, @lut={"host"=>[{"timestamp"=>"2015-10-02T10:46:52.235010", "@version"=>"1", "@timestamp"=>"2015-10-02T14:46:52.235Z", "host"=>"localhost.localdomain", "path"=>"/var/log/logstashfake.log", "type"=>"SuricataIDPS"},... cut off, too many characters

It seems like the date is processed properly, but somehow it wont send through Elastic output. It's strange though that after adding a Z it will get sent through Elastic.

Also I notice that logstash keeps logging in a strange timezone, it is currently 9.45am here and logstash still reports 3.36am while my systems date reports the correct time

Output in the local file:

{"timestamp":"2015-10-02T10:46:52.235010","@version":"1","@timestamp":"2015-10-02T14:46:52.235Z","host":"localhost.localdomain","path":"/var/log/logstashfake.log","type":"SuricataIDPS"}

Just like below this indicates that the Logstash machine has a UTC-4 timezone setting. Given that configuration it's doing the right thing; 10:46 UTC-4 is indeed 14:46 UTC.

Also I notice that logstash keeps logging in a strange timezone, it is currently 9.45am here and logstash still reports 3.36am while my systems date reports the correct time

09:45 UTC+2 is just 9 minutes later than 03:36 UTC-4. The Logstash logs report time in UTC-4 so I wonder if you actually have the timezone setting you expect on the Logstash box.

The UTC-4 logging is not consistent with what you're saying about being in the CET timezone (currently UTC+2).

My systems timezone has been changed with the following command:

# cp /usr/share/zoneinfo/Europe/Paris /etc/localtime
# date
Mon Oct  5 10:31:09 CEST 2015

This did the trick for Kibana to change to the proper timezone, but Logstash still reports the same as before.

Have you restarted Logstash?

Yes, Logstash has been restarted multiple times already to reload the config for above tests and other things.

@sarthak2016, please don't post new problems in old and unrelated threads. You've already posted your question once elsewhere.

my log have this format 11-30-2016 12:13:57 INFO

then how do i include timestamp format