Issue while parsing date

Created following logstash pipeline file. but I am not able to make the date filter working hence commented out in the following output.

input { 
  file {
    path => "/Users/viaggarw/Documents/ELK/logstash-6.2.4/bplogs.log.bak"
    start_position => "beginning"
  }
  #stdin {
  #}
}
filter {
  grok {
    match => {"message" => ["%{TIMESTAMP_ISO8601:logdate} %{HOSTNAME:hostname} %{WORD:conatiner_name}: %{GREEDYDATA:[@metadata][messageline]}",
    "%{TIMESTAMP_ISO8601:timestamp} %{HOSTNAME:hostname} %{WORD:container}\[%{INT:haprorxy_id}\]: %{GREEDYDATA:[@metadata][messageline]}"]}
  }
  if "_grokparsefailure" in [tags] {
    drop {}
  }
  mutate {
    remove_field => ["message", "@timestamp"]
  }
  json {
    source => "[@metadata][messageline]"
  }
  if "_jsonparsefailure" in [tags] {
    drop {}
  }
  #date {
  #  match => ["logdate", "yyyy-mm-dd HH:mm:ss.SZ"]
  #}
}
output {
  #elasticsearch {
    #hosts => ["127.0.0.1:9200"]
    #index => "logs-%{+yyyy-mm-dd}"
  #  document_type => "applicationlogs"
  #}
  stdout {
    codec => rubydebug
  }
}

My input log file is having date format like:

2018-06-08T06:30:20.675790+00:00 vikrant-1810 raopenstack: {"msg": "Target service = neutron, Path tail = /v2.0/networks", "namespace": "raopenstack.rest_auth", "priority": 6, "pid": 1, "tid": 140335537149696, "code_file": "/bp2/src/raopenstack/rest_auth.py", "code_line": 310, "code_func": "sign", "timestamp": "2018-06-08T06:30:20.675650Z", "app": "raopenstack", "app_instance": "0", "container": "af70ae7a3c72"}
~~

Can anyone please help me to understand why date filter is not working?

If your date is 2018-06-08T06:30:20.675790+00:00 then it simply does not match "yyyy-mm-dd HH:mm:ss.SZ". It has a T in the middle instead of a space, and it has 6 digits of sub-second time. Use

"yyyy-MM-dd'T'HH:mm:ss.SSSSSSZ"

I used exactly what you have mentioned but no luck.

input { 
  #file {
  #  path => "/Users/viaggarw/Documents/ELK/logstash-6.2.4/bplogs.log.bak"
  #  start_position => "beginning"
  #}
  stdin {
  }
}
filter {
  grok {
    match => {"message" => ["%{TIMESTAMP_ISO8601:logdate} %{HOSTNAME:hostname} %{WORD:conatiner_name}: %{GREEDYDATA:[@metadata][messageline]}",
    "%{TIMESTAMP_ISO8601:timestamp} %{HOSTNAME:hostname} %{WORD:container}\[%{INT:haprorxy_id}\]: %{GREEDYDATA:[@metadata][messageline]}"]}
  }
  if "_grokparsefailure" in [tags] {
    drop {}
  }
  mutate {
    remove_field => ["message", "@timestamp"]
  }
  json {
    source => "[@metadata][messageline]"
  }
  if "_jsonparsefailure" in [tags] {
    drop {}
  }
  date {
    match => ["logdate", "yyyy-MM-dd'T'HH:mm:ss.SSSSSSZ"]
  }
}
output {
  elasticsearch {
    hosts => ["127.0.0.1:9200"]
    index => "logs-%{+yyyy-mm-dd}"
    document_type => "applicationlogs"
  }
}

While giving the input

2018-06-08T06:30:17.217783+00:00 vikrant-1810 haproxy_0[823]: {"http_status":"200","msg":"200 GET /solutionmanager/api/v1/containers/mano_18.02.28_0 HTTP/1.1","haproxy_backend":"solutionmanager_read","retries":"0","request_bytes":"228","response_bytes":"5788","dst_response_time":"3","dst_connect_time":"0","session_duration":"3","haproxy_termination_state":"--","src_ip":"172.16.0.23","src_port":"43330","dst_ip":"172.16.0.6","dst_port":"9999","tls_version":"-","tls_ciphers":"-","pid":"823","container":"43aa78c56891","host":"vikrant-1810","app":"haproxy","namespace":"http","app_instance":"0"}

It's giving the following message

[2018-06-09T00:04:29,917][FATAL][logstash.runner          ] An unexpected error occurred! {:error=>#<LogStash::Error: timestamp field is missing>, :backtrace=>["org/logstash/ext/JrubyEventExtLibrary.java:168:in `sprintf'", "/Users/viaggarw/Documents/ELK/logstash-6.2.4/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.1.1-java/lib/logstash/outputs/elasticsearch/common.rb:46:in `event_action_tuple'", "/Users/viaggarw/Documents/ELK/logstash-6.2.4/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.1.1-java/lib/logstash/outputs/elasticsearch/common.rb:36:in `block in multi_receive'", "org/jruby/RubyArray.java:2486:in `map'", "/Users/viaggarw/Documents/ELK/logstash-6.2.4/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.1.1-java/lib/logstash/outputs/elasticsearch/common.rb:36:in `multi_receive'", "/Users/viaggarw/Documents/ELK/logstash-6.2.4/logstash-core/lib/logstash/output_delegator_strategies/shared.rb:13:in `multi_receive'", "/Users/viaggarw/Documents/ELK/logstash-6.2.4/logstash-core/lib/logstash/output_delegator.rb:49:in `multi_receive'", "/Users/viaggarw/Documents/ELK/logstash-6.2.4/logstash-core/lib/logstash/pipeline.rb:477:in `block in output_batch'", "org/jruby/RubyHash.java:1343:in `each'", "/Users/viaggarw/Documents/ELK/logstash-6.2.4/logstash-core/lib/logstash/pipeline.rb:476:in `output_batch'", "/Users/viaggarw/Documents/ELK/logstash-6.2.4/logstash-core/lib/logstash/pipeline.rb:428:in `worker_loop'", "/Users/viaggarw/Documents/ELK/logstash-6.2.4/logstash-core/lib/logstash/pipeline.rb:386:in `block in start_workers'"]}
[2018-06-09T00:04:29,981][ERROR][org.logstash.Logstash    ] java.lang.IllegalStateException: org.jruby.exceptions.RaiseException: (SystemExit) exit

I do not get an exception for that with 6.2.4, but I do not know which version you built.

It could be because you have timestamp rather than logdate in the second grok pattern. So you delete @timestamp, but the date filter does not add it back.

Thanks.. It was the issue. After replacing date timestamp it's working fine.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.