_dateparsefailure in tags

Hello Team,
i am using filebeat to ship all logs to logstash,this logs are docker container logs
here is my filebeat.yml file

filebeat.inputs:
- type: log
  paths:
    - 'var/lib/docker/containers/ec037f29884667f4e271052c077afc1d9076c9642103efef006cb50c2c34dff9/ec037f29884667f4e271052c077afc1d9076c9642103efef006cb50c2c34dff9-json.log'   
  fields:
    type: service-log
  fields_under_root: true 
  multiline.pattern: ^[0-9]{4}-[0-9]{2}-[0-9]{2}
  multiline.negate: true
  multiline.match: after  


filebeat.config.modules:
  path: ${path.config}/modules.d/*.yml
  reload.enabled: false

setup.template.settings:
  index.number_of_shards: 3

setup.kibana:
  host: "kibana:5601"

processors:
- add_docker_metadata:
    host: "unix:///var/run/docker.sock"

- decode_json_fields:
    fields: ["message"]
    target: ""
    overwrite_keys: true 
    add_error_key: true

setup.template.enabled: false
setup.template.name: "filebeat"
setup.template.pattern: "filebeat-*"
setup.ilm.enabled: false


output.logstash:
  hosts: ["logstash:5044"] 
  

and here is my logstash.conf file


input {
    
  beats {
        port => "5044"

    }
}


filter {
  mutate {
    gsub => [ "message", "r", "" ]
  }
  if [container][name] == "zuul-service"  {
    mutate { remove_field => ["agent","host","container","offset","log","ecs","@version","@timestamp","input"]}
    grok {
         match => [ "message", 
               "(?<timestamp>%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{TIME})  %{LOGLEVEL:level} %{NUMBER:pid} --- \[(?<thread>[A-Za-z0-9-]+)\] [A-Za-z0-9.]*\.(?<class>[A-Za-z0-9#_]+)\s*:\s+(?<logmessage>.*)",
               "message",
               "(?<timestamp>%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{TIME})  %{LOGLEVEL:level} %{NUMBER:pid} --- .+? :\s+(?<logmessage>.*)"
             ]
		overwrite => [ "message" ]
       } 
    }
	  
 
	
  date {
    match => [ "timestamp" , "yyyy-MM-dd HH:mm:ss,SSS" ]
	timezone => "UTC"
    add_field => { "Status" => "Matched"}  
    remove_field => ["timestamp"]
  }	
	
}

 
output {
   
  stdout {
    codec => rubydebug
  }
    elasticsearch {
     hosts => ["elasticsearch:9200"]
     manage_template => false
     index => "%{[type]}"
 
  } 
  
}

this is output of logstash which give _dateparsefailure error in tags

{
         "level" => "INFO",
       "message" => "{\"log\":\"2020-07-10 10:14:15.880  INFO 1 --- [tap-executo-0] c.n.d.s..aws.ConfigClusteResolve      : Resolving eueka endpoints via configuation\\n\",\"steam\":\"stdout\",\"time\":\"2020-07-10T10:14:15.881357215Z\"}",
          "tags" => [
        [0] "beats_input_codec_plain_applied",
        [1] "_dateparsefailure"
    ],
        "thread" => "tap-executo-0",
        "stream" => "stdout",
          "time" => "2020-07-10T10:14:15.881357215Z",
     "timestamp" => "2020-07-10 10:14:15.880",
    "logmessage" => "Resolving eueka endpoints via configuation\\n\",\"steam\":\"stdout\",\"time\":\"2020-07-10T10:14:15.881357215Z\"}",
          "type" => "zuul-log",
           "pid" => "1",
         "class" => "ConfigClusteResolve"
}

I dont know how to fix this error,Please help to keep my logmessage very clear to user.
Regards,
Govinda

That has . between the seconds and milliseconds, your date filter has ,

Thanks, that helps a lot.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.