Hi all,
I have two types of timestamps coming into my logstash input from different logfiles:
[6/13/18 8:11:25:022 CEST] 2018-04-17T15:19:20.313
My grok below works for both:
if [fields][log_type] == "p8_server_error" {
grok {
match => [ "message",
"%{TIMESTAMP_ISO8601:logdate} %{DATA:thread} %{DATA:sub} [ ]* %{DATA:category} \- %{LOGLEVEL:sev} %{GREEDYDATA:message}" ]
overwrite => [ "message" ]
}
mutate {
replace => [ "type", "p8_server_error_log" ]
}
}
if [fields][log_type] == "SystemOut-ICN-JVM" {
grok {
match => [ "message",
"%{DATESTAMP:logdate} %{DATA} %{DATA:thread} %{DATA:source} [ ]* %{DATA:sev} %{DATA:module} %{DATA:log-level} %{DATA} \[ \] %{DATA:java-method} %{GREEDYDATA:message}" ]
overwrite => [ "message" ]
}
mutate {
replace => [ "type", "SystemOut-ICN-JVM_log" ]
}
}
And here's the date Filter which I think is where it's failing:
date {
match => [ "logdate", "yyyy-MM-dd'T'HH:mm:ss.SSS", "M/dd/yy HH:mm:ss.SSS", "ISO8601" ]
}
The problem is that only one type this one -> (2018-04-17T15:19:20.313
) makes it into the ES index. So here's the subsequent error what i get:
[2018-07-05T07:26:17,226][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"bab_4", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x39e1735b>], :response=>{"index"=>{"_index"=>"bab_4", "_type"=>"doc", "_id"=>"NCPnaGQBAzFCu_yrxMDS", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [logdate]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: \"6/13/18 5:29:50:575\" is malformed at \"/13/18 5:29:50:575\""}}}}}
What am I doing wrong here? can some one please help??