Hi there
I just read and saw a lot of examples but in can't solve it , I have my filter of this way:
filter {
grok {
match => { "message" =>"[%{TIMESTAMP_ISO8601:date_log}] | [%{WORD:ENV}] | [%{WORD:APPLICATION}] | [%{WORD:TYPE}] | [%{GREEDYDATA:LOGMS}]" }
}
my log has the format:
[2018-06-08 11:20:23] | [TEST] | [WSRESTALG] | [CRITICAL] | [this is an example log message ]
the error in logstash is:
Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"docker-2018.06.08", :_type=>"doc", :_routing=>nil}, #LogStash::Event:0x93ca06a], :response=>{"index"=>{"_index"=>"docker-2018.06.08", "_type"=>"doc", "_id"=>"4cRE4GMBWr8dcirVLSft", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse ", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: "2018-06-08 11:20:23" is malformed at " 11:20:23""}}}}}
if I try with some example "dates" as [2016-09-19T18:19:00] the filter works fine , the problem is that my date field in the logs are in this format
2018-06-08 11:20:23
I have not found the pattern to match that , I have tried with
%{DATESTAMP:date_log}
but it doesn't work neither. Any help would be appreciate.
thanks in advance.