Date format match

Hi there

I just read and saw a lot of examples but in can't solve it , I have my filter of this way:

filter {
grok {
match => { "message" =>"[%{TIMESTAMP_ISO8601:date_log}] | [%{WORD:ENV}] | [%{WORD:APPLICATION}] | [%{WORD:TYPE}] | [%{GREEDYDATA:LOGMS}]" }

}

my log has the format:

[2018-06-08 11:20:23] | [TEST] | [WSRESTALG] | [CRITICAL] | [this is an example log message ]

the error in logstash is:

Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"docker-2018.06.08", :_type=>"doc", :_routing=>nil}, #LogStash::Event:0x93ca06a], :response=>{"index"=>{"_index"=>"docker-2018.06.08", "_type"=>"doc", "_id"=>"4cRE4GMBWr8dcirVLSft", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse ", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: "2018-06-08 11:20:23" is malformed at " 11:20:23""}}}}}

if I try with some example "dates" as [2016-09-19T18:19:00] the filter works fine , the problem is that my date field in the logs are in this format

2018-06-08 11:20:23

I have not found the pattern to match that , I have tried with

%{DATESTAMP:date_log}

but it doesn't work neither. Any help would be appreciate.

thanks in advance.

I do not know why you are getting that error, but what I would do is this. Note that you need to escape the | .

    grok {
        match => { "message" =>"\[%{TIMESTAMP_ISO8601:date_log}\] \| \[%{WORD:ENV}\] \| \[%{WORD:APPLICATION}\] \| \[%{WORD:TYPE}\] \| \[%{GREEDYDATA:LOGMS}\]" }
    }
    date {
        match => [ "date_log", "yyyy-MM-dd HH:mm:ss" ]
        timezone => "Europe/Moscow"
        remove_field => "date_log"
    }

The error comes from Elasticsearch and is related to the mapping of a field (your log entry appears to be garbled after "failed to parse").

Just use a stdout { codec => rubydebug } output while you're developing your filter. Once that looks okay you can try to index it in ES.

thanks for answering , still with this error , this is my conf

input {

beats {
# The port to listen on for filebeat connections.
port => 5044
# The IP address to listen for filebeat connections.
host => "0.0.0.0"
}
}
filter {
grok {
match => { "message" =>"[%{TIMESTAMP_ISO8601:date_log}] | [%{WORD:ENV}] | [%{WORD:APPLICATION}] | [%{WORD:TYPE}] | [%{GREEDYDATA:LOGMS}]" }

}

output {
elasticsearch {
hosts => ["11.224.212:9200"]
manage_template => false
index => "docker-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
}

output {
file
{
path=> "/tmp/file.txt"

codec => line { format => "custom format: %{message}"}
}
}

this is the output of the debug:
ustom format: [2018-06-08 18:19:00] | [DEV] | [WSRESTALG] | [INFO] | [No active profile set, falling back to default profiles: default]
custom format: [2018-06-08 18:19:00] | [DEV] | [WSRESTALG] | [INFO] | [No active profile set, falling back to default profiles: default]
custom format: [2018-06-08T18:19:00] | [DEV] | [WSRESTALG] | [INFO] | [No active profile set, falling back to default profiles: default]

the last log works , the previous doesn't , just the Date field is the problem:

2018-06-11T10:09:36,671][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"docker-2018.06.11", :_type=>"doc", :_routing=>nil}, #LogStash::Event:0x1a1a2461], :response=>{"index"=>{"_index"=>"docker-2018.06.11", "_type"=>"doc", "_id"=>"YcT37mMBWr8dcirVVUUY", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [date_log]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: "2018-06-08 18:19:00" is malformed at " 18:19:00""}}}}}

in http://grokconstructor.appspot.com/do/match#result , the filter works:

this is the output of the debug:

No, that's not what I asked for.

Your problem has nothing to do with your grok filter. It appears to be working fine. The problem is that the Elasticsearch mapping of your date_log field doesn't match the contents of the field. I suggest you use a date filter to parse date_log into @timestamp and then remove date_log. @Badger has already shown you how to do that.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.