Python log Logstash grok filter


(Brian) #1

I have a filter setup:

filter {
    grok {
        match => [ "message", "%{TIMESTAMP_ISO8601:timestamp} - %{DATA:module} - %{LOGLEVEL:Level} - %{GREEDYDATA:logmessage}" ]
    }
    date {
        timezone => "Australia/Perth"
        locale => "en"
        match => [ "timestamp", "yyyy-MM-dd HH:mm:ss,SSS"  ]
    }
}

an example message is:

2018-02-21T10:22:08.fZ - pipeline.features_extraction.feature - INFO - array of shape: (4000,) generated

but I keep getting _grokparsefailures. Any ideas what I am doing wrong?


(Magnus Bäck) #2

You have an "f" in your timestamp where Logstash expects a number.


(Brian) #3

what does the date {} bit actually do I got that off stack overflow. not sure if that is causing issues?

I changed my python format to now:

2018-02-22 12:16:14,189 - statsd.client.Gauge - INFO - encoder_time: 3541.82878648

which matches fully according to the grokconstructor web app but still raises a _grokparseerror?


(Magnus Bäck) #4

Yeah, it should work. Do you have any extra files in the config directory? Like a backup file with an old grok filter configuration? Otherwise be systematic. Comment the grok filter completely. Is the error gone? Add back the filter but reduce the expression to %{TIMESTAMP_ISO8601:timestamp}. Is the error gone. Et cetera.

Oh, and don't use more than one DATA or GREEDYDATA in the same expression. It's inefficient and could have surprising effects. In this case use e.g. NOTSPACE instead of DATA.


(system) #5

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.