_grokparse failure, but it's working


(Jack Judge) #1

I'm getting a weird grokparse failure everything seems to be working, it's a bit of a puzzler. I'm running logstash 1.5.0-1

This is a sample log message;

<B 2015Jul30 10:29:26.859> Workbook references the 'today' function but 'today' calendar position is not in the Workbook. Hence, calculations which use 'today' may contain wrong results.

And this is a snippet from the logstash conf,

grok {
                        match => { "message" => "\<%{USERNAME:loglevel} %{GREEDYDATA:logdate}\> %{GREEDYDATA:log_msg}" }
                }
                date {
                        timezone => "PST8PDT"
                        match => [ "logdate", "YYYYMMMdd HH:mm:ss.SSS" ]
                }

And this is the output, the grok and date filters are working so I'm puzzled why I'm getting tagged with a grokparsefailure.
{
"_index": "logstash-2015.08.17",
"_type": "rpas",
"_id": "AU89BH2s8IpA7GMMxcYe",
"_score": null,
"_source": {
"message": "<D 2015Aug17 11:55:47.297> TcpStream::closeSocket(SSL 2 Server70, 8);",
"@version": "1",
"@timestamp": "2015-08-17T18:55:47.297Z",
"type": "rpas",
"host": "qa-ip-rpas-02",
"path": "/u01/app/oracle/mfp/logs/daemon_D201508170000b00.log",
"tags": [
"_grokparsefailure"
],
"loglevel": "D",
"logdate": "2015Aug17 11:55:47.297",
"log_msg": "TcpStream::closeSocket(SSL 2 Server70, 8);"
},
"fields": {
"@timestamp": [
1439837747297
]
},
"sort": [
1439837747297
]
}

I'm outputting to an elasticsearch machine using the node protocol, this is our preferred choice by a country mile. I'm not specifying any codecs but I'm not getting this when I output to STDOUT using the rubydebug codec.
I guess it's something to do with the elasticsearch output.
I have experimented with a couple of output codecs (json and json_lines) but there was no discernible difference.
Does anyone have any ideas ? Thanks.


(Magnus B├Ąck) #2

And this is your only grok filter? No stray files in /etc/logstash/conf.d that you've forgotten about?


(system) #3