Logstash grok _grokparsefailure tag

hi. im using syslog-ng and loggen to generate my logs and this is the one for example:

<38>2020-04-01T23:30:02 localhost prg00000[1234]: seq: 0000000096, thread: 0000, runid: 1585767601, stamp: 2020-04-01T23:30:02 PADDPADDPADDPADDPADDPADDPADDPADDPADDPADDPADDPADDPADDPADDPADDPADDPADDPADDPADDPADDPADDPADDPADDPADDPADDPADDPADDPADDPADDPADDPADDPADDPADDPADDPADDPADDPADDPADDPADDPADDPADDPADDPADD

and this is my logstash configuration file:

input {
  tcp {
    port => 9000
  }
  udp {
    port => 9000
  }
}

filter {

grok {

match => { "message" =>  "%{GREEDYDATA:nonsense}: {NUMBER:seq}, %{NUMBER:thread}, %{NUMBER:runid}, %{TIMESTAMP_ISO8601:stamp} %{GREEDYDATA:message}" }

}


}

output {

elasticsearch {

hosts => ["localhost:9200"]

}

}

although i can see the logs in kibana, but all of them have the same tag "_grokparsefailure" . can someone please help me ?

That is not going to match

0000000096, thread: 0000, runid: 1585767601,

You could try

{NUMBER:seq}, thread: %{NUMBER:thread}, runid: %{NUMBER:runid},

Also, get rid of the leading '%{GREEDYDATA:nonsense}', it will make the pattern much more expensive if a line does not match. Starting the pattern with the : that follows that limits the number of places in the log line where it has to starting trying to match.

I would actually dissect that instead of using grok

dissect { mapping => { "message" => "<%{pri}>%{ts} %{host} %{program}[%{pid}]: seq: %{seq}, thread: %{thread}, runid: %{runid}, stamp: %{stamp} %{restOfLine}" }

If you do not want some fields you can replace %{pri} with %{} etc. -- it will still consume the text between the delimiters but not store it as a field.

1 Like

thanks it worked perfectly and now i got what i exactly want :partying_face:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.