Not able to grok filebeat's mysql logs output


(Rajya Vardhan Mishra) #1

Hi,

I have been trying to grok filebeat messages(in json format), so that i can create kibana dashboard. But not able to crack this , even after spending couple of days!

Step 1:
My filebeat plugin puts messages in a log file. Example messages:

{"@timestamp":"2018-01-31T12:50:09.481Z","@metadata":{"beat":"filebeat","type":"doc","version":"6.1.2","topic":"abc"},"offset":1696118,"prospector":{"type":"log"},"grok_key":"mysql.errorlog","beat":{"name":"host1","hostname":"host1","version":"6.1.2"},"message":"2018-01-31T12:50:08.436179Z 31898 [Note] Aborted connection 123 to db: 'product' user: 'user' host: '0.0.0.0' (Got timeout reading communication packets)","source":"/logs/mysql-error.log"}

I want to grok this message so that i can plot them in Kibana. Referring grok examples from here

So my logstash.conf looks like:

input {
  file {
    path => "/tmp/slow.log"
    type => "mysql.errorlog"
    start_position => "beginning"
  }
}

filter {
    json{
      source => "message"
    }
    grok {
      match => { "message" => ["%{LOCALDATETIME:[mysql][error][timestamp]} (\[%{DATA:[mysql][error][level]}\] )?%{GREEDYDATA:[mysql][error][message]}",
        "%{TIMESTAMP_ISO8601:[mysql][error][timestamp]} %{NUMBER:[mysql][error][thread_id]} \[%{DATA:[mysql][error][level]}\] %{GREEDYDATA:[mysql][error][message1]}",
        "%{GREEDYDATA:[mysql][error][message2]}"] }
      pattern_definitions => {
        "LOCALDATETIME" => "[0-9]+ %{TIME}"
      }
      remove_field => "message"
    }
    mutate {
      rename => { "[mysql][error][message1]" => "[mysql][error][message]" }
    }
    mutate {
      rename => { "[mysql][error][message2]" => "[mysql][error][message]" }
    }
    date {
      match => [ "[mysql][error][timestamp]", "ISO8601", "YYMMdd H:m:s" ]
      remove_field => "[mysql][error][time]"
    }
  }

output {
  elasticsearch { hosts => localhost }
  stdout {codec => rubydebug }
}

I start logstash as:

/opt/logstash/bin/logstash -f /usr/share/logstash/pipeline/logstash.conf --debug --verbose

When the log message is parsed, nothing is pushed to ES.
On stdout: from output it seems that grok is working fine and able to run regex successfully. But, no output goes in ES.

{
        "offset" => 1696118,
    "prospector" => {
        "type" => "log"
    },
        "source" => "/logs/mysql-error.log",
          "type" => "mysql.errorlog",
          "path" => "/tmp/slow.log",
    "@timestamp" => 2018-01-31T12:50:08.436Z,
      "@version" => "1",
          "host" => "1111",
          "beat" => {
        "hostname" => "host1",
            "name" => "host1",
         "version" => "6.1.2"
    },
         "mysql" => {
        "error" => {
            "thread_id" => "31898",
                "level" => "Note",
              "message" => "Aborted connection 31898 to db: 'product' user: 'user' host: '0.0.0.0' (Got timeout reading communication packets)",
            "timestamp" => "2018-01-31T12:50:08.436179Z"
        }
    },
      "grok_key" => "mysql.errorlog"
}

Weird, that logstash also doesn't give any debug/error messages.

I have tried many changes in this config, and nothing seems to be working.
If i remove the grok filter and keep the filter simply as below, then messages go to ES:

filter {
    json{
      source => "message"
    }
}

Plz help.


(Magnus B├Ąck) #2

If you're not getting any log messages at all from Logstash that's something you should address first.


(system) #3

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.