Logstash grok not parsing Multiline pattern properly

Hi,

We are using using standard ELK stack with filebeat for our Logging Solution.
We are trying to parse a multiline trace using grok filter and store it ti Elasticsearch.

Logs has a new line in them it format is as follow:


2020-05-29 10:54:57,571 [main-EventThread] INFO  com.cpl.dops.zkwatchdog.MonitoringService - Node Modified: http://stage-zoo-001:8080/node?path=/cpl/game-server/config
Added "/thirdPartyMatchMakingGameIds/11" Value: 1000134
Added "/gameConfigs/1000105" Value: {"name":"BrickBreakerz","matchMakerClass":"com.cpl.gs.sfs2x.core.mm.InOutMatchMaker","cancelBattleIfNotStarted":false,"useBattleBasedUserCount":true}
Added "/gameConfigs/1000134" Value: {"name":"Poker Puzzle","matchMakerClass":"com.cpl.gs.sfs2x.core.mm.InOutMatchMaker","cancelBattleIfNotStarted":false,"useBattleBasedUserCount":true}
Copied "/gameConfigs/1000061" to "/gameConfigs/1000145"

We are using Multiline pattern in filebeat config as below:


 multiline:
    pattern: '^[0-9]{4}-[0-9]{2}-[0-9]{2}'
    negate: true
    match: after

In Kibana grok is working as expected:

But when the same grok is used in Logstash, it is not working as per the exepectations:
Sample output of logstash is as below:

{
                "agent" => {
             "version" => "7.7.0",
                "type" => "filebeat",
                  "id" => "407a7096-5984-4d71-9f37-07589619ad90",
            "hostname" => "curator_cli",
        "ephemeral_id" => "aa60cc74-f02a-4b9a-92ee-7b96136b47f0"
    },
              "message" => "2020-05-29 10:54:57,571 [main-EventThread] INFO  com.mpl.dops.zkwatchdog.MonitoringService - Node Modified: http://prod-zoo-001:8080/node?path=/mpl/game-server/config\nAdded \"/thirdPartyMatchMakingGameIds/11\" Value: 1000134\nAdded \"/gameConfigs/1000105\" Value: {\"name\":\"BrickBreakerz\",\"matchMakerClass\":\"com.mpl.gs.sfs2x.core.mm.InOutMatchMaker\",\"cancelBattleIfNotStarted\":false,\"useBattleBasedUserCount\":true}\nAdded \"/gameConfigs/1000134\" Value: {\"name\":\"Poker Puzzle\",\"matchMakerClass\":\"com.mpl.gs.sfs2x.core.mm.InOutMatchMaker\",\"cancelBattleIfNotStarted\":false,\"useBattleBasedUserCount\":true}\nCopied \"/gameConfigs/1000061\" to \"/gameConfigs/1000145\"",
                 "host" => {
        "name" => "curator_cli"
    },
           "@timestamp" => 2020-05-30T15:17:52.409Z,
                  "log" => {
         "flags" => [
            [0] "multiline"
        ],
        "offset" => 0,
          "file" => {
            "path" => "/var/log/mpl/cur.log"
        }
    },
            "timestamp" => "2020-05-29 10:54:57,571",
           "threadname" => "main-EventThread",
                 "tags" => [
        [0] "zk-grokparse-success"
    ],
         "service-name" => "/mpl/game-server/config\nAdded \"/thirdPartyMatchMakingGameIds/11\" Value: 1000134\nAdded \"/gameConfigs/1000105\" Value: {\"name\":\"BrickBreakerz\",\"matchMakerClass\":\"com.mpl.gs.sfs2x.core.mm.InOutMatchMaker\",\"cancelBattleIfNotStarted\":false,\"useBattleBasedUserCount\":true}\nAdded \"/gameConfigs/1000134\" Value: {\"name\":\"Poker Puzzle\",\"matchMakerClass\":\"com.mpl.gs.sfs2x.core.mm.InOutMatchMaker\",\"cancelBattleIfNotStarted\":false,\"useBattleBasedUserCount\":true}",
                  "ecs" => {
        "version" => "1.5.0"
    },
            "javaclass" => "com.mpl.dops.zkwatchdog.MonitoringService",
            "log-level" => "INFO",
             "@version" => "1",
    "zk-changed-values" => "Copied \"/gameConfigs/1000061\" to \"/gameConfigs/1000145\"",
                "input" => {
        "type" => "log"
    }
}

Does anyone have a clue what I am doing wrong? Any suggestions are greatly appreciated.

Thanks,
Paresh

What is the configuration of your grok filter?

Logstash config file look as follow :


input {
    beats {
        port => "5045"
        include_codec_tag => false
    }
}

filter {

            grok
            {
                id => "zk-watcher"
                add_tag => "zk-grokparse-success"
                tag_on_failure => ["logstash-2", "zk-grokparsefailure"]
                match =>
                {
                    "message" =>

                    [
 		      "%{TIMESTAMP_ISO8601:timestamp} \[(?[^\]]+)\] %{LOGLEVEL:log-level}  %{JAVACLASS:javaclass} -(?:.*)[=](?.*)\n(?(.|\r|\n)*)"
                    ]


                }
            }


    }

output{
      stdout {}

}


Output is sent to stdout for testing the grok, in production it will be pointed to elasticsearch endpoint

You need to disambiguate which newline should separate the service-name from zk-changed-values. grok is picking one (the last one possible) where kibana picks a different one (the first).

In your pattern for service-name change .* to [^\n]*

This resolved the Issue and Thanks a ton :+1: .

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.