Multiline is not displayed in the "Visualize" or "Dashboard" screen

Hello Dear Community,

I am having problems to display multi line messages after they were parsed in Kibana 7.5.2.

In the visualization and dashboard in Kibana the "message.keyword" is not shown when multi line messages are in the logs.
Let me explain my configuration and my issue with multi line, I hope I can give the needed information to troubleshoot it, so here we go:

I want some sample logs to be displayed, examples of these logs:

2020-01-15 08:01:02,575 [DefaultMessageListenerContainer-1] INFO  - a
2020-01-15 08:01:02,575 [DefaultMessageListenerContainer-1] ERROR - b
2020-01-15 08:01:02,575 [DefaultMessageListenerContainer-1] WARN  - c
2020-01-15 08:01:02,575 [DefaultMessageListenerContainer-1] DEBUG - d

Something more realistic:

2020-01-21 08:01:02,575 [DefaultMessageListenerContainer-1] DEBUG - <<<<<The values of DDMEAS are>>>>>
2020-01-21 08:01:02,575 [DefaultMessageListenerContainer-1] DEBUG - Measurement  type udc id is: <XXXXX>
2020-01-21 08:01:02,575 [DefaultMessageListenerContainer-1] DEBUG - External measurement code: <YYYYY>
2020-01-21 08:01:02,575 [DefaultMessageListenerContainer-1] DEBUG - DtsContext SvcPtId: <7415>

When the messages are in a single line all message are correctly displayed after I created the visualization and the dashboard:

So far so good, now let's drop the following log file:

2020-01-22 08:01:02,576 [DefaultMessageListenerContainer-1] DEBUG - <<<<<The values of XXXXX are>>>>>
2020-01-22 08:01:02,577 [DefaultMessageListenerContainer-1] DEBUG - Measurement  type udc id is: <9>
2020-01-22 08:01:02,578 [DefaultMessageListenerContainer-1] DEBUG - Returning BO : TimezoneBO
id[2307]
lastUpdDt[Tue Nov 06 07:34:13 UTC 2018]
insertDt[Tue Nov 06 07:34:13 UTC 2018]
insertBy[203]
lastUpdBy[203]
orgId[102]
recVersionNum[1]
dataSrc[SOURCE1]
name[Europe/Vienna]
javaName[Europe/Vienna]
displayStdOffset[GMT+01:00]
hasDst[Y] from the boBag
2020-01-22 08:01:02,579 [DefaultMessageListenerContainer-1] DEBUG - External measurement code: <10>
2020-01-22 08:01:02,580 [DefaultMessageListenerContainer-1] DEBUG - DtsContext SvcPtId: <7415>

As you can see the multi-line message is the one that begins with "Returning..."

When I open the "Discover" view I can see the line was parsed correctly:

I open it in the same view and says:

"No cached mapping for this field. Refresh field list from the Management > Index Patterns page"

And that row does not appear in the Dashboard:

So as suggested I refresh the list:

I get back to the "Discover" screen and I can see the admiration mark is gone:

But in the visualization is not appearing:

Any file I drop with multi lines is not shown in the dashboard, could someone please advice what needs to be done to have displayed those logs:

Here is my configuration

  • filebeat.yml (only relevant) (As an image as the formatting is breaking all)


logstash configuration:

input {
  beats {
    port => 5044
    ssl => false
  }
}

# filter extracting from log4j log fields: timestamp, thread, servity
filter {
fingerprint {
    method => "SHA1"
    key => "KEY"
}

  grok {
    #usefull is http://grokdebug.herokuapp.com/
    match => [ "message", "(?m)%{TIMESTAMP_ISO8601:timestamp}\s{0,4}\[%{DATA:thread}\]\s{0,4}%{LOGLEVEL:loglevel}\s{0,4}-\s{0,4}%{GREEDYDATA:message}" ]
    overwrite => [ "message" ]
  }
  date {
    match => [ "timestamp" , "yyyy-MM-dd HH:mm:ss,SSS" ]
  }
}

output {
         elasticsearch {
            hosts => "localhost:9200"
            ilm_rollover_alias => "index-vagrant"
            ilm_pattern => "000001"
            ilm_policy => "vagrant-policy"
            document_id => "%{fingerprint}"
         }
         stdout {
            codec => rubydebug
         }
}

This is the script used to create the index definition and the policy:

#!/bin/bash

echo "--- creating policy ---"
curl -0 -v -X PUT http://localhost:9200/_ilm/policy/vagrant-policy \
-H "Expect:" \
-H 'Content-Type: application/json; charset=utf-8' \
-d @- << EOF

{
  "policy": {
"phases": {
  "hot": {
    "actions": {
      "rollover": {
        "max_age": "5d",
        "max_size": "10G"
      },
      "set_priority": {
        "priority": 100
      }
    }
  },
  "warm": {
    "min_age": "5d",
    "actions": {
      "forcemerge": {
        "max_num_segments": 1
      },
      "allocate": {
        "number_of_replicas": 0
      },
      "set_priority": {
        "priority": 50
      }
    }
  },
  "cold": {
    "min_age": "14d",
    "actions": {
      "freeze": {}
    }
  },
  "delete": {
    "min_age": "14d",
    "actions": {
      "delete": {}
    }
  }
}
  }
}
EOF


echo "--- creating template ---"
curl -0 -v -X PUT http://localhost:9200/_template/vagrant-policy \
-H "Expect:" \
-H 'Content-Type: application/json; charset=utf-8' \
-d @- << EOF
{
  "index_patterns": [
"index-vagrant-*"
  ],
  "settings": {
"index": {
  "number_of_shards": 1,
  "number_of_replicas": 1,
  "lifecycle.name": "vagrant-policy",
  "lifecycle.rollover_alias": "index-vagrant"
}
  },
  "mappings" : {
"properties" : {
  "HTTP_RESULT_CODE" : { "type" : "short" },
  "PROCESSING_TIME" : { "type" : "integer" },
	  "propertyValue": { "type" : "integer" }	    	  
}
  }
}
EOF


echo "--- creating index alias ---"
curl -0 -v -X PUT http://localhost:9200/index-vagrant-000001 \
-H "Expect:" \
-H 'Content-Type: application/json; charset=utf-8' \
-d @- << EOF
{
  "aliases": {
"index-vagrant": {
  "is_write_index": true
}
  }
}
EOF

Not sure this is relevant but this elasticsearch log (a part):

2020-01-24T13:44:04,217][INFO ][o.e.c.r.a.AllocationService] [ubuntu1804.localdomain] Cluster health status changed from [RED] to [YELLOW] (reason: [shards s
[2020-01-24T13:44:10,448][INFO ][o.e.c.m.MetaDataIndexTemplateService] [ubuntu1804.localdomain] adding template [.management-beats] for index patterns [.manag
[2020-01-24T13:44:41,333][INFO ][o.e.c.m.MetaDataDeleteIndexService] [ubuntu1804.localdomain] [index-vagrant-000001/mKZJEITVQUaoAj9NU5uBag] deleting index
[2020-01-24T13:45:01,231][INFO ][o.e.c.m.MetaDataCreateIndexService] [ubuntu1804.localdomain] [index-vagrant] creating index, cause [auto(bulk api)], template
[2020-01-24T13:45:01,459][INFO ][o.e.c.m.MetaDataMappingService] [ubuntu1804.localdomain] [index-vagrant/BPsCIvDnQuWY7t7Aae1oBA] create_mapping [_doc]
[2020-01-24T13:49:18,427][INFO ][o.e.c.m.MetaDataDeleteIndexService] [ubuntu1804.localdomain] [index-vagrant/BPsCIvDnQuWY7t7Aae1oBA] deleting index
[2020-01-24T13:49:29,842][INFO ][o.e.c.m.MetaDataCreateIndexService] [ubuntu1804.localdomain] [index-vagrant] creating index, cause [auto(bulk api)], template
[2020-01-24T13:49:30,020][INFO ][o.e.c.m.MetaDataMappingService] [ubuntu1804.localdomain] [index-vagrant/vpUJlWvVTmaH2eoQyFilqg] create_mapping [_doc]
[2020-01-24T13:52:10,597][INFO ][o.e.c.m.MetaDataMappingService] [ubuntu1804.localdomain] [.kibana_1/gyRLAhm8TGaB1o7qsn5WAQ] update_mapping [_doc]
[2020-01-24T14:03:21,438][INFO ][o.e.c.m.MetaDataMappingService] [ubuntu1804.localdomain] [index-vagrant/vpUJlWvVTmaH2eoQyFilqg] update_mapping [_doc]

I can see also the message received in logstash:

{
     "@timestamp" => 2020-01-11T07:01:02.578Z,
    "fingerprint" => "7f4b845356d25eaf7e507accd319382632522f72",
        "message" => "Returning BO : TimezoneBO\nid[2307]\nlastUpdDt[Tue Nov 06 07:34:13 UTC 2018]\ninsertDt[Tue Nov 06 07:34:13 UTC 2018]\ninsertBy[203]\nlastUpdBy[203]\norgId[102]\nrecVersionNum[1]\ndataSrc[SOURCE1]\nname[Europe/Vienna]\njavaName[Europe/Vienna]\ndisplayStdOffset[GMT+01:00]\nhasDst[Y] from the boBag",
           "tags" => [
        [0] "beats_input_codec_plain_applied"
    ],
       "@version" => "1",
          "input" => {
        "type" => "log"
    },
          "agent" => {
        "ephemeral_id" => "e7fce2eb-1edd-4ce4-bff8-2ac30deefcb4",
                  "id" => "0353296c-cc91-4943-9278-1cb4716bea08",
             "version" => "7.5.2",
            "hostname" => "ubuntu1804.localdomain",
                "type" => "filebeat"
    },
            "log" => {
          "file" => {
            "path" => "/vagrant/uu.log"
        },
        "offset" => 203,
         "flags" => [
            [0] "multiline"
        ]
    },
           "host" => {
        "name" => "ubuntu1804.localdomain"
    },
         "thread" => "DefaultMessageListenerContainer-1",
      "timestamp" => "2020-01-11 08:01:02,578",
            "ecs" => {
        "version" => "1.1.0"
    },
       "loglevel" => "DEBUG"
}

I hope all information is ok. Thanks and regards

~Marco T.

I have tried this, but no luck yet :frowning: (truncate:maxHeight)

Even the numbers match:

I created this visualization testing the following logs:

2020-01-15 08:01:02,575 [DefaultMessageListenerContainer-1] INFO  - a
2020-01-15 08:01:02,575 [DefaultMessageListenerContainer-1] ERROR - b
2020-01-15 08:01:02,575 [DefaultMessageListenerContainer-1] WARN  - c
2020-01-15 08:01:02,575 [DefaultMessageListenerContainer-1] DEBUG - d

And:

2020-01-01 08:01:02,576 [DefaultMessageListenerContainer-1] DEBUG - <<<<<The values of XXXXX are!!!!>>>>>
2020-01-01 08:01:02,577 [DefaultMessageListenerContainer-1] DEBUG - Measurement  type udc id is: <1234>
2020-01-01 08:01:02,578 [DefaultMessageListenerContainer-1] DEBUG - Returning BO : LocoBO
id[2307]
lastUpdDt[Tue Nov 06 07:34:13 UTC 2018]
insertDt[Tue Nov 06 07:34:13 UTC 2018]
insertBy[203]
lastUpdBy[203]
orgId[102]
recVersionNum[1]
dataSrc[SOURCE1]
name[Europe/Vienna]
javaName[Europe/Vienna]
displayStdOffset[GMT+01:00]
hasDst[Y] from the boBag
2020-01-01 08:01:02,579 [DefaultMessageListenerContainer-1] DEBUG - External measurement code: <99>
2020-01-01 08:01:02,580 [DefaultMessageListenerContainer-1] DEBUG - DtsContext SvcPtId: <8383>

All would be excellent if the message with the multi line would be displayed:

The only thing I found is when selecting "Show missing values" the multi line message that is supposed to be there appears as "Missing":

I am testing with a single multi line message, somehow appears in the dashboard:

[root@ubuntu1804 20:23:19] vagrant $ cat mul.log
2020-01-24 08:01:02,578 [DefaultMessageListenerContainer-1] INFO  - a
2020-01-24 08:01:02,579 [DefaultMessageListenerContainer-1] ERROR - b
2020-01-24 08:01:02,580 [DefaultMessageListenerContainer-1] WARN  - c
2020-01-24 08:01:02,581 [DefaultMessageListenerContainer-1] WARN  - kd kf
kfk
2020-01-24 08:01:02,582 [DefaultMessageListenerContainer-1] DEBUG - d

Result:

So, seems the problem is a multi line message, does someone know how to display something bigger or please advise which visualization is better?

Regards

~M

Hello,

I am getting deeper into this issue, looks like strings like this:

id[2307]
lastUpdDt[Tue Nov 06 07:34:13 UTC 2018]
insertDt[Tue Nov 06 07:34:13 UTC 2018]
insertBy[203]
lastUpdBy[203]
orgId[102]
recVersionNum[1]
dataSrc[SOURCE1]
name[Europe/Vienna]
javaName[Europe/Vienna]
displayStdOffset[GMT+01:00]

Cause a problem, the advice of an export is very much welcomed :slight_smile:

I can see that if I have logs like this, things work quite normal, how to clean the multi lines in logstash?

Kind of works with the following filter in logstash:

filter {

mutate {
    gsub => ["message", "\n", " (LF) "]
}

fingerprint {
    method => "SHA1"
    key => "KEY"
}

  grok {
    #usefull is http://grokdebug.herokuapp.com/
    break_on_match => false
    #match => [ "message", "(?m)%{TIMESTAMP_ISO8601:timestamp}\s{0,4}\[%{DATA:thread}\]\s{0,4}%{LOGLEVEL:loglevel}\s{0,4}-\s{0,4}%{GREEDYDATA:message}" ]
    match => [ "message", "(?m)%{TIMESTAMP_ISO8601:timestamp}\s{0,4}\[%{DATA:thread}\]\s{0,4}%{LOGLEVEL:loglevel}\s{0,4}-\s{0,4}(?<message>(.|\r|\n)*)" ]
    overwrite => [ "message" ]
  }
  date {
    match => [ "timestamp" , "yyyy-MM-dd HH:mm:ss,SSS" ]
  }
  truncate {
    length_bytes => 100
    fields => "message"
    add_tag => ["shortened_message"]
  }
}

Maybe someone has a better idea?

Regards

~M

The discover view is just another visualisation that can be added to the Dashboard, issue solved.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.