Need help to get Key Value into Elastic


#1

ELK version 6.2.3

Sample data as below

{'_links': {'doc': {'href': '/mgmt/docs/status/MQSystemResources'}, 'self': {'href': '/mgmt/status/default/MQSystemResources'}}, 'MQSystemResources': {'HAStatus': 'Online', 'TotalErrorsStorage': 16179, 'UsedStorage': 528408, 'UsedErrorsStorage': 495, 'TotalTraceStorage': 32256, 'UsedTraceStorage': 691, 'TotalStorage': 3051008, 'HAPartner': 'MQA1 (Online)'}}

Logstash config has a Input Beats and Output section.

Expectation is to have the Keys defined in ES< and searchable in kibana.

When looking in kibana, it is only a message field.

stdout from logstash looks as

{
"@timestamp" => 2018-06-07T05:01:56.544Z,
"prospector" => {
    "type" => "log"
},
   "message" => "{'_links': {'doc': {'href': '/mgmt/docs/status/MQSystemResources'}, 'self': {'href': '/mgmt/status/default/MQSystemResources'}}, 'MQSystemResources': {'HAStatus': 'Online', 'TotalErrorsStorage': 16179, 'UsedStorage': 528408, 'UsedErrorsStorage': 495, 'TotalTraceStorage': 32256, 'UsedTraceStorage': 691, 'TotalStorage': 3051008, 'HAPartner': 'MQA1 (Online)'}}",
  "@version" => "1",
    "source" => "/var/log/om.log",
    "offset" => 363,
    "fields" => {
        "topic" => "dummy",
    "logsource" => "dummy"
},
      "beat" => {
    "hostname" => "elk",
     "version" => "6.2.3",
        "name" => "elk"
}

}

How do i have the Key values in ES indexed.


(Robert Cowart) #2

https://www.elastic.co/guide/en/logstash/current/plugins-filters-json.html


#3

@rcowart I have tried that, and it did not work.
Any other option to parse.


(Robert Cowart) #4

Exactly what did you try? Exactly what were the results?


#5

@rcowart as the message is JSON already, i tried adding and removing the "json" filter, but no luck.

Filebeat configuration is as below:

filebeat config

filebeat.prospectors:

  • type: log
    enabled: true
    paths:

    • /var/log/*.json
      fields:
      logsource: mq_event

    processors:

    • decode_json_fields:
      fields: ['message']

output.logstash:
hosts: ["logstash:5000"]

Logstash pipeline

`
input {
beats {
port => 5000
}
}

filter
{
json {
source => "[message]"
}
}

output {
elasticsearch {
hosts => "elasticsearch:9200"
index => "dev-%{+YYYY.MM.dd}"
}

stdout { codec => rubydebug }

}
`

logstash "stdout" output is below.

`
elk_logstash | {
elk_logstash | "message" => "{'_links': {'doc': {'href': '/mgmt/docs/status/MQSystemResources'}, 'self': {'href': '/mgmt/status/default/MQSystemResources'}}, 'MQSystemResources': {'HAStatus': 'Online', 'TotalErrorsStorage': 16179, 'UsedStorage': 528408, 'UsedErrorsStorage': 495, 'TotalTraceStorage': 32256, 'UsedTraceStorage': 691, 'TotalStorage': 3051008, 'HAPartner': 'MQA1 (Online)'}}",
elk_logstash | "@version" => "1",
elk_logstash | "fields" => {
elk_logstash | "logsource" => "mq_event"
elk_logstash | },
elk_logstash | "tags" => [
elk_logstash | [0] "beats_input_codec_plain_applied",
elk_logstash | [1] "_jsonparsefailure"
elk_logstash | ],
elk_logstash | "@timestamp" => 2018-06-07T14:36:31.495Z,
elk_logstash | "prospector" => {
elk_logstash | "type" => "log"
elk_logstash | },
elk_logstash | "host" => "58b63133dc90",
elk_logstash | "beat" => {
elk_logstash | "name" => "58b63133dc90",
elk_logstash | "version" => "6.2.4",
elk_logstash | "hostname" => "58b63133dc90"
elk_logstash | },
elk_logstash | "source" => "/var/log/om.json",
elk_logstash | "offset" => 363
elk_logstash | }

`


#6

The json filter does not like the single quotes. Try

mutate { gsub => [ "message", "'", '"' ] }

#7

@Badger Thanks. This resolved my issue.


(system) #8

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.