Parsing JSON objects

I have a json logs coming from websites like this

"{\"path\":\"/visitors/events/\",\"value\":{\"fingerprint\":\"b9fcd389db7f5ca5fad9c767909c481a\",\"sessionId\":\"d7c41580-6155-1b2c-27f5-65c7e19a7174\",\"visitorId\":\"82a9178c-09f8-b021-1815-b558ce48778f\",\"trackingId\":\"INF-XXXXXXX\",\"userId\":null,\"userProfile\":null,\"target\":{\"selector\":\"p:nth-child(3)\"},\"duration\":2587.200000009034,\"timestamp\":\"2018-04-05T11:12:22.045Z\",\"event\":\"engage\",\"source\":{\"url\":{\"hash\":\"#nodeunit-header\",\"host\":\"localhost:8000\",\"hostname\":\"localhost\",\"pathname\":\"/test-api.html\",\"protocol\":\"http:\",\"query\":{\"q\":\"super duper query\"}}}}}

or like this -

"{\"path\":\"/visitors/events/\",\"value\":{\"fingerprint\":\"b9fcd389db7f5ca5fad9c767909c481a\",\"sessionId\":\"d7c41580-6155-1b2c-27f5-65c7e19a7174\",\"visitorId\":\"82a9178c-09f8-b021-1815-b558ce48778f\",\"trackingId\":\"INF-XXXXXXX\",\"userId\":null,\"userProfile\":null,\"timestamp\":\"2018-04-05T11:12:18.786Z\",\"event\":\"click\",\"source\":{\"url\":{\"hash\":\"#nodeunit-header\",\"host\":\"localhost:8000\",\"hostname\":\"localhost\",\"pathname\":\"/test-api.html\",\"protocol\":\"http:\",\"query\":{\"q\":\"super duper query\"}}}}}",

I am using Elastic stack using docker where I am sending above via filebeats --> Logstash --> ES.

Using kibana for visualisation what I need is to filter on the above key inside the json,i have gone through various post related to Logging using json_decoding and other methods like splitting.

None of them working my configuration is ass below.

filebeat.prospectors:
  paths:
    - /var/logs/websocket.log
  multiline.pattern: '^{'
  multiline.negate: true
  multiline.match:  after
  processors:
  - decode_json_fields:
      fields: ['message']
      target: json

#output.elasticsearch:
#  hosts: ['elasticsearch:9200']
#  username: elastic
#  password: changeme
output.logstash:
  hosts: ["logstash:5044"]

And my Logstash conf

input {
  beats {
    port => 5044
  }
}

output {
  elasticsearch {
    hosts => [ 'elasticsearch' ]
    manage_template => false
    index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
    document_type => "%{[@metadata][type]}"
    user     => 'elastic'
    password => 'changeme'
  }
}

Point me to a direction.

The logs have to contain a json object on each line. Are the lines in the logs exactly as you pasted them? If they are they contain escape characters that cannot be parsed as JSON.

Regarding configuration, you should use json options of the filebeat prospector.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.