JSON Parsing Error - Logstash

Hi,

I am trying to ingest a JSON file but I'm getting errors and none of the key value pairs are being extrated to any fields.

I currently have this configuration:

input {
    file {
        path => "/home/path_to_json/test.json"
        start_position => "beginning"
        sincedb_path => "/dev/null"
    }
}

filter {
     json {
       source => "message"
     }
   }

output {
    elasticsearch {
        hosts => "http://localhost:9200"
        index => "logs"
    }
    stdout { codec => rubydebug}
}

And here is a sample the application in question provides of the response it will return to the request passed via the API:

{
  "events": [
    {
      "sourceIP": "1.1.1.1",
      "destinationIP": "127.0.0.1",
      "qid": 1004
    },
    {
      "sourceIP": "1.1.1.1",
      "destinationIP": "127.0.0.1",
      "qid": 1005
    }
  ]
}

The above is just a sample, so the real results will contain a lot more data with a large number of key value pairs, which I am hoping logstash could parse so once they arrive at Elasticsearch I would expect to have those extracted to fields, like:

sourceIP: 1.1.1.1
destinationIP: 127.0.0.1

Etc, etc,...

Any ideas as to what could be going wrong here?

A file filter, by default, creates one event for each line of the file. If your JSON is spread across multiple lines you will need a multiline codec to combine lines.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.