I'm trying to import data from a JSON file.
The root level item in the file is an object containing two keys. One of those keys, datapoints
, contains an array of objects that I would like to use as events.
I cannot find a way to tell Logstash (or Filebeat, if there's a mechanism in there) to generate events from the array under the datapoints
key.
Here is an example of the JSON file I'm receiving:
{
"messages": [
"message 1",
"message 2"
],
"datapoints": [
{"key1": "value 1", "key2": "value 4"},
{"key1": "value 2", "key2": "value 5"},
{"key1": "value 3", "key2": "value 6"}
]
}
I would like to extract the array inside the datapoints
key and treat each object in the array as an event, as if that array was the only item in the file. I am not concerned with the messages
key or its value at all.
If I manually replace the contents of the file with the array inside the datapoints
key, resulting in the file contents below, the processing goes ahead mostly fine (although I get a validation error on the first item in the array - different issue though!).
[
{"key1": "value 1", "key2": "value 4"},
{"key1": "value 2", "key2": "value 5"},
{"key1": "value 3", "key2": "value 6"}
]
This generates the desired output:
{
"key1": "value 1",
"key2": "value 4"
}
{
"key1": "value 2",
"key2": "value 5"
}
{
"key1": "value 3",
"key2": "value 6"
}
In this case, I'm using a simple pipeline like this:
input {
file {
path => "myjsonfile.json"
start_position => "beginning"
codec => "json"
}
}
output {
stdout { codec => rubydebug }
}
Is there a way to have Logstash use the array when it's inside an object like in the first example so that I don't have to code something up to modify the file contents before they go to Logstash?
Thanks in advance.