Parsing a JSON array of objects

I have a log file that's an array of objects that looks something like this:

[
  {
     "cate1": "data1a",
     "cate2": "data2a"
  },
  {
     "cate1": "data1b",
     "cate2": "data2b"
  }
]

and I need each object in the array to be a separate entry in Elasticsearch and each "cate" to be a field. How would I go about configuring Logstash to do this?

2 Likes

Use the split filter.

1 Like

I've tried a few variations of using the split filter (by itself, with the json and mutate filters, etc.) but to no avail. Could you demonstrate the proper config for this filter? Thank you. :slight_smile:

The documentation contains an example that looks very much like what you have: https://www.elastic.co/guide/en/logstash/master/plugins-filters-split.html

1 Like

Respectfully I do not see a way to use the split filter to do what gigvinyl is proposing. Do you prose having a split filter like (this wouldn't work, but at least it's a strawman).:
split {
source => "foo"
terminator => "},"
target => "bar"
}
json {
source => "bar"
target => "abc"
}

The source field is an array so the terminator option doesn't apply. And why would you use a json filter? The original question indicates an array of objects, not an array of JSON strings.