How to parse an array of multi-line json objects as separate documents in ES?

I have a use case where I am going to be receiving a new file every hour that is picked up by filebeat and then sent to logstash. The file will look like this:

[
	{
		"Results": {
			"StartWeekSec": [
				2000,
				198000.0
			],
			"ReportStartPeriod": "2019-01-15T07:00:00.000Z",
			"Constellation": "GPS",
			"ReferenceTime": [
				2000,
				31984.00
			],
			"Status": 0
		},
		"ForecastResults": [
			{
				"Point": [
					29.67,
					-7.5
				],
				"BestAccuracy": 10.98,
				"WorstAccuracy": 14.59				
			},
			{
				"Point": [
					55.37,
					54.1
				],
				"BestAccuracy": 20,
				"WorstAccuracy": 13.99		
			}
		]	
	},
	{
		"Results": {
			"StartWeekSec": [
				2000,
				198000.0
			],
			"ReportStartPeriod": "2019-01-15T07:00:00.000Z",
			"Constellation": "GPS",
			"ReferenceTime": [
				2000,
				31984.00
			],
			"Status": 0
		},
		"ForecastResults": [
			{
				"Point": [
					29.67,
					-7.5
				],
				"BestAccuracy": 10.98,
				"WorstAccuracy": 14.59				
			},
			{
				"Point": [
					55.37,
					54.1
				],
				"BestAccuracy": 20,
				"WorstAccuracy": 13.99		
			}
		]	
	}	
]

It is an array containing a number of json objects, some of which like "ForecastResults" can contain a number of nested objects within them.

I want to take each object in the array and convert it to a document in an Elasticsearch index that follows the above mapping. How can I achieve this using logstash? Could it be done with filebeat alone?

Hi,
Split filter plugin can help you.

Thank you, this worked perfectly!

Hello @reillye, Can you please share your configuration file as I am having the same kind of scenario and have tried numerous things to achieve this but nothing seems working.