How to read json file using filebeat and send it to elasticsearch

Hi Anderw,

Got the error and fixed...now filebeat service is running but still it is not parsing the json file. Only showing below fileds :

@timestamp	July 20th 2017, 22:09:19.311
t@version	1
t_id	AV1g3kWDrZ1B3oM6iSFT
t_index	filebeat-2017.07.20
#_score	1
t_type	syslog
tbeat.hostname	ip-192-168-1-61.ec2.internal
tbeat.name	ip-192-168-1-61.ec2.internal
tbeat.version	5.4.3
thost	ip-192-168-1-61.ec2.internal
tinput_type	log
#offset	2,830,900
tsource	/home/json/world_bank.json
tsyslog_facility	user-level
#syslog_facility_code	1
tsyslog_severity	notice
#syslog_severity_code	5
ttags	beats_input_codec_plain_applied
ttype	syslog

but i want below fields from json file to be parsed as follows :

[{"application":{"id":"d6a19b39-5e40-4cbb-9857-561df066be5b","securityResourceId":"a06e9992-be1f-4355-bb37-69331acf19be","name":"hello Application","description":"","created":1490177768938,"enforceCompleteSnapshots":false,"active":true,"tags":[],"deleted":false,"user":"admin"},"applicationProcess":{"id":"776c883b-7281-43ed-93eb-9dce669675a6","name":"hello App Process","description":"","active":true,"inventoryManagementType":"AUTOMATIC","offlineAgentHandling":"PRE_EXECUTION_CHECK","versionCount":2,"version":2,"commit":54,"path":"applications/d6a19b39-5e40-4cbb-9857-561df066be5b/processes/776c883b-7281-43ed-93eb-9dce669675a6","deleted":false,"metadataType":"applicationProcess"},"environment":{"id":"6741e9d2-8e9a-4f80-b067-16eb96121149","securityResourceId":"13b02668-ad35-4031-aaf5-1dc82a4de72d","name":"helloDeploy","description":"","color":"#00B2EF","requireApprovals":false,"noSelfApprovals":false,"lockSnapshots":false,"calendarId":"d92cabe9-f96b-4637-9d62-d15bb26ec6c8","active":true,"deleted":false,"cleanupDaysToKeep":0,"cleanupCountToKeep":0,"enableProcessHistoryCleanup":false,"useSystemDefaultDays":true,"historyCleanupDaysToKeep":365,"conditions":[]},"id":"0ba25d74-8b9d-4fe9-8d80-f11feeec7ddf","submittedTime":1490246254582,"traceId":"bab5a5bc-4c87-4b9a-9481-fdffb9478384","userName":"admin","onlyChanged":true,"description":"","startTime":1490246255525,"result":"FAULTED","state":"CLOSED","paused":false,"endTime":1490246267119,"duration":11594},

What are you showing here? Where is this from?

The JSON that you posted looks to be incomplete and therefore invalid. Can you share the full file? The expectation is that there is one complete JSON object per line.

Hi Andrew,

Yes i have shared a part of the json file. Sharing the full file below :

https://pastebin.com/CQznjRTn

Please suggest.

I looked at the file. It contains a JSON list that is split across multiple lines. Filebeat expects the object to begin and end on the same line. Do you want to whole list to be contained in one event or do you want each object in the list to be indexed as its own event? If it's the latter then I would send the events to Logstash, clean each line up by removing any leading [ and trailing , characters then parse the line as a JSON.

Hi Andrew,

Yes, i am looking for whole list to be contained in one event.

And also if you can share the process for parsing in both ways.

Thanks in Advance.

Here are more details on this method.

  1. Use the mutate to replace the leading and trailing characters.

  2. Then add a json filter to decode the cleaned up JSON.

Hi Andrew,

Can it be done in one single event ?

Thanks & Regards,
Aditya Balhara
DRYiCE Autonomics
[cid:image002.png@01D21E1A.47E5EF40]
[cid:image001.png@01CDED94.61B8DBE0]

I think you could index all the data as one event, by using multiline to group all of the lines together then apply the JSON decoding. Writing arrays of objects isn't usually a good choice to make if you can avoid it. The data will be hard to work with in Kibana.

This topic was automatically closed after 21 days. New replies are no longer allowed.