Best option to parse some data GROK or reindex?

I am collecting machine tool data and the logs I am getting are in the JSON format that can be sent to logstash. However I need to break down one of the JSON lines into some more documents. Should I try to run this through GROK filter or manipulate in Elasticsearch via reindexing/mapping?

This is what I am sending to logstash now
{"timestamp": "2020-11-17T19:02:19.352Z","sequence": 2460137,"deviceName": "OKUMA.MachiningCenterMA650-EAST","deviceUUID": "OKUMA.MachiningCenterMA650-EAST.190056","componentId": "Mp1","dataItemId": "Mp1ProgramHeader","Events": { "ProgramHeader": { "name": "p1ProgramHeader", "@@data": "(CIMATRON E13)( FILE NAME:24625_DET11B_BSH)( OKUMA PROGRAM )( rob.mank )( Monday November 9, 2020 - 1:22:43 PM )(slab top)( TOOL NAME: 2.0 INGER .06R 4.0 )( TOOL DIAMETER......: 2. )" } } }

What I need to break down is
{ "name": "p1ProgramHeader", "@@data": "(CIMATRON E13)( FILE NAME:24625_DET11B_BSH)( OKUMA PROGRAM )( rob.mank )( Monday November 9, 2020 - 1:22:43 PM )(slab top)( TOOL NAME: 2.0 INGER .06R 4.0 )( TOOL DIAMETER......: 2. )" }

Each item in () I need sent to a new document. These items in () are info at the beginning of a machine tool program (G code) that contain information about the job running in the machine tool.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.