Hello Community !
I try to parse following:
-- New Customer ---------
time customerData1
time customerData2
time date id: 1
time Service: a
time OnlineRequest
time OnlineResponse
time text
time [Status] Code: Text
time date id: 2
time Service: b
time text
time text
time text
time text
time [Status] Code: Text
....
time date id: n
time Service: xyz
time text
time text
time text
time text
time [Status] Code: Text
time customerData4
time customerData5
-- New Customer ---------
...
into something like this
{
"customerData1":"value",
"customerData2":"value",
"customerData3":"value",
"customerData4":"value",
"services":[
{
"id":"value",
"name":"value"
"onlineRequest":"value",
"onlineResponse":"value",
"text":"value",
"statuscode":"value",
"statustext":"value"
},
{
"id":"value",
"name":"value"
..
}
]
}
Therefore i sent all Lines of a Customer (-- New Customer --- .... ) delimited with "\n" to logstash as one event.
But how can i parse the Servicepart n times with the same grok pattern?
I thought about using the grok-filter for the customerData, then use the split-filter on the whole message an then the aggregate-filter for the service-part, but i know that the aggregate-filter can only be used with one filterworker....that would be a performancekiller.
Is there a good way to repeat a grok-pattern n-times or how can i solve my problem?
Hope someone can help
Jupp