I have been trying to parse a complex log pattern for a while . Using multiline pattern matching in filebeat I have been able to group it as per my needs . The sample log format is as follows:
{
[Timestamp] [ClassA] [Group1][Group2][Data]
[Timestamp] [Group1][Data]
[Timestamp] [Group2][Data]
[Timestamp] [ClassB][Data]
}
This is how an individual event has been stitched
{
[Timestamp] [ClassA] [Group1][Group2][Data]
[Timestamp] [Group1][Data]
[Timestamp] [Group2][Data]
[Timestamp] [ClassB][Data]
}
{
[Timestamp] [ClassA] [Group1][Group2][Group3][Data]
[Timestamp] [Group1][Data]
[Timestamp] [Group2][Data]
[Timestamp] [Group3][Data]
[Timestamp] [ClassB][Data]
}
.....
Now I need to use logstash to create an event having the following fields.
{
data from [ClassA] ,data from the [Group1] of class A ,data from [ClassB]
}
{
data from [ClassA] ,data from the [Group2] of class A ,data from [ClassB]
}
This is how the events should be split.
The issue that I am facing is to figure out a way to dynamically add the fields of the group( which keeps changing) , i.e , for each event the number of groups differ , but they do start with a consistent pattern which makes it easy for me to write a grok but due to the dynamic nature of it , writing a grok does not seems viable . I think usage of split and clone seems more relevant here or maybe not . I am not really sure how this could be done . Kindly Help!!