How to get multiple entries of similar pattern from the same input line using grok pattern?

Hi Team,

I'm aware of grok pattern and able to parse and store the data into elastic search using logstash configuration file with filter and grok patterns.

For example:
If the data input line is:


field1, field2, field3 are being parsed and getting stored into elastic search successfully without any problem.

But now I have a input line like below:


means there are multiple occurrences of required pattern in the same input line, with Start as starting of pattern and # among all the required patterns.

Is there any way to fetch all such fields and store into elastic search?

Kindly help me and let me know in case of any further information is required.

You could use mutate+split to convert that to an array using # as the delimiter, then pass the array to grok, which will iterate over the entries.

Thanks for the reply.

Could you please help me any reference links. or configuration. I'm not aware of mutate and split.

mutate+split is documented here. grok here.

Hi @Badger,

I've gotten the documentation links, but not getting how to use both mutate+split which will parse the data and to pass that array to grok to process further.

Kind help.

Hi Team,

I've tried to use mutate + split and grok pattern in logstash configuration file as below:

mutate {
split => { "message" => "#" }
grok {
match => { "message" => ["Detailed_Dashboard-%{USERNAME:ThreadId}|%{WORD:FileName}|%{DATA:FilePath}|%{DATA:TableName}|%{DATA:User}|%{DATA:Class}|%{DATA:Method}|%{DATA:Server}|%{DATA:FromTime}|%{DATA:ToTime}|%{NUMBER:ResponseTime:int}|%{GREEDYDATA:IsError}"]}
remove_field => [ "message" ]
add_field => { "pattern_type" => "Detailed_Dashboard" }

It's not saving the 2 entries into elastic search, instead it's storing the data as 2 elements in each field Eg: TableName field contains data dsta, dsta

Could you please help me how to use these correctly in order to save those entries as separate rows.

Any help would be highly appreciated.

If your data has arrays and you want to save them as different documents then you can use a split filter.

Hi @Badger,

I want to store each pattern as it like in each individual line.

For example take below scenario


then I want to store 3 records into elastic search like

filed1 field2 field3
Elasticsearch 100 Database
Logstash 200 Parser
Kibana 300 UI

Is it do-able or not using logstash pipeline configuration or not?

Yes, use mutate+split to divide the string into an array using # as a delimiter. Use a split filter to separate the array into three events. Use a csv filter to parse each event into separate fields.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.