Save first lines of data and make it sharable for all file lines in logstash

Hello,
I've the following snapshot from a file that I'm using filebeat and logstash to send it to ES

Service Endpoint: http://1.1.1.1:8080/ECM
TestSuite : JenkinsDeploy
SuccessCount:4
FailureCount:0
TestCases:1 AdditionalCustomerProfile_JO.xml, Result: Success
TestCases:2 CustomerProfile_EG.xml, Result: Success
TestCases:3 LookUpRetrievals_EG.xml, Result: Success
TestCases:4 LookUpRetrievals_JO.xml, Result: Success

all I want to do is to keep sending the first 4 lines in each request to ES noting that I read many files and each file contains different data
is there any way to do it?

I've been thinking about this too and I think I'm going to try using the aggregate filter

https://www.elastic.co/guide/en/logstash/current/plugins-filters-aggregate.html

Since you know what each of the lines looks like you will have to setup 4 grok filters for the first 4 lines then aggregate those together.

Then whenever the line starts with "TestCases" add the aggregated event to that event but don't define it as an end_event.
I'm not sure if this will work but it's worth a shot.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.