Logstash filter based on first line as identifier

I am using a light weight python based forwarder (github) instead of filebeat due to an old Solaris version dependency on the server sending the files. I am unable to add fields like we can do in filebeat using this forwarder. My logstash config needs some information to understand what type of data it is receiving based on which it applies a CSV filter and indexes the data. On other Linux based platform's data I've achieved this by reading the field values as defined on filebeat config. Eg.if [fields][fieldname] == "xyz" which I cant use for the python based forwarder.

As a solution I was looking for a way to use the first line of the file as an identifier to make the decision that logstash needs. This is a new file which is going to get freshly prepared every hour and should have the identifier on the first line each time. I want logstash to be able to understand what type of data it is by looking at the data and use the corresponding CSV filter.

Something of this sort:

filter
{
   if (line_1 == type_1)
   {
      csv {...}
   }
   else if (line_1 == type_2)
   {
       csv {...}
    }
}

Is this something possible? Thank You.

If your forwarder sends each line of the file as a separate event then no, it is not possible, since the events from each source are independent. If your forwarder sends the entire file as a single event then yes, you could do a pattern match to determine the type of the event.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.