I have a TXT file that I want to import into Elasticsearch via LogStash,
it is a fixed type of columns, in which I used grok and if to separate the columns of this file, that's fine, but I want to use the multiline to get 'n' lines that will be inserted / consolidated in the same document in Elastic.
Each line of the file the first column is the type line, (it can be 1,2, or 3) and I would like to consolidate everything below 1 an event in the logstash, it closes the event when the next line is 1, it follows an example of lines in the txt file:
in the output it would be like this:
LINE 1, 2, 3, 4 - document elastic _id
LINE 5 - another document elastic _id
LINE 6 and 7 - another document elastic_id
LINE 8 - another document elastic_id
I have been researching but I can't find a correct pattern for this or not to do for Lostash but for a python reading the file and consolidating it into an array.
Can someone help me ?