How to merge log events as a single log event, when frequent updates happens to a file

Data will be written in log file while installing product at random time interval.Overall it will take 10 minutes time to finish the whole installation process.

with multiline settings , I was able to send whole file as a single event when no updates happens to file(log data fixed and no updates while harvesting).

Do we have any control to merge the whole log file data as a single event when frequent updates happen to log file ? Tried queue.mem( (https://www.elastic.co/guide/en/beats/filebeat/current/configuring-internal-queue.html)) and it is not working.

Can someone share the insights/options towards the expected outcome?

Log file data:
STEP 1: Validate Configuration Data File path

STEP 2: Read Configuration Data File

................ some more steps

STEP 100: registered services successfully, services are in running state

Product Installation Successfully Completed

FileBeat.Yml Config:

  • type: log
    enabled: true
    paths:

    • C:\Logs\test.log
      input_type: log

    fields_under_root: true
    fields:
    product: demo
    application: test

    multiline.pattern: '^STEP'
    multiline.negate: true
    multiline.match: after
    multiline.flush_pattern: '^Product Installation'

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.