How to Flush_pattern

Hi,
I have some log file:
12:07:10.698 [ INFO] [aa] MqConsume(): New message "Message108" received from Topic "Topic1"
12:07:10.698 [DEBUG] [aa] MqMessageHeaders(): Processing message headers
12:07:10.699 [ INFO] [aa] MqConsume(): Message headers:
12:07:10.699 [ INFO] [aa] MqConsume(): Message type: Binary
12:07:10.700 [DEBUG] [aa] MqBytesMessage(): Processing binary message body
12:07:10.700 [DEBUG] [aa] MqBytesMessage(): Raw binary message size 679.80 KB (696113 bytes)
12:07:10.711 [DEBUG] [aa] MqBytesMessage(): Decompressing message content
12:07:10.902 [ INFO] [bb] ReadMessage(): New message "Message108" received from source "mq():topic:"Topic1""
12:07:10.902 [DEBUG] [cc] InsertMessage(): Inserting message "Message108"
12:07:19.356 [DEBUG] [cc] InsertMessage(): Succeeded
12:07:19.361 [DEBUG] [aa] MqAckMessage(): Acknowledging message "Message108"
12:07:19.362 [DEBUG] [bb] UpdateMessageStateTemp(): Updating route message state, Status Received -> CheckingDuplicate, Error NoError -> NoError
12:07:19.362 [ INFO] [bb] CheckForDuplicates_PerformCheck(): Checking message "Message108" for duplicates
12:07:19.832 [ INFO] [bb] WriteMessage_PerformWriting(): Routing message "Message108" from "asd:topic:"Topic1""
12:07:19.832 [DEBUG] [dd] Write(): Writing message "Message108" into BOS database
12:07:19.832 [DEBUG] [dd] DoUnserialize(): Unserializing message into data packet structure
12:07:19.832 [DEBUG] [dd] LoadResource(): Loading resource from message "Message108"
12:07:20.058 [DEBUG] [ee] LoadFromStream(): Unserializing resource with utf-8 encoding from JSON format
12:07:24.553 [DEBUG] [ee] LoadFromStream(): Unserialization completed successfully
12:07:24.712 [DEBUG] [ee] StartTransaction(): Starting transaction
12:07:25.746 [ INFO] [ee] LoadRows(): Rows count 8410
12:07:26.365 [ INFO] [ee] Progress 1% (items: 85/8410, time left: 1 minutes 0 seconds)
12:08:27.466 [ INFO] [ee] Progress 100% (items: 8410/8410, time left: 0 seconds)
12:08:27.472 [DEBUG] [ee] CommitTransaction(): Committing transaction
12:08:27.555 [DEBUG] [ee] Load(): Rows loaded
12:08:27.555 [ INFO] [bb] ValidateWriteResult(): Write command result: Success
12:08:27.555 [DEBUG] [bb] UpdateMessageState(): Updating route "Topic1" message "Message108", replace content is "No":
Status: Sending -> Sent
Error: NoError -> NoError
12:08:27.555 [DEBUG] [cc] UpdatePendingMessage(): Updating pending message 1300
12:08:28.561 [DEBUG] [cc] UpdatePendingMessage(): Succeeded
12:08:28.561 [DEBUG] [bb] UpdateMessageStateTemp(): Updating route message state, Status Sent -> Disposing, Error NoError -> NoError
12:08:28.561 [ INFO] [bb] DisposeMessage(): Disposing message "Message108"
12:08:28.561 [DEBUG] [cc] DisposePendingMessage(): Disposing pending message 1300
12:08:30.398 [DEBUG] [ff] Cleanup(): Running operative table cleanup
12:08:30.399 [DEBUG] [ff] Cleanup(): 1 message(s) deleted
12:08:30.399 [DEBUG] [cc] DisposePendingMessage(): Succeeded


But for me interesting is only such tree rows:

12:07:10.698 [ INFO] [aa] MqConsume(): New message "Message108" received from Topic "Topic1"
12:08:27.466 [ INFO] [ee] Progress 100% (items: 8410/8410, time left: 0 seconds)
12:08:30.399 [DEBUG] [cc] DisposePendingMessage(): Succeeded

This for row means that:

  1. message were received
  2. message were 100% processeded
  3. the status is: Succeeded

Is it possible to use Flush_pattern in filebeat .yml file to send ONLY THIS TREE LINES AS ONE MESSAGE to Logstash

Hi @jurisn,

Yes, you can use flush.pattern. Please find below link for documentation.

https://www.elastic.co/guide/en/beats/filebeat/master/multiline-examples.html
And
https://www.elastic.co/guide/en/beats/filebeat/6.0/_examples_of_multiline_configuration.html

Hope It may be helpful for you.

Regards,

I did it like that:
#multiline.pattern: 'MqConsume(): New message'
#multiline.negate: true
#multiline.match: after
#multiline.flush_pattern: 'DisposePendingMessage(): Succeeded'

, but it take ALL rows (30 rows for each event) between first and last row. It is not suitable for me.

Is it possible send ONLY THIS TREE LINES AS ONE MESSAGE to Logstash?

As i understand the solution is here:
https://www.elastic.co/guide/en/logstash/current/plugins-filters-aggregate.html

It means that file beat can send only rows, but can't aggregate them.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.