Filter to consolidate values in multiple logs

Hi there,

I need help to optimize my logstash.

Szenario:
We use IBM datapower and ship the logs to logstash.
Logstash sends it to Elasticsearch of course.

In these logs is field named gtid ("globalTransactionId").
One transaction generates like 10 logs and one of them has a username in it.
This Username defines which application causes the transactions.

Now I need to create a new field in each of the 10 logs with the application name.

I dont find any filter which is able to queue the stream and add a field dependend on the username.

My solution for now is:

  • shipping logs to elasticsearch
  • bash script with curl to get logs from elasticsearch with "gtid" field and Username (from a file).
  • then it uses the scroll api with curl and updates each document with the gtid and adds the field with a value.
  • then it fetches the new gtid and so on...

this is very uncomfortable and creates a lot of overhead.

If you have any ideas I would be happy to read it.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.