Logstash PIpeline for Filebeat-auditd-(redis)-LS-Elastic

Dear All,

I'm sending Filebeat auditd data to redis, and use Logstash to send the same to ES. Why FB to Redis to LS to ES, could be a question as FB can directly sent to ES. It is because we are adding more data as part of LS.

Irrespective of the added data, I'm trying to reuse the filebeat module ingest pipeline settings to parse the incoming data, but the same has painless script, which is not supported using Logstash. I have to use Ruby instead.
(Refer to https://github.com/elastic/beats/blob/master/filebeat/module/auditd/log/ingest/pipeline.json)

If I understand the script, it does below

  1. Remove the keys where values are empty
  2. Give property value for x64 proc based on arch key value
  3. Convert to Hex to ascii.

I can change the 1st one to use Prune Filter, but prune works only at top level
2nd case I may be able to use mutate, but arch key is not always expected
3rd case I'm yet to work on Ruby.

Regardless, trying to check if we have the pipeline aligned towards LS to Redis. Also, any other alternatives using filters that are available by default in LS.

Appreciate any guidance/pointer.

Regards
Karthik R

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.