I am using S3 SQS notifications to fetch logs from S3 bucket to ingest pipeline with filebeat 8.13.4. Logs are being fetched and index created on cluster. Pipeline is cloned using filebeat s3 access ingest pipeline and need to do some processing But any addition to new fields from pipeline processors (with set processor) is not getting added to index. Pipeline simulation shows the processor executed and field added expected with new value. Output section in filebeat.yml contains:
output.elasticsearch:
index: "server_logs-%{+MM.dd}"
pipeline: test
setup.template.name: "test_clone"
setup.template.enable: true
setup.template.pattern: "test_server*"
I need to add some GROK filter to parse the default pipeline fields and add to the index created. But unable to get any processor executed on pipeline and unparsed data gets added to index.
Any help is welcome.
Thanks
Ajay