I ingest logs from one SaaS solution though the pre-built elastic agent integration. The logs are pretty noisy and I want to reroute them in different namespaces (data streams) to apply different ILM policies.
What are my options?
I have tried to reroute those logs via *@custom pipeline using different fields and it has broken the integration (at least there were no logs from the integration before I made the pipeline empty (deleted all processors) lol). I am thinking of adding the reroute processors in the "final pipeline" after the logs are parsed. Is it a good idea at all?
How exactly you reroute them and what broke? Can you provide more context.
The last pipeline you have access to make changes is the @custom one for each integration, so any reroute processor needs to be in this pipeline.
A common issue when using the reroute are related to permissions, per default each integration will only have permission to write on the data streams and namespace configured on the policy, since 9.1+ you have an option to add extra permissions to use the reroute processor.
In which version are you and what are the integrations that you want to reroute?
Everything was working fine, after we restarted integration server, logs just stopped delivering at some point. New data systems (per namespace) were successfully created. It just stopped for whatever reason. When I cleaned up the custom pipeline, logs continued delivering to the default index. So the issue is not the permissions, I guess. There were no errors in the logs (or I just missed) and no errors from in the elastic agent integration.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.