Hello guys,
i’m working on a solution which use Elasticsearch as DataLake and export data on an external SIEM.
As first requirement I want leverage elasticsearch capabilities as integration and enrichment, so I cannot export data using output in elastic agent because data will not processed by package pipeline offered by native integration. Then i think to use Logstash using elasticsearch as input; but in this way logstash becomes a single point of failure; how can deploy a solution to address my request (even without Logstash; it’s ok an alternative)? I would like a solution which scales dinamically and avoid sending same data two times