I'm looking to forward already stored logs from Elasticsearch to an external application-source.
We're quite aware that we could leverage Logstash to distribute/split the new future-incoming logs between Elasticsearch and other external destination BUT we would like to forward our already existing logs as-well.
While waiting for the response, I did research and go through using Logstash & API's to export logs from Elasticsearch.
As per your response "There are many batch and streaming ETL tools that can use elasticsearch as a source." @stephenb Can you please name few ETL tools that can help me achieve this objective? (Are you referring to queuing tools?)
Thanks for the response @stephenb
Considering that we would export the logs from Elasticsearch by any above means, doesn't this impose a overhead on the ELK-Stack? as this would be querying huge amount of data across all the indices
Like any datastore, reads / writes require some level of compute / IO resources nothing is free
That is why often teams with this use case often split the ingest feed to elasticsearch to both elasticsearch and the other destination at ingest time.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.