I'm trying to find a way to stream logs from fastly.com to an elastic cloud service. According to Fastly's docs you are only able to do this through Logstash (which is not included in the cloud service). It seems impractical to have and manage a logstash instance for receiving logs from one source.
Anyone have tried to successfully stream logs from Fastly to Elastic search directly?
It seems the only way to stream logs to elastic is through beats.
After looking through the list of supported log services in the docs you linked, two paths might be viable:
Use logstash or filebeat to forward the syslog stream to Elasticsearch's HTTP API.
Try to (ab)use one of the HTTP-based service integrations to submit JSON docs to an ingest pipeline on your Elasticserach cluster. If I were to attempt that, I would start with the sumologic integration, because it looks quite generic and configurable. I would point it at a small HTTP server which I control to inspect the format and derive a pipeline from that.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.