Hello,
We are in the process of migrating an older version of ELK to Elastic Cloud. We are already evaluating pricing, subscription model, etc. and I am getting ahead of the process with some question I have.
Right now we use Filebeat to collect logs from IIS servers and send them to logstash. Each IIS has several log folders and with Filebeat we tag them using "fields" so that in logstash depending on the tag they are added to a certain index. In Elastic Cloud I think there is no logstash, do you know how to do it so that in Elastic Cloud we know what each log corresponds to?
Depending on what you do with Logstash you could also keep your Logstash server and output the data to Elastic Cloud as some things are not possible to do with just Filebeat and Elasticsearch and need Logstash.
Filebeat collects the log files from several IIS servers and, depending on the folder, assigns a "field" name to each one so that logstash marks an index name for it.
But from what you say, we can substitute this code, which is the one we use right now on some servers:
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.