Post my company being aquired, we have been asked to migrate from our on-prem Splunk to our parent company's AWS ELK. This is all new to me, so I'm needing some help knowing the right path forward.
They are having us use Filebeat and Winlogbeat to send logs to a public Logstash server, which then sends on to a Kafka topic, which is then pulled in by a second Logstash server, which is in turn indexed by Elasticsearch (pretty sure I got that flow right). We have data going in successfully, however, it's really ugly. I'm trying to understand how to get logs into it with proper field extraction/ingest parsing for various logs like Apache, Cisco network devices, and system logs.
As a specific example, I'm using the Filebeat Cisco module for our firewalls (which is working) and was trying to understand pipelines and dashboards. It appears that since we can't talk directly to their Elasticsearch instance that the filebeat setup --pipelines won't work. Is there another way to get the pipeline in place so that we can get the ingest parsing magic from these modules? Similarly the dashboards (if they're worth using)?
Thanks for any illumination that you can shed on this!