Hello,
We are thinking in migrate some of our ingestion from Logstash to the integrations of Elastic Agent, but while researching the documentation of the Elastic Agent we were not able to find anything about using Kafka as an Input.
We looked also in some of the integrations that we would use and there is not one reference about using Kafka as an Input, for example, the Fortigate integration only have logfile
, tcp
and udp
as possible inputs.
policy_templates:
- name: fortinet_fortigate
title: Fortinet FortiGate logs
description: Collect logs from Fortinet FortiGate instances
inputs:
- type: logfile
title: "Collect Fortinet FortiGate logs (input: logfile)"
description: "Collecting logs from Fortinet FortiGate instances (input: logfile)"
- type: tcp
title: "Collect Fortinet FortiGate logs (input: tcp)"
description: "Collecting logs from Fortinet FortiGate instances (input: tcp)"
- type: udp
title: "Collect Fortinet FortiGate logs (input: udp)"
description: "Collecting logs from Fortinet FortiGate instances (input: udp)"
This is the same for every other integration that I checked, like the Cisco FTD, Cisco Asa and Cisco ISE.
I assumed that the Elastic Agent supported Kafka as an input since the Filebeat has a Kafka Input, but it looks that it really do not support or it is not in the documentation.
Anyone has some more information about this? Without a Kafka input we will need to change our approach and stick with Logstash or use another tool like Vector.
I already opened a ticket with the support, but maybe someone here is faster to answer.