We currently are utilizing Filebeat 8.13.x to ship sensor data for Zeek and Suricata via Kafka and utilizing Logstash to subscribe from Kafka and push to Elasticsearch.
We are in the mist of moving from filebeat to elastic_agent. We have installed the integrations for Suricata and Zeek. However, we are running into an issue
"error"=>{"type"=>"illegal_argument_exception", "reason"=>"pipeline with id \[%{\[@metadata\]\[pipeline\]}\] does not exist".
This is our current Logstash output
Output {
elasticsearch {
hosts => [ "nodes"]
index => "logs-nsm-endpoint"
action => "create"
manage_template => "false"
pipeline => "%{[@metadata][pipeline]}"
user => "logstash"
password => "supersecretpassword"
ssl => true
cacert => '/etc/logstash/ssl/elastic-certificate.crt'
ssl_certificate_verification => true
}
}
We have tried to to change the pipeline to the pipeline that got installed when adding the integration assets, but it just errored out on all the data.
Does anyone have a clear cut guide or any way forward or suggestions?