Sflow data into Kafka?

HI there,

I'm shipping sflow data into my Elastic Stack, using Elastiflow's filters and dashboards. Input is UDP with sflow codec.

My architecture has Logstash ingress nodes which are only doing minimal processing, and their primary role is to get events into the Kafka cluster as quickly as possible, so the Logstash filter nodes can process these before indexing to ES. The filter stage has more CPU and room to scale by adding nodes.

I've moved the Elastiflow filters behind Kafka to my filter nodes, and have the sflow UDP input on the ingress nodes, and this is working.

The issue is that the sflow codec appears to be doing the majority of the processing, and is causing load issues on the ingress nodes, as they were never designed for this kind of work.

What I'd like to do is get the raw sflow data into kafka, and move the sflow codec to the filter nodes to do the heavy lifting, but I can't for the life of me figure out how to go about this (i.e., which codec to use on the UDP input, and kafka output) to preserve the sflow data.

Is this even possible?

Thanks.

Any ideas about this? I really can't find a way around this problem.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.