Logstash Netflow Codec Issues

I'm having issues handling netflow data through the logstash codec. I was wondering if anyone had any insight that could help.

Logstash version: 7.6.2 (I'm happy to upgrade if need be not tied to a specific version)

for reference adding the Dockerfile as well:

FROM logstash:7.6.2

#Create symlink so can use paths from production with logstash docker defaults
USER root

COPY compose/logstash/runner.sh  /tmp/

RUN yum makecache && \
    yum install -y wget  && \
    mkdir -p /etc/logstash && \
    ln -s /usr/share/logstash/pipeline /etc/logstash/conf.d && \
    curl -L -O https://artifacts.elastic.co/downloads/beats/filebeat/filebeat-7.6.2-x86_64.rpm && \
    rpm -vi filebeat-7.6.2-x86_64.rpm && \
    filebeat modules enable netflow && \ # probably not needed
    logstash-plugin  install logstash-codec-sflow logstash-codec-netflow

## NOTE: default is port 2055

#USER logstash

VOLUME /var/cache/netsage

ENTRYPOINT ["/tmp/runner.sh"]

Input:

    udp {
    port                 => 2055
    codec                => netflow
    receive_buffer_bytes => 16777216
    workers              => 4
    type                 => "netflow"
    }

filter:

None, disabled for now.

output:

output {
    file {
        path => "/data/all.json"
         codec => json_lines 
      }
      stdout { codec => rubydebug { metadata => true } }

}

First issue I'm seeing is messages like these.

 [2020-07-16T02:13:31,092][WARN ][logstash.codecs.netflow  ][main] Can't (yet) decode flowset id 256 from source id 1, because no template to decode it with has been received. This message will usually go away after 1 minute.

I know there is a filebeat netflow module which I don't fully understand how to utilize yet, but I believe I should be able simply use the codec. Or is that deprecated in favor of the module?

Just an update, filebeat seems to work flawlessly. The message format is different but it seems to work.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.