Cisco Module arbitrary parse error with nearly identical messages

Hello,

I want to use Filebeat to import directly into elasticsearch the logs from a cisco router, these logs were first stored via syslog in an Ubuntu 16.04 server.

However, I get this error message in the field error.message in aprox half of the events:

GoError: failed in processor.convert: conversion of field [event.sequence] to type [long] failed: unable to convert value [022084]: strconv.ParseInt: parsing "022084": invalid syntax

The weird part of this behavior is that the log lines are almost identical, for example:

This line produces the mentioned error:

Jan 13 18:12:31 RO-ROM-VPN-KYOSA 022084: Jan 13 18:12:35.141 LCY: %SEC-6-IPACCESSLOGP: list 101 denied tcp 120.131.176.111(7133) -> 170.257.123.53(7547), 1 packet  

This other line gets parsed well:

Jan 13 17:12:30 RO-ROM-VPN-KYOSA 021176: Jan 13 17:12:33.168 LCY: %SEC-6-IPACCESSLOGP: list 101 denied tcp 191.128.99.50(43651) -> 170.257.123.53(9943), 1 packet

These are the contents of the file cisco.yml.

- module: cisco
  ios:
    enabled: true
    var.input: file
    var.paths: ["/var/log/logs_cisco/RO-ROM-VPN-KYOSA/*.log"]

Have anyone else experimented the same behavior? Is this a known bug?

Best regards

Hi @arnau_K, welcome to the Elastic community forums!

Summary: It looks like you've uncovered a bug. Thanks! I've created https://github.com/elastic/beats/issues/15513 to track it.

Explanation:
Looking at the Cisco module source code, I believe this is where the failure is coming from:

Specifically, that bit of code tries to parse the sequence number string as an integer. It sees the leading 0 and tries to parse the string that follows as an octal (base 8) number. Since base 8 numbers can only have digits 0-7 in them, parsing of 022084 fails but parsing of 021176 succeeds.

I think the intent here is to parse the sequence number as a decimal (base 10) number. So I believe this is a bug you've found. I've created https://github.com/elastic/beats/issues/15513 to track it.

Thanks,

Shaunak

Hello @shaunak !

Thank you for the quick response. Do you think there is any work around to this problem? We were really interested in this functionality.

Best regards,