Filebeat syslog module problem on year change

I configured filebeat to import data from /var/log/secure into elasticsearch using the new module data available in the git repo. See here for the pipeline config I used. This worked great until new year's day when I started to see parsed entries like these:

{
  "_index": "filebeat-2017.01.02",
  "_type": "secure",
  "_id": "AVlhaS3Vpt2V0zqfCLFb",
  "_score": null,
  "_source": {
    "@timestamp": "2016-01-02T23:59:48.000Z",
    "offset": 281345,
    "beat": {
      "hostname": "freud.ahti.mobi",
      "name": "freud.ahti.mobi",
      "version": "5.1.1"
    },
    "input_type": "log",
    "source": "/var/log/secure",
    "syslog": {
      "system": {
        "hostname": "freud",
        "pid": "8397",
        "program": "sshd",
        "message": "Connection closed by 172.31.25.147 [preauth]",
        "timestamp": "Jan  2 23:59:48"
      }
    },
    "fields": {
      "pipeline": "syslog",
      "environment": "extra",
      "role": "influxdb",
      "source_type": "syslog-system"
    },
    "type": "secure"
  },
  "fields": {
    "@timestamp": [
      1451779188000
    ]
  },
  "highlight": {
    "syslog.system.program": [
      "@kibana-highlighted-field@sshd@/kibana-highlighted-field@"
    ]
  },
  "sort": [
    1451779188000
  ]
}

Note that the date string "Jan 2 23:59:48" was parsed as "2016-01-02T23:59:48.000Z", which is a whole year off. I think this indicates a bug in the date processor, but not entirely sure. A restart of elasticsearch "fixed" it for new entries.

Hi @pprkut,

this looks like a bug too me, similar issue happend in Logstash a while back https://github.com/logstash-plugins/logstash-filter-date/issues/3.
IMO the real problem is that syslog does not include a Year by default, but the ingest node should handle this on year switch over.
Would you mind creating an issue at https://github.com/elastic/elasticsearch/issues ?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.