Logstash 5.6.0 deserialization errors when persistent queues enabled


(Zt Zeng) #1

The same problem in a closed post

  • Version: 5.5.0 & 5.6.0

  • OS: Linux ubuntu-100 4.4.0-31-generic #50~14.04.1-Ubuntu SMP Wed Jul 13 01:07:32 UTC 2016 x86_64 x86_64 x86_64 GNU/Linux

  • Config file: logstash.yml with

    xpack.monitoring.elasticsearch.url: "http://xxxx:xxxx"
    xpack.monitoring.elasticsearch.username: xxx
    xpack.monitoring.elasticsearch.password: xxx
    
  • Pipeline config: bussiness.conf

  • Sample Data:

    {
           "offset" => 295655,
           "logger" => ".d.s.w.r.o.CachingOperationNameGenerator",
           "source" => "xxx/log_info.2017-09-15.log",
           "thread" => "localhost-startStop-1",
             "type" => "biz-info",
             "tags" => [
          [0] "beats_input_codec_plain_applied"
      ],
       "@timestamp" => 2017-09-15T05:57:15.223Z,
         "logLevel" => "INFO",
         "@version" => "1",
             "beat" => {
              "name" => "server",
          "hostname" => "server",
           "version" => "5.5.2"
      },
             "host" => "server",
      "log_message" => "Generating unique operation named: getAvatarUsingGET_5"
    }
    
  • Steps to reproduce:
    Just run with above data and config, If we enable 'persistent queue', it will throw the error. Otherwise, it works.


Logstash Persistent Queues throws exception when trying to read BigInteger values from the queue
(subhasdan) #2

Thank you so much for clearly reproducing!! This issue was driving us crazy and losing lot of logs on random days!

@guyboertje can you please share your thoughts on this :slight_smile:


(subhasdan) #3

@Zt_Zeng also - after your receive this error, does the pipeline continue to keep working/flowing? Because for us the deserialization error gets stuck there, just want to confirm if you see the same behavior?

(and btw - the JSON has an extra [0] in the message, so semantically not valid JSON, just to make sure if its the exact same message produced by filebeat)


(subhasdan) #4

@Zt_Zeng how can we test the message by sending it directly to beat input 5044?


(Zt Zeng) #5

I think it isn't working, because no data is processed from the log and ES not receive any data.

As for the strange [0], I reproduce the problem, and it is the exact message received by logstash. But this is the log printed when no persistent queue is enabled, because when logstash start with persistent queue, I can only see error stack trace and no received event.

And the following is the json published by filebeat:

{
  "@timestamp": "2017-09-23T01:19:01.448Z",
  "beat": {
    "hostname": "server",
    "name": "server",
    "version": "5.5.2"
  },
  "fields": {
    "type": "biz-info"
  },
  "input_type": "log",
  "message": "xxx",
  "offset": 268,
  "source": "xxx/log_info.2017-09-21.log",
  "type": "log"
}

(system) #6

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.