Constant JSON parse errors in logstash output log

This may be more of a general question on how to deal with mixed inputs of normal logs and JSON.

I'm working on upgrading our stack to 5.2. As part of this process I've built a new ELK server and I have it up and running and I have some clients sending logs to it with filebeat. Logs are flowing in fine but I noticed that I'm getting constant JSON parse errors in the logstash-plain.log file like so:

[2017-03-21T21:31:35,081][ERROR][logstash.codecs.json ] JSON parse error, original data now in message field {:error=>#<LogStash::Json::ParserError: Unrecognized token 'Mar': was expecting ('true', 'false' or 'null')
at [Source: Mar 21 21:31:29 ps-pdev-db01 os-prober: debug: running /usr/lib/os-probes/mounted/90solaris on mounted /dev/mapper/VOL_GROUP2-DATABASE; line: 1, column: 4]>, :data=>"Mar 21 21:31:29 ps-pdev-db01 os-prober: debug: running /usr/lib/os-probes/mounted/90solaris on mounted /dev/mapper/VOL_GROUP2-DATABASE"}

I think I know what is happening here. On our previous server we were setting the codec to all incoming logs to JSON. Only one of our logs was actually coming in as JSON but when I first setup the current server it was recommended by people on #logstash to set all logs to JSON. So on our new server I stuck with a similar config, here is my conf.d/02-beats-input.conf file:

input {
  beats {
    port => 5044
    ssl => true
    ssl_certificate => "/etc/pki/tls/certs/ps-dev-elk.plansourcedev.com.crt"
    ssl_key => "/etc/pki/tls/private/ps-dev-elk.plansourcedev.com.key"
    type => "logs"
    codec =>   json {
      charset => "UTF-8"
    }
  }
}

Only one of the logs we are shipping is in JSON format and the rest are normal syslog type logs but this config seems to work fine for us. It seemed to work well on our current server and so far seems to work as expected on our new 5.2 ELK server. The only issue I have at the moment is this constant stream of JSON parse errors coming into the logstash output log.

I'm wondering if there's a simple way to silence those JSON parse errors? And I'm also wondering if there's maybe a better way to do what I'm trying to accomplish?

Use a plain codec in your input and then selectively apply a json filter if the input looks like JSON (e.g. if message begins with "{").

Thanks Magnus, that seems to have worked. What I did is change our input like so:

input {
  beats {
    port => 5044
    ssl => true
    ssl_certificate => "/etc/pki/tls/certs/ps-dev-elk.domain.com.crt"
    ssl_key => "/etc/pki/tls/private/ps-dev-elk.domain.com.key"
    type => "logs"
    codec => "plain"
#    codec =>   json {
#      charset => "UTF-8"
#    }
  }
}

and then added another filter specific to the JSON logs like so:

filter {
  if [type] == "rails_json" {
    json {
      source => "message"
    }
    mutate {
      remove_field => "message"
    }
  }
}

the rails_json type is getting set by the filebeats client when it sends the logs.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.