JSON parse error, original data now in message field

Hi There,

I am new to ELK, and need support. I have deployed a new ELK stack [7.8.0] & I am trying to ingest my rsyslogs to logstash. I am able to retrieve the logs to logstash but with following errors,

logstash-plain.log ::

`[2020-07-24T06:10:53,644][WARN ][logstash.codecs.jsonlines][main][e6516d264562b138b4d2764ccabf8f84f9868ff0172f7c414529beb343cb84d2] JSON parse error, original data now in message field {:error=>#<LogStash::Json::ParserError: Unexpected character ('' (code 92)): expected a valid value (number, String, array, object, 'true', 'false' or 'null')

at [Source: (String)"\u0000\u0005\u0000\u0004\u0000\x9E\u0000\x9F\xC0|\xC0}\u00003\u0000g\u00009\u0000k\u0000E\u0000\xBE\u0000\x88\u0000\xC4\u0000\u0016\u0000\xA2\u0000\xA3\xC0\x80\xC0\x81\u00002\u0000@\u00008\u0000j\u0000D\u0000\xBD\u0000\x87\u0000\xC3\u0000\u0013\u0000f\u0001\u0000\u0000D\u0000\u0005\u0000\u0005\u0001\u0000\u0000\u0000\u0000\xFF\u0001\u0000\u0001\u0000\u0000#\u0000\u0000\u0000"; line: 1, column: 2]>, :data=>"\u0000\u0005\u0000\u0004\u0000\x9E\u0000\x9F\xC0|\xC0}\u00003\u0000g\u00009\u0000k\u0000E\u0000\xBE\u0000\x88\u0000\xC4\u0000\u0016\u0000\xA2\u0000\xA3\xC0\x80\xC0\x81\u00002\u0000@\u00008\u0000j\u0000D\u0000\xBD\u0000\x87\u0000\xC3\u0000\u0013\u0000f\u0001\u0000\u0000D\u0000\u0005\u0000\u0005\u0001\u0000\u0000\u0000\u0000\xFF\u0001\u0000\u0001\u0000\u0000#\u0000\u0000\u0000"}`

On Kibana Console ::

tags :: _jsonparsefailure, _grokparsefailure

logstash.conf ::

Input

input {
tcp {
host => "X.X.X.X"
port => 10514
codec => "json"
type => "syslog"
}
}

Fileter

filter {
if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:[%{POSINT:syslog_pid}])?: %{GREEDYDATA:syslog_message}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}

Output

output {
if [type] == "syslog" {
elasticsearch {
hosts => [ "127.0.0.1:9200" ]
index => "livelogs1"
}
}
stdout { codec => rubydebug }
}

Rsyslog Client Conf ::

70-output.conf

. @@X.X.X.X:10514;json-template

01-json-template.conf

template(name="json-template" type="list" option.json="on") {
constant(value="{")
constant(value=""@timestamp":"") property(name="timereported" dateFormat="rfc3339")
constant(value="","@version":"1")
constant(value="","message":"") property(name="msg")
constant(value="","host":"") property(name="hostname")
constant(value="","severity":"") property(name="syslogseverity-text")
constant(value="","facility":"") property(name="syslogfacility-text")
constant(value="","programname":"") property(name="programname")
constant(value="","procid":"") property(name="procid")
constant(value=""}\n")
}

/var/log/message ::

Jul 24 06:10:29 omcscefnbuycxm rsyslogd: unexpected GnuTLS error -110 in nsdsel_gtls.c:178: The TLS connection was non-properly terminated. [v8.24.0-52.el7_8.2 try http://www.rsyslog.com/e/2078 ]
Jul 24 06:10:29 abc rsyslogd: netstream session 0x7fc9e00e4700 from 140.87.151.146 will be closed due to error [v8.24.0-52.el7_8.2 try http://www.rsyslog.com/e/2089 ]
Jul 24 06:10:34 xyz salt-minion: [ERROR ] Error while bringing up minion for multi-master. Is master at omcsceforvmjfz-pub.opc.oracleoutsourcing.com responding?
Jul 24 06:10:39 pqr salt-minion: [ERROR ] Error while bringing up minion for multi-master. Is master at omcscefwqmianu-pub.opc.oracleoutsourcing.com responding?`

Are you using Beats to ingest into Logstash?

Try rsyslog to log it as file. Let's say you log into file /var/log/ryslog/ for example

In Beats try

- type: log
  paths:
    - "/var/log/rsyslog/*.log"
  tags: ["syslog"]
  fields_under_root: true

Then try below setting in logstash

input {
  beats {
    port => 5000
    type => syslog
  }
}

Hi Kin,

Thanks for your response.

I am not using filebeat for shipping logs data. I had configured rsyslog OS daemon to forward logs to logstash at 10514 (TCP) in json format using template.

hmm.. never tried directly as I always used to write to file using rsyslog and collect inwards using beats.

Have a try by putting within inputs

        codec => line {
         charset => "UTF-8"
         }  

if still doesn't work, try with a higher log level (use --log.level debug on the command line or its equivalent) and posting several of the logging lines before and after the error you're encountering.

Hi there,

first of all please try to use the code formatter tool </> when pasting something which is not plain text, otherwise your message will be difficult to read and understand.

Anyway, can you paste here the results of the following pipeline?

input {
  tcp {
    host => "X.X.X.X"
    port => 10514
    codec => "json"
    type => "syslog"
  }
}

filter {}

output {
  stdout{}
}

This way we should be able to see exactly what logstash is receiving.

Had tried this option, but did't worked for me.

Will try it, and come back with logs.

Hi Fabio,

Here is output,

{
"message" => "\u0016\u0003\u0001\u0000\xED\u0001\u0000\u0000\xE9\u0003\u0003_)k\u0015:\xE5w\x9B*\xC1W\xA9\u000E\xD5\u0011\et\xA0͖\xD8粣Q\xE40\x8E0E\xF9\x99\u0000\u0000|\xC0+\xC0,\xC0\x86\xC0\x87\xC0\t\xC0#\xC0",
"@version" => "1",
"port" => 53246,
"@timestamp" => 2020-08-04T14:05:06.887Z,
"type" => "syslog",
"tags" => [
[0] "_jsonparsefailure"
],
"host" => "130.61.40.7"
}
{
"message" => "\u0000\u0005\u0000\u0004\u0000\x9E\u0000\x9F\xC0|\xC0}\u00003\u0000g\u00009\u0000k\u0000E\u0000\xBE\u0000\x88\u0000\xC4\u0000\u0016\u0000\xA2\u0000\xA3\xC0\x80\xC0\x81\u00002\u0000@\u00008\u0000j\u0000D\u0000\xBD\u0000\x87\u0000\xC3\u0000\u0013\u0000f\u0001\u0000\u0000D\u0000\u0005\u0000\u0005\u0001\u0000\u0000\u0000\u0000\xFF\u0001\u0000\u0001\u0000\u0000#\u0000\u0000\u0000",
"@version" => "1",
"port" => 53246,
"@timestamp" => 2020-08-04T14:05:06.913Z,
"type" => "syslog",
"tags" => [
[0] "_jsonparsefailure"
],
"host" => "130.61.40.7"
}

Thanks,

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.