LogStash Json ParserError Unexpected character ('t' (code 116)) : was expecting comma to separate Object entries

please Help me please.
I want to send a log with JSON format from rsyslog to logsatsh and then from logstash to graylog.

All the steps I have done.
step 1 : config /etc/rsyslog.conf

*.*  action(type="omfwd" target="192.168.163.41" port="514" protocol="udp"
            action.resumeRetryCount="100"
            queue.type="linkedList" queue.size="10000" template="json-template")

step 2 : set json template

template(name="json-template" type="list" option.json="on") {
  constant(value="{")
  constant(value="\"timestamp\":\"")
  property(name="timereported" dateFormat="rfc3339")
  constant(value="\",\"message\":\"")
  property(name="msg")
  constant(value="\",\"host\":\"")
  property(name="hostname")
  constant(value="\",\"severity\":\"")
  property(name="syslogseverity-text")
  constant(value="\",\"facility\":\"")
  property(name="syslogfacility-text")
  constant(value="\",\"syslog-tag\":\"")
  property(name="syslogtag")
  constant(value="\"}\n")
}

step 3 : install logstash 6.3.2. and config this file.
logstash config :

input {
  udp {
    host => "192.168.163.41"
    port => 10514
    codec => "json"
    tags => "rsyslog"
    }
}

filter { }

output {
 if "rsyslog" in [tags] {
     gelf {
         host => "192.168.163.163"
         sender => "192.168.163.41"
       }
     }
}

step 4 : i send json for check.

logger ddddddddddddddddd

step 5 :
i get this error :

Sep  9 11:37:02 logread logstash: [2018-09-09T11:37:02,988][ERROR][logstash.codecs.json     ] JSON parse error, original data now in message field {:error=>#<LogStash::Json::ParserError: Unexpected character ('t' (code 116)): was expecting comma to separate Object entries
Sep  9 11:37:02 logread logstash: at [Source: (String)"{"@timestamp":"2018-09-09T11:37:02.971589-04:00","@version":"1","message":"\"2018-09-09T11:37:02.972094-04:00\",\"message\":\"ddddddddddddddddd\",\"host\":\"kafka1\",\"severity\":\"notice\",\"facility\":\"user\",\"syslog-tag\":\"root:\"}","sysloghost":"192.168.163.37","severity":"notice","facility":"user","programname":"{"timestamp"","procid":"-"}
Sep  9 11:37:02 logread logstash: "; line: 1, column: 326]>, :data=>"{\"@timestamp\":\"2018-09-09T11:37:02.971589-04:00\",\"@version\":\"1\",\"message\":\"\\\"2018-09-09T11:37:02.972094-04:00\\\",\\\"message\\\":\\\"ddddddddddddddddd\\\",\\\"host\\\":\\\"kafka1\\\",\\\"severity\\\":\\\"notice\\\",\\\"facility\\\":\\\"user\\\",\\\"syslog-tag\\\":\\\"root:\\\"}\",\"sysloghost\":\"192.168.163.37\",\"severity\":\"notice\",\"facility\":\"user\",\"programname\":\"{\"timestamp\"\",\"procid\":\"-\"}\n"}

please help me. I'm tired.

I don't think the configuration above is consistent with the results. Why does the event have severity and facility fields if you only have a udp input? Those fields are typically added by the syslog input or a grok filter.

Why does the programname field even exist and why does it contain what looks like the beginning of the JSON payload?

thanks @magnusbaeck for answer, I want to understand in general. Send logs by rsyslog, Jason format and apply filtering by logstash and finally posting to Gary Log.
I changed the template but an error occurs.

template(name="json-template" type="list" ) {
  constant(value="{")
  constant(value="\"message\":\"")
  property(name="msg")
  constant(value="\"}\n")
}

send log test :

logger testtttttttttttttttttttttttt

ERROR :

Sep  9 18:18:56 logread logstash: [2018-09-09T18:18:56,906][ERROR][logstash.codecs.json     ] JSON parse error, original data now in message field {:error=>#<LogStash::Json::ParserError: Unexpected character ('m' (code 109)): was expecting comma to separate Object entries
Sep  9 18:18:56 logread logstash: at [Source: (String)"{"@timestamp":"2018-09-09T18:18:56.900373-04:00","@version":"1","message":"\"testtttttttttttttttttttttttt\"}","sysloghost":"192.168.163.37","severity":"notice","facility":"user","programname":"{"message"","procid":"-"}
Sep  9 18:18:56 logread logstash: "; line: 1, column: 188]>, :data=>"{\"@timestamp\":\"2018-09-09T18:18:56.900373-04:00\",\"@version\":\"1\",\"message\":\"\\\"testtttttttttttttttttttttttt\\\"}\",\"sysloghost\":\"192.168.163.37\",\"severity\":\"notice\",\"facility\":\"user\",\"programname\":\"{\"message\"\",\"procid\":\"-\"}\n"}

Where do these paramets come from?
‍‍facility,programname,programname,programname
My template only contains a message.
I tried to filter these fields.

‍‍‍‍

    input {

udp {
host => "192.168.163.41"
port => 10514
codec => "json"
tags => "rsyslog"
}
}
filter {
mutate {
gsub => [
"timestamp", ""@", "",
"version", ""@", "",
"message", """, "",
"sysloghost", """, "",
"severity", """, "",
"facility", """, "",
"programname", """, ""
]
}
}
output {
if "rsyslog" in [tags] {
gelf {
host => "192.168.163.163"
sender => "192.168.163.41"
}
}
}

Did I apply the filter correctly?

also, i use this tutorial, but i get error yet.
(http://rsyslog-logstash-graylog)

[2018-09-09T19:39:34,304][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2018-09-09T19:39:46,244][ERROR][logstash.codecs.json     ] JSON parse error, original data now in message field {:error=>#<LogStash::Json::ParserError: Unexpected character ('@' (code 64)): was expecting comma to separate Object entries
 at [Source: (String)"{"@timestamp":"2018-09-09T19:39:46.239257-04:00","@version":"1","message":"\"2018-09-09T19:39:14.023499-04:00\",\"@version\":\"1\",\"message\":\"ddddddddddd\",\"host\":\"kafka1\",\"severity\":\"notice\",\"facility\":\"user\",\"programname\":\"root\",\"procid\":\"-\"}","sysloghost":"192.168.163.37","severity":"notice","facility":"user","programname":"{"@timestamp"","procid":"-"}
"; line: 1, column: 356]>, :data=>"{\"@timestamp\":\"2018-09-09T19:39:46.239257-04:00\",\"@version\":\"1\",\"message\":\"\\\"2018-09-09T19:39:14.023499-04:00\\\",\\\"@version\\\":\\\"1\\\",\\\"message\\\":\\\"ddddddddddd\\\",\\\"host\\\":\\\"kafka1\\\",\\\"severity\\\":\\\"notice\\\",\\\"facility\\\":\\\"user\\\",\\\"programname\\\":\\\"root\\\",\\\"procid\\\":\\\"-\\\"}\",\"sysloghost\":\"192.168.163.37\",\"severity\":\"notice\",\"facility\":\"user\",\"programname\":\"{\"@timestamp\"\",\"procid\":\"-\"}\n"}


I suggest you temporarily use the default codec for your input, comment out all filters and outputs, and add a stdout { codec => rubydebug } output to dump the raw data that you receive. You're obviously not receiving valid JSON so first we need to figure out why.

I'm not convinced that the chosen way of getting rsyslog to send JSON is a great idea. What happens if the message contains a double quote? I don't see anything in that configuration snippet that makes sure that it gets escaped.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.