Logstash issue with json input from beats [Solved]

In short I am getting the following error when I view the console terminal on logstash
JSON parse error, original data now in message field {:error=>#<LogStash::Json::ParserError: Unrecognized token 'new': was expecting 'null', 'true', 'false' or NaN at

My setup looks like this

Spring boot -> log-file.json* -> filebeat -> logstash -> elastic

json file encoded using logstash-logback-encoder

log-file.json
{"@timestamp":"2017-09-08T17:23:38.677+01:00","@version":1,"message":"new log content","logger_name":"log.tracing.controller.HelloAController","thread_name":"http-nio-8081-exec-2","level":"INFO","level_value":20000,"X-Span-Export":"false","X-B3-SpanId":"9c7519e0c71db8f7","X-B3-TraceId":"9c7519e0c71db8f7"}

filebeat.yml

filebeat.prospectors:
- input_type: log
  paths:
    - /mnt/log/*.log
  json.overwrite_keys: true
  json.keys_under_root: true
  fields_under_root: true

output.logstash:
  hosts: ['logstash:5044']

logstash.conf

input {
  beats {
    port => 5044
    codec => "json"
  }
}

output {
  elasticsearch {
    hosts    => [ 'elasticsearch' ]
    user     => 'elastic'
    password => 'changeme'
  }
  stdout { codec => rubydebug }
}

debug response
JSON parse error, original data now in message field {:error=>#<LogStash::Json::ParserError: Unrecognized token 'new': was expecting 'null', 'true', 'false' or NaN at [Source: new log content; line: 1, column: 4]>, :data=>"new log content"

{
    "X-Span-Export" => "false",
           "offset" => 307,
            "level" => "INFO",
       "input_type" => "log",
           "source" => "/mnt/log/tracing-A.log",
          "message" => "new log content",
             "type" => "log",
             "tags" => [
        [0] "_jsonparsefailure",
        [1] "beats_input_codec_json_applied"
    ],
       "@timestamp" => 2017-09-08T16:23:38.677Z,
      "X-B3-SpanId" => "9c7519e0c71db8f7",
      "level_value" => 20000,
      "thread_name" => "http-nio-8081-exec-2",
         "@version" => 1,
             "beat" => {
        "hostname" => "1ca3ba38c094",
            "name" => "1ca3ba38c094",
         "version" => "5.5.1"
    },
             "host" => "1ca3ba38c094",
     "X-B3-TraceId" => "9c7519e0c71db8f7",
      "logger_name" => "log.tracing.controller.HelloAController"
}

Further notes

  • its actualy working. I get the logs parsed correctly and I am able to view them in kibana. The error message appears to be a false positive to me? Or Maybe I have not understood something here?
  • I wish to use the timestamp from the original json log file thats why I have overwrite keys and keys under root selected.
  • If rename "message" to "msg" json field on the log-file.json I no longer recieve the error message. However I would prefer to keep the default logstash encoder parmaters and not have to change it to msg.

https://www.elastic.co/guide/en/beats/filebeat/current/exported-fields-log.html#_message
Please refer to the link - message field shows "The content of the line read from the log file." But, you have a key and value that's conflicting with the default field.

3rd point in further notes is the straight forward solution.

What do you mean "3rd point in further notes is the straight forward solution." is there a further notes section? have not spotted it on the page?

"Further notes" mentioned by you in your query. Sorry for the confusion.

If rename "message" to "msg" json field on the log-file.json I no longer recieve the error message. However I would prefer to keep the default logstash encoder parmaters and not have to change it to msg.

I have solved it now. I renamed the message field to msg.

solution below.

logback.xml (logback xml file in spring boot application)

<appender name="stash" class="ch.qos.logback.core.rolling.RollingFileAppender">
    <filter class="ch.qos.logback.classic.filter.ThresholdFilter">
        <level>info</level>
    </filter>
    <file>/home/rob/projects/scratch/log-tracing-demo/build/logs/tracing-A.log</file>
    <rollingPolicy class="ch.qos.logback.core.rolling.TimeBasedRollingPolicy">
        <fileNamePattern>/home/rob/projects/scratch/log-tracing-demo/build/logs/tracing-A.log.%d{yyyy-MM-dd}</fileNamePattern>
        <maxHistory>30</maxHistory>
    </rollingPolicy>
    <encoder class="net.logstash.logback.encoder.LogstashEncoder" >
        <includeContext>false</includeContext>
        <fieldNames>
            <message>msg</message>
        </fieldNames>
    </encoder>
</appender>

So now the json-file examples look like the following.
{"@timestamp":"2017-09-11T14:32:47.920+01:00","@version":1,"msg":"Unregistering JMX-exposed beans","logger_name":"org.springframework.jmx.export.annotation.AnnotationMBeanExporter","thread_name":"Thread-19","level":"INFO","level_value":20000}

Then in the logstash added the following filter

filter {
  mutate {
    rename => {"msg" => "message"}
  }
}
1 Like

Good tweak with mutate. Mark your way as a solution. Might help someone.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.