Partial logs ingested by logstash causing _jsonparsefailure

We're ingesting logs from a file in JSON format. Here's the config,

input {
  file {
     'codec' => 'json'
     'path' => '/mnt/log/app/json.log*'
     'type' => 'erlang'
     'sincedb_path' => '/usr/share/logstash/data/.sincedb_app_json'
  }
}

We're noticing that sometimes partial lines get ingested into elasticsearch.


As you can seen in the "message" field, the line starts somewhere in the middle of the actual line in file.
When trying to look at the same line in the actual file, looks like the line is a valid json.

user@hostname:/var/log/app# grep "2019-06-28T00:00:00.230" json.log | jq "."
{
  "node": "node1@hostname.company.domain",
  "pid": "<0.1417.148>",
  "line": 988,
  "function": "some_func",
  "module": "some_module",
  "application": "some_app",
  "some": "593479e7-0a3d-4369-b08a-6dbe3d77f542",
  "more": "f11cfd2f-0e7b-47f6-877f-37626b50ac76",
  "fields": "Blah",
  "message": "Msg Status",
  "@timestamp": "2019-06-28T00:00:00.230",
  "level": "info"
}

logstash version: 5.5.2

Notably, all such occurrences have the initial part cut off from the logs, but never the end of the line.

Any idea about what could be the problem here?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.