Filebeat Error decoding JSON: invalid character \x00 looking for beginning value

I have a python application that runs on k8s and I want to send its logs to Elasticsearch,

each of my application pods writes its logs to different file on a shred PVC and my filebeat which run on a different pod read all of the logs from that PVC and send them to Elasticsearch.

It’s all working very well but recently I started to see the following error in my kibana

“ Error decoding JSON: invalid character \x00 looking for beginning value “ with a message that contained the string \u0000 around 50 times and a part of my log, using the message I was able to locate some of the logs in the log file itself and they seemed perfectly fine.

The versions i am using:

Filebeat: 7.11.1

Elasticsearch: 8.4.1

PVC storage class: trident-no-snapshot-nas

The error you're encountering typically indicates Filebeat is trying to parse a log entry as JSON, but it encounters a null character (\x00 or \u0000 ), which is not valid in JSON. This could be due to the log file containing binary data or being corrupted. To address this, you might consider ensuring your application outputs valid JSON logs, checking for and removing any null characters in your logs before Filebeat processes them, or configuring Filebeat to handle or ignore such errors more gracefully. Additionally, consider upgrading Filebeat, as newer versions may have improved handling for such scenarios.

First of all thank you for the answer,
as I wrote in the question, because that the error message contained part of the logs that caused the errors i was able to locate them and they seemed perfectly fine, they didn’t contain any binary data and I was able to parse them using a simple python script.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.