Logstash codecs (multiline,json) ParseError: illegal character


#1

Hi,

Elasticsearch 5.5.0
Logstash 5.5.0
logstash-codec-multiline 3.0.5
logstash-codec-json 3.0.3.

Another issue when reading file (NFS) containing JSON lines using above.

From time to time getting such an error:

:exception=>#<LogStash::Json::ParserError: Illegal character ((CTRL-CHAR, code 0)): only regular white space (\r, \n, \t) is allowed between tokens

even if line in log file is correctly ended with \n

Any ideas ? May it be NFS problem ?

Thanks in advance !


Logstash codecs (multiline,json) uncomplete messages from file lines
(Magnus Bäck) #2

The error message indicates nul characters in the JSON data and that's not valid. You can use a tool like hexdump to inspect what the file really contains, including non-printable characters like nul.


#3

HI,

Do you think that issue is not related to:

https://discuss.elastic.co/t/logstash-vs-nfs-null-characters/25918

Thank you !


#4

Looks like reading file via NFS could cause this problem:
https://serverfault.com/questions/707034/disable-file-cache-for-mount-point

Is it known issue ?


(Magnus Bäck) #5

This does appear to be related. Storing log files on NFS sounds like a bad idea in general.


#6

Thanks Magnus,

Do you mean storing remotely is bad idea generally or only using NFS ?
I think most companies using remote log's storing as they have many instances of applications.


(Magnus Bäck) #7

Do you mean storing remotely is bad idea generally or only using NFS ?

Logging is a fundamental part of any application and you always want logging to work. You also don't want the logging itself to break the application. Therefore I'd always want log files to reside in a local volume so that the risk of failure is minimized.

I think most companies using remote log's storing as they have many instances of applications.

I agree that many companies run many instances of applications but I don't think that's a valid reason for storing the logs on NFS.


#8

You're right, logs should be stored as files locally and sent to centralized system in other way.

Solution I plan to use is to send data (as JSON single lines) from log4j2 to logstash input TCP or UDP plugin with json codec applied on input.

So i think we can close this thread.
Many thanks for your explanations !


(Magnus Bäck) #9

Solution I plan to use is to send data (as JSON single lines) from log4j2 to logstash input TCP or UDP plugin with json codec applied on input.

Then you're still dependent on the network. Write logs to local files and ship those files independently of your application.


#10

Yes, i know. Would be perfect to store files locally and additionally to send them remotely to Logstash for statistics purposes. But it not depends on me unfortunately.


(system) #11

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.