Json logs separated by blank lines

i have a json log file, where each entry is separated by a blank line

{"@timestamp":1456544565,"valid_domain":true,"ip":"8.8.8.8"}

{"@timestamp":2464564543,"valid_domain":false,"ip":"1.2.3.4"}

{"@timestamp":3454678735,"valid_domain":false,"ip":"9.8.7.6"}

filebeat refused to parse it, giving the error

2018-08-08T10:08:43.626-0300	ERROR	reader/json.go:33	Error decoding JSON: EOF

so i tried excluding empty lines with

exclude_lines : ['^$']

and it worked! until... i restarted the service. then filebeat refused to start, informing this error

2018-08-08T10:09:15.257-0300	ERROR	instance/beat.go:691	Exiting: Error in initing input: When using the JSON decoder and line filtering together, you need to specify a message_key value accessing 'filebeat.inputs.2' (source:'/etc/filebeat/filebeat.logstash.yml')

i don't have (want?) a message_key.

so, i have 3 questions here:

1- why aren't the blank lines ignored by default when parsing json?
the way i see it, if the curly braces are closed } it should end the parsing and ignore all empty lines until the next opening curly brace { is found.

2- exclude_lines : ['^$'] does work. but after the restart filebeat makes sure it doesn't. why? it works! leave it be...

3- what can i do now?

1- why aren't the blank lines ignored by default when parsing json?
the way i see it, if the curly braces are closed } it should end the parsing and ignore all empty lines until the next opening curly brace { is found.

The current Filebeat engine works on lines and by default the JSON reader that we use assume that each line is a valid document, in your case an empty line trigger an error. Each line is parsed individually, maybe in a future version we could change it to emit a new json document when a document is valid (complete) but this is not the case today. Also I think even by switching to that, it would still trip on the empty line.

2- exclude_lines : ['^$'] does work. but after the restart filebeat makes sure it doesn't. why? it works! leave it be...

I am not sure, it should have returned the same errors everytime filebeat is started.

3- what can i do now?

You have to define a message_key, when you use exclude lines, in filebeat it will generate an internal event with the json document in the field message, so you have to use that value to correctly extract the document.

but if the log line is perfectly parsed without defining message_key, why make it obligatory?
(as long as i never restart the service, i get exactly the result i want).

or did i just got lucky?

but if the log line is perfectly parsed without defining message_key , why make it obligatory?

It because when we skip line we have to do 2 steps to parse it.

or did i just got lucky?

I would expect the same result without restart.

What application produces space in their json log?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.