Error Parsing JSON data in Logstash

Hello guys,
I am trying to debug this issue showing up related to Json Parsing. When my filebeat sends data to logstash, following error is showing up.

My Logstash Filter

filter {
	if "cost_management" in [tags] {
		fingerprint {
			source => "message"
			target => "[@metadata][fingerprint]"
			method => "MURMUR3"
		}
		# Parsing of json events.
		json {
			source => "message"
			tag_on_failure => [ "_grok_error_log_nomatch" ]
		}
		# If line doesnt matched then drop that line.
		if "_grok_error_log_nomatch" in [tags] {
			drop { }
		}
		# Set timestamp as per input log events.
		date {
			# date format : 2019-12-23 23:28:42
			match => [ "timestamp", "yyyy-MM-dd HH:mm:ss", "yyyy-MM-dd HH:mm:ss ", " yyyy-MM-dd HH:mm:ss", "ISO8601" ]
			target => "@timestamp"
		}
		# Set index name as per given tags.
		mutate {
			add_field => [ "index_name", "cost_management" ]
		}
		# Remove unwanted tags from sorted logs.
		mutate {
			remove_tag => ["beats_input_codec_plain_applied", "beats_input_codec_plain_applied", "_grokparsefailure", "_geoip_lookup_failure", "multiline", "_jsonparsefailure"]
		}
	}
}

When I use this filter, I receive following error on my Logstash servers.

[2021-03-11T12:03:10,631][WARN ][logstash.filters.json    ][main] Error parsing json {:source=>"message", :raw=>"l\": 0.01}", :exception=>#<LogStash::Json::ParserError: Unrecognized token 'l': was expecting ('true', 'false' or 'null')
 at [Source: (byte[])"l": 0.01}"; line: 1, column: 3]>}
[2021-03-11T12:05:11,247][WARN ][logstash.filters.json    ][main] Error parsing json {:source=>"message", :raw=>"}", :exception=>#<LogStash::Json::ParserError: Unexpected close marker '}': expected ']' (for root starting at [Source: (byte[])"}"; line: 1, column: 0])
 at [Source: (byte[])"}"; line: 1, column: 2]>}
[2021-03-11T12:15:10,819][WARN ][logstash.filters.json    ][main] Error parsing json {:source=>"message", :raw=>"24}", :exception=>#<LogStash::Json::ParserError: Unexpected character ('}' (code 125)): Expected space separating root-level values
 at [Source: (byte[])"24}"; line: 1, column: 4]>}
[2021-03-11T12:20:10,851][WARN ][logstash.filters.json    ][main] Error parsing json {:source=>"message", :raw=>"5}", :exception=>#<LogStash::Json::ParserError: Unexpected character ('}' (code 125)): Expected space separating root-level values
 at [Source: (byte[])"5}"; line: 1, column: 3]>}

My JSON data sample

{"timestamp": "2021-03-11 16:08:46", "monthly_total_bill": 94.98}
{"timestamp": "2021-03-11 16:08:46", "last_day_bill": 4.11}
{"timestamp": "2021-03-11 16:08:46", "monthly_service": "AWS Glue", "monthly_service_bill": 0.0}
{"timestamp": "2021-03-11 16:08:46", "monthly_service": "AWS Key Management Service", "monthly_service_bill": 0.0}
{"timestamp": "2021-03-11 16:08:46", "monthly_service": "Amazon DynamoDB", "monthly_service_bill": 0.0}
{"timestamp": "2021-03-11 16:08:46", "monthly_service": "Amazon EC2 Container Registry (ECR)", "monthly_service_bill": 0.06}
{"timestamp": "2021-03-11 16:08:46", "monthly_service": "EC2 - Other", "monthly_service_bill": 40.79}
{"timestamp": "2021-03-11 16:08:46", "monthly_service": "Amazon Elastic Compute Cloud - Compute", "monthly_service_bill": 23.54}
{"timestamp": "2021-03-11 16:08:46", "monthly_service": "Amazon Glacier", "monthly_service_bill": 0.0}
{"timestamp": "2021-03-11 16:08:46", "monthly_service": "Amazon Relational Database Service", "monthly_service_bill": 5.21}

The data is sometimes indexed correctly. But sometimes is does not. Please help me out.
Thank you.

That suggests that some of your JSON objects are wrapped across multiple lines.

Hi. Can you please elaborate a little on that? That would be really helpful.

You say your lines look like

{"timestamp": "2021-03-11 16:08:46", "monthly_service": "Amazon Relational Database Service", "monthly_service_bill": 5.21}

but the error suggests that they look like

{"timestamp": "2021-03-11 16:08:46", "monthly_service": "Amazon Relational Database Service", "monthly_service_bill": 
5.21}

so that the JSON object is parsed in two parts, neither of which is valid.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.