Despite having skip_on_invalid_json
set to true I keep getting errors on content that isn't purely json, but is a mix of json and other content (null, int, strings).
Here's my filter config:
filter {
json {
source => "message"
skip_on_invalid_json => true
}
mutate {
lowercase => [ "logGroup" ]
}
}
And here are some example warnings I'm getting in my logs:
[2020-01-22T22:52:07,697][WARN ][logstash.filters.json ] Parsed JSON object/hash requires a target configuration option {:source=>"message", :raw=>"{\"timestamp\":\"2020-01-22T22:51:47.940Z\",\"log_level\":\"INFO\",\"caller\":\"redacted\",\"invoke_id\":\"redacted\"} \"Input Phase is old or ManagedInfo Success is timeout or cancel. [Input Phase=2, ManagedInfo Phase=4,Success=0]\"\n"}
[2020-01-22T22:52:07,707][WARN ][logstash.filters.json ] Parsed JSON object/hash requires a target configuration option {:source=>"message", :raw=>"{\"timestamp\":\"2020-01-22T22:51:41.675Z\",\"log_level\":\"INFO\",\"caller\":\"redacted\",\"invoke_id\":\"redacted\"} 2001222251312800\n"}
[2020-01-22T22:55:11,134][WARN ][logstash.filters.json ] Parsed JSON object/hash requires a target configuration option {:source=>"message", :raw=>"{\"timestamp\":\"2020-01-22T22:54:58.479Z\",\"log_level\":\"INFO\",\"caller\":\"redacted\",\"invoke_id\":\"redacted\"} null"}
The logs do get placed into elasticsearch, but the logstash logs are filling up with warnings since this is a very high volume stream.