When using a Json filter in a logstash pipeline, I run into a size problem. I get an error indicating a max size of 32769 (2^15):
:exception=>#<LogStash::Json::ParserError: Unexpected end-of-input in VALUE_STRING
at [Source: (byte)"{"fields":{},"level":"info","@timestamp":1647597990820,"message":"{"id":"c4590000-fdc0-da0b-2c67-08da08c6f119","created_at":"2022-03-18T10:06:18.798Z","error":{"error":"","error_description":""},"scanner_information":{"hardware_id":"PRMC3N-OEM-03-203048","certificate_serial_number":""},"document_verification":{"overall_status":"not_passed","auto_checks":{"error":{"error":"","error_description":""},"calculated_risk_value":90,"document_details":{""[truncated 15884 bytes]; line: 1, column: 32769]>}
Does anyone know if I can increase the max size that the JSON plugin can deal with? can't find any information on this).