I receive logs from docker containers, these have a field 'log' that is in itself a json message and that contains a subfield 'message' which I'm interested in.
I want to filter on the message field with the following filter:
json {
source => "message"
}
However I get:
:response=>{"index"=>{"_index"=>"document-verification-000001", "_type"=>"_doc", "_id"=>"likYrH8BR6mWLvAxG3SC", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"object mapping for [log] tried to parse field [log] as object, but found a concrete value"}}}}
I don't know how to build the filter so I can use the inner 'message' json. I also tried double filters like:
json {
source => "log"
}
I then don't get the error but also don't get data.
Any help?
It appears that if I specify in the filter:
json {
source => "message"
}
json {
source => "log"
}
Then it works. The 'log' element is getting json-parsed. However I then run into another problem: the size of the json. Although the 'log' element is about 20k, I get an error indicating a max size of 32769 (2^15):
:exception=>#<LogStash::Json::ParserError: Unexpected end-of-input in VALUE_STRING
at [Source: (byte)"{"fields":{},"level":"info","@timestamp":1647597990820,"message":"{"id":"c4590000-fdc0-da0b-2c67-08da08c6f119","created_at":"2022-03-18T10:06:18.798Z","error":{"error":"","error_description":""},"scanner_information":{"hardware_id":"PRMC3N-OEM-03-203048","certificate_serial_number":""},"document_verification":{"overall_status":"not_passed","auto_checks":{"error":{"error":"","error_description":""},"calculated_risk_value":90,"document_details":{""[truncated 15884 bytes]; line: 1, column: 32769]>}
Anyone know if I can increase this max size? (I can't find anything on this).
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.