Team, I am trying to decompress a gzip compressed data using logstash - am not able to figure out how to do this.
Input json:
(this is fed through a file in this example. In actual, it is consuming a kafka message which looks similar)
{
"id": "238dbf3e-34d6-4a8c-a7cf-39a091642754",
"version": 1,
"ttl": 2592000,
"createdDate": 1678501916554,
"modifiedDate": 1678501930238,
"payload": {
"compressedPayload": "H4sIAAAAAAAACqtWykvMTVVSsFJQysrPyFNIyU9V0lFQykxRslIwNAACICcxJaUotbgYpKhaQSk5s6QSKKkUnJin4JVfDFaenF+aV1IEFg4NdlSqrQUAsk/IEFcAAAA=",
"compresisonType": "GZIP"
},
"type": "single"
}
Desired output:
Decompressed content inside payload.compressedPayload (the below is decompressed using a online gzip decompresser)
ie
{
"name": "john doe",
"id": 10000,
"address": {
"city": "San Jose",
"country": "USA"
}
}
My logstash pipeline config looks like this
input {
file {
path=> "/usr/share/logstash/sample-kafka-pipeline.json"
start_position =>"beginning"
sincedb_path => "/dev/null"
}
}
filter {
json {
source => "[message][payload][compressedPayload]"
target => "[message]"
}
}
output {
stdout { codec => rubydebug }
elasticsearch {
hosts => "localhost:9200"
index => "test-docker-logstash"
}
}
I know my filter doesn't look right. But I would like to know what is the best way to decompress the gzipped string in my input json.
Any help is appreciated. Thanks in advance.