Decompress a gzip compressed string in logstash and push to es

Team, I am trying to decompress a gzip compressed data using logstash - am not able to figure out how to do this.

Input json:
(this is fed through a file in this example. In actual, it is consuming a kafka message which looks similar)

{
  "id": "238dbf3e-34d6-4a8c-a7cf-39a091642754",
  "version": 1,
  "ttl": 2592000,
  "createdDate": 1678501916554,
  "modifiedDate": 1678501930238,
  "payload": {
    "compressedPayload": "H4sIAAAAAAAACqtWykvMTVVSsFJQysrPyFNIyU9V0lFQykxRslIwNAACICcxJaUotbgYpKhaQSk5s6QSKKkUnJin4JVfDFaenF+aV1IEFg4NdlSqrQUAsk/IEFcAAAA=",
    "compresisonType": "GZIP"
  },
  "type": "single"
}

Desired output:
Decompressed content inside payload.compressedPayload (the below is decompressed using a online gzip decompresser)
ie

{
  "name": "john doe",
  "id": 10000,
  "address": {
    "city": "San Jose",
    "country": "USA"
  }
}

My logstash pipeline config looks like this

input {
  file {
        path=> "/usr/share/logstash/sample-kafka-pipeline.json"
        start_position =>"beginning"
        sincedb_path => "/dev/null"
  }
}

filter {
    json {
        source => "[message][payload][compressedPayload]"
        target => "[message]"
    }
}

output {
    stdout { codec => rubydebug }
    elasticsearch {
        hosts => "localhost:9200"
        index => "test-docker-logstash"
    }
}

I know my filter doesn't look right. But I would like to know what is the best way to decompress the gzipped string in my input json.

Any help is appreciated. Thanks in advance.

You will have to use a ruby filter.

input { generator { count => 1 lines => [ '{ "payload": { "compressedPayload": "H4sIAAAAAAAACqtWykvMTVVSsFJQysrPyFNIyU9V0lFQykxRslIwNAACICcxJaUotbgYpKhaQSk5s6QSKKkUnJin4JVfDFaenF+aV1IEFg4NdlSqrQUAsk/IEFcAAAA=" } }' ] codec => json } }

output { stdout { codec => rubydebug { metadata => false } } }
filter {
    ruby {
        code => '
            begin
                p = event.get("[payload][compressedPayload]")
                gzipData = Base64.decode64(p)
                sio = StringIO.new(gzipData)
                gz = Zlib::GzipReader.new(sio, encoding: Encoding::ASCII_8BIT)
                u = gz.read
                event.set("someField", u)
            ensure
                gz&.close
            end
        '
    }
}

which will produce

 "someField" => "{\"name\" : \"john doe\", \"id\": 10000, \"address\" : { \"city\": \"San Jose\", \"country\": \"USA\"}}",

Thank you so much @Badger for the prompt reply.
This worked great, after a minor tweak to fit it to my need. The filter was flawless.
Being still new to ELK, this answer opened other possibilities for me.