Logstash configuration to globally mutate sub index patterns

I have an ELK stack, with some configured index patterns. As part of internal requirements, I need to edit a json object (which is part of that index pattern), globally to "string".
As of now, such json object is randomly populated by numerous sub-fields, like "date", "temps", and many more others.
All of these are automatically managed by ELK, but I need all of them to be processed and converted (casted) as "strings".

So the basic configuration is as follows:

type A: "long"
type B: "int"
type C: "string"
type D: "json object"

where type D is a nested json object which includes (as sub fields) numerous other index patterns and fields, like (for example) "typeD.date", "typeD.temps", "typeD.extrastuff", and so on.

I need all of the "typeD*" objects to be casted and converted as strings. So far, I tried to use a .conf with logstash, but it did not work. I will paste my configuration after (numerous) attempts. I tried all these configuration in my own local docker environment.

The log which I tested logstash from was called log_file.json and it contains numerous sample lines, including also the index patterns I want to edit globally with this mutate operation.

input {
      type => "json"
      path => "/home/container/logstash/log_file.json"
      start_position => "beginning"
      sincedb_path => "/dev/null"
      codec => json

filter {
   if [message] =~ /\A\{.+\}\z/ {
     json {
       source => "message"

  json {
    source => "[message][typeD]"

  mutate {
    convert => ["typeD.*","string"]

output {
  elasticsearch {
    hosts => "http://localhost:9200"
    index => "logstash-test-%{+yyyy.MM.dd}"

Also, after logging in via Kibana, I can see that the (conf) file has been correctly created and loaded by logstash, but still it fails to convert the index patterns as strings.

I would need to know which edits are necessary for the configuration to work. Does the logstash config file need some edit? Is there anything wrong in the setup?