Logstash pipelines and multiple outputs

Morning. I have an issue with the last version of elastic stack. When I try to use pipelines to send logs to 2 different outputs one it´s receiving but the other doesn´t have anything. There is no errors in logstash(debug mode)

This is my code to send the logs to elastic and google_pubusb. When I send directly to elastic works fine but with I use the code together this only sends the logs to PubSub

filter { 
  clone {  
    clones => ["elasticcloned"] 
  }                  
  if [type] == "elasticcloned" and [cdr] == "topup-events" {
    mutate {
      convert => {
       "service" => "string"
      }
    }  
  }      

output {
  if [type] == "elasticcloned" {
    if [cdr] == "topup-events" {
      elasticsearch {
        ssl_enabled => true
        hosts => ["https://x.x.x.x:9200"]
        ssl_certificate_authorities => "/etc/logstash/certs/http_ca.crt"
        user=> "elastic"
        password=> "xxxxxxxxxxxx"
        index => "topup-events-%{+YYYY.MM.dd}"
      }
    }
    else if [cdr] == "topup-master" {
      elasticsearch {
      ssl_enabled => true
      hosts => ["https://x.x.x.x:9200"]
      ssl_certificate_authorities => "/etc/logstash/certs/http_ca.crt"
      user=> "elastic"
      password=> "xxxxxxxxxxxxxx"
      index => "topup-master-%{+YYYY.MM.dd}" 
    } 
  }
  else 
    if [cdr] == "topup-events" {
      google_pubsub {
      project_id => "project-mep"
      topic => "project-events-cdr"
      codec => "json"
      }
    }
    else if [cdr] == "topup-master" {
      google_pubsub {
      project_id => "project-mep"
      topic => "project-master-cdr"
      codec => "json"
      }
    }
  }
}

In logstash logs there is no evicende of failures, only this:

[2024-05-14T10:19:37,130][DEBUG][logstash.outputs.elasticsearch][mepcdr] Found existing Elasticsearch template, skipping template management {:name=>"ecs-logstash"}
[2024-05-14T10:19:37,133][INFO ][logstash.outputs.elasticsearch][mepcdr] Not eligible for data streams because config contains one or more settings that are not compatible with data streams: {"index"=>"master-%{+YYYY.MM.dd}"}
[2024-05-14T10:19:37,133][INFO ][logstash.outputs.elasticsearch][mepcdr] Data streams auto configuration (`data_stream => auto` or unset) resolved to `false`
[2024-05-14T10:19:37,214][INFO ][logstash.outputs.elasticsearch][mepcdr] Using a default mapping template {:es_version=>8, :ecs_compatibility=>:v8}
[2024-05-14T10:19:37,308][INFO ][logstash.javapipeline    ][mepcdr] Starting pipeline {:pipeline_id=>"mepcdr", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>250, "pipeline.sources"=>["/etc/logstash/conf.d/mepcdr/input.conf", "/etc/logstash/conf.d/mepcdr/output.conf"], :thread=>"#<Thread:0xb452c37 /usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}

Any idea about this strange issue?

I suggest adding unconditional else branches to both conditionals which write to a file with a rubydebug codec. Then see if the events written to the file actually have the [type] and [cdr] fields that you think they do.

Thnx. I´ll try to verify if the conditions are applied but not sure if I will have any result.

But am I right with this configuration or is there a better way to do it?
I mean using different files or changing the order or using tags instead of "type" as field.