Data enrichment using logstash with translate plugin

I am trying to enrich the data before it gets indexed, I have tried the below methods but both are currently not working

  1. Using the elasticsearch plugin in filter
input {
    kafka {
            bootstrap_servers => "xxx.xx.xx.xxx:9092"
            topics => ["topicname"]
           }
        }
filter {
     elasticsearch {
      hosts => ["xxx.xx.xx.xxx:9200"]
      index => "lookup"
      query => '{ "query": { "match": { "id": "%{id}" } } }'
      fields => {
        "desc" => "desc"
      }
      user => elastic
      password => "******"
      ca_file => "/path/to/elasticsearch-ca.pem"
     }
    json {
            source => "message"
            remove_field => ["message", "@version"]
    }
    mutate{
            lowercase => ["name"]
    }
}
output {
   elasticsearch {
        hosts => ["xxx.xx.xx.xxx:9200"]
        index => "test"
        document_id => "%{id}"
        user => elastic
        password => *******
        ssl => true
        cacert => '/path/to/elasticsearch-ca.pem'
        ssl_certificate_verification => false
   }
}

The issue that I ran into is that ssl_certificate_verification is not allowed as a parameter on the Elasticsearch filter hence it is not connecting to Elasticsearch. I tried generating the cert using the following link so that I can have a secure SSL connection but that also doesn't work.

2)After which I tried the translate filter to achieve the same, I get no error but the lookup using the dictionary is not happening

input {
    kafka {
            bootstrap_servers => "xxx.xx.xx.xxx:9092"
            topics => ["topicname"]
           }
        }
filter {
    translate {
      source => ["code"]
      target => ["desc"]
      dictionary => {
        "1" => "One"
        "2" => "two"
        "3" => "three"
        "4" => "four"
     }
     override => true
     fallback => "Fallback"
    }
    json {
            source => "message"
            remove_field => ["message", "@version"]
    }
    mutate{
            lowercase => ["name"]
    }
}
output {
   stdout { codec => rubydebug }
   elasticsearch {
        hosts => ["xxx.xx.xx.xxx:9200"]
        index => "test"
        document_id => "%{id}"
        user => elastic
        password => "*********"
        ssl => true
        cacert => '/path/to/elasticsearch-ca.pem'
        ssl_certificate_verification => false
   }
}

The value of id was INTEGER but the dictionary source field should be a string so I even converted and tried no error but the desc field is blank.

Finally I also setup the enrich index and configuration on elastic but when I send data through Logstash I get DLQ issues.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.