Logstash integration with Amazon Elasticsearch

Hi Team,

I have setup logstash to fetch from amazon MSK and then push the messages to Amazon Elasticsearch with Kibana integration. The setup works smoothly in QA and Dev environments but when I use the same logstash image in Production environment it stops working. I can see in the logstash logs, that the messages are fetched from MSK, but nothing is pushed to Elasticsearch. The wierd thing is that the logstash logs are exactly the same in QA, Dev and Prod. I am even using the same config file, with only the Elasticsearch endpoint that changes. Here is the config file.

input {
kafka {
bootstrap_servers => "{MSK_HOST}" topics => "{TOPIC}"
group_id => "${KAFKA_GRP_ID}"
auto_offset_reset => "latest"
#ssl_truststore_location => "/usr/share/logstash/config/kafka.client.truststore.jks"
security_protocol => "SSL"
decorate_events => true
codec => json
}
}

filter {
ruby {
code => "event.set('[@metadata][kafka][lc_topic]', event.get('[@metadata][kafka][topic]').split(/(?=[A-Z])/).map{|x| x.downcase }.join('') )"
}
}

output {
amazon_es {
hosts => ["${ES_HOST}"]
ssl => true
#ilm_enabled => false
index => "logstash.%{[@metadata][kafka][lc_topic]}.%{+YYYY.MM.dd}"
manage_template => false
aws_access_key_id => ''
aws_secret_access_key => ''

}

}

Any idea why this happens?

We aren't able to help here sorry, this is a custom plugin AWS has built.

it doesnt work with the usual plug-in as well

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.