When logstash initializes it shows no mapping template in the logs. When it goes to send data there are "can't convert nil to Array" errors. Note that this is using stunnel for https to the elastic cloud node.
Here is the error from logstash logs:
[2017-02-06T21:53:24,357][INFO ][org.logstash.beats.Server] Starting server on port: 5044
[2017-02-06T21:53:24,769][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>["http://~hidden~:~hidden~@localhost:19200"]}}
[2017-02-06T21:53:24,771][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:url=>#<URI::HTTP:0x4ab00bdd URL:http://~hidden~:~hidden~@localhost:19200>, :healthcheck_path=>"/"}
[2017-02-06T21:53:24,999][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>#<URI::HTTP:0x4ab00bdd URL:http://~hidden~:~hidden~@localhost:19200>}
[2017-02-06T21:53:24,999][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2017-02-06T21:53:25,060][ERROR][logstash.outputs.elasticsearch] Failed to install template. {:message=>"undefined method []' for nil:NilClass", :class=>"NoMethodError"} [2017-02-06T21:53:25,061][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["localhost:19200"]} [2017-02-06T21:53:25,157][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>250} [2017-02-06T21:53:25,159][INFO ][logstash.pipeline ] Pipeline main started [2017-02-06T21:53:25,202][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600} [2017-02-06T21:53:58,318][ERROR][logstash.outputs.elasticsearch] An unknown error occurred sending a bulk request to Elasticsearch. We will retry indefinitely {:error_message=>"can't convert nil into Array", :error_class=>"TypeError", :backtrace=>["org/jruby/RubyArray.java:1462:in
concat'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-5.4.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:94:in join_bulk_responses'", "org/jruby/RubyArray.java:1613:in
each'", "org/jruby/RubyEnumerable.java:852:in inject'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-5.4.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:92:in
join_bulk_responses'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-5.4.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:88:in bulk'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-5.4.0-java/lib/logstash/outputs/elasticsearch/common.rb:186:in
safe_bulk'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-5.4.0-java/lib/logstash/outputs/elasticsearch/common.rb:109:in submit'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-5.4.0-java/lib/logstash/outputs/elasticsearch/common.rb:76:in
retrying_submit'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-5.4.0-java/lib/logstash/outputs/elasticsearch/common.rb:27:in multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/output_delegator_strategies/shared.rb:12:in
multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/output_delegator.rb:42:in multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:331:in
output_batch'", "org/jruby/RubyHash.java:1342:in each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:330:in
output_batch'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:288:in worker_loop'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:258:in
start_workers'"]}