Logstash 2.0 with https protocol to elasticsearch

I have install Elasticsearch 2.0, Kibana 4.2, Logstash 2.0 and Shield.

After configuring logstash with input file, filter and output I am unable to send logs to elasticsearch

error message is

{:timestamp=>"2015-11-22T15:11:33.744000-0700", :message=>"Attempted to send a bulk request to Elasticsearch configured at '["http://localhost:9200/"]'
, but an error occurred and it failed! Are you sure you can reach elasticsearch from this machine using the configuration provided?", :client_config=>{:h
osts=>["http://localhost:9200/"], :ssl=>nil, :transport_options=>{:socket_timeout=>0, :request_timeout=>0, :proxy=>nil, :ssl=>{}}, :transport_class=>Elas
ticsearch::Transport::Transport::HTTP::Manticore, :logger=>nil, :tracer=>nil, :reload_connections=>false, :retry_on_failure=>false, :reload_on_failure=>f
alse, :randomize_hosts=>false}, :error_message=>"localhost:9200 failed to respond",

the config file is

input {
lumberjack {
port => 5043
type => "logs"
ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
}
}
filter {
if [type] == "syslog" {
grok {
match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:[%{POSINT:syslog_pid}])?: %{G
REEDYDATA:syslog_message}" }
add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]
}
syslog_pri { }
date {
match => [ "syslog_timestamp", "MMM d HH:mm:ss", "MMM dd HH:mm:ss" ]
}
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
}
stdout { codec => rubydebug }
}

any help please

https://www.elastic.co/guide/en/elasticsearch/reference/current/breaking_20_network_changes.html#_bind_to_localhost perhaps?