Logstash cannot establish pipeline to communicate with elastic search

I am trying to collect the logs from kubernetes pods and pass it to elastic db using logstash and filebeats.I am able to deploy the Elasticsearch and kibana in k8 cluster but i am not able to deploy my logstash and filebeat.

My elastisearch.yaml file is

apiVersion: elasticsearch.k8s.elastic.co/v1
kind: Elasticsearch
metadata:
name: elasticsearch
spec:
version: 7.14.1 #Make sure you use the version of your choice
http:
service:
spec:
type: LoadBalancer #Adds a External IP
nodeSets:

  • name: default
    count: 1
    config:
    node.store.allow_mmap: false

Logstash.yaml file is

apiVersion: v1
kind: ConfigMap
metadata:
name: logstash-configmap
data:
logstash.yml: |
http.host: "0.0.0.0"
path.config: /usr/share/logstash/pipeline
logstash.conf: |
input {
beats {
port => 5044
}
}
output {
elasticsearch {
hosts => "https://External ip:9200"
index =>"testlogs"
user => "elastic"
password => "xxxx"
cacert => "/etc/"
ssl_certificate_verification => false
}
}

apiVersion: apps/v1
kind: Deployment
metadata:
name: logstash-deployment
spec:
replicas: 1
selector:
matchLabels:
app: logstash
template:
metadata:
labels:
app: logstash
spec:
containers:
- name: logstash
image: docker.elastic.co/logstash/logstash:7.14.1
ports:
- containerPort: 5044
volumeMounts:
- name: config-volume
mountPath: /usr/share/logstash/config
- name: logstash-pipeline-volume
mountPath: /usr/share/logstash/pipeline
volumes:
- name: config-volume
configMap:
name: logstash-configmap
items:
- key: logstash.yml
path: logstash.yml
- name: logstash-pipeline-volume
configMap:
name: logstash-configmap
items:
- key: logstash.conf
path: logstash.conf

kind: Service
apiVersion: v1
metadata:
name: logstash-service
spec:
selector:
app: logstash
ports:
- protocol: TCP
port: 5044
targetPort: 5044

My filebeat.yml file is

apiVersion: v1
kind: ConfigMap
metadata:
name: filebeat-config
labels:
app: filebeat
data:
filebeat.yml: |-
filebeat.autodiscover:
providers:
- type: kubernetes
node: aks-agentpool-11954791-vmss000003
hints.enabled: true
hints.default_config:
type: container
paths:
- /var/log/pods/$pod_name
processors:
- add_cloud_metadata:
- add_host_metadata:

output.logstash:
  hosts: ['logstash-service:5044']

After deploying my logstash pods got crashed and getting this error
[ERROR] 2022-10-21 08:54:16.937 [[main]-pipeline-manager] javapipeline - Pipeline error {:pipeline_id=>"main", :exception=>#<Errno::EISDIR: Is a directory - channel: org.jruby.util.io.ChannelFD@2f88e4ba /etc/>, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/manticore-0.7.1-java/lib/manticore/client.rb:642:in block in setup_trust_store'", "org/jruby/RubyIO.java:1158:in open'", "org/jruby/RubyKernel.java:317:in open'", "uri:classloader:/META-INF/jruby.home/lib/ruby/stdlib/open-uri.rb:37:in open'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/manticore-0.7.1-java/lib/manticore/client.rb:641:in setup_trust_store'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/manticore-0.7.1-java/lib/manticore/client.rb:629:in ssl_socket_factory_from_options'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/manticore-0.7.1-java/lib/manticore/client.rb:396:in pool_builder'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/manticore-0.7.1-java/lib/manticore/client.rb:404:in pool'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/manticore-0.7.1-java/lib/manticore/client.rb:207:in initialize'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-11.0.3-java/lib/logstash/outputs/elasticsearch/http_client/manticore_adapter.rb:26:in initialize'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-11.0.3-java/lib/logstash/outputs/elasticsearch/http_client.rb:319:in build_adapter'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-11.0.3-java/lib/logstash/outputs/elasticsearch/http_client.rb:323:in build_pool'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-11.0.3-java/lib/logstash/outputs/elasticsearch/http_client.rb:62:in initialize'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-11.0.3-java/lib/logstash/outputs/elasticsearch/http_client_builder.rb:106:in create_http_client'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-11.0.3-java/lib/logstash/outputs/elasticsearch/http_client_builder.rb:102:in build'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-11.0.3-java/lib/logstash/plugin_mixins/elasticsearch/common.rb:34:in build_client'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-11.0.3-java/lib/logstash/outputs/elasticsearch.rb:275:in register'", "org/logstash/config/ir/compiler/OutputStrategyExt.java:131:in register'", "org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:68:in register'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:228:in block in register_plugins'", "org/jruby/RubyArray.java:1820:in each'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:227:in register_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:585:in maybe_setup_out_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:240:in start_workers'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:185:in run'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:137:in block in start'"], "pipeline.sources"=>["/usr/share/logstash/pipeline/logstash.conf"], :thread=>"#<Thread:0x5b2f9d8f@/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:125 run>"}
[INFO ] 2022-10-21 08:54:16.940 [[main]-pipeline-manager] javapipeline - Pipeline terminated {"pipeline.id"=>"main"}
[ERROR] 2022-10-21 08:54:16.957 [Converge PipelineAction::Create] agent - Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: PipelineAction::Create, action_result: false", :backtrace=>nil}

Is it because of the ca certificate path which i have mentioned?If so can someone share the correct way to mention the ca certifcate and where do i find it in k8 cluster.Thanks in Advance!!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.