Not getting output on Kibana while can see logs on logstash log screen on Docker

I'm trying to send Airflow logs to Elasticsearch for monitoring on Kibana using Logstash and Filebeat. However, although I can see logs on the Docker Logstash log screen, those logs cannot be seen on Elasticsearch and Kibana. Where is the error in my configuration and YAML files? Can you guys help me? I've been searching for a solution for nearly a day.

logstash-sample.conf is below:

  # Beat -> Logstash -> Elasticsearch pipeline.

  input {
    beats {
      type => log
      port => 5044
    }
  }

  filter {}

  output {
    elasticsearch {
      hosts => ["localhost:9200"]
      index => "updatedlogs"
      #user => "elastic"
      #password => "changeme"
    }

    stdout {
      codec => rubydebug
    } 
  }

logstash.yml is below:

http.host: "0.0.0.0"
xpack.monitoring.elasticsearch.hosts: [ "http://elasticsearch:9200" ]     # Docker node name

filebeat.yml is below:

filebeat.inputs:
- type: log
  enabled: true
  paths:
    - C:/test/airflow/logs/*/*/*/*.log

output.logstash:
  hosts: ["localhost:5044"]
  enabled: true

You need to share the logs, does it have any error on it?

What is the result of running this request on Kibana Dev Tools:

GET updatedlogs/_search

I've been getting the following output for days, unfortunately, despite my efforts, I'm still facing the same problem.

#! Elasticsearch built-in security features are not enabled. Without authentication, your cluster could be accessible to anyone. See https://www.elastic.co/guide/en/elasticsearch/reference/7.17/security-minimal-setup.html to enable security.
{
  "error" : {
    "root_cause" : [
      {
        "type" : "index_not_found_exception",
        "reason" : "no such index [updatedlogs]",
        "resource.type" : "index_or_alias",
        "resource.id" : "updatedlogs",
        "index_uuid" : "_na_",
        "index" : "updatedlogs"
      }
    ],
    "type" : "index_not_found_exception",
    "reason" : "no such index [updatedlogs]",
    "resource.type" : "index_or_alias",
    "resource.id" : "updatedlogs",
    "index_uuid" : "_na_",
    "index" : "updatedlogs"
  },
  "status" : 404
}

Also when I was checking filebeat logs I found these ERROR logs.

ERROR	logstash/async.go:256	Failed to publish events caused by: EOF
ERROR	logstash/async.go:256	Failed to publish events caused by: client is not connected
ERROR	pipeline/output.go:121	Failed to publish events: client is not connected

I solved the problem a few days ago. The main issue is in docker-compose.yml file where I mounted logstash.conf in the config folder. The correct mounting for the logstash.conf file is specified in the pipeline folder, not config folder. I have provided the correct and wrong mountings below. I hope it helps who is struggling with the same problem as me.

# WRONG! 
   - ./logstash/logstash.conf:/usr/share/logstash/config/logstash.conf

# CORRECT!
   - ./logstash.conf:/usr/share/logstash/pipeline/logstash.conf
   conf file needs to go into pipeline folder!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.