FileBeat is sending the data to LogStash but nothing appears on logstash logs/kibana/elastic log

Hi All,

Am very new to ELK stack and we are implementing this in our project because of huge efforts involved in debugging and seeing the logs.

I have configured the filebeat on one of our servers and configuration goes like this : filebeat.yml

filebeat.prospectors:

  • type: log
    enabled: true
    paths:
    • /comp/custom/sys/apache-tomcat/logs/NotificationProcessing/log
      multiline.pattern: ^[
      multiline.negate: true
      multiline.match: after
      filebeat.config.modules:
      path: ${path.config}/modules.d/*.yml
      reload.enabled: false
      setup.template.settings:
      index.number_of_shards: 3
      output.logstash:
      hosts: ["complogserver.corp.com:5044"]
      logging.level: debug
      logging.to_files: true

After i run the filebeat as below
./filebeat -e -c filebeat.yml -d "*"

I see some logs on the console as

2018-05-22T11:03:17.127Z DEBUG [logstash] logstash/async.go:142 2 events out of 2 events sent to logstash host complogserver.corp.com:5044. Continue sending

But nothing appears on the logstash/elastic logging.

Logstash is basically running on server complogserver.corp.com and here is the logs

[2018-05-22T10:58:35,112][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://elastic:xxxxxx@complogserver.corp.com:9200/"}
[2018-05-22T10:58:35,169][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-05-22T10:58:35,170][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>6}
[2018-05-22T10:58:35,171][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-05-22T10:58:35,176][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-05-22T10:58:35,187][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//complogserver.corp.com:9200"]}
[2018-05-22T10:58:35,600][INFO ][logstash.inputs.beats ] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[2018-05-22T10:58:35,664][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x7ba87f1@/test/ARSystem/logstash-6.2.4/logstash-core/lib/logstash/pipeline.rb:247 sleep>"}
[2018-05-22T10:58:35,674][INFO ][org.logstash.beats.Server] Starting server on port: 5044
[2018-05-22T10:58:35,684][INFO ][logstash.agent ] Pipelines running {:count=>1, :pipelines=>["main"]}

Am stuck here from past 2 days with out a way to test where its going wrong.

Can some one please help me on this at the earliest.

Please post all non-comment lines from filebeat.yml. Format it as preformatted text using markdown notation or the </> toolbar button.

filebeat.prospectors:
- type: log
  enabled: true
  paths:
    - /comp/custom/sys/apache-tomcat/logs/NotificationProcessing/log
  multiline.pattern: ^\[
  multiline.negate: true
  multiline.match: after
filebeat.config.modules:
  path: ${path.config}/modules.d/*.yml
  reload.enabled: false
setup.template.settings:
  index.number_of_shards: 3
setup.kibana:
  host: "complogserver.corp.com:5601"
output.logstash:
  hosts: ["complogserver.corp.com:5044"]
logging.level: debug
logging.to_files: true

Here is the simple logstash.conf which am using for logstash
Log stash is running on the same server as that of elastic, But still instead of localhost am using the server name directly as complogserver.corp.com

input {
        beats
        {
        port => "5044"
      }
}
output {
    elasticsearch {
        hosts => [ "complogserver.corp.om:9200" ]
        user => elastic
        password => welcome1
    }
}

Ok here is the intresting thing which i observed after playing around in kibana.

My Elastic Search Health Status is Yellow. And i have many indicies which have documents with in.

These documents are also shown with status yellow because this requires multiple nodes to process this document.

But all i have is just one server and i want to use the same server for every thing.

How can i create another node on the same server and get all the documents finally to show up on kibana.

If you have a single node setup then you do not benefit from replicas. However, the cluster being Yellow will not prevent Kibana showing you documents from the index.

If having cluster status Green makes you happier then use a template. Something like

PUT _template/default0replicas
{
"template" : "*",
"settings" : {"number_of_replicas" : 0 }
}

But that will not determine whether documents show up in Kibana.

Thanks for the quick and honest reply.

I get the below response which i execute the request :slight_smile:

{
    "error": {
        "root_cause": [
            {
                "type": "security_exception",
                "reason": "missing authentication token for REST request [/_template/default0replicas]",
                "header": {
                    "WWW-Authenticate": "Basic realm=\"security\" charset=\"UTF-8\""
                }
            }
        ],
        "type": "security_exception",
        "reason": "missing authentication token for REST request [/_template/default0replicas]",
        "header": {
            "WWW-Authenticate": "Basic realm=\"security\" charset=\"UTF-8\""
        }
    },
    "status": 401
}

Ok So i disabled the security using the parameter xpack.security.enabled: false

Executed the above request which you had provided to set the replicas to 0

But still everything is still in Yellow and i still see the same message on Kibana.

There is 18 indices clearly, But why is it not showing in discovery is the major question which i have now.

Tomorrow's indexes will be green.

You might want to check a few documents by doing "GET /logstash-2018.05.23/_search".

In Kibana, either the time picker is set to a period that has no documents (e.g. if you are set to Last 15 minutes and all the documents are older than that) or the search you are doing matches no documents. If you are parsing timestamp and not configuring the timezone correctly that could lead to all documents being hours old.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.