Filebeat is collecting Logs but not showing in kibana

Hi Team,

i'm using filebeat 6.4
It is collecting and parsing the logs but not showing at kibana.

Note: Filebeat is running on my ELK system and I'm trying to parse some files from same sytem.

Hello @Suresh_Pal, to help you out I will need a few more details about your environment and your current configuration.

Do you mind sharing the following:

  • Your Filebeat YAML configuration
  • Some logs from a running Filebeat, I am interested to see if there is any warning or errors.

Thanks

Sure @pierhugues,

filebeat.yml

###################### Filebeat Configuration Example #########################

# This file is an example configuration file highlighting only the most common
# options. The filebeat.reference.yml file from the same directory contains all the
# supported options with more comments. You can use it as a reference.
#
# You can find the full configuration reference here:
# https://www.elastic.co/guide/en/beats/filebeat/index.html

# For more available modules and options, please see the filebeat.reference.yml sample
# configuration file.

#=========================== Filebeat prospectors =============================

filebeat.prospectors:

# Each - is a prospector. Most options can be set at the prospector level, so
# you can use different prospectors for various configurations.
# Below are the prospector specific configurations.

- type: log

  # Change to true to enable this prospector configuration.
  enabled: true

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    #- /var/log/*.log
    #N- /home/tomcat/tomcatFCSKY/logs/catalina.out
    #N- /home/tomcat/builds/FCSKY/logs/*/application.log 
    #N- /home/tomcat/builds/FCSKY/logs/*.log
   


    - /home/tomcat/tomcat*/logs/catalina.out
    - /home/tomcat/tomcat*/logs/*.log
    - /home/tomcat/builds/*/logs/Redis/*.log
    - /home/tomcat/builds/FCSKY/logs/accesslogs.log
    - /home/tomcat/builds/FCSKY/logs/application.log
    - /home/tomcat/builds/FCSKY/logs/csrf.log
    - /home/tomcat/builds/FCSKY/logs/dataservicesapi.log
    - /home/tomcat/builds/FCSKY/logs/insightsapi.log
    - /home/tomcat/builds/FCSKY/logs/LoginStats.log
    - /home/tomcat/builds/FCSKY/logs/tomcatlogs.log
    - /home/tomcat/builds/FCSKY/logs/userlogin.log
    - /var/log/httpd/access_log
    - /var/log/httpd/ssl_access_log
    - /home/tomcat/tomcatFCSKY/logs/localhost_access_log.*
    - /var/log/cloud-init-output.log
    - /var/log/solr/*.log
    - /home/solr/solr-6.3.0/server/logs/*
    - /home/tomcat/builds/BaseThreadsFCSKY/logs/*.log
    - /home/tomcat/builds/genericJobsSM/logs/application.log*
    - /home/tomcat/builds/FacebookConnect725/logs/application.log
    #ELK Access LOGS
    #- /var/log/nginx/nginx.vhost.access.log    
  #- c:\programdata\elasticsearch\logs\*

  # Exclude lines. A list of regular expressions to match. It drops the lines that are
  # matching any regular expression from the list.
  #exclude_lines: ['^DBG']

  # Include lines. A list of regular expressions to match. It exports the lines that are
  # matching any regular expression from the list.
  #include_lines: ['^ERR', '^WARN']

  # Exclude files. A list of regular expressions to match. Filebeat drops the files that
  # are matching any regular expression from the list. By default, no files are dropped.
  #exclude_files: ['.gz$']

  # Optional additional fields. These fields can be freely picked
  # to add additional information to the crawled log files for filtering
  #fields:
  #  level: debug
  #  review: 1

  ### Multiline options

  # Mutiline can be used for log messages spanning multiple lines. This is common
  # for Java Stack Traces or C-Line Continuation

  # The regexp Pattern that has to be matched. The example pattern matches all lines starting with [
  multiline.pattern: '^(([0-9]{4}-[0-9]{2}-[0-9]{2})|([a-zA-z]{3} [0-9]{2}, [0-9]{4} [0-9]{2}:[0-9]{2}:[0-9]{2} [AM|PM])|([0-9]{2}-[a-zA-z]{3}-[0-9]{4})|([a-zA-z]{3} [a-zA-z]{3} [0-9]{2} [0-9]{2}:[0-9]{2}:[0-9]{2}))' 
  #multiline.pattern: '^[[:space:]]+(at|.{3})\b|Caused by:' 

  # Defines if the pattern set under pattern should be negated or not. Default is false.
  multiline.negate: true

  # Match can be set to "after" or "before". It is used to define if lines should be append to a pattern
  # that was (not) matched before or after or as long as a pattern is not matched based on negate.
  # Note: After is the equivalent to previous and before is the equivalent to to next in Logstash
  multiline.match: after


- type: log
  enabled: true
  paths:
    - /home/tomcat/builds/FCSKY/logs/db.log
  fields_under_root: true
  fields:
    type: db_log


  multiline.pattern: '^(([0-9]{4}-[0-9]{2}-[0-9]{2})|([a-zA-z]{3} [0-9]{2}, [0-9]{4} [0-9]{2}:[0-9]{2}:[0-9]{2} [AM|PM])|([0-9]{2}-[a-zA-z]{3}-[0-9]{4})|([a-zA-z]{3} [a-zA-z]{3} [0-9]{2} [0-9]{2}:[0-9]{2}:[0-9]{2}))'
  multiline.negate: true
  multiline.match: after
######
- type: log
  enabled: true
  paths:
    - /var/log/nginx/nginx.vhost.access.log
  fields_under_root: true
  fields:
    type: user_access


  multiline.pattern: '^([0-9]{2}.[0-9]{1}.[0-9]{2}.[0-9]{2})'
  multiline.negate: true
  multiline.match: after



#============================= Filebeat modules ===============================

filebeat.config.modules:
  # Glob pattern for configuration loading
  path: ${path.config}/modules.d/*.yml

  # Set to true to enable config reloading
  reload.enabled: false

  # Period on which files under path should be checked for changes
  #reload.period: 10s

#==================== Elasticsearch template setting ==========================

setup.template.settings:
  index.number_of_shards: 3


#============================== Kibana =====================================

# Starting with Beats version 6.0.0, the dashboards are loaded via the Kibana API.
# This requires a Kibana endpoint configuration.
setup.kibana:

  host: "localhost:5601"

#----------------------------- Logstash output --------------------------------
output.logstash:

  hosts: ["ELK-PROD-CL-ELB-1500669148.us-east-1.elb.amazonaws.com:5044"]






#================================ Logging =====================================


logging.level: debug



Filbeat logs

Before: 1, After: 1, Pending: 0
2019-03-07T22:09:37.322-0500    INFO    [monitoring]    log/log.go:141  Non-zero metrics in the last 30s        {"monitoring": {"metrics": {"beat":{"cpu":{"system":{"ticks":2500},"total":{"ticks":8580,"time":{"ms":4},"value":8580},"user":{"ticks":6080,"time":{"ms":4}}},"info":{"ephemeral_id":"4c43dd52-1b79-4d8a-abe1-263170a9ae23","uptime":{"ms":63000008}},"memstats":{"gc_next":4194304,"memory_alloc":2196264,"memory_total":556744456}},"filebeat":{"harvester":{"open_files":0,"running":0}},"libbeat":{"config":{"module":{"running":0}},"pipeline":{"clients":3,"events":{"active":0}}},"registrar":{"states":{"current":2}},"system":{"load":{"1":4.01,"15":3.58,"5":3.47,"norm":{"1":1.0025,"15":0.895,"5":0.8675}}}}}}
2019-03-07T22:09:40.699-0500    DEBUG   [input] input/input.go:152      Run input
2019-03-07T22:09:40.700-0500    DEBUG   [input] log/input.go:174        Start next scan
2019-03-07T22:09:40.700-0500    DEBUG   [input] log/input.go:404        Check file for harvesting: /var/log/nginx/nginx.vhost.access.log

Okay I see that you are using LS, I would need a bigger log from Filebeat, looking at the monitoring statement I do not see any files open by Filebeat.

Hi @pierhugues,

After downgrading from 6.4 to 6.2 version. It working now. Thanks for the quick response.

Thanks,
Suresh