Kibana is showing only one record on the discover dash board

In my kibana discover dash board it is only showing one record. And the record is updating every time when I have data from file beat. also my _id value is
%{logstash_checksum}
I am using bitnami
filebeat -logstash- kibana

Seems like it's probably an issue with how you've configured filebeat. I'm going to switch over the labels for this post.

Actually sounds more like a problem with the logstash configuration. The fact that you 'overwrite' events all the time makes it sounds like all events get the same ID applied.

Yes you are correct , But in my Id values is like %{logstash_checksum}, I think id is not getting any value
My logstash configuration as follows

# Sample Logstash configuration for creating a simple
# Beats -> Logstash -> Elasticsearch pipeline.

input {
  beats {
    port => 5044
  }
}

output {
  elasticsearch {
    hosts => ["http://localhost:9200"]
    index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
      }
}

My filebeat.yml file as follows

###################### Filebeat Configuration Example #########################

# This file is an example configuration file highlighting only the most common
# options. The filebeat.reference.yml file from the same directory contains all the
# supported options with more comments. You can use it as a reference.
#
# You can find the full configuration reference here:
# https://www.elastic.co/guide/en/beats/filebeat/index.html

# For more available modules and options, please see the filebeat.reference.yml sample
# configuration file.

#=========================== Filebeat inputs =============================

filebeat.inputs:

# Each - is an input. Most options can be set at the input level, so
# you can use different inputs for various configurations.
# Below are the input specific configurations.

- type: log

  # Change to true to enable this input configuration.
  enabled: true

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    - /data/optimize-n2/logs/fb-sim/CT_Q_50
  


  #============================= Filebeat modules ===============================

  filebeat.config.modules:
    # Glob pattern for configuration loading
    path: ${path.config}/modules.d/*.yml

    # Set to true to enable config reloading
    reload.enabled: false

    # Period on which files under path should be checked for changes
    #reload.period: 10s
    #==================== Elasticsearch template setting ==========================

    setup.template.settings:
      index.number_of_shards: 1
      #index.codec: best_compression
      #_source.enabled: false


    


    #============================== Dashboards =====================================
    # These settings control loading the sample dashboards to the Kibana index. Loading
    # the dashboards is disabled by default and can be enabled either by setting the
    # options here or by using the `setup` command.
    #setup.dashboards.enabled: false

    # The URL from where to download the dashboards archive. By default this URL
    # has a value which is computed based on the Beat name and version. For released
    # versions, this URL points to the dashboard archive on the artifacts.elastic.co
    # website.
    #setup.dashboards.url:
    #============================== Kibana =====================================

    # Starting with Beats version 6.0.0, the dashboards are loaded via the Kibana API.
    # This requires a Kibana endpoint configuration.
    setup.kibana:

      # Kibana Host
      # Scheme and port can be left out and will be set to the default (http and 5601)
      # In case you specify and additional path, the scheme is required: http://localhost:5601/path
      # IPv6 addresses should always be defined as: https://[2001:db8::1]:5601
      #host: "localhost:5601"

      # Kibana Space ID
      # ID of the Kibana Space into which the dashboards should be loaded. By default,
      # the Default Space will be used.
      #space.id:

  
  

   #----------------------------- Logstash output --------------------------------
   output.logstash:
     # The Logstash hosts
     hosts: ["172.31.89.203:5044"]

     # Optional SSL. By default is off.
     # List of root certificates for HTTPS server verifications
     #ssl.certificate_authorities: ["/etc/pki/root/ca.pem"]

     # Certificate for SSL client authentication
     #ssl.certificate: "/etc/pki/client/cert.pem"

     # Client Certificate Key
     #ssl.key: "/etc/pki/client/cert.key"

   #================================ Processors =====================================

   # Configure processors to enhance or manipulate events generated by the beat.

   processors:
     - add_host_metadata: ~
     - add_cloud_metadata: ~

                                                                                                                           211,1         Bot

Please format logs and configs using the </> button in the editor window. I tried to fix some of the formatting for you.

Where do you set the event ID? This is a bare bones config.

Have you tried with logstash stdout output if new events are actually send?

Have you checked filebeat logs for events being send (run filebeat in debug mode via -d '*').

Hi we can close this case I hanged from binami ELK stack to openDistro ELk stack. It work fine. I think the problem with my bitnami ELk stack. I used the same file beat configuration for openDistro it works fine for me

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.