New Install, No Data in Kibana

I am new to Elastic Stack and currently just "finished" my installation but I am not receiving any data in Kibana. I installed version 5.6.2 of Elasticsearch, Logstash, Kibana, and X-Pack all on the same host. I used netstat and see that the connections are established but no data.

I installed Filebeat on a different host and pointed it to my Elastic Stack server IP and port (tried both 5043 and 5044).

Here are my configurations:

/etc/filebeat/filebeat.yml:

filebeat.prospectors:
- input_type: log
  paths:
    - /var/log/*.log
output.logstash:
  hosts: ["IP_ADDR:5043"]

/etc/logstash/logstash.yml:

path.data: /var/lib/logstash
path.config: /etc/logstash/conf.d
http.host: "IP_ADDR"
path.logs: /var/log/logstash
xpack.monitoring.enabled: true
xpack.monitoring.elasticsearch.url: http://IP_ADDR:9200
xpack.monitoring.elasticsearch.username: elastic
xpack.monitoring.elasticsearch.password: changeme

/etc/logstash/conf.d/elasticsearch-output.yml

input {
  beats {
    port => "5043"
  }
}
output {
  elasticsearch {
    hosts => ["IP_ADDR:9200"]
    user => "elastic"
    password => "changeme"
    sniffing => false
    manage_template => false
    index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
    document_type => "%{[@metadata][type]}" 
  }
}

/etc/elasticsearch/elasticsearch.yml

network.host: IP_ADDR
xpack.security.enabled: false

/etc/kibana/kibana.yml

server.host: IP_ADDR
elasticsearch.url: "http://IP_ADDR:9200"
xpack.security.enabled: false

The IP_ADDR is not 127.0.0.1 or localhost.

What do I have configured wrong?

Everything looks ok from what I can see.

What do the logs for filebeat say when you start things up?

I switched the logging to debug mode. Here is the latest from the filebeat logs:

2017-10-01T16:03:29-04:00 DBG  Disable stderr logging
2017-10-01T16:03:29-04:00 INFO Home path: [/usr/share/filebeat] Config path: [/etc/filebeat] Data path: [/var/lib/filebeat] Logs path: [/var/log/filebeat]
2017-10-01T16:03:29-04:00 INFO Setup Beat: filebeat; Version: 5.6.2
2017-10-01T16:03:29-04:00 DBG  Processors: 
2017-10-01T16:03:29-04:00 DBG  Initializing output plugins
2017-10-01T16:03:29-04:00 INFO Max Retries set to: 3
2017-10-01T16:03:29-04:00 INFO Activated logstash as output plugin.
2017-10-01T16:03:29-04:00 DBG  Create output worker
2017-10-01T16:03:29-04:00 DBG  No output is defined to store the topology. The server fields might not be filled.
2017-10-01T16:03:29-04:00 INFO Publisher name: ubuntu-gnome
2017-10-01T16:03:29-04:00 INFO Flush Interval set to: 1s
2017-10-01T16:03:29-04:00 INFO Max Bulk Size set to: 2048
2017-10-01T16:03:29-04:00 DBG  create bulk processing worker (interval=1s, bulk size=2048)
2017-10-01T16:03:29-04:00 INFO filebeat start running.
2017-10-01T16:03:29-04:00 INFO Metrics logging every 30s
2017-10-01T16:03:29-04:00 INFO Registry file set to: /var/lib/filebeat/registry
2017-10-01T16:03:29-04:00 INFO Loading registrar data from /var/lib/filebeat/registry
2017-10-01T16:03:29-04:00 INFO States Loaded from registrar: 9
2017-10-01T16:03:29-04:00 INFO Loading Prospectors: 1
2017-10-01T16:03:29-04:00 DBG  File Configs: [/var/log/*.log]
2017-10-01T16:03:29-04:00 INFO Start sending events to output
2017-10-01T16:03:29-04:00 DBG  exclude_files: []
2017-10-01T16:03:29-04:00 INFO Starting Registrar
2017-10-01T16:03:29-04:00 DBG  New state added for /var/log/bootstrap.log
2017-10-01T16:03:29-04:00 DBG  New state added for /var/log/casper.log
2017-10-01T16:03:29-04:00 DBG  New state added for /var/log/dpkg.log
2017-10-01T16:03:29-04:00 DBG  New state added for /var/log/kern.log
2017-10-01T16:03:29-04:00 DBG  New state added for /var/log/boot.log
2017-10-01T16:03:29-04:00 DBG  New state added for /var/log/auth.log
2017-10-01T16:03:29-04:00 DBG  New state added for /var/log/fontconfig.log
2017-10-01T16:03:29-04:00 DBG  New state added for /var/log/gpu-manager.log
2017-10-01T16:03:29-04:00 DBG  New state added for /var/log/alternatives.log
2017-10-01T16:03:29-04:00 INFO Prospector with previous states loaded: 9
2017-10-01T16:03:29-04:00 INFO Starting spooler: spool_size: 2048; idle_timeout: 5s
2017-10-01T16:03:29-04:00 INFO Starting prospector of type: log; id: 17005676086519951868 
2017-10-01T16:03:29-04:00 INFO Loading and starting Prospectors completed. Enabled prospectors: 1
2017-10-01T16:03:29-04:00 DBG  Start next scan
2017-10-01T16:03:29-04:00 DBG  Check file for harvesting: /var/log/auth.log
2017-10-01T16:03:29-04:00 DBG  Update existing file for harvesting: /var/log/auth.log, offset: 18306
2017-10-01T16:03:29-04:00 DBG  Resuming harvesting of file: /var/log/auth.log, offset: 18306
2017-10-01T16:03:29-04:00 DBG  Set previous offset for file: /var/log/auth.log. Offset: 18306 
2017-10-01T16:03:29-04:00 DBG  Setting offset for file: /var/log/auth.log. Offset: 18306 
2017-10-01T16:03:29-04:00 DBG  Check file for harvesting: /var/log/dpkg.log
2017-10-01T16:03:29-04:00 DBG  Update existing file for harvesting: /var/log/dpkg.log, offset: 1199180
2017-10-01T16:03:29-04:00 DBG  File didn't change: /var/log/dpkg.log
2017-10-01T16:03:29-04:00 DBG  Check file for harvesting: /var/log/fontconfig.log
2017-10-01T16:03:29-04:00 DBG  Update existing file for harvesting: /var/log/fontconfig.log, offset: 4204
2017-10-01T16:03:29-04:00 DBG  File didn't change: /var/log/fontconfig.log
2017-10-01T16:03:29-04:00 DBG  Check file for harvesting: /var/log/alternatives.log
2017-10-01T16:03:29-04:00 DBG  Update existing file for harvesting: /var/log/alternatives.log, offset: 37199
2017-10-01T16:03:29-04:00 DBG  File didn't change: /var/log/alternatives.log
2017-10-01T16:03:29-04:00 DBG  Check file for harvesting: /var/log/boot.log
2017-10-01T16:03:29-04:00 DBG  Update existing file for harvesting: /var/log/boot.log, offset: 1841
2017-10-01T16:03:29-04:00 DBG  File didn't change: /var/log/boot.log
2017-10-01T16:03:29-04:00 DBG  Check file for harvesting: /var/log/bootstrap.log
2017-10-01T16:03:29-04:00 DBG  Update existing file for harvesting: /var/log/bootstrap.log, offset: 59400
2017-10-01T16:03:29-04:00 DBG  File didn't change: /var/log/bootstrap.log
2017-10-01T16:03:29-04:00 DBG  Check file for harvesting: /var/log/casper.log
2017-10-01T16:03:29-04:00 DBG  Update existing file for harvesting: /var/log/casper.log, offset: 1807
2017-10-01T16:03:29-04:00 DBG  File didn't change: /var/log/casper.log
2017-10-01T16:03:29-04:00 DBG  Check file for harvesting: /var/log/gpu-manager.log
2017-10-01T16:03:29-04:00 DBG  Update existing file for harvesting: /var/log/gpu-manager.log, offset: 2107
2017-10-01T16:03:29-04:00 DBG  File didn't change: /var/log/gpu-manager.log
2017-10-01T16:03:29-04:00 DBG  Check file for harvesting: /var/log/kern.log
2017-10-01T16:03:29-04:00 DBG  Update existing file for harvesting: /var/log/kern.log, offset: 116097
2017-10-01T16:03:29-04:00 DBG  Resuming harvesting of file: /var/log/kern.log, offset: 116097
2017-10-01T16:03:29-04:00 DBG  Set previous offset for file: /var/log/kern.log. Offset: 116097 
2017-10-01T16:03:29-04:00 DBG  Setting offset for file: /var/log/kern.log. Offset: 116097 
2017-10-01T16:03:29-04:00 INFO Harvester started for file: /var/log/auth.log
2017-10-01T16:03:29-04:00 DBG  Prospector states cleaned up. Before: 9, After: 9
2017-10-01T16:03:29-04:00 INFO Harvester started for file: /var/log/kern.log
2017-10-01T16:03:29-04:00 DBG  End of file reached: /var/log/auth.log; Backoff now.
2017-10-01T16:03:29-04:00 DBG  End of file reached: /var/log/kern.log; Backoff now.
2017-10-01T16:03:30-04:00 DBG  End of file reached: /var/log/auth.log; Backoff now.
2017-10-01T16:03:30-04:00 DBG  End of file reached: /var/log/kern.log; Backoff now.
2017-10-01T16:03:32-04:00 DBG  End of file reached: /var/log/kern.log; Backoff now.
2017-10-01T16:03:32-04:00 DBG  End of file reached: /var/log/auth.log; Backoff now.
2017-10-01T16:03:34-04:00 DBG  Flushing spooler because of timeout. Events flushed: 15
2017-10-01T16:03:34-04:00 DBG  Publish: {
  "@timestamp": "2017-10-01T20:03:29.133Z",
  "beat": {
    "hostname": "ubuntu-gnome",
    "name": "ubuntu-gnome",
    "version": "5.6.2"
  },
  "input_type": "log",
  "message": "Oct  1 20:03:29 ubuntu-gnome sudo:     root : TTY=pts/0 ; PWD=/var/log/mybeat ; USER=root ; COMMAND=/usr/sbin/service filebeat restart",
  "offset": 18441,
  "source": "/var/log/auth.log",
  "type": "log"
}
2017-10-01T16:03:34-04:00 DBG  Publish: {
  "@timestamp": "2017-10-01T20:03:29.133Z",
  "beat": {
    "hostname": "ubuntu-gnome",
    "name": "ubuntu-gnome",
    "version": "5.6.2"
  },
  "input_type": "log",
  "message": "Oct  1 20:03:29 ubuntu-gnome sudo: pam_unix(sudo:session): session opened for user root by (uid=0)",
  "offset": 18540,
  "source": "/var/log/auth.log",
  "type": "log"
}
2017-10-01T16:03:34-04:00 DBG  Publish: {
  "@timestamp": "2017-10-01T20:03:29.133Z",
  "beat": {
    "hostname": "ubuntu-gnome",
    "name": "ubuntu-gnome",
    "version": "5.6.2"
  },
  "input_type": "log",
  "message": "Oct  1 20:03:29 ubuntu-gnome sudo: pam_unix(sudo:session): session closed for user root",
  "offset": 18628,
  "source": "/var/log/auth.log",
  "type": "log"
}
2017-10-01T16:03:34-04:00 DBG  Publish: {
  "@timestamp": "2017-10-01T20:03:29.133Z",
  "beat": {
    "hostname": "ubuntu-gnome",
    "name": "ubuntu-gnome",
    "version": "5.6.2"
  },
  "input_type": "log",
  "message": "Oct  1 20:03:28 ubuntu-gnome kernel: [ 1021.777817] haswell-pcm-audio haswell-pcm-audio: FW loaded, mailbox readback FW info: type 01, - version: 00.00, build 77, source commit id: 876ac6906f31a43b6772b23c7c983ce9dcb18a19",
  "offset": 116319,
  "source": "/var/log/kern.log",
  "type": "log"
}
2017-10-01T16:03:34-04:00 DBG  output worker: publish 4 events
2017-10-01T16:03:34-04:00 DBG  connect
2017-10-01T16:03:34-04:00 DBG  Try to publish 4 events to logstash with window size 10
2017-10-01T16:03:34-04:00 DBG  4 events out of 4 events sent to logstash. Continue sending
2017-10-01T16:03:34-04:00 DBG  send completed
2017-10-01T16:03:34-04:00 DBG  Events sent: 15
2017-10-01T16:03:34-04:00 DBG  Processing 15 events
2017-10-01T16:03:34-04:00 DBG  Registrar states cleaned up. Before: 9, After: 9
2017-10-01T16:03:34-04:00 DBG  Write registry file: /var/lib/filebeat/registry
2017-10-01T16:03:34-04:00 DBG  Registry file updated. 9 states written.
2017-10-01T16:03:36-04:00 DBG  End of file reached: /var/log/kern.log; Backoff now.
2017-10-01T16:03:36-04:00 DBG  End of file reached: /var/log/auth.log; Backoff now.
2017-10-01T16:03:39-04:00 DBG  Flushing spooler because of timeout. Events flushed: 0
2017-10-01T16:03:39-04:00 DBG  Run prospector

Everything else just states that filebeat is checking for changing in the logs to push to Logstash.

Based on the logs, it's saying it's read the files and isn't seeing new data to read.

What does the output from _cat/indices against Elasticsearch show?

I ran curl -u elastic:changeme http://IP_ADDR:9200/_cat/indices:

yellow open .watcher-history-6-2017.10.01     8Rdl1eLmSZy0eLuECUUe_Q 1 1  1530   0   1.1mb   1.1mb
yellow open .monitoring-kibana-6-2017.10.02   SEzSEdGlQAiE3whBUSgNqw 1 1   311   0 210.6kb 210.6kb
yellow open .monitoring-logstash-6-2017.10.02 ZZ3ggIJ_RTq7lJP9WP1aLA 1 1    56   0  48.9kb  48.9kb
yellow open .monitoring-kibana-6-2017.10.01   l_W_1T7hSeuDqzac32YQwQ 1 1  1787   0 827.9kb 827.9kb
yellow open filebeat-2017.10.01               U60vLr92RqSF01YhirHLVA 5 1 40685   0  11.3mb  11.3mb
yellow open .monitoring-es-6-2017.10.01       0GuCntrPTXGhs-1y0UYjhg 1 1 32956 222  21.9mb  21.9mb
yellow open .monitoring-alerts-6              zFVvWwmxTP6O1_xRTxol3g 1 1     1   0  12.5kb  12.5kb
yellow open winlogbeat-2017.09.24             crEtNclCT86lsXFgDYlR3g 5 1     1   0    20kb    20kb
yellow open .watcher-history-6-2017.10.02     Et1FcKRbR4m9H7dzWXLFsA 1 1   255   0   232kb   232kb
yellow open winlogbeat-2017.09.20             rgfBpgFcQ6OWnMoaN5k3hg 5 1    99   0 614.4kb 614.4kb
yellow open .watches                          fsDKAF71T9C913P7xA46Iw 1 1     4   0  39.5kb  39.5kb
yellow open winlogbeat-2017.10.01             PzkYc7F7RfemAAgfLjQU1w 5 1   112   0   1.1mb   1.1mb
yellow open .monitoring-es-6-2017.10.02       YxTUyqFuQFuDf2hJwx3WzQ 1 1  6584 164     5mb     5mb
yellow open winlogbeat-2017.09.30             VFHXWpF3RsW9AQ2A-xfy9g 5 1   518   0   2.1mb   2.1mb
yellow open .kibana                           JvByQiAoQtS8eolOFDmOQQ 1 1     1   0   3.9kb   3.9kb
yellow open filebeat-2017.09.30               oi4pL7mfQACZyITnwKXOpw 5 1 20420   0     6mb     6mb
yellow open .triggered_watches                YfapItCASSCTL2CBugQwag 1 1     0   0  91.3kb  91.3kb

Cool, so there is data in Elasticsearch, you can see that with the winlogbeat and filebeat indices.

Going back to Kibana, have you defined the index patterns? If you have, do you have the right time range to cover the data in Discover?

I think that's the problem. I used filebeat-* and winlogbeat-* as patterns and nothing is found.

image

And Kibana is definitely pointing to the same Elasticsearch cluster as your curl commands from before?

Yup. I just disabled X-Pack security for Elasticsearch and everything is working. I added xpack.security.enabled: false to the bottom of elasticsearch.yml and now I'm able to define a index pattern.

Why would this stop me from receiving data? Authentication?

It wouldn't because you defined the user and password in the Logstash config.
Is there anything in the Logstash logs about indexing failures due to bad authentication with Elasticsearch?

Not really. This error was fixed prior to me disabling X-Pack security.

root@HOSTNAME:/var/log# grep -riE "fail|error|bad" logstash/ | grep -iE "cred|auth|user|pass"
logstash/logstash-plain-2017-09-30.log:[2017-09-30T21:00:12,721][ERROR][logstash.pipeline        ] Error registering plugin {:plugin=>"<LogStash::Inputs::Metrics collection_interval=>10, collection_timeout_interval=>600, id=>\"710374655e8d374e8e681c853584e11b845f7812-1\", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>\"plain_8b7ee491-5b6a-4dfa-b220-c8028c3c4fdd\", enable_metric=>true, charset=>\"UTF-8\">>", :error=>"You must set the password using the xpack.monitoring.elasticsearch.password in logstash.yml"}

I updated the configs in my first message to show what is currently in the files.

The problem was that I was using the built-in kibana user which doesn't have the permission to search any indices. I switched to the elastic user and the data appeared. Now I can start creating roles and applying those roles to users.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.